A predictive data-driven state-dependent decision approach to determine inventory system states for critical spare parts

Document Type : Research Paper


1 Department of Industrial Engineering, South Tehran Branch, Islamic Azad University, Tehran, Iran

2 Department of Industrial Engineering, Islamic Azad University, Bonab Branch, Bonab, Iran



The Markov chain is widely used in state-dependent inventory control of spare parts because of its ability to model the gradual degradation process of components and predict their condition. Also, according to previous studies, considering system information causes a significant reduction in costs. Therefore, the present study tries to extract the system information using a machine learning algorithm and provide it as a transition matrix to the Markov decision process (MDP) to determine the future states of the critical spare parts inventory system. In the presented method, the machine learning algorithm, here Adaptive Neuro-Fuzzy Inference System (ANFIS), is in charge of the training data, and the Markov chain uses the trained data to predict the future states of the inventory system. For this purpose, four states have been considered, each representing a level of tension and demand in the inventory system. Applying the model to the data collected for a critical component showed that the model has good accuracy in predicting the next states of the system. Also, the presented model shows a lower error rate, RMSE, and MAPE, compared to the Arima model for predicting the next state of the inventory system.