P.Y. Boyko, E.M. Bikov, E.I. Sokolov, D.A. Yarotsky Application of Machine Learning to Incident Ranking at Moscow Railway
P.Y. Boyko, E.M. Bikov, E.I. Sokolov, D.A. Yarotsky Application of Machine Learning to Incident Ranking at Moscow Railway


Moscow Railway, a large railway network including 8800 kilometers of track and 549 stations, is equipped with tens of thousands of devices for automatic registration of system failures. Alerts produced by these devices are processed by operators of the Infrastructure Management Center. The alert flow is very intense and creates a significant stress on the operators while about 97% of the signals turn out to be false alarms. To optimize the operation of the Center we have used machine learning to develop an advanced automated incident ranking model that estimates the probability of an actual failure from multiple features of the registered incident. The model was trained as an ensemble of decision trees by the algorithm XGBoost using a database of 5 million historical incidents. The model has been integrated into the software infrastructure of the Center and is used in the daily work of operators.


railroad monitoring, incident ranking, machine learning, feature engineering, ensemble of decision trees, XGBoost.

PP. 43-53.


1. Horituchi, Yuji, Baba, Yukino, Kashima, Hisashi, Suzuki, Masahito, Kayahara, Hiroki, & Maeno, Jun. 2017. Predicting Fuel Consumption and Flight Delays for Low-Cost Airlines. In: Twenty-Ninth IAAI Conference.
2. Hebert, Jeff. 2016. Predicting rare failure events using classification trees on large scale manufacturing data with complex interactions. Pages 2024–2028 of: 2016 IEEE International Conference on Big Data, BigData 2016, Washington DC, USA, December 5-8, 2016.
3. Hastie, Trevor, Tibshirani, Robert, & Friedman, Jerome. 2001. The Elements of Statistical Learning. Springer Series in Statistics. New York, NY, USA: Springer New York Inc.
4. YDF’s Recommender System to Decrease Steelmaking Costs at Magnitogorsk Iron and Steel Works,
Accessed: 2017-02-25.
5. Mitchell, Thomas M. 1997. Machine Learning. 1 edn. New York, NY, USA: McGraw-Hill, Inc. Mustapha, Ismail Babajide,
& Saeed, Faisal. 2016. Bioactive molecule prediction using extreme gradient boosting. Molecules, 21(8), 983.
6. Sheridan, Robert P, Wang, Wei Min, Liaw, Andy, Ma, Junshui, & Gifford, Eric M. 2016. Extreme Gradient Boosting as a Method
for Quantitative Structure–Activity Relationships. Journal of Chemical Information and Modeling, 56(12), 2353–2360.
7. Kaggle. 2016. The Bosch Production Line Performance competition. bosch-production-lineperformance. Accessed: 2017-02-25.
8. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Pretten- hofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, E. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research, 12, 2825–2830.
9. Chen, Tianqi, & Guestrin, Carlos. 2016. Xgboost: A scalable tree boosting system. arXiv preprint arXiv:1603.02754.
10. Friedman, Jerome H. 2001. Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189–1232.
11. Cerqueira V., Pinto F., Sa C., Soares C. (2016) Combining Boosted Trees with Metafeature Engineering for Predictive Maintenance. In: Bostrom H., Knobbe A., Soares C., Papapetrou P. (eds) Advances in Intelligent Data Analysis XV. IDA 2016. Lecture Notes in Computer Science, vol 9897. Springer, Cham.


2024 / 02
2024 / 01
2023 / 04
2023 / 03

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".