Yu. A. Dubnov, A. V. Boulytchev Approximate Estimation Using the Accelerated Maximum Entropy Method. Part 1. Problem Statement and Implementation for the Regression Problem
Yu. A. Dubnov, A. V. Boulytchev Approximate Estimation Using the Accelerated Maximum Entropy Method. Part 1. Problem Statement and Implementation for the Regression Problem

The work is devoted to the development of an entropy estimation method with “soft” randomization for restoring the parameters of probabilistic mathematical models from the available observations. Soft randomization refers to the technique of adding regularization to the information entropy functional to simplify the optimization problem and speed up learning process compared to the traditional maximum entropy method. In this work, the concept of the soft randomization entropy estimation method was developed, including obtaining entropy-optimal PDF functions in general form. During the experiments, several types of model regularization were tested on the example of a classical regression analysis problem.


probabilistic mathematical model, maximum entropy method, linear regression, regularization.

PP. 69-80.

DOI 10.14357/20718632220407

1. Björck, Å. Numerical Methods for Least Squares Problems. SIAM. 1996.
2. Huang, David S. Regression and Econometric Methods. New York: John Wiley & Sons. 1970. pp. 127–147.
3. Mishulina O. A. Statisticheskij analiz i obrabotka vremennyh ryadov. — M.: MIFI, 2004. — S. 180.
4. Woodward, W. A., Gray, H. L. & Elliott, A. C. Applied Time Series Analysis, CRC Press. 2012.
5. Amos Golan, George G. Judge, Douglas Miller. Maximum Entropy Econometrics: Robust Estimation with Limited Data. – John Wiley and Sons Ltd. Chichester, U.K., 1996.
6. Yu. S. Popkov , Yu. A. Dubnov. Entropy-robust randomized forecasting under small sets of retrospective data // Automation and Remote Control. 2016, Volume 77, Issue 5, pp 839-854.
7. Popkov, Y.S.; Dubnov, Y.A.; Popkov, A.Y. New Method of Randomized Forecasting Using Entropy-Robust Estimation: Application to the World Population Prediction. // Mathematics, 2016, Vol. 4, Iss.1, p.1-16.
8. Jaynes E.T. Information Theory and Statistical Mechanics // Physics Review Notes. 1957. V. 106. P. 620–630.
9. R.D. Levin, M. Tribus. The maximum entropy formalism. MIT Press, 1979.
10. Jaynes E.T. Probability Theory. The logic and science. Cambrige Univ. Press, 2003.
11. Ximing Wu. A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight // Entropy, 2009. no.11.
12. Popkov, Y., Popkov, A. & Dubnov, Y. Elements of Randomized Forecasting and Its Application to Daily Electrical Load Prediction in a Regional Power System // Automation and Remote Control. 2020. V. 81, Iss.. 7. Pp. 1286–1306.
13. Popkov, Y.S.; Popkov, A.Y.; Dubnov, Y.A.; Solomatine, D. Entropy-Randomized Forecasting of Stochastic Dynamic Regression Models. Mathematics 2020, 8, 1119.
14. Yuri S. Popkov, Zeev Volkovich, Yuri A. Dubnov, Renata Avros and Elena Ravve. Entropy “2”-Soft Classification of Objects // Entropy, 2017, Vol. 19, Iss. 4, No.178.
15. Yu. A. Dubnov. Entropy-Based Estimation in Classification Problems // Automation and Remote Control. 2019, V. 80, Iss. 3, pp 502-512.
16. Voevodin V.V., Kuznecov Yu.A. Matricy i vychisleniya. M., Nauka, 1984.
17. Cypkin Ya.Z. Osnovy teorii obuchayushchihsya sistem. M., Nauka, 1970.
18. Kaashoek M.A., Seatzu S., van der Mee C. Recent Advances in Operator Theory and its Application. 2006, Springer, p.478.
2024 / 02
2024 / 01
2023 / 04
2023 / 03

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".