DATA PROCESSING AND ANALYSIS
Yu.A. Dubnov, A.V. Boulytchev On an Approach to Tuning the MetropolisHastings Algorithm for the Task of Separating a Mixture of Gaussian Components
APPLIED ASPECTS OF COMPUTER SCIENCE
CONTROL SYSTEMS
Yu.A. Dubnov, A.V. Boulytchev On an Approach to Tuning the MetropolisHastings Algorithm for the Task of Separating a Mixture of Gaussian Components
Abstract 

The article considers the problem of separating a mixture of Gaussian components, which consists in determining, from available observations, the parameters of the mixture components. An approach to solving this problem is proposed, based on Bayesian estimation using the most informative prior distributions (Maximal Data Information Prior - MDIP). The novelty of the described approach lies in the use of sample estimates to calculate the prior distribution and determine the settings of the Metropolis-Hastings algorithm for sampling with adaptive stepwise adjustment of the proposed distribution parameters.

Keywords: 

Gaussian mixture model, Bayesian approach, Prior distribution, Metropolis-Hastings algorithm.

PP. 25-33.

DOI 10.14357/20718632200103
 
References

1. McLachlan, G., and D. Peel. Finite Mixture Models. – Hoboken, NJ: John Wiley & Sons. Inc., 2000.
2. Figueiredo, M.A.T. and Jain A.K. Unsupervised Learning of Finite Mixture Models. // IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.24(3), pp.381-396, 2012.
3. Reynolds, D.A., Rose, R.C. Robust Text-Independent Speaker Identification using Gaussian Mixture Speaker Models // IEEE Transactions on Acoustics, Speech, and Signal Processing, vol.3(1). pp.72-83, 1995.
4. Brigo Damiano, Mercurio Fabio. Lognormal-mixture dynamics and calibration to market volatility smiles. // International Journal of Theoretical and Applied Finance, vol.5(4), pp.427-452, 2002.
5. Yu.A. Dubnov, A.V. Boulytchev Bayesian Identification of a Gaussian Mixture Model // Journal of Information Technologies and Computing Systems. 2017. Iss.1. P.101-111.
6. Dempster A.P., Laird N.M., Rubin D.B. Maximum Likelihood from Incomplete Data via the EM Algorithm.// Journal of the Royal Statistical Society. Series B, vol.39(1), pp.1-38, 1977.
7. John E. Rolph. Bayesian Estimation of Mixing Distributions // The Annals of Mathematical Statistics, vol.39, No.4, pp.1289-1302, 1968.
8. Andrew Gelman. Bayes, Jeffreys, Prior Distributions and the Philosophy of Statistics // Statistical Science, vol.24, No.2, pp.176-178, 2009.
9. Robert E. Kass and Larry Wasserman. The Selection of Prior Distributions by Formal Rules // Journal of the American Statistical Association, vol.91, No.435, pp.1343-1370, 1996.
10. Navid Feroze and Muhammad Aslam. Bayesian Estimation of Two-Component Mixture of Gumbel Type II Distribution under Informative Priors // International Journal of Advanced Science and Technology, vol.53, pp.11-30, 2013.
11. Zellner A. Past and Recent Results on Maximal DataInformation Priors // Texhnical Report, Graduate School of Business, University of Chicago, 1996.
12. Siddhartha Chib, Edward Greenberg. Understanding the Metropolis-Hastings Algorithm // The American Statistician, vol.49, No.4, pp.327-335, 1995.
13. Gilks, W.R. and Roberts, G. O. ”Strategies for improving MCMC”, in (Gilks, W.R. eds) Markov Chain Monte Carlo in Practice, Chapman & Hall/CRC? 1996.
14. Robert, Christian; Casella, George. Monte Carlo Statistical Methods. Springer, 2004.
15. Dempster A.P., Laird N.M., Rubin D.B. Maximum Likelihood from Incomplete Data via the EM Algorithm.// Journal of the Royal Statistical Society. Series B, vol.39(1), pp.1-38, 1977.
 
2024 / 02
2024 / 01
2023 / 04
2023 / 03

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".