Likelihood-Based Inference and Model Selection for Discrete-Time Finite State-Space Hidden Markov Models

Postgraduate Thesis uoadl:2800185 797 Read counter

Unit:
Κατεύθυνση Στατιστική και Επιχειρησιακή Έρευνα
Library of the School of Science
Deposit date:
2018-10-01
Year:
2018
Author:
Katsianos Vasileios
Supervisors info:
Μελιγκοτσίδου Λουκία, Επίκουρη Καθηγήτρια, Τμήμα Μαθηματικών, Εθνικό και Καποδιστριακό Πανεπιστήμιο Αθηνών
Original Title:
Likelihood-Based Inference and Model Selection for Discrete-Time Finite State-Space Hidden Markov Models
Languages:
English
Translated title:
Likelihood-Based Inference and Model Selection for Discrete-Time Finite State-Space Hidden Markov Models
Summary:
Hidden Markov Μodels (HMMs) are one of the most fruitful statistical modelling concepts that have appeared in the last fifty years. The use of latent states makes HMMs generic enough to handle a wide array of complex real-world time series, while the relatively straightforward dependence structure still allows for the use of efficient computational procedures. This dissertation concerns itself with the presentation of frequentist and Bayesian methods for statistical inference and model selection in the context of HMMs. These methods are, then, applied on real and simulated data in order to gauge their accuracy and efficiency.

HMMs belong in a general class of models referred to as missing data problems. In the context of frequentist statistics, the Expectation-Maximisation (EM) algorithm approximates the maximum likelihood estimator (MLE) of the parameter vector in a missing data problem, whereas, in the framework of Bayesian statistics, Markov Chain Monte Carlo (MCMC) methods, especially the full-conditional Gibbs sampler, are applicable to approximate the posterior distribution of the parameter vector. These methods are first applied for parameter estimation in finite mixture models, which may be regarded as special cases of HMMs, where no dependence is allowed whatsoever between subsequent observations.

In the case of HMMs some form of forward-backward recursion is additionally required in order to compute the conditional distribution of the hidden variables, given the observations. This so called Forward-Backward algorithm may be combined either with the EM algorithm or some MCMC method for parameter estimation. Lastly, we examine methods for selecting the number of hidden states in an HMM. The frequentist approach usually entails the approximation of the generalised likelihood-ratio (LR) statistics through some bootstrap technique, while the Bayesian approach relies either on trans-dimensional MCMC methods, which incorporate moves between different models along with parameter estimation, or on simulation methods to approximate the marginal likelihoods of the competing models.
Main subject category:
Science
Keywords:
Missing Data Problems, Finite Mixture Models, Hidden Markov Models, Expectation-Maximization Algorithm, Markov Chain Monte Carlo
Index:
No
Number of index pages:
0
Contains images:
Yes
Number of references:
40
Number of pages:
127
Thesis.pdf (2 MB) Open in new window