Recursive Deviance Information Criterion for the Hidden Markov Model

  •  Safaa Kadhem    
  •  Paul Hewson    
  •  Irene Kaimi    


In Bayesian model selection, the deviance information criterion (DIC) has become a widely used criterion. It is however not defined for the hidden Markov models (HMMs). In particular, the main challenge of applying the DIC for HMMs is that the observed likelihood function of such models is not available in closed form. A closed form for the observed likelihood function can be obtained either by summing all possible hidden states of the complete likelihood using the so-called the forward recursion, or via integrating out the hidden states in the conditional likelihood. Hence, we propose two versions of the DIC to the model choice problem in HMMs context, namely, the recursive deviance-based DIC and the conditional likelihood-based DIC. In this paper, we compare several normal HMMs after they are estimated by Bayesian MCMC method. We conduct a simulation study based on synthetic data generated under two assumptions, namely diversity in the heterogeneity level and also the number of states. We show that the recursive deviance-based DIC performs well in selecting the correct model compared with the conditional likelihood-based DIC that prefers the more complicated models. A real application involving the waiting time of Old Faithful Geyser data was also used to check those criteria. All the simulations were conducted in Python v.2.7.10, available from first author on request.

This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1927-7032
  • ISSN(Online): 1927-7040
  • Started: 2012
  • Frequency: bimonthly

Journal Metrics

  • h-index (December 2021): 20
  • i10-index (December 2021): 51
  • h5-index (December 2021): N/A
  • h5-median(December 2021): N/A

( The data was calculated based on Google Scholar Citations. Click Here to Learn More. )