Accord.Statistics.Models.Markov BaseHiddenMarkovModel

Accord.Statistics.Models.Markov HiddenMarkovModel TDistribution

**Namespace:**Accord.Statistics.Models.Markov

**Assembly:**Accord.Statistics (in Accord.Statistics.dll) Version: 2.10.0.0 (2.10.0.4632)

- TDistribution

Hidden Markov Models (HMM) are stochastic methods to model temporal and sequence data. They are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition, part-of-speech tagging, musical score following, partial discharges and bioinformatics.

This page refers to the arbitrary-density (continuous emission distributions) version of the model. For discrete distributions, please see HiddenMarkovModel.

Dynamical systems of discrete nature assumed to be governed by a Markov chain emits a sequence of observable outputs. Under the Markov assumption, it is also assumed that the latest output depends only on the current state of the system. Such states are often not known from the observer when only the output values are observable.

Hidden Markov Models attempt to model such systems and allow, among other things,

- To infer the most likely sequence of states that produced a given output sequence,
- Infer which will be the most likely next state (and thus predicting the next output),
- Calculate the probability that a given sequence of outputs originated from the system (allowing the use of hidden Markov models for sequence classification).

The “hidden” in Hidden Markov Models comes from the fact that the observer does not know in which state the system may be in, but has only a probabilistic insight on where it should be.

The arbitrary-density Hidden Markov Model uses any probability density function (such as GaussianMixture Model) for computing the state probability. In other words, in a continuous HMM the matrix of emission probabilities B is replaced by an array of either discrete or continuous probability density functions.

If a general discrete distribution is used as the underlying probability density function, the model becomes equivalent to the discrete Hidden Markov Model.

For a more thorough explanation on some fundamentals on how Hidden Markov Models work, please see the HiddenMarkovModel documentation page. To learn a Markov model, you can find a list of both supervised and unsupervised learning algorithms in the Accord.Statistics.Models.Markov.Learning namespace.

References:

- Wikipedia contributors. "Linear regression." Wikipedia, the Free Encyclopedia. Available at: http://en.wikipedia.org/wiki/Hidden_Markov_model
- Bishop, Christopher M.; Pattern Recognition and Machine Learning. Springer; 1st ed. 2006.

The example below reproduces the same example given in the Wikipedia entry for the Viterbi algorithm (http://en.wikipedia.org/wiki/Viterbi_algorithm). As an arbitrary density model, one can use it with any available probability distributions, including with a discrete probability. In the following example, the generic model is used with a GeneralDiscreteDistribution to reproduce the same example given in HiddenMarkovModel TDistribution . Below, the model's parameters are initialized manually. However, it is possible to learn those automatically using BaumWelchLearning TDistribution .

// Create the transation matrix A double[,] transitions = { { 0.7, 0.3 }, { 0.4, 0.6 } }; // Create the vector of emission densities B GeneralDiscreteDistribution[] emissions = { new GeneralDiscreteDistribution(0.1, 0.4, 0.5), new GeneralDiscreteDistribution(0.6, 0.3, 0.1) }; // Create the initial probabilities pi double[] initial = { 0.6, 0.4 }; // Create a new hidden Markov model with discrete probabilities var hmm = new HiddenMarkovModel<GeneralDiscreteDistribution>(transitions, emissions, initial); // After that, one could, for example, query the probability // of a sequence ocurring. We will consider the sequence double[] sequence = new double[] { 0, 1, 2 }; // And now we will evaluate its likelihood double logLikelihood = hmm.Evaluate(sequence); // At this point, the log-likelihood of the sequence // ocurring within the model is -3.3928721329161653. // We can also get the Viterbi path of the sequence int[] path = hmm.Decode(sequence, out logLikelihood); // At this point, the state path will be 1-0-0 and the // log-likelihood will be -4.3095199438871337