Accord.NET (logo) BaumWelchLearning Class Accord.NET Framework
Online Show table of contents (goes to the online documentation index).
Baum-Welch learning algorithm for discrete-density Hidden Markov Models.
Inheritance Hierarchy

Online System Object
  Accord.Statistics.Models.Markov.Learning BaseBaumWelchLearning
    Accord.Statistics.Models.Markov.Learning BaumWelchLearning

Namespace: Accord.Statistics.Models.Markov.Learning
Assembly: Accord.Statistics (in Accord.Statistics.dll) Version: 2.10.0.0 (2.10.0.4632)
Syntax

public class BaumWelchLearning : BaseBaumWelchLearning, 
	IUnsupervisedLearning, IConvergenceLearning
Remarks

The Baum-Welch algorithm is an unsupervised algorithm used to learn a single hidden Markov model object from a set of observation sequences. It works by using a variant of the Expectation-Maximization algorithm to search a set of model parameters (i.e. the matrix of transition probabilities A, the matrix of emission probabilities B, and the initial probability vector π) that would result in a model having a high likelihood of being able to generate a set of training sequences given to this algorithm.

For increased accuracy, this class performs all computations using log-probabilities.

For a more thorough explanation on hidden Markov models with practical examples on gesture recognition, please see Online Sequence Classifiers in C#, Part I: Hidden Markov Models [1].

[1]: Online http://www.codeproject.com/Articles/541428/Sequence-Classifiers-in-Csharp-Part-I-Hidden-Marko

Examples

// We will try to create a Hidden Markov Model which 
//  can detect if a given sequence starts with a zero 
//  and has any number of ones after that. 
int[][] sequences = new int[][] 
{
    new int[] { 0,1,1,1,1,0,1,1,1,1 },
    new int[] { 0,1,1,1,0,1,1,1,1,1 },
    new int[] { 0,1,1,1,1,1,1,1,1,1 },
    new int[] { 0,1,1,1,1,1         },
    new int[] { 0,1,1,1,1,1,1       },
    new int[] { 0,1,1,1,1,1,1,1,1,1 },
    new int[] { 0,1,1,1,1,1,1,1,1,1 },
};

// Creates a new Hidden Markov Model with 3 states for 
//  an output alphabet of two characters (zero and one)
HiddenMarkovModel hmm = new HiddenMarkovModel(3, 2);

// Try to fit the model to the data until the difference in 
//  the average log-likelihood changes only by as little as 0.0001 
var teacher = new BaumWelchLearning(hmm) { Tolerance = 0.0001, Iterations = 0 };
double ll = teacher.Run(sequences);

// Calculate the probability that the given 
//  sequences originated from the model 
double l1 = hmm.Evaluate(new int[] { 0, 1 });       // 0.999 
double l2 = hmm.Evaluate(new int[] { 0, 1, 1, 1 }); // 0.916 

// Sequences which do not start with zero have much lesser probability. 
double l3 = hmm.Evaluate(new int[] { 1, 1 });       // 0.000 
double l4 = hmm.Evaluate(new int[] { 1, 0, 0, 0 }); // 0.000 

// Sequences which contains few errors have higher probabability 
//  than the ones which do not start with zero. This shows some 
//  of the temporal elasticity and error tolerance of the HMMs. 
double l5 = hmm.Evaluate(new int[] { 0, 1, 0, 1, 1, 1, 1, 1, 1 }); // 0.034 
double l6 = hmm.Evaluate(new int[] { 0, 1, 1, 1, 1, 1, 1, 0, 1 }); // 0.034
See Also