hidden_markov_model

Document Sample
hidden_markov_model Powered By Docstoc
					Hidden Markov Model
Hidden Markov models (HMMs) are the most popular means of temporal
classification. They have found application in areas like speech,
handwriting and gesture recognition.

Informally speaking, a hidden Markov model is a variant of a finite state
machine. However, unlike finite state machines, they are not deterministic.
A normal finite state machine emits a deterministic symbol in a given
state. Further, it then deterministically transitions to another state.
Hidden Markov models do neither deterministically, rather they both
transition and emit under a probabilistic model.

Usually, with a finite state machine, a string of symbols can be given and
it can be easily determined (a) whether the string could have been
generated by the finite state machine in the first place (b) if it could
have been generated by the finite state machine, what the sequence of state
transitions it undertook were. With a hidden Markov model, (a) is replaced
with a probability that the HMM generated the string and (b) is replaced
with nothing: in general the exact sequence of state transitions undertaken
is ``hidden'', hence the name




                                    QuickTime™ and a
                                      decompressor
                              are neede d to see this picture.




 A diagram illustrating an HMM and the different ways a,a,b,c can be generated by
                                     the HMM.
An example
Assume you have a friend who lives far away and who you call daily to talk
about what each of you did that day. Your friend has only three things he's
interested in: walking in the park, shopping, and cleaning his apartment.
The choice of what to do is determined exclusively by the weather on a
given day. You have no definite information about the weather where your
friend lives, but you know general trends. Based on what he tells you he
did each day, you try to guess what the weather must have been like.

You believe that the weather operates as a discrete Markov chain. There are
two states, "Rainy" and "Sunny", but you cannot observe them directly, that
is, they are hidden from you. On each day, there is a certain chance that
your friend will perform one of the following activities, depending on the
weather: "walk", "shop", or "clean". Since your friend tells you about his
activities, those are the observations. The entire system is that of a
hidden Markov model (HMM).

You know the general weather trends in the area and you know what your
friend likes to do on average. In other words, the parameters of the HMM
are known. In fact, you can write them down in the Python programming
language:

     states = ('Rainy', 'Sunny')
     observations = ('walk', 'shop', 'clean')
     start_probability = {'Rainy': 0.6, 'Sunny': 0.4}
     transition_probability = {'Rainy' : {'Rainy': 0.7, 'Sunny': 0.3},
     'Sunny' : {'Rainy': 0.4, 'Sunny': 0.6}, }
     emission_probability = {'Rainy' : {'walk': 0.1, 'shop': 0.4, 'clean':
     0.5},
     'Sunny' : {'walk': 0.6, 'shop': 0.3, 'clean': 0.1}, }

In this fragment, start_probability refers to your uncertainty about which
state the HMM is in when your friend first calls you (all you know is that
it tends to be rainy on average). The transition_probability refers to the
change of the weather in the underlying Markov chain. In this example,
there is only a 30% chance that tomorrow will be sunny if today is rainy.
The emission_probability tells you how likely your friend is to perform a
certain activity on each day. If it's rainy, there is a 50% chance that he
is cleaning his apartment; if it's sunny, there is a 60% chance that he
will go outside for a walk.
Figure 1 - Hidden Markov models form the basis for most
gene-prediction algorithms.
From the following article
How does eukaryotic gene prediction work?
Michael R Brent
Nature Biotechnology 25, 883 - 885 (2007)
doi:10.1038/nbt0807-883
                                      QuickTime™ and a
                                         decompressor
                                are need ed to see this picture.




(a) Use of a hidden Markov model (HMM) to interpret a message containing
    typographical errors: transition probabilities model letter sequences in
    correctly spelled English words, whereas emission probabilities model the
    probability of each possible typographical error. (b) De novo gene
    predictors use generalized hidden Markov models (GHMMs), in which states
    correspond to variable-length segments of the DNA sequence sharing some
    common function in transcription, RNA processing or translation. (c)
    Sequence logos representing weight matrices for the last six bases of an
    intron (left) and the first six bases of an intron (right). (d,e) For dual-
    genome predictors, the observations are segments of an alignment between
    two genomes. The pattern of mismatches and gaps in d suggests a protein-
    encoding region, whereas the pattern of mismatches and gaps in e suggests a
    noncoding region.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:7
posted:9/22/2011
language:English
pages:4