hidden markov models
Post on 30-Dec-2015
74 Views
Preview:
DESCRIPTION
TRANSCRIPT
Hidden Markov Models
戴玉書
L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models
Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application
Markov chain property: Probability of each subsequent state depends only
on what was the previous state
)|(),,,|( 1121 ikikikiiik ssPssssP
)()|()|()|(
),,,()|(
),,,(),,,|(),,,(
112211
1211
12112121
iiiikikikik
ikiiikik
ikiiikiiikikii
sPssPssPssP
sssPssP
sssPssssPsssP
Markov ModelsState
State
State
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application
N - number of states : M - the number of observables:
If you don’t have complete state information, but some
observations at each state
Hidden Markov Models
},,,{ 21 Nsss
q1 q2 q3 q4 ……
1o 2o 3o 4o
},,,{ 21 Mvvv
Hidden Markov ModelsState:{ , , }
Observable:{ , }
0.1
0.9
0.8
0.2
0.3
0.7
Hidden Markov Models
)|()()),(( immimi svPvbvbB
)|(),( ijijij ssPaaA
M=(A, B, )
= initial probabilities : =(i) , i = P(si)
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application
Evaluation Determine the probability that a particular sequence
of symbols O was generated by that model
TTtt qqqqqqqqq
T
tq aaaaMQP ,,,,
1
1 12121111)|(
TT oqoqoqtt
T
tbbbMqoPMQOP ,,,
1 2211),|(),|(
allQ
MQPMQOPMOP )|(),|()|(
Forward recursion
Initialization:
Forward recursion:
Termination:
)|,,...,()( 1 MsqooPi ittt
)()( 11 obi ii
)(])([)( 11
1
tjij
N
itt obaij
N
iT
N
iiTT iMsqoooPMOP
1121 )()|,,...,,()|(
Backward recursion
Initialization:
Backward recursion:
Termination:
),|,...,,()( 21 MsqoooPi itTttt
1)( iT
)(])([)( 11
1
tjij
N
jtt obaji
N
i
N
iiiMOP
1 11i1i1i1T21 )(o b)( )s)P(qsq|o, ... ,o P(o)|(
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application
Decoding Given a set of symbols O determine the most likely sequence of hidden states Q that led to the observations
We want to find the state sequence Q which maximizes P(Q|o1,o2,...,oT)
Viterbi algorithm
General idea:if best path ending in qt= sj goes through qt-1= si then it should coincide with best path ending in qt-1= si
s1
si
sN
sjaij
aNj
a1j
qt-1 qt
Initialization:
Forward recursion:
Termination:
Viterbi algorithm
)o ... o,o,s q ,q P(qmax (i) t21it 1-t1 t
)](ob(i)a[ max (j) tjij1-ti
t
)( )o,s P(qmax )( 11i11 obi ii
)]( [ maxi
iT
Viterbi algorithm
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application
Learning problem Given a coarse structure of the model, determine H
MM parameters M=(A, B, ) that best fit training data determine these parameters
i
imimi s statein timesofNumber
s statein occurs n vobservatio timesofNumber )s | P(v )(b mv
i
jiijij s state ofout ns transitioofNumber
s state tos state fromn transitioofNumber )s | P(s a
1 tat time s statein frequency Expected i i
Baum-Welch algorithm
)o, ... ,o ,o | s q , s P(q ),( T21j1tit jit
)o, ... o,P(o
)s q |o, ... ,P(o )(o b a )o ... o o ,s P(q
T21
j1tT 2t1tjjiT21it
(j)) (o b a (i)
(j) ) (o b a (i)
1t1tjij
1t1tjij
i jt
t
Define variable t(i,j) as the probability of being in state si at time t and in state sj at time t+1, given the observation sequence o1, o2, ... ,oT
0:Initialise M
Baum-Welch algorithm Define variable k(i) as the probability of being in st
ate si at time t, given the observation sequence o1,o2 ,...,oT
N
jtt jii
1
),()(
,)(
),(
1
1
1
1
T
tt
T
tt
ij
i
jia
,
)(
)(
)( 1
1
1
,1
T
tt
T
vott
mj
j
j
vb mt
)(1 i
Outline
Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation problem -Decoding problem -Learning problem Application
Example 1 -character recognition The structure of hidden states:
Observation = number of islands in the vertical slice
s1 s2 s3
Example 1 -character recognition After character image segmentation the following sequence
of island numbers in 4 slices was observed : {1,3,2,1}
Example 2- face detection & recognition The structure of hidden states:
Example 2- face detection A set of face images is used in the training of one
HMM model
N =6 states
Image:48, Training:9, Correct detection:90%,Pixels:60X90
Example 2- face recognition Each individual in the database is represent by an
HMM face model A set of images representing different instances of
same face are used to train each HMM
N =6 states
Example 2- face recognition
Image:400, Training :Half, Individual:40, Pixels:92X112
top related