Hidden Markov

 
Hidden Markov
Model- Viterbi
Algorithm
 
Lecture – 10
 
Department of CSE, DIU
 
1.
Hidden Markov Model
 
- Notations
1.
Hidden Markov Model
 
  - Viterbi Algorithm
 
CONTENTS
 
1. Hidden 
Markov
 Model
 
 
Components:
O
bserved
 
variables
E
mitted
 
symbols
H
idden
 
variables
R
elationships
 
between
 
them
R
epresente
d 
b
y
 
a
 grap
h 
wit
h transition
probabilities
 
G
o
a
l
:
 
F
i
n
d
 
t
h
e
 
m
o
s
t
 
l
i
k
e
l
y
 
e
x
p
l
a
n
a
t
i
o
n
 
f
o
r
 
t
h
e
o
b
s
e
r
v
e
d
 
v
a
r
i
a
b
l
e
s
 
Hidden Markov Model (HMM)
 
States are decoupled
 
from
 
symbols
x
 
is
 
the
 
sequence
 
of
 
symbols
 
emitted
 
by
 
model
 
x
i
 
 
i
s
 
the
 symbo
l
 emitte
d 
a
t
 
time
 
i
A
 
p
a
t
h
,
 
,
 
i
s
 
a
s
e
q
u
e
n
c
e
 
o
f
 
s
t
a
t
e
s
 
T
he
 
i
-
th
 
state
 i
n
 
 
i
s
 
i
a
kr 
 
is
 
the
 
probability
 
of
 
making
 
a
 
transition
 
from
state
 
k
 
t
o
 
state
 
r
:
a
kr
 
Pr
(
i
 
 
r
 
|
i
 
1
 
k
)
e
k
(b)
 
is
 
the
 
probability
 
that
 
symbol
 
b
 
is
 
emitted
when
 
in
 
state
 
k
e
k
 
(
b
 
)
 
 
Pr(
x
i
 
 
b
 
|
 
i
 
 
k
 
)
 
Hidden Markov Model: Notations
 
A 
casino uses
 
a
 
fair
 
die
 
most
 
of
 
the
 
time,
 
but
occasionally
 
switches
 
to
 
a
 
loaded
 
one
 
F
ai
r
 die
:
 Prob(1
)
 
=
 Prob(2
)
 
=
 
.
 
.
 
.
 
=
 Prob(6
)
 
=
 1/6
 
Loade
d
 die
:
 Prob(1
)
 
=
 Prob(2
)
 
=
 
.
 
.
 
.
 
=
 Prob(5
)
 
=
 1/10,
Prob(6)
 
=
 
½
T
h
e
s
e
 
a
r
e
 
t
h
e
 
e
m
i
s
s
i
o
n
 
p
r
o
b
a
b
i
l
i
t
i
e
s
T
r
a
n
s
i
t
i
o
n
 
p
r
o
b
a
b
i
l
i
t
i
e
s
P
r
o
b
(
F
a
i
r
 
 
Loaded) 
=
 0.01
P
rob(Loaded
 
 
Fair)
 
=
 0.2
T
ransitions
 betwee
n 
states
 obe
y a
 
Markov
 process
 
Scenario: The Occasionally Dishonest Casino
Problem
 
1
:
1
/
1
0
2
:
1
/
1
0
3
:
1
/
1
0
4
:
1
/
1
0
5
:
1
/
1
0
6
:
1
/
2
 
0
.
2
 
0
.
0
1
 
0
.
9
9
 
0
.
8
0
 
F
a
i
r
 
L
o
a
d
e
d
 
e
k
 
(
b
)
 
a
kl
 
An HMM for Occasionally Dishonest Casino
 
Ho
w
 likel
y 
i
s
 
a
 give
n 
sequence?
t
h
e
 
F
o
r
w
a
r
d
 
a
l
g
o
r
i
t
h
m
W
ha
t 
i
s 
th
e
 
mos
t
 probabl
e 
“path
for
generatin
g a
 give
n
 sequence?
t
h
e
 
V
i
t
e
r
b
i
 
a
l
g
o
r
i
t
h
m
 
Important Questions
 
The most
 
likely path
 
*
 
satisfies
 
*
 
 
argmax
 
Pr
(
x
,
 
)
To 
fin
d
 
*
, consider
 
all possible
 
way
s
 
th
e
 
last
symbol
 
of 
x
 
could
 
have
 bee
n
 
emitted
Let
 
v
 
(
i
 
)
 
 
e
 
(
x
 
)
 
ma
x
v
 
(
i
 
 
1
)
a
 
 
rk
 
k
 
k
 
i
 
r
 
r
 
t
o
 
emit
 
x
1
,
K
,
 
x
i
 
such
 
tha
t
 
i
 
 
k
Then
 
v
k
 
(
i
 
)
 
 
Prob.
 
of
 
path
 
1
,
L
,
i
 
most
 
likely
 
The Most Probable Path
 
2. Viterbi Algorithm
 
 
I
n
i
t
i
a
l
i
z
a
t
i
o
n
:
(
i
 
=
 
0
)
v
0
 
(
0
)
 
 
1
,
 
v
k
 
(
0
)
 
 
0
 
for
 
k
 
 
0
R
e
c
u
r
s
i
o
n
:
 
(
i
 
=
 
1
,
 
.
 
.
 
.
 
,
 
L
)
:
 
F
o
r
 
e
a
c
h
 
s
t
a
t
e
 
k
v
k
 
(
i
)
 
 
e
k
 
(
x
i
 
)
 
ma
x
v
r
 
(
i
 
1
)
a
r
k
 
 
T
e
r
m
i
n
a
t
i
o
n
:
 
r
 
Pr
(
x
,
 
)
 
 
ma
x
 
v
 
(
L
)
a
 
 
k
 
k
 
0
 
*
 
k
 
To find 
*
,
 
us
e
 
trace-back
,
 
as
 i
n
 
dynami
c
 
programming
 
The Viterbi Algorithm
Outcome = 6,2,6
Static Probability is always =
0.5
Transition Probability
1
:
 
 
1
/
6
2
:
 
 
1
/
6
3
:
 
 
1
/
6
4
:
 
 
1
/
6
5
:
 
 
1
/
6
6
:
 
 
1
/
6
1
:
 
 
1
/
1
0
2
:
 
 
1
/
1
0
3
:
 
 
1
/
1
0
4
:
 
 
1
/
1
0
5
:
 
 
1
/
1
0
6
:
 
 
1
/
2
0
.
2
0
.
0
1
0
.
9
9
0
.
8
0
F
a
i
r
L
o
a
d
e
d
v
 
(
i
 
)
 
 
e
 
(
x
 
)
 
ma
x
v
 
(
i
 
 
1
)
a
 
rk
r
r
k
 
k
 
i
The Viterbi Algorithm
1
:
 
 
1
/
6
2
:
 
 
1
/
6
3
:
 
 
1
/
6
4
:
 
 
1
/
6
5
:
 
 
1
/
6
6
:
 
 
1
/
6
1
:
 
 
1
/
1
0
2
:
 
 
1
/
1
0
3
:
 
 
1
/
1
0
4
:
 
 
1
/
1
0
5
:
 
 
1
/
1
0
6
:
 
 
1
/
2
0
.
2
0
.
0
1
0
.
9
9
0
.
8
0
F
a
i
r
L
o
a
d
e
d
v
 
(
i
 
)
 
 
e
 
(
x
 
)
 
ma
x
v
 
(
i
 
 
1
)
a
 
rk
r
r
k
 
k
 
i
The Viterbi Algorithm: Example
Slide Note
Embed
Share

A detailed overview of Hidden Markov Model (HMM) and the Viterbi Algorithm. Covers notations, components, scenarios like the Occasionally Dishonest Casino Problem, and important questions addressed by the Forward and Viterbi algorithms. Learn about the most likely path in HMMs and how to find it using the Viterbi Algorithm.

  • Markov Model
  • Viterbi Algorithm
  • HMM
  • Casino Problem
  • Forward Algorithm

Uploaded on Feb 17, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Hidden Markov Model-Viterbi Algorithm Lecture 10 Department of CSE, DIU

  2. CONTENTS 1. Hidden Markov Model - Notations 1. Hidden Markov Model - Viterbi Algorithm

  3. 1. Hidden Markov Model

  4. Hidden Markov Model (HMM) Components: Observed variables Emitted symbols Hidden variables Relationships between them Represented by a graph with transition probabilities Goal: Find the most likely explanation for the observed variables

  5. Hidden Markov Model: Notations States are decoupled from symbols x is the sequence of symbols emitted by model xiis the symbol emitted at time i A path, , is a sequence of states The i-th state in is i akr is the probability of making a transition from state k to state r: akr=Pr( i=r| i 1=k) ek(b) is the probability that symbol b is emitted when in state k ek(b) = Pr(xi= b | i= k)

  6. Scenario: The Occasionally Dishonest Casino Problem A casino uses a fair die most of the time, but occasionally switches to a loaded one Fair die: Prob(1) = Prob(2) = . . . = Prob(6) = 1/6 Loaded die: Prob(1) = Prob(2) = . . . = Prob(5) = 1/10, Prob(6) = These are the emission probabilities Transition probabilities Prob(Fair Loaded) = 0.01 Prob(Loaded Fair) = 0.2 Transitions between states obey a Markov process

  7. An HMM for Occasionally Dishonest Casino akl 0.99 0.80 0.01 1: 2: 1/6 1/6 1: 1/10 2: 1/10 3: 1/10 4: 1/10 5: 1/10 6: 1/2 3: 1/6 4: 1/6 5: 1/6 6: 1/6 0.2 ek(b) Fair Loaded

  8. Important Questions How likely is a given sequence? the Forward algorithm What is the most probable path for generating a given sequence? the Viterbi algorithm

  9. The Most Probable Path The most likely path *satisfies *= argmax Pr(x, ) To find *, consider all possible ways the last symbol of x could have been emitted Let vk(i) = Prob. of path 1,L , i mostlikely to emit x1,K ,xi such that i= k Then v (i) = e (x )max( v (i 1)a ) k k i r rk r

  10. 2. Viterbi Algorithm

  11. The Viterbi Algorithm (i = 0) Initialization: v0(0) =1, vk(0) = 0for k 0 Recursion: (i = 1, . . . , L): For each state k vk(i) = ek(xi)max(vr(i 1)ark) r Termination: Pr(x, ) = max(v (L)a ) k * k0 k To find *, use trace-back, as in dynamic programming

  12. Outcome = 6,2,6 Static Probability is always = 0.5 The Viterbi Algorithm Transition Probability 6 2 6 Fair (1/6)x(1/2) = 1/12 (1/6) x max{(1/12) x 0.99, (1/4) x 0.2} (1/6)x max{0.01375 x 0.99, 0.02 x 0.2} = 0.00226875 = 0.01375 Loaded (1/2) x (1/2) = 1/4 (1/10)x max{(1/12) x 0.01, (1/4) x 0.8} (1/2) x max{0.01375 x 0.01, 0.02 x 0.8} = 0.08 = 0.02 0.80 0.99 1: 1/10 2: 1/10 3: 1/10 4: 1/10 5: 1/10 6: 1/2 0.01 1: 1/6 2: 1/6 3: 1/6 4: 1/6 5: 1/6 6: 1/6 v (i) = e (x )max( v (i 1)a ) k k i r rk r 0.2 Loaded Fair

  13. The Viterbi Algorithm: Example 6 2 6 Fair (1/6)x(1/2) = 1/12 (1/6) x max{(1/12) x 0.99, (1/4) x 0.2} (1/6)x max{0.01375 x 0.99, 0.02 x 0.2} = 0.00226875 = 0.01375 Loaded (1/2) x (1/2) = 1/4 (1/10)x max{(1/12) x 0.01, (1/4) x 0.8} (1/2) x max{0.01375 x 0.01, 0.02 x 0.8} = 0.08 = 0.02 0.80 0.99 1: 1/10 2: 1/10 3: 1/10 4: 1/10 5: 1/10 6: 1/2 0.01 1: 1/6 2: 1/6 3: 1/6 4: 1/6 5: 1/6 6: 1/6 v (i) = e (x )max( v (i 1)a ) k k i r rk r 0.2 Loaded Fair

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#