Markov Chains and Applications

 
Markov Chains
 
 
What is a “Markov Chain”
 
We have a 
set of states
, S = {s1, s2,...,sr}.
A process starts in one of these states and moves successively
from one state to another. Each move is called a 
step
.
If the chain is currently in state s
i
, then it moves to state s
j
 at
the next step with a 
probability
 denoted by 
p
ij
 , and this
probability does not depend upon which states the chain was
in before the current state.
The probabilities p
ij
 are called 
transition probabilities
. The
process can remain in the state it is in, and this occurs with
probability p
ii
.
 
Andrey Markov
 
(14 June 1856 – 20 July 1922) was a Russian
mathematician. He is best known for his work
on 
stochastic processes
.   primary subject of his
research later became known as 
Markov
chains
 and 
Markov processes
.
Markov and his younger brother 
Vladimir Andreevich
Markov
 (1871–1897) proved 
Markov brothers'
inequality
.
His son, another 
Andrei Andreevich Markov
 (1903–
1979), was also a notable mathematician, making
contributions to 
constructive
mathematics
 and 
recursive function
 theory.
 
Example: Land of Oz
 
They never have two nice days in a row.
If they have a nice day, they are just as likely to have snow as rain
the next day.
If they have snow or rain, they have an even chance of having the
same the next day.
If there is change from snow or rain, only half of the time is this a
change to a nice day.
 
Transition Matrix for the Land of Oz
 
2-Day Forecast for the Land of Oz
 
We consider the question of determining the probability that, given
the chain is in state i today, it will be in state j two days from now.
We denote this probability by p
ij
(2)
 .
We see that if it is rainy today then the event that it is snowy two
days from now is the disjoint union of the following three events:
1)
it is rainy tomorrow and snowy two days from now,
2)
it is nice tomorrow and snowy two days from now, and
3)
it is snowy tomorrow and snowy two days from now.
 
2 Day Forecast Probabilities
 
We have:
p
13
(2)
 = p
i11j
 p
13j
+ p
12
 p
23
 + p
13
 p
33
and so forth, so that:
 
Theorem
: Let matrix of a Markov chain. The ij-th entry
p
ij
(n)
 of the matrix P
n
 gives the probability that the Markov
chain, starting in state s
i
, will be in state s
j
 after n steps.
 
6-Day Forecast for Land of Oz
 
 
Example: Drunkard’s Walk
 
A man walks along a four-block stretch of Park Avenue:
If he is at corner 1, 2, or 3, then he walks to the left or right with
equal probability. He continues until he reaches corner 4, which is
a bar, or corner 0, which is his home.
If he reaches either home or the bar, he stays there.
 
Transition Matrix for Drunkard Walk
 
Transient vs Absorbing
 
Definition:
 A state s
i
 of a Markov chain is called absorbing if it is
impossible to leave it (i.e., p
ii
 = 1).
 
Definition
: A Markov chain is absorbing if it has at least one
absorbing state, and if from every state it is possible to go to an
absorbing state (not necessarily in one step).
 
Definition
: In an absorbing Markov chain, a state which is not
absorbing is called transient.
 
Long-term Drunken Walk
 
 
Questions to Ask about a Markov Chain
 
What is the probability that the process will eventually reach an
absorbing state?
What is the probability that the process will end up in a given
absorbing state?
On the average, how long will it take for the process to be
absorbed?
On the average, how many times will the process be in each
transient state?
 
Example: Running for Election
 
The President of the United States tells person A his or her intention
to run or not to run in the next election. Then A relays the news to
B, who in turn relays the message to C, and so forth, always to
some new person. We assume that there is a probability a that a
person will change the answer from yes to no when transmitting it
to the next person and a probability b that he or she will change it
from no to yes. We choose as states the message, either yes or no.
 
Transition Matrix
 
 
Example: Elite Colleges
 
In the Dark Ages, Harvard, Dartmouth, and Yale admitted only male
students. Assume that, at that time:
80 percent of the sons of Harvard men went to Harvard and the
rest went to Yale
40 percent of the sons of Yale men went to Yale, and the rest split
evenly between Harvard and Dartmouth
of the sons of Dartmouth men, 70 percent went to Dartmouth, 20
percent to Harvard, and 10 percent to Yale
 
Transition Matrix
 
Slide Note
Embed
Share

Markov chains are models used to describe the transition between states in a process, where the future state depends only on the current state. The concept was pioneered by Russian mathematician Andrey Markov and has applications in various fields such as weather forecasting, finance, and biology. This summary covers the fundamentals of Markov chains, their applications, and forecast probabilities in a simple and informative manner.

  • Markov Chains
  • Stochastic Processes
  • Probability Theory
  • Applications
  • Forecast

Uploaded on Sep 30, 2024 | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Markov Chains

  2. What is a Markov Chain We have a set of states, S = {s1, s2,...,sr}. A process starts in one of these states and moves successively from one state to another. Each move is called a step. If the chain is currently in state si, then it moves to state sjat the next step with a probability denoted by pij, and this probability does not depend upon which states the chain was in before the current state. The probabilities pijare called transition probabilities. The process can remain in the state it is in, and this occurs with probability pii.

  3. Andrey Markov (14 June 1856 20 July 1922) was a Russian mathematician. He is best known for his work on stochastic processes. primary subject of his research later became known as Markov chains and Markov processes. Markov and his younger brother Vladimir Andreevich Markov (1871 1897) proved Markov brothers' inequality. His son, another Andrei Andreevich Markov (1903 1979), was also a notable mathematician, making contributions to constructive mathematics and recursive function theory.

  4. Example: Land of Oz They never have two nice days in a row. If they have a nice day, they are just as likely to have snow as rain the next day. If they have snow or rain, they have an even chance of having the same the next day. If there is change from snow or rain, only half of the time is this a change to a nice day.

  5. Transition Matrix for the Land of Oz R N S R N S

  6. 2-Day Forecast for the Land of Oz We consider the question of determining the probability that, given the chain is in state i today, it will be in state j two days from now. We denote this probability by pij(2) . We see that if it is rainy today then the event that it is snowy two days from now is the disjoint union of the following three events: 1) it is rainy tomorrow and snowy two days from now, 2) it is nice tomorrow and snowy two days from now, and 3) it is snowy tomorrow and snowy two days from now.

  7. 2 Day Forecast Probabilities We have: p13(2) = pi11j p13j+ p12 p23 + p13 p33 and so forth, so that: Theorem: Let matrix of a Markov chain. The ij-th entry pij(n) of the matrix Pn gives the probability that the Markov chain, starting in state si, will be in state sj after n steps.

  8. 6-Day Forecast for Land of Oz

  9. Example: Drunkards Walk A man walks along a four-block stretch of Park Avenue: If he is at corner 1, 2, or 3, then he walks to the left or right with equal probability. He continues until he reaches corner 4, which is a bar, or corner 0, which is his home. If he reaches either home or the bar, he stays there.

  10. Transition Matrix for Drunkard Walk 0 1 2 3 4 0 1 2 3 4

  11. Transient vs Absorbing Definition: A state si of a Markov chain is called absorbing if it is impossible to leave it (i.e., pii = 1). Definition: A Markov chain is absorbing if it has at least one absorbing state, and if from every state it is possible to go to an absorbing state (not necessarily in one step). Definition: In an absorbing Markov chain, a state which is not absorbing is called transient.

  12. Long-term Drunken Walk

  13. Questions to Ask about a Markov Chain What is the probability that the process will eventually reach an absorbing state? What is the probability that the process will end up in a given absorbing state? On the average, how long will it take for the process to be absorbed? On the average, how many times will the process be in each transient state?

  14. Example: Running for Election The President of the United States tells person A his or her intention to run or not to run in the next election. Then A relays the news to B, who in turn relays the message to C, and so forth, always to some new person. We assume that there is a probability a that a person will change the answer from yes to no when transmitting it to the next person and a probability b that he or she will change it from no to yes. We choose as states the message, either yes or no.

  15. Transition Matrix

  16. Example: Elite Colleges In the Dark Ages, Harvard, Dartmouth, and Yale admitted only male students. Assume that, at that time: 80 percent of the sons of Harvard men went to Harvard and the rest went to Yale 40 percent of the sons of Yale men went to Yale, and the rest split evenly between Harvard and Dartmouth of the sons of Dartmouth men, 70 percent went to Dartmouth, 20 percent to Harvard, and 10 percent to Yale

  17. Transition Matrix

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#