随机过程模型.ppt_第1页
随机过程模型.ppt_第2页
随机过程模型.ppt_第3页
随机过程模型.ppt_第4页
随机过程模型.ppt_第5页
已阅读5页,还剩87页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

1、Chapter 17Markov Chains to accompanyOperations Research: Applications in other words, P(X0=i) = qi.,We call the vector q= q1, q2,qs the initial probability distribution for the Markov chain. In most applications, the transition probabilities are displayed as an s x s transition probability matrix P.

2、 The transition probability matrix P may be written as,For each I We also know that each entry in the P matrix must be nonnegative. Hence, all entries in the transition probability matrix are nonnegative, and the entries in each row must sum to 1.,5.3 n-Step Transition Probabilities,A question of in

3、terest when studying a Markov chain is: If a Markov chain is in a state i at time m, what is the probability that n periods later than the Markov chain will be in state j? This probability will be independent of m, so we may writeP(Xm+n =j|Xm = i) = P(Xn =j|X0 = i) = Pij(n)where Pij(n) is called the

4、 n-step probability of a transition from state i to state j. For n 1, Pij(n) = ijth element of Pn,The Cola Example,Suppose the entire cola industry produces only two colas. Given that a person last purchased cola 1, there is a 90% chance that her next purchase will be cola 1. Given that a person las

5、t purchased cola 2, there is an 80% chance that her next purchase will be cola 2. If a person is currently a cola 2 purchaser, what is the probability that she will purchase cola 1 two purchases from now? If a person is currently a cola a purchaser, what is the probability that she will purchase col

6、a 1 three purchases from now?,The Cola Example,We view each persons purchases as a Markov chain with the state at any given time being the type of cola the person last purchased. Hence, each persons cola purchases may be represented by a two-state Markov chain, where State 1 = person has last purcha

7、sed cola 1 State 2 = person has last purchased cola 2 If we define Xn to be the type of cola purchased by a person on her nth future cola purchase, then X0, X1, may be described as the Markov chain with the following transition matrix:,The Cola Example,We can now answer questions 1 and 2. We seek P(

8、X2 = 1|X0 = 2) = P21(2) = element 21 of P2:,The Cola Example,Hence, P21(2) =.34. This means that the probability is .34 that two purchases in the future a cola 2 drinker will purchase cola 1. By using basic probability theory, we may obtain this answer in a different way. We seek P11(3) = element 11

9、 of P3: Therefore, P11(3) = .781,Many times we do not know the state of the Markov chain at time 0. Then we can determine the probability that the system is in state i at time n by using the reasoning. Probability of being in state j at time n where q=q1, q2, q3.,To illustrate the behavior of the n-

10、step transition probabilities for large values of n, we have computed several of the n-step transition probabilities for the Cola example. This means that for large n, no matter what the initial state, there is a .67 chance that a person will be a cola 1 purchaser. We can easily multiply matrices on

11、 a spreadsheet using the MMULT command.,5.4 Classification of States in a Markov Chain,To understand the n-step transition in more detail, we need to study how mathematicians classify the states of a Markov chain. The following transition matrix illustrates most of the following definitions. A graph

12、ical representation is shown in the book.,Definition: Given two states of i and j, a path from i to j is a sequence of transitions that beings in i and ends in j, such that each transition in the sequence has a positive probability of occurring. Definition: A state j is reachable from state i if the

13、re is a path leading from i to j. Definition: Two states i and j are said to communicate if j is reachable from i, and i is reachable from j. Definition: A set of states S in a Markov chain is a closed set if no state outside of S is reachable from any state in S.,Definition: A state i is an absorbi

14、ng state if pij=1. Definition: A state i is a transient state if there exists a state j that is reachable from i, but the state i is not reachable from state j. The importance of these concepts will become clear after the next two sections.,5.5 Steady-State Probabilities and Mean First Passage Times

15、,Steady-state probabilities are used to describe the long-run behavior of a Markov chain. Theorem 1: Let P be the transition matrix for an s-state ergodic chain. Then there exists a vector = 1 2 s such that,Theorem 1 tells us that for any initial state i, The vector = 1 2 s is often called the stead

16、y-state distribution, or equilibrium distribution, for the Markov chain.,Transient Analysis mij is called the mean first passage time from state i to state j. In the example, we assume we are currently in state i. Then with probability pij, it will take one transition to go from state i to state j.

17、For k j, we next go with probability pik to state k. In this case, it will take an average of 1 + mkj transitions to go from i and j.,This reasoning implies By solving the linear equations of the equation above, we find all the mean first passage times. It can be shown that,Solving for Steady-State

18、Probabilities and Mean First Passage Times on the Computer,Since we solve steady-state probabilities and mean first passage times by solving a system of linear equations, we may use LINDO to determine them. Simply type in an objective function of 0, and type the equations you need to solve as your c

19、onstraints. Alternatively, you may use the LINGO model in the file Markov.lng to determine steady-state probabilities and mean first passage times for an ergodic chain.,5.6 Absorbing Chains,Many interesting applications of Markov chains involve chains in which some of the states are absorbing and th

20、e rest are transient states. This type of chain is called an absorbing chain. To see why we are interested in absorbing chains we consider the following accounts receivable example.,77,78,79,80,81,82,83,84,85,86,Accounts Receivable Example,The accounts receivable situation of a firm is often modeled

21、 as an absorbing Markov chain. Suppose a firm assumes that an account is uncollected if the account is more than three months overdue. Then at the beginning of each month, each account may be classified into one of the following states: State 1 New account State 2 Payment on account is one month ove

22、rdue State 3 Payment on account is two months overdue State 4 Payment on account is three months overdue State 5 Account has been paid State 6 Account is written off as bad debt,Suppose that past data indicate that the following Markov chain describes how the status of an account changes from one mo

23、nth to the next month:,To simplify our example, we assume that after three months, a debt is either collected or written off as a bad debt. Once a debt is paid up or written off as a bad debt, the account if closed, and no further transitions occur. Hence, Paid or Bad Debt are absorbing states. Since every account will eventually be paid or written off as a bad debt, New, 1 month, 2 months, and 3 months are transient states.,A typical new account will be absorbed as either a collected

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论