tailieunhanh - Introduction to Probability - Chapter 11

Chapter 11 Markov Chains Introduction Most of our study of probability has dealt with independent trials processes. These processes are the basis of classical probability theory and much of statistics. We have discussed two of the principal theorems for these processes | Chapter 11 Markov Chains Introduction Most of our study of probability has dealt with independent trials processes. These processes are the basis of classical probability theory and much of statistics. We have discussed two of the principal theorems for these processes the Law of Large Numbers and the Central Limit Theorem. We have seen that when a sequence of chance experiments forms an independent trials process the possible outcomes for each experiment are the same and occur with the same probability. Further knowledge of the outcomes of the previous experiments does not influence our predictions for the outcomes of the next experiment. The distribution for the outcomes of a single experiment is sufficient to construct a tree and a tree measure for a sequence of n experiments and we can answer any probability question about these experiments by using this tree measure. Modern probability theory studies chance processes for which the knowledge of previous outcomes influences predictions for future experiments. In principle when we observe a sequence of chance experiments all of the past outcomes could influence our predictions for the next experiment. For example this should be the case in predicting a student s grades on a sequence of exams in a course. But to allow this much generality would make it very difficult to prove general results. In 1907 A. A. Markov began the study of an important new type of chance process. In this process the outcome of a given experiment can affect the outcome of the next experiment. This type of process is called a Markov chain. Specifying a Markov Chain We describe a Markov chain as follows We have a set of states S si s2 sr . The process starts in one of these states and moves successively from one state to another. Each move is called a step. If the chain is currently in state Sj then it moves to state Sj at the next step with a probability denoted by pj and this probability does not depend upon which states the chain was

crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.