M118 Weeks 14 & 15

M118 Weeks 14 & 15 - NOW IS NOT THE TIME TO START...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: NOW IS NOT THE TIME TO START SLACKING!!! CHAPTER 8 IS DIFFICULT AND IS HEAVILY WEIGHTED ON THE FINAL EXAM! YOUR WEBWORK ASSIGNMENTS FROM NOW UNTIL THE END OF THE SEMESTER WILL HAVE REVIEW PROBLEMS FROM CHAPTERS 1 THROUGH 3 !! CHAPTER 8 - MARKOV CHAINS CHAPTER 8 - MARKOV CHAINS TWO CLASSIC EXAMPLES: A.) An experiment consists of watching DJ White shoot free-throws, each time noting whether or not he hits the shot. If he makes the current shot, he has a 70% chance of making the next shot, and if he misses the current shot, he has a 40% chance of hitting the next shot. Q: If he makes the current shot, what is the probability that he also makes the shot after the next one? B.) An experiment consists of watching a rat in a maze with three compartments, noting every 15 minutes which compartment the rat is in. If the Rat is in compartment A on the current observation, then it is equally likely to be in each of the three compartments on the next observation. If the rat is in compartment B on the current observation, then it is equally like to stay there as to move to another compartment, and if it moves, it is equally likely to be in each of the other two compartments. Finally, if the rat is in compartment C on the current observation, it is always in compartment A on the next observation. Q: If the rat is in compartment A now, what is the probability that it is in compartment C 30 minutes from now? Q: WHAT MAKES THESE TWO EXPERIMENTS MARKOV CHAINS? DEFN: An experiment is a Markov Chain Markov Chain if it satisfies: a.) At each stage of the experiment, the outcome is one of a fixed number of states (state = possible outcome at each stage), and b.) The conditional probability of moving from one state to another on the next observation (observation = stage of experiment) depends only on the two states in question and nothing else. NOTE: The experiment usually STARTS in one of the possible states. TERMINOLOGY: 1.) We number the states 1, 2, up to number of states, and 2.) Pr[moving to state k | in state j now ] = p jk = transition probability transition probability FREE-THROW EXAMPLE RAT IN MAZE EXAMPLE state 1 = miss shot state 1 = in compartment A state 2 = hit shot state 2 = in compartment B state 3 = in compartment C p 11 = .60 p 12 = .40 p 11 = 1/3 p 12 = 1/3 p 13 = 1/3 p 21 = .30 p 22 = .70 p 21 = 1/4 p 22 = 1/2 p 23 = 1/4 p 31 = 1.0 p 32 = 0 p 33 = 0 TWO METHODS OF SUMMARING THE INFO IN A MARKOV CHAIN: A.) Transition Diagram Transition Diagram- a diagram that shows each of the possible states along with the probabilities of moving between states on the next observation. B.) Transition Matrix Transition Matrix- a matrix that has as entries the conditional probabilities of moving from the state in that row of the matrix into the state that is in that column of the matrix on the next observation....
View Full Document

This note was uploaded on 01/24/2011 for the course MATH M118 taught by Professor Stevemckinley during the Fall '07 term at Indiana.

Page1 / 55

M118 Weeks 14 & 15 - NOW IS NOT THE TIME TO START...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online