Lecture Slides 3

Lecture Slides 3 - AMS 210: Applied Linear Algebra Fall...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
AMS 210: Applied Linear Algebra Fall 2009 Lecture 3 (September 8, 2009) AMS 210
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Topics Today Summary of previous lecture Some remarks on Maxima Markov Chains (cont’d) Dynamic Models Underdetermined Linear Systems Linear Programming AMS 210
Background image of page 2
Summary of Previous Lecture Markov chains are iterated, linear, probabilistic models consisting of a set of transition probabilities, a set of states, and an initial state. Markov chains can, for example, be iterated or solved. AMS 210
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
A Couple of Maxima Concepts %: the previous Maxima output. Lists: sequences of items. Loops: program control construct to automatically, repeatedly, evaluate an expression. No, these won’t be on any test. They’re to elucidate otherwise (reasonable and) inevitable questions I’d get anyway. AMS 210
Background image of page 4
Markov Chains for Nonsense Words One illustration of the workings of a Markov chain model is to run it on something people have an intuitive sense of. Allowable English words are one of these things. One can train a model from freely available text - in the example I’m going to show, 250MB of Project Gutenberg - and from that derive Markov chain transition probabilities. AMS 210
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Markov Chains for Nonsense Words Basic operation: begin with a probability state vector. Pick next state by random, weighted choice based on existing choice. Entire system rather large in probability matrix form otherwise used with on the order of 26 or 26 × 26 states corresponding to whether 1-grams or 2-grams in use. Iterate through Markov chain model of text
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/06/2011 for the course AMS 211 taught by Professor Shuaixue during the Fall '09 term at SUNY Stony Brook.

Page1 / 24

Lecture Slides 3 - AMS 210: Applied Linear Algebra Fall...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online