This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Algorithms Lecture 5: Dynamic Programming [ Fa10 ] Those who cannot remember the past are doomed to repeat it. George Santayana, The Life of Reason, Book I: Introduction and Reason in Common Sense (1905) The 1950s were not good years for mathematical research. We had a very interesting gentleman in Washington named Wilson. He was secretary of Defense, and he actually had a pathological fear and hatred of the word research . Im not using the term lightly; Im using it precisely. His face would suffuse, he would turn red, and he would get violent if people used the term research in his presence. You can imagine how he felt, then, about the term mathematical . The RAND Corporation was employed by the Air Force, and the Air Force had Wilson as its boss, essentially. Hence, I felt I had to do something to shield Wilson and the Air Force from the fact that I was really doing mathematics inside the RAND Corporation. What title, what name, could I choose? Richard Bellman, on the origin of his term dynamic programming (1984) If we all listened to the professor, we may be all looking for professor jobs. Pittsburgh Steelers head coach Bill Cowher, responding to David Romers dynamicprogramming analysis of football strategy (2003) 5 Dynamic Programming 5.1 Fibonacci Numbers 5.1.1 Recursive Denitions Are Recursive Algorithms The Fibonacci numbers F n , named after Leonardo Fibonacci Pisano 1 , the mathematician who popularized algorism in Europe in the 13th century, are dened as follows: F = 0, F 1 = 1, and F n = F n1 + F n2 for all n 2. The recursive denition of Fibonacci numbers immediately gives us a recursive algorithm for computing them: R EC F IBO ( n ) : if ( n < 2 ) return n else return R EC F IBO ( n1 )+ R EC F IBO ( n2 ) How long does this algorithm take? Except for the recursive calls, the entire algorithm requires only a constant number of steps: one comparison and possibly one addition. If T ( n ) represents the number of recursive calls to R EC F IBO , we have the recurrence T ( ) = 1, T ( 1 ) = 1, T ( n ) = T ( n1 )+ T ( n2 )+ 1. This looks an awful lot like the recurrence for Fibonacci numbers! The annihilator method gives us an asymptotic bound of ( n ) , where = ( p 5 + 1 ) / 2 1.61803398875, the socalled golden ratio , is the largest root of the polynomial r 2r1. But its fairly easy to prove (hint, hint) the exact solution T ( n ) = 2 F n + 11 . In other words, computing F n using this algorithm takes more than twice as many steps as just counting to F n ! Another way to see this is that the R EC F IBO is building a big binary tree of additions, with nothing but zeros and ones at the leaves. Since the eventual output is F n , our algorithm must call R EC R IBO ( 1 ) (which returns 1) exactly F n times. A quick inductive argument implies that R EC F IBO ( ) is called exactly F n1 times. Thus, the recursion tree has F n + F n1 = F n + 1 leaves, and therefore, because its a full binary tree, it must have 2 F n + 11 nodes....
View
Full
Document
This note was uploaded on 10/14/2011 for the course ECON 101 taught by Professor Smith during the Spring '11 term at West Virginia University Institute of Technology.
 Spring '11
 Smith

Click to edit the document details