Lecture ??: 04/09/2007
Recall:
Talking about SGA as a Markov chain on population space (fitnessproportional selection.)
Early
examples (without mutation): had some absorbing states
these are the monomorphic
populations
(all
indiciduals same,) all other states transient.
Mutation:
Introduce
mutation
=> take nongeneric Markov chain on population space and make it generic
with very small є’s (ie, irreducible
—whole state space is one big positively recurrent class.)
The idea
here is that it’s possible to move in one step from any population to any other—albeit often very hard (not
really seen because it involves too many lowprobability events.)
Mathwise: there’s a unique stationary distribution π
*
p
>0 for any
population P; you’ll visit every
P
infinitely often
if you run chain forever; etc.
In “real life:” π
p
*
larger for some P’s than others; when you run the algorithm, usually see this
kind of thing:
•
Population evolves toward a monomorphic one and stays “near” that for awhile
•
Then shoots (via some mutations) to another monomorphic, stays there for awhile
User choices (eg selection method, p
m
, p
c
, encoding) affect which monomorphics you see a lot, how long
you linger around them, etc.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '07
 DELCHAMPS
 Algorithms, Markov chain, highest fitness individual, population space

Click to edit the document details