# Joint space broad suboptimum population a population

• 237

Course Hero uses AI to attempt to automatically extract content from documents to surface to you and others so you can study better, e.g., in search results, to enrich docs, and more. This preview shows page 122 - 124 out of 237 pages.

Joint SpaceBroadSuboptimumPopulation APopulation BIndividual A1Individual A2Narrow OptimumFigure 44Relative Overgeneralization.Laziness is the tip of the iceberg though. How doyou assess the fitness of a cooperative coevolutionaryindividual based on tests? Early on it was thought thatyou might base it on theaverageof the test results withvarious collaborators from the other population(s). Let’ssay that there is one optimal joint solution, but the hillleading to it is very small; whereas there’s a large sub-optimal peak elsewhere, as in Figure 44. If we testedindividualsA1 andA2 with many individuals from Pop-ulationBand took the average,A1 would appear fitteron averageeven thoughA2 was actually a collaboratorin the optimum.A1 is a jack-of-all-trades-but-master-of-none individual which is never phenomenal anywhere,but most of the time it’s involved in a joint solutionthat’s better than average.This situation leads to a pathological condition calledrelative overgeneralization, where thepopulations converge to joint solutions which are suboptimal, but involve lots of jacks-of-all-trades.Paul Wiegand discovered this unfortunate situation.95The way to fix this is to assess fitness as themaximumof the tests rather than their average. However to get good results you may need to do alotof tests, perhaps even against theentireother population. It turns out that usually there are justa few “special” collaborators in the other population(s) which, if you tested just with them, wouldcompute fitness orderings for your entire population in exactly the same way as testing againsteveryone. Liviu Panait, a former student of mine, developed a 2-population cooperative algorithm,iCCEA, which computes thisarchiveof special collaborators, resulting in far fewer tests.96Finally, if your fitness function has multiple global optima, or near-optima, you could alsowind up victim tomiscoordination.97Let’s say you have two cooperating populations,AandB,95See Paul’s thesis: R. Paul Wiegand, 2004,An Analysis of Cooperative Coevolutionary Algorithms, Ph.D. thesis, GeorgeMason University, Fairfax, Virginia.96See his thesis: Liviu Panait, 2006,The Analysis and Design of Concurrent Learning Algorithms for Cooperative MultiagentSystems, Ph.D. thesis, George Mason University, Fairfax, Virginia.97Miscoordination isn’t a disaster: an explorative enough system will find its way out. But it’s worthwhile mentioningthat itisa disaster in a sister technique in artificial intelligence, multiagent reinforcement learning.120
Joint SpacePopulation APopulation BIndividual B2Individual A1SuboptimalRegionIndividual B1Individual A2Global Optimum 1Global Optimum 2Figure 45Miscoordination.and two global optima, 1 and 2. The two optima are off-set from one another as shown in Figure 45. PopulationAhas discovered an individualA1 who is part of globaloptimum 1 (yay!), and likewise PopulationBhas discov-ered an individualB2 who is part of global optimum 2(yay!). But neither of these individuals will survive, be-cause PopulationAhasn’t yet discovered individualA2who, when collaborating withB2, would helpB2 shine.

Course Hero member to access this document

Course Hero member to access this document

End of preview. Want to read all 237 pages?

Course Hero member to access this document

Term
Winter
Professor
N/A
Tags