Consider playing Tic-Tac-Toe against an opponent who plays randomly. In particular,

assume the opponent chooses with uniform probability any open space, unless

there is a forced move (in which case it makes the obvious correct move).

(a) Formulate the problem of learning an optimal Tic-Tac-Toe strategy in this case

as a Q-learning task. What are the states, transitions, and rewards in this nondeterministic

Markov decision process?

(b) Will your program succeed if the opponent plays optimally rather than randomly?

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it’s a ‘+’ or a ‘-‘ instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.

(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?

Now (for parts b – d of this problem) assume your initial population is made of the following four strings:

{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}

Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20),

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA’s learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available.

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it’s a ‘+’ or a ‘-‘ instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.

(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?

Now (for parts b – d of this problem) assume your initial population is made of the following four strings:

{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}

Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20),

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA’s learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available.

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it’s a ‘+’ or a ‘-‘ instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.

(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?

Now (for parts b – d of this problem) assume your initial population is made of the following four strings:

{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}

Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20),

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA’s learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available.

Consider again the question from Section 12.2.1, in Tom Mitchell regarding what criterion to use

for choosing among hypotheses when both data and prior knowledge are available.

Give your own viewpoint on this issue.

assume the opponent chooses with uniform probability any open space, unless

there is a forced move (in which case it makes the obvious correct move).

(a) Formulate the problem of learning an optimal Tic-Tac-Toe strategy in this case

as a Q-learning task. What are the states, transitions, and rewards in this nondeterministic

Markov decision process?

(b) Will your program succeed if the opponent plays optimally rather than randomly?

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it’s a ‘+’ or a ‘-‘ instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.

(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?

Now (for parts b – d of this problem) assume your initial population is made of the following four strings:

{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}

Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20),

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA’s learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available.

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it’s a ‘+’ or a ‘-‘ instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.

(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?

Now (for parts b – d of this problem) assume your initial population is made of the following four strings:

{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}

Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20),

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA’s learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available.

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it’s a ‘+’ or a ‘-‘ instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.

(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?

Now (for parts b – d of this problem) assume your initial population is made of the following four strings:

{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}

Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20),

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA’s learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available.

Consider again the question from Section 12.2.1, in Tom Mitchell regarding what criterion to use

for choosing among hypotheses when both data and prior knowledge are available.

Give your own viewpoint on this issue.

### Recently Asked Questions

- What is the time complexity of the following recursive algorithm where r is the root of a non-binary tree and c is a child of r?

- Hi I'm doing my post lab questions. And it asks "Using data you obtained today determine the fixed mass ratio for the reaction of CaCl2 and Na2CO3. Represent

- Given that x is a normal variable with mean μ = 46 and standard deviation σ = 6.2, find the following probabilities. (Round your answers to four decimal