This preview has intentionally blurred parts. Sign up to view the full document

View Full Document

Unformatted Document Excerpt

How do you factor the difference of two squares? How do you factor the perfect square trinomial? How do you factor the sum and difference of two cubes? Which of these three makes the most sense to you? Explain why. To factor the difference of two squares you would need to factor the square into an expanded form. For example a^2-b^2 would be written in factored extended form as such (a+b)(a-b)= a^2-ab +ab-b^2=a^2-b^2 As in the example above the ab-ab cancel each other out and you are left with the difference of squares a^2-b^2. To factor the perfect square trinomial you would need to find a number that satisfies the square so for example a^2+24x+144 this trinomial would be factored out as such: (a+12)(a+12) or (a+12)^2 To factor the sum and difference of two cubes you would first start with this example for the sum: a^3+b^3 To factor this out in this way: (a+b)(a^2-ab+b^2) you will need to have the sign the same for the first equation match the sign that is in the original equation. The second sign would always be opposite of the original equation and the last sign will always be plus. When using the difference of cubes a^3-b^3 you will need to follow the same rule just reverse the signs. So the factor would look like this (a-b) (a^2+ab+b^2). Of all these factoring terms I think the factoring of squares is the one that makes the most sense. I can easily understand the equations and to find the square of a number is fairly easy. As is only dealing with two terms to factor out. It is very easy to only use two terms to deal with and factor out then it would be to find the cube or factor a square trinomial. ... View Full Document

End of Preview