Unformatted text preview: the difference of their respective sums. Finally, if you have a convergent series and you multiply its terms by some constant, the resulting series is equal to the sum of the original series times that constant. One of the conditions for a series to converge is that the terms must approach zero. Otherwise the sum of an infinite number of terms would have to be infinite. But watch out! It is a classic mistake to assume that any series whose terms approach zero must converge. Some series have terms that approach zero, but they don’t approach zero fast enough. As a consequence, the entire series diverges. So if your series converges, you know the limit of the sequence of terms is equal to zero. Is it possible to use that fact in another way?...
View Full Document
- Spring '10