# Infinite Sequences and Series

## Sequences

A sequence is an ordered list of objects and can be considered as a function whose domain is the natural numbers.### Learning Objectives

Distinguish a sequence and a set### Key Takeaways

#### Key Points

- Like a set, a sequence contains members (also called elements). Unlike a set, order matters in a sequence, and the same elements can appear multiple times at different positions.
- The terms of a sequence are commonly denoted by a single variable, say $a_n$, where the index$n$indicates the$n$th element of the sequence.
- Sequences whose elements are related to the previous elements in a straightforward way are often specified using recursion.

#### Key Terms

**set**: a collection of distinct objects, considered as an object in its own right**recursion**: the act of defining an object (usually a function) in terms of that object itself

Examples:

$(M, A, R, Y)$

is a different sequence from $(A, R, M, Y)$

. Also, the sequence $(1, 1, 2, 3, 5, 8)$

, which contains the number $1$

at two different positions, is a valid sequence. Sequences can be finite, as in this example, or infinite, such as the sequence of all even positive integers $(2, 4, 6, \cdots)$

. Finite sequences are sometimes known as strings or words, and infinite sequences as streams. The empty sequence $( \quad )$

is included in most notions of sequence, but may be excluded depending on the context.### Indexing

The terms of a sequence are commonly denoted by a single variable, say$a_n$

, where the index $n$

indicates the $n$

th element of the sequence. Indexing notation is used to refer to a sequence in the abstract. It is also a natural notation for sequences whose elements are related to the index $n$

(the element's position) in a simple way. For instance, the sequence of the first 10 square numbers could be written as:$(a_1,a_2, \cdots,a_{10}), \quad a_k = k^2$

This represents the sequence

$(1,4,9, \cdots, 100)$

.Sequences can be indexed beginning and ending from any integer. The infinity symbol,

$\infty$

, is often used as the superscript to represent the sequence that includes all integer $k$

-values starting with a certain one. The sequence of all positive squares is then denoted as:$\displaystyle{(a_k)_{k=1}^\infty, \quad a_k = k^2}$

.### Specifying a Sequence by Recursion

Sequences whose elements are related to the previous elements in a straightforward way are often specified using recursion. This is in contrast to the specification of sequence elements in terms of their position. To specify a sequence by recursion requires a rule to construct each consecutive element in terms of the ones before it. In addition, enough initial elements must be specified so that new elements of the sequence can be specified by the rule.### Example

The Fibonacci sequence can be defined using a recursive rule along with two initial elements. The rule is that each element is the sum of the previous two elements, and the first two elements are$0$

and $1$

: $a_n = a_{n-1} + a_{n-2}$

and $a_0 = 0, \, a_1=1$

. The first ten terms of this sequence are ($0,1,1,2,3,5,8,13,21,34$

).## Series

A series is the sum of the terms of a sequence.### Learning Objectives

State the requirements for a series to converge to a limit### Key Takeaways

#### Key Points

- Given an infinite sequence of numbers $\{ a_n \}$, a series is informally the result of adding all those terms together:$\sum_{n=0}^\infty a_n$.
- Unlike finite summations, infinite series need tools from mathematical analysis, specifically the notion of limits, to be fully understood and manipulated.
- By definition, a series converges to a limit $L$if and only if the associated sequence of partial sums converges to$L$:$L = \sum_{n=0}^{\infty}a_n \Leftrightarrow L = \lim_{k \rightarrow \infty} S_k$.

#### Key Terms

**sequence**: an ordered list of objects**Zeno's dichotomy**: That which is in locomotion must arrive at the half-way stage before it arrives at the goal.

$\{ a_n \}$

, a series is informally the result of adding all those terms together: $a_1 + a_2 + a_3 + \cdots$

. These can be written more compactly using the summation symbol $\Sigma$

. An example is the famous series from Zeno's dichotomy and its mathematical representation:$\displaystyle{\sum_{n=1}^\infty \frac{1}{2^n} = \frac{1}{2}+ \frac{1}{4}+ \frac{1}{8}+\cdots}$

The terms of the series are often produced according to a certain rule, such as by a formula or by an algorithm. As there are an infinite number of terms, this notion is often called an infinite series. Unlike finite summations, infinite series need tools from mathematical analysis, specifically the notion of limits, to be fully understood and manipulated. In addition to their ubiquity in mathematics, infinite series are also widely used in other quantitative disciplines such as physics, computer science, and finance.

### Definition

For any sequence of rational numbers, real numbers, complex numbers, functions thereof, etc., the associated series is defined as the ordered formal sum:$\displaystyle{\sum_{n=0}^{\infty}a_n = a_0 + a_1 + a_2 + \cdots}$

The sequence of partial sums

$\{S_k\}$

associated to a series $\sum_{n=0}^\infty a_n$

is defined for each k as the sum of the sequence $\{a_n\}$

from $a_0$

to $a_k$

:$\displaystyle{S_k = \sum_{n=0}^{k}a_n = a_0 + a_1 + \cdots + a_k}$

By definition, the series

$\sum_{n=0}^{\infty} a_n$

converges to a limit $L$

if and only if the associated sequence of partial sums $\{S_k\}$

converges to $L$

. This definition is usually written as follows:$\displaystyle{L = \sum_{n=0}^{\infty}a_n \Leftrightarrow L = \lim_{k \rightarrow \infty} S_k}$

## The Integral Test and Estimates of Sums

The integral test is a method of testing infinite series of nonnegative terms for convergence by comparing them to an improper integral.### Learning Objectives

Describe the purpose of the integral test### Key Takeaways

#### Key Points

- The integral test uses a monotonically decreasing function $f$defined on the unbounded interval$[N, \infty)$(where$N$is an integer).
- The infinite series $\sum_{n=N}^\infty f(n)$converges to a real number if and only if the improper integral$\int_N^\infty f(x)\,dx$is finite. In other words, if the integral diverges, then the series diverges as well.
- Integral tests proves that the harmonic series $\sum_{n=1}^\infty \frac1n$diverges.

#### Key Terms

**improper integral**: an integral where at least one of the endpoints is taken as a limit, either to a specific number or to infinity**natural logarithm**: the logarithm in base$e$

### Statement of the test

Consider an integer$N$

and a non-negative function $f$

defined on the unbounded interval $[N, \infty )$

, on which it is monotonically decreasing. The infinite series $\sum_{n=N}^\infty f(n)$

converges to a real number if and only if the improper integral $\int_N^\infty f(x)\,dx$

is finite. In other words, if the integral diverges, then the series diverges as well.Although we won't go into the details, the proof of the test also gives the lower and upper bounds:

$\displaystyle{\int_N^\infty f(x)\,dx\le\sum_{n=N}^\infty f(n)\le f(N)+\int_N^\infty f(x)\,dx}$

for the infinite series.

### Applications

The harmonic series$\sum_{n=1}^\infty \frac1n$

diverges because, using the natural logarithm (its derivative) and the fundamental theorem of calculus, we get:$\displaystyle{\int_1^M\frac1x\,dx=\ln x\Bigr|_1^M=\ln M\to\infty \quad\text{for }M\to\infty}$

On the other hand, the series

$\sum_{n=1}^\infty \frac1{n^{1+\varepsilon}}$

converges for every $\varepsilon > 0$

because, by the power rule:$\displaystyle{\int_1^M\frac1{x^{1+\varepsilon}}\,dx =-\frac1{\varepsilon x^\varepsilon}\biggr|_1^M= \frac1\varepsilon\Bigl(1-\frac1{M^\varepsilon}\Bigr) \le\frac1\varepsilon }$

The above examples involving the harmonic series raise the question of whether there are monotone sequences such that

$f(n)$

decreases to $0$

faster than $\frac{1}{n}$

but slower than $\frac{1}{n^{1 + \varepsilon}}$

in the sense that:$\displaystyle{\lim_{n\to\infty}\frac{f(n)}{\frac{1}{n}}=0}$

and:

$\displaystyle{\lim_{n\to\infty}\frac{f(n)}{\frac{1}{n^{1+\varepsilon}}}=\infty}$

for every

$\varepsilon > 0$

, and whether the corresponding series of the $f(n)$

still diverges. Once such a sequence is found, a similar question can be asked of $f(n)$

taking the role of $\frac{1}{n}$

oand so on. In this way, it is possible to investigate the borderline between divergence and convergence of infinite series.## Comparison Tests

Comparison test may mean either limit comparison test or direct comparison test, both of which can be used to test convergence of a series.### Learning Objectives

Distinguish the limit comparison and the direct comparison tests### Key Takeaways

#### Key Points

- For sequences $\{a_n \}$,$\{b_n \}$, both with non-negative terms only, if$\lim_{n \to \infty} \frac{a_n}{b_n} = c$with$0 < c < \infty$.
- If the infinite series $\sum b_n$converges and$0 \le a_n \le b_n$for all sufficiently large$n$(that is, for all$n > N$for some fixed value$N$), then the infinite series$\sum a_n$also converges.
- If the infinite series $\sum b_n$diverges and$0 \le a_n \le b_n$for all sufficiently large$n$, then the infinite series$\sum a_n$also diverges.

#### Key Terms

**integral test**: a method used to test infinite series of non-negative terms for convergence by comparing it to improper integrals**improper integral**: an integral where at least one of the endpoints is taken as a limit, either to a specific number or to infinity

### Limit Comparison Test

Statement: Suppose that we have two series,$\Sigma_n a_n$

and $\Sigma_n b_n$

, where $a_n$

, $b_n$

are greater than or equal to $0$

for all $n$

. If $\lim_{n \to \infty} \frac{a_n}{b_n} = c$

with $0 < c < \infty$

, then either both series converge or both series diverge.Example: We want to determine if the series

$\Sigma \frac{n+1}{2n^2}$

converges or diverges. For this we compare it with the series $\Sigma \frac{1}{n}$

, which diverges. As $\lim_{n \to \infty} \frac{n+1}{2n^2} \frac{n}{1} = \frac{1}{2}$

, we have that the original series also diverges.### Direct Comparison Test

The direct comparison test provides a way of deducing the convergence or divergence of an infinite series or an improper integral. In both cases, the test works by comparing the given series or integral to one whose convergence properties are known. In this atom, we will check the series case only.For sequences

$\{a_n\}$

, $\{b_n\}$

with non-negative terms:- If the infinite series $\sum b_n$converges and$0 \le a_n \le b_n$for all sufficiently large$n$(that is, for all$n>N$for some fixed value$N$), then the infinite series$\sum a_n$also converges.
- If the infinite series $\sum b_n$diverges and$a_n \ge b_n \ge 0$for all sufficiently large$n$, then the infinite series$\sum a_n$also diverges.

### Example

The series$\Sigma \frac{1}{n^3 + 2n}$

converges because $\frac{1}{n^3 + 2n} < \frac{1}{n^3}$

for $n > 0$

and $\Sigma \frac{1}{n^3}$

converges.## Alternating Series

An alternating series is an infinite series of the form$\sum_{n=0}^\infty (-1)^n\,a_n$

or $\sum_{n=0}^\infty (-1)^{n-1}\,a_n$

with $a_n > 0$

for all $n$

.### Learning Objectives

Describe the properties of an alternating series and the requirements for one to converge### Key Takeaways

#### Key Points

- The theorem known as the "Leibniz Test," or the alternating series test, tells us that an alternating series will converge if the terms $a_n$converge to$0$monotonically.
- The signs of the general terms alternate between positive and negative.
- The sum $\sum_{n=1}^\infty \frac{(-1)^{n+1}}{n}$converges by the alternating series test.

#### Key Terms

**monotone**: property of a function to be either always decreasing or always increasing**Cauchy sequence**: a sequence whose elements become arbitrarily close to each other as the sequence progresses

$\displaystyle{\sum_{n=0}^\infty (-1)^n\,a_n}$

or:

$\displaystyle{\sum_{n=0}^\infty (-1)^{n-1}\,a_n}$

with

$a_n > 0$

for all $n$

. The signs of the general terms alternate between positive and negative. Like any series, an alternating series converges if and only if the associated sequence of partial sums converges.### Alternating Series Test

The theorem known as the "Leibniz Test," or the alternating series test, tells us that an alternating series will converge if the terms$a_n$

converge to $0$

monotonically.Proof: Suppose the sequence

$a_n$

converges to $0$

and is monotone decreasing. If $m$

is odd and $S_m - S_n < a_{m}$

via the following calculation:$\begin{align} S_m - S_n & = \sum_{k=0}^m(-1)^k\,a_k\,-\,\sum_{k=0}^n\,(-1)^k\,a_k\ \\& = \sum_{k=m+1}^n\,(-1)^k\,a_k \\ & =a_{m+1}-a_{m+2}+a_{m+3}-a_{m+4}+\cdots+a_n\\ & =\displaystyle a_{m+1}-(a_{m+2}-a_{m+3}) -\cdots-a_n \le a_{m+1}\le a_{m} \\& \quad [a_{n} \text{ decreasing}]. \end{align}$

Since

$a_n$

is monotonically decreasing, the terms are negative. Thus, we have the final inequality $S_m - S_n \le a_{m}$

. Similarly, it can be shown that, since $a_m$

converges to $0$

, $S_m - S_n$

converges to $0$

for $m, n \rightarrow \infty$

. Therefore, our partial sum $S_m$

converges. (The sequence $\{ S_m \}$

is said to form a Cauchy sequence, meaning that elements of the sequence become arbitrarily close to each other as the sequence progresses.) The argument for $m$

even is similar.Example:

$\displaystyle{\sum_{n=1}^\infty \frac{(-1)^{n+1}}{n} = 1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots}$

$a_n = \frac1n$

converges to 0 monotonically. Therefore, the sum $\sum_{n=1}^\infty \frac{(-1)^{n+1}}{n}$

converges by the alternating series test.## Absolute Convergence and Ratio and Root Tests

An infinite series of numbers is said to converge absolutely if the sum of the absolute value of the summand is finite.### Learning Objectives

State the conditions when an infinite series of numbers converge absolutely### Key Takeaways

#### Key Points

- A real or complex series $\textstyle\sum_{n=0}^\infty a_n$is said to converge absolutely if$\textstyle\sum_{n=0}^\infty \left|a_n\right| = L$for some real number$L$.
- The root test is a convergence test of an infinite series that makes use of the limit $L = \lim_{n\rightarrow\infty}\left|\frac{a_{n+1}}{a_n}\right|$.
- The root test is a criterion for the convergence of an infinite series using the limit superior $C = \limsup_{n\rightarrow\infty}\sqrt[n]{|a_n|}$.

#### Key Terms

**summand**: something which is added or summed**improper integral**: an integral where at least one of the endpoints is taken as a limit, either to a specific number or to infinity**limit superior**: the supremum of the set of accumulation points of a given sequence or set

$\textstyle\sum_{n=0}^\infty a_n$

is said to converge absolutely if $\textstyle\sum_{n=0}^\infty \left|a_n\right| = L$

for some real number $L$

. Similarly, an improper integral of a function, $\textstyle\int_0^\infty f(x)\,dx$

, is said to converge absolutely if the integral of the absolute value of the integrand is finite—that is, if $\int_0^\infty \left|f(x)\right|dx = L$

.Absolute convergence is important for the study of infinite series because its definition is strong enough to have properties of finite sums that not all convergent series possess, yet is broad enough to occur commonly. (A convergent series that is not absolutely convergent is called conditionally convergent.)

### Ratio Test

The ratio test is a test (or "criterion") for the convergence of a series$\sum_{n=1}^\infty a_n$

, where each term is a real or complex number and $a_n$

is nonzero when n is large. The test was first published by Jean le Rond d'Alembert and is sometimes known as d'Alembert's ratio test.The usual form of the test makes use of the limit,

$L = \lim_{n\rightarrow\infty}\left|\frac{a_{n+1}}{a_n}\right|$

. The ratio test states that,- if $L < 1$, then the series converges absolutely;
- if $L > 1$, then the series does not converge;
- if $L = 1$or the limit fails to exist, then the test is inconclusive, because there exist both convergent and divergent series that satisfy this case.

### Root Test

The root test is a criterion for the convergence (a convergence test) of an infinite series. It depends on the quantity$\limsup_{n\rightarrow\infty}\sqrt[n]{|a_n|}$

, where $a_n$

are the terms of the series, and states that the series converges absolutely if this quantity is less than one but diverges if it is greater than one. It is particularly useful in connection with power series.The root test was developed first by Augustin-Louis Cauchy and so is sometimes known as the Cauchy root test, or Cauchy's radical test. For a series

$\sum_{n=1}^\infty a_n$

, the root test uses the number $C = \limsup_{n\rightarrow\infty}\sqrt[n]{ \left|a_n \right|}$

, where "lim sup" denotes the limit superior, possibly ∞. Note that if $\lim_{n\rightarrow\infty}\sqrt[n]{ \left|a_n \right|}$

converges, then it equals $C$

and may be used in the root test instead. The root test states that- if $C < 1$, then the series converges absolutely;
- if $C > 1$, then the series diverges;
- if $C = 1$and the limit approaches strictly from above, then the series diverges;
- otherwise the test is inconclusive (the series may diverge, converge absolutely, or converge conditionally).

There are some series for which

$C = 1$

and the series converges, e.g.:$\displaystyle{\sum{\frac{1}{n^2}}}$

and there are others for which

$C = 1$

and the series diverges, e.g.:$\displaystyle{\sum{\frac{1}{n}}}$

## Tips for Testing Series

Convergence tests are methods of testing for the convergence or divergence of an infinite series.### Learning Objectives

Formulate three techniques that will help when testing the convergence of a series### Key Takeaways

#### Key Points

- There is no single convergence test which works for all series out there.
- Practice and training will help you choose the right test for a given series.
- We have learned about the root /ratio test, integral test, and direct/ limit comparison test.

#### Key Terms

**conditional convergence**: A series or integral is said to be conditionally convergent if it converges but does not converge absolutely.

Here is a summary for the convergence test that we have learned:

### List of Tests

Limit of the Summand: If the limit of the summand is undefined or nonzero, then the series must diverge.Ratio test: For

$r = \lim_{n \to \infty} \left|\frac{a_{n+1}}{a_n}\right|$

, if $r < 1$

, the series converges; if $r > 1$

, the series diverges; if $r = 1$

, the test is inconclusive.Root test: For

$r = \limsup_{n \to \infty}\sqrt[n]{ \left|a_n \right|}$

, if $r < 1$

, then the series converges; if $r > 1$

, then the series diverges; if $r = 1$

, the root test is inconclusive.Integral test: For a positive, monotone decreasing function

$f(x)$

such that $f(n)=a_n$

, if $\int_{1}^{\infty} f(x)\, dx = \lim_{t \to \infty} \int_{1}^{t} f(x)\, dx < \infty$

then the series converges. But if the integral diverges, then the series does so as well.Direct comparison test: If the series

$\sum_{n=1}^\infty b_n$

is an absolutely convergent series and $\left |a_n \right | \le \left | b_n \right|$

for sufficiently large $n$

, then the series $\sum_{n=1}^\infty a_n$

converges absolutely.Limit comparison test: If

$\left \{ a_n \right \}, \left \{ b_n \right \} > 0$

, and $\lim_{n \to \infty} \frac{a_n}{b_n}$

exists and is not zero, then $\sum_{n=1}^\infty a_n$

converges if and only $if \sum_{n=1}^\infty b_n$

converges.## Power Series

A power series (in one variable) is an infinite series of the form$f(x) = \sum_{n=0}^\infty a_n \left( x-c \right)^n$

, where $a_n$

is the coefficient of the $n$

th term and $x$

varies around $c$

.### Learning Objectives

Express a power series in a general form### Key Takeaways

#### Key Points

- Power series usually arise as the Taylor series of some known function.
- In many situations $c$is equal to zero—for instance, when considering a Maclaurin series. In such cases, the power series takes the simpler form$f(x) = \sum_{n=0}^\infty a_n x^n = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + \cdots$.
- A power series will converge for some values of the variable $x$and may diverge for others. If there exists a number$r$with$0 < r \leq \infty$such that the series converges when$\left| x-c \right| <r$and diverges when$\left| x-c \right| >r$, the number$r$is called the radius of convergence of the power series.

#### Key Terms

**Z-transform**: transform that converts a discrete time-domain signal into a complex frequency-domain representation**combinatorics**: a branch of mathematics that studies (usually finite) collections of objects that satisfy specified criteria and their structures

$\displaystyle{f(x) = \sum_{n=0}^\infty a_n \left( x-c \right)^n = a_0 + a_1 (x-c)^1 + a_2 (x-c)^2 + \cdots}$

where

$a_n$

represents the coefficient of the $n$

th term, $c$

is a constant, and $x$

varies around $c$

(for this reason one sometimes speaks of the series as being *centered*at

$c$

). This series usually arises as the Taylor series of some known function. Any polynomial can be easily expressed as a power series around any center $c$

, albeit one with most coefficients equal to zero. For instance, the polynomial$f(x) = x^2 + 2x + 3$

can be written as a power series around the center

$c=1$

as:$f(x) = 6 + 4 (x-1) + 1(x-1)^2 + 0(x-1)^3 + 0(x-1)^4 + \cdots \,$

or, indeed, around any other center

$c$

.In many situations

$c$

is equal to zero—for instance, when considering a Maclaurin series. In such cases, the power series takes the simpler form$\displaystyle{f(x) = \sum_{n=0}^\infty a_n x^n = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + \cdots}$

These power series arise primarily in real and complex analysis, but also occur in combinatorics (under the name of generating functions) and in electrical engineering (under the name of the

$Z$

-transform). The familiar decimal notation for real numbers can also be viewed as an example of a power series, with integer coefficients, but with the argument $x$

fixed at $\frac{1}{10}$

. In number theory, the concept of $p$

-adic numbers is also closely related to that of a power series.### Radius of Convergence

A power series will converge for some values of the variable$x$

and may diverge for others. All power series $f(x)$

in powers of $(x-c)$

will converge at $x=c$

. If $c$

is not the only convergent point, then there is always a number $r$

with 0 < r ≤ ∞ such that the series converges whenever $\left| x-c \right| <r$

and diverges whenever $\left| x-c \right| >r$

. The number $r$

is called the radius of convergence of the power series. According to the Cauchy-Hadamard theorem, the radius $r$

can be computed from$\displaystyle{r^{-1}=\lim_{n\to\infty}\left|{a_{n+1}\over a_n}\right|}$

if this limit exists.

## Expressing Functions as Power Functions

A power function is a function of the form$f(x) = cx^r$

where $c$

and $r$

are constant real numbers.### Learning Objectives

Describe the relationship between the power functions and infinitely differentiable functions### Key Takeaways

#### Key Points

- Since all infinitely differentiable functions can be represented in power series, any infinitely differentiable function can be represented as a sum of many power functions (of integer exponents ).
- Therefore, an arbitrary function that is infinitely differentiable is expressed as an infinite sum of power functions ($x^n$) of integer exponent:$f(x) = \sum_{n=0} ^ {\infty} \frac {f^{(n)}(0)}{n! } \, x^{n}$.
- Functions of the form $f(x) = x^{3}$,$f(x) = x^{1.2}$,$f(x) = x^{-4}$are all power functions.

#### Key Terms

**differentiable**: having a derivative, said of a function whose domain and co-domain are manifolds**power law**: any of many mathematical relationships in which something is related to something else by an equation of the form$f(x) = a x^k$

$f(x) = cx^r$

where $c$

and $r$

are constant real numbers. Polynomials are made of power functions. Since all infinitely differentiable functions can be represented in power series, any infinitely differentiable function can be represented as a sum of many power functions (of integer exponents). The domain of a power function can sometimes be all real numbers, but generally a non-negative value is used to avoid problems with simplifying. The domain of definition is determined by each individual case. Power functions are a special case of power law relationships, which appear throughout mathematics and statistics.The Taylor series of a real or complex-valued function

$f(x)$

that is infinitely differentiable in a neighborhood of a real or complex number $a$

is the power series:$\displaystyle{\sum_{n=0} ^ {\infty} \frac {f^{(n)}(a)}{n! } \, (x-a)^{n}}$

where

$n!$

denotes the factorial of $n$

and $f^na$

denotes the $n$

th derivative of $f$

evaluated at the point $x=a$

. Any finite number of initial terms of the Taylor series of a function is called a Taylor polynomial. If the Taylor series is centered at zero, then that series is also called a Maclaurin series:$\displaystyle{f(x) = \sum_{n=0} ^ {\infty} \frac {f^{(n)}(0)}{n! } \, x^{n}}$

Therefore, an arbitrary function that is infinitely differentiable is expressed as an infinite sum of power functions (

$x^n$

) of integer exponent.### Examples

Functions of the form$f(x) = x^3$

, $f(x) = x^{1.2}$

, $f(x) = x^{-4}$

are all power functions.## Taylor and Maclaurin Series

Taylor series represents a function as an infinite sum of terms calculated from the values of the function's derivatives at a single point.### Learning Objectives

Identify a Maclaurin series as a special case of a Taylor series### Key Takeaways

#### Key Points

- Any finite number of initial terms of the Taylor series of a function is called a Taylor polynomial.
- A function that is equal to its Taylor series in an open interval (or a disc in the complex plane) is known as an analytic function.
- The Taylor series of a real or complex-valued function $f(x)$that is infinitely differentiable in a neighborhood of a real or complex number a is the power series$f(x)=∑∞n=0f(n)(a)n!(x-a)nf(x) = \sum_{n=0} ^ {\infty} \frac {f^{(n)}(a)}{n! } \, (x-a)^{n}$. If$a = 0$, the series is called a Maclaurin series.

#### Key Terms

**differentiable**: having a derivative, said of a function whose domain and co-domain are manifolds**analytic function**: a real valued function which is uniquely defined through its derivatives at one point

It is common practice to approximate a function by using a finite number of terms of its Taylor series. Taylor's theorem gives quantitative estimates on the error in this approximation. Any finite number of initial terms of the Taylor series of a function is called a Taylor polynomial. The Taylor series of a function is the limit of that function's Taylor polynomials, provided that the limit exists. A function may not be equal to its Taylor series, even if its Taylor series converges at every point. A function that is equal to its Taylor series in an open interval (or a disc in the complex plane) is known as an analytic function.

The Taylor series of a real or complex-valued function

$f(x)$

that is infinitely differentiable in a neighborhood of a real or complex number $a$

is the power series:$\begin{align} f(x) &= f(a)+\frac {f'(a)}{1! } (x-a)+ \frac{f''(a)}{2! } (x-a)^2+ \cdots \\ &= \sum\text{\textunderscore}{n=0} ^ {\infty} \frac {f^{(n)}(a)}{n! } \, (x-a)^{n} \end{align}$

where

$n!$

denotes the factorial of $n$

and $f^na$

denotes the $n$

th derivative of $f$

evaluated at the point $x=a$

. The derivative of order zero $f$

is defined to be $f$

itself, and $(x - a)^0$

and $0!$

are both defined to be $1$

. In the case that $a=0$

, the series is also called a Maclaurin series.### Example 1

The Maclaurin series for$(1 - x)^{-1}$

for $\left| x \right| < 1$

is the geometric series: $1+x+x^2+x^3+\cdots\!$

so the Taylor series for

$x^{-1}$

at $a=1$

is:$1-(x-1)+(x-1)^2-(x-1)^3+\cdots$

### Example 2

The Taylor series for the exponential function$e^x$

at $a=0$

is:$\displaystyle{1 + \frac{x^1}{1! } + \frac{x^2}{2! } + \frac{x^3}{3! } + \frac{x^4}{4! } + \frac{x^5}{5! }+ \cdots \\= 1 + x + \frac{x^2}{2} + \frac{x^3}{6} + \frac{x^4}{24} + \frac{x^5}{120} + \cdots\! \\= \sum_{n=0}^\infty \frac{x^n}{n! }}$

## Applications of Taylor Series

Taylor series expansion can help approximating values of functions and evaluating definite integrals.### Learning Objectives

Describe applications of the Taylor series expansion### Key Takeaways

#### Key Points

- The partial sums of the series, which is called the Taylor polynomials, can be used as approximations of the entire function.
- Differentiation and integration of power series can be performed term by term, and hence could be easier than directly working with the original function.
- The (truncated) series can be used to compute function values numerically. This is particularly useful in evaluating special mathematical functions (such as Bessel function).

#### Key Terms

**definite integral**: the integral of a function between an upper and lower limit**complex analysis**: theory of functions of a complex variable; a branch of mathematical analysis that investigates functions of complex numbers**analytic function**: a real valued function which is uniquely defined through its derivatives at one point

1. The partial sums (the Taylor polynomials) of the series can be used as approximations of the entire function. These approximations are often good enough if sufficiently many terms are included. Approximations using the first few terms of a Taylor series can make otherwise unsolvable problems possible for a restricted domain; this approach is often used in physics.

2. Differentiation and integration of power series can be performed term by term and is hence particularly easy. Taylor series is especially useful in evaluating definite integrals. For example, to evaluate

$\int_{0}^{1} e^{x^3} \, dx$

, the Taylor series expansion of $e^t= \sum_{n=0}^{\infty} \frac{1}{n! } \, t^n$

and the substitution of $t=x^3$

can be used. Since each term in the summation can be integrated separately, we can evaluate the definite integral as long as the sum converges.3. An analytic function is uniquely extended to a holomorphic function on an open disk in the complex plane. This makes the machinery of complex analysis available.

4. The (truncated) series can be used to compute function values numerically. This is particularly useful in evaluating special mathematical functions (such as Bessel function).

5. Algebraic operations can be done readily on the power series representation; for instance the Euler's formula follows from Taylor series expansions for trigonometric and exponential functions.

$e^{ix}$

can be found from the Taylor expansion of $\cos(x)$

and $\sin(x)$

:$\displaystyle{\cos(x) = 1-\frac{x^2}{2!}+\frac{x^4}{4! } -\frac{x^6}{6! }+ \cdots}$

$\displaystyle{\sin(x) = x - \frac{x^3}{3!}+\frac{x^5}{5! } -\frac{x^7}{7! } + \cdots}$

and adding the two terms together yields:

$\begin{align} cos(x)+i\,sin(x) & = (1-\frac{x^2}{2!}+\frac{x^4}{4! } - \cdots) + i (x - \frac{x^3}{3! } + \frac{x^5}{5! } - \cdots) \\ & = 1 + ix + \frac{(ix)^2}{2! } + \frac{(ix)^3}{3! }+ \frac{(ix)^4}{4! } + \cdots \\ & = e^{ix} \end{align}$

This result is of fundamental importance in many fields of mathematics (for example, in complex analysis), physics and engineering.

## Summing an Infinite Series

Infinite sequences and series can either converge or diverge.### Learning Objectives

Describe properties of the infinite series### Key Takeaways

#### Key Points

- Infinite sequences and series continue indefinitely.
- A series is said to converge when the sequence of partial sums has a finite limit.
- A series is said to diverge when the limit is infinite or does not exist.

#### Key Terms

**limit**: a value to which a sequence or function converges**sequence**: an ordered list of objects

For any infinite sequence of real or complex numbers, the associated series is defined as the ordered formal sum

$\displaystyle{\sum_{n=0}^{\infty}a_n = a_0 + a_1 + a_2 + \cdots}$

The sequence of partial sums

${S_k}$

associated to a series $\sum_{n=0}^\infty a_n$

is defined for each k as the sum of the sequence ${a_n}$

from $a_0$

to $a_k$

:$\displaystyle{S_k = \sum_{n=0}^{k}a_n = a_0 + a_1 + \cdots + a_k}$

Infinite sequences and series can either converge or diverge. A series is said to converge when the sequence of partial sums has a finite limit. By definition the series

$\sum_{n=0}^\infty a_n$

converges to a limit $L$

if and only if the associated sequence of partial sums converges to $L$

. This definition is usually written as:$\displaystyle{L = \sum_{n=0}^{\infty}a_n \Leftrightarrow L = \lim_{k \rightarrow \infty} S_k}$

If the limit of is infinite or does not exist, the series is said to diverge.

An easy way that an infinite series can converge is if all the

$a_{n}$

are zero for sufficiently large $n$

s. Such a series can be identified with a finite sum, so it is only infinite in a trivial sense. Working out the properties of the series that converge even if infinitely many terms are non-zero is, therefore, the essence of the study of series. In the following atoms, we will study how to tell whether a series converges or not and how to compute the sum of a series when such a value exists.## Convergence of Series with Positive Terms

For a sequence$\{a_n\}$

, where $a_n$

is a non-negative real number for every $n$

, the sum $\sum_{n=0}^{\infty}a_n$

can either converge or diverge to $\infty$

.### Learning Objectives

Identify convergence conditions for a sequence with positive terms### Key Takeaways

#### Key Points

- Because the partial sum $S_k$of a series with non-negative terms can only increase as$k$becomes larger, the limit of the partial sum can either converge or diverge to$\infty$.
- A geometric sum $1 + \frac{1}{2}+ \frac{1}{4}+ \frac{1}{8}+\cdots+ \frac{1}{2^n}+\cdots$converges to$2$, which can be understood visually.
- The series $\sum_{n \ge 1} \frac{1}{n^2}$is convergent. This can be seen by comparing individual terms of the series with a sequence$\left( b_n = \frac{1}{n-1} - \frac{1}{n} \right)$, which is know to converge.

#### Key Terms

**converge**: of a sequence, to have a limit**convergence test**: methods of testing for the convergence, conditional convergence, absolute convergence, interval of convergence, or divergence of an infinite series

$\{a_n\}$

, where $a_n$

is a non-negative real number for every $n$

, the sequence of partial sums$S_k = \sum_{n=0}^{k}a_n = a_0 + a_1 + \cdots + a_k$

is non-decreasing. Because the partial sum

$S_k$

can only increase as $k$

becomes larger, the limit of the partial sum can either converge or diverge to $\infty$

. Therefore, it follows that a series $\sum_{n=0}^{\infty} a_n$

with non-negative terms converges if and only if the sequence $S_k$

of partial sums is bounded.### Example 1

The series$\sum_{n \ge 1} \frac{1}{n^2}$

is convergent because of the inequality:$\displaystyle{\frac1 {n^2} \le \frac{1}{n-1} - \frac{1}{n}, (n \ge 2)}$

and because:

$\displaystyle{\sum_{n \ge 2} \left(\frac{1}{n-1} - \frac{1}{n} \right) =\left(1-\frac{1}{2}\right) + \left(\frac{1}{2}-\frac{1}{3}\right) + \left(\frac{1}{3}-\frac{1}{4}\right) + \cdots = 1}$

### Example 2

Would the series$\displaystyle{S = 1 + \frac{1}{2}+ \frac{1}{4}+ \frac{1}{8}+\cdots+ \frac{1}{2^n}+\cdots}$

converge? It is possible to "visualize" its convergence on the real number line? We can imagine a line of length

$2$

, with successive segments marked off of lengths $1$

, $\frac{1}{2}$

, $\frac{1}{4}$

, etc. There is always room to mark the next segment, because the amount of line remaining is always the same as the last segment marked: when we have marked off $\frac{1}{2}$

, we still have a piece of length $\frac{1}{2}$

unmarked, so we can certainly mark the next $\frac{1}{4}$

. This argument does not prove that the sum is equal to $2$

(although it is), but it does prove that it is at most $2$

. In other words, the series has an upper bound. Proving that the series is equal to $2$

requires only elementary algebra, however. If the series is denoted $S$

, it can be seen that:$\displaystyle{\frac{S}{2} \,= \frac{1+ \frac{1}{2}+ \frac{1}{4}+ \frac{1}{8}+\cdots}{2} \\ \quad= \frac{1}{2}+ \frac{1}{4}+ \frac{1}{8}+ \frac{1}{16} +\cdots}$

Therefore:

$\displaystyle{S-\frac{S}{2} = 1\\S = 2}$

For these specific examples, there are easy ways to check the convergence. However, it could be the case that there are no easy ways to check the convergence. For these general cases, we can experiment with several well-known convergence tests (such as ratio test, integral test, etc.). We will learn some of these tests in the following atoms.