6 Power Series
6.1 From Polynomials to Power Series
In the preceding chapter, we established that polynomials form a vector space. The space \mathcal{P}_n of polynomials of degree at most n has dimension n+1, with standard basis \{1, x, x^2, \ldots, x^n\}. Every polynomial in \mathcal{P}_n is a finite linear combination: p(x) = c_0 \cdot 1 + c_1 \cdot x + c_2 \cdot x^2 + \cdots + c_n \cdot x^n = \sum_{k=0}^{n} c_k x^k.
The space \mathcal{P} of all polynomials is infinite-dimensional, but each element uses only finitely many basis functions. The coordinate vector (c_0, c_1, c_2, \ldots) has only finitely many nonzero entries.
Power series remove this restriction. A power series f(x) = \sum_{n=0}^{\infty} c_n x^n = c_0 + c_1 x + c_2 x^2 + c_3 x^3 + \cdots allows infinitely many nonzero coefficients. This is an infinite linear combination in the basis \{1, x, x^2, \ldots\}.
The central question: when does such an infinite sum define a function? For each fixed x, we obtain a numerical series \sum c_n x^n. Whether this converges depends on both the coefficients c_n and the value of x. The theory of series convergence, developed in previous sections, determines the answer.
6.2 Power Series as Functions
Definition 6.1 (Power Series) A power series centered at a is a formal infinite linear combination \sum_{n=0}^{\infty} c_n (x-a)^n = c_0 + c_1(x-a) + c_2(x-a)^2 + c_3(x-a)^3 + \cdots, where c_n \in \mathbb{R} are coefficients and x is a variable.
For each fixed x, this produces a numerical series. The set of all x where the series converges is the domain of convergence.
Contrast with polynomials:
Polynomial: Finite sum, converges everywhere, always defines a function
Power series: Infinite sum, converges on a subset, defines a function only where convergent
Geometric interpretation: Just as a polynomial p(x) = \sum_{k=0}^{n} c_k x^k represents a vector in the (n+1)-dimensional space \mathcal{P}_n, a power series f(x) = \sum_{n=0}^{\infty} c_n x^n represents a vector in an infinite-dimensional function space. The coordinates (c_0, c_1, c_2, \ldots) form an infinite sequence, and convergence determines whether this infinite linear combination produces a well-defined function.
Convention. We focus on series centered at a = 0. The general case \sum c_n (x-a)^n reduces via substitution u = x-a.
6.3 Three Fundamental Examples
Before developing general theory, examine three power series with distinct convergence behavior.
Example 6.1 (Convergence Only at Origin) Consider \sum_{n=0}^{\infty} n! x^n. For any x \neq 0, apply the ratio test: \frac{(n+1)!|x|^{n+1}}{n!|x|^n} = (n+1)|x| \to \infty. The series diverges for all x \neq 0, converging only at the origin.
Example 6.2 (Convergence Everywhere) Consider \sum_{n=0}^{\infty} \frac{x^n}{n!}. For any fixed x, apply the ratio test: \frac{|x|^{n+1}/(n+1)!}{|x|^n/n!} = \frac{|x|}{n+1} \to 0 < 1. The series converges absolutely for all x \in \mathbb{R}. (This represents e^x.)
Example 6.3 (Convergence on an Interval) The geometric series \sum_{n=0}^{\infty} x^n converges for |x| < 1 and diverges for |x| \geq 1. From Section 2.6, when |x| < 1: \sum_{n=0}^{\infty} x^n = \frac{1}{1-x}.
These examples suggest that convergence is determined by distance from center. The next section makes this precise.
6.4 Radius of Convergence
A fundamental observation: if \sum c_n x_0^n converges for some x_0 \neq 0, then the series converges absolutely for all |x| < |x_0|.
Lemma 6.1 (Comparison Lemma) If \sum c_n x_0^n converges for some x_0 \neq 0, then \sum c_n x^n converges absolutely for all x with |x| < |x_0|.
Convergence of \sum c_n x_0^n implies c_n x_0^n \to 0, hence the sequence \{c_n x_0^n\} is bounded: |c_n x_0^n| \leq M for some M > 0.
For |x| < |x_0|, define r = |x|/|x_0| < 1. Then |c_n x^n| = |c_n x_0^n| \cdot \left|\frac{x}{x_0}\right|^n \leq M r^n. Since \sum M r^n is geometric with ratio r < 1, it converges. By comparison, \sum |c_n x^n| converges. \square
Contrapositive. If \sum c_n x_0^n diverges, then \sum c_n x^n diverges for all |x| > |x_0|.
These results establish that convergence is determined by distance from the center.
Theorem 6.1 (Existence of Radius of Convergence) For any power series \sum_{n=0}^{\infty} c_n x^n, exactly one of the following holds:
The series converges only for x = 0 (set R = 0)
The series converges for all x \in \mathbb{R} (set R = \infty)
There exists R \in (0, \infty) such that the series converges absolutely for |x| < R and diverges for |x| > R
The value R is called the radius of convergence.
Define S = \{|x| : \sum c_n x^n \text{ converges}\}. This set is nonempty since 0 \in S.
Case 1: If S = \{0\}, we have case (i).
Case 2: If S is unbounded, then for any M > 0, there exists x_0 with |x_0| > M and \sum c_n x_0^n convergent. By the comparison lemma, \sum c_n x^n converges absolutely for all |x| < |x_0|, hence for all |x| < M. Since M was arbitrary, we have case (ii).
Case 3: Otherwise, S is nonempty, bounded, and contains points other than 0. Let R = \sup S by completeness of \mathbb{R}. Then 0 < R < \infty.
If |x| < R, then |x| is not an upper bound for S, so there exists x_0 with |x| < |x_0| and \sum c_n x_0^n convergent. By the comparison lemma, \sum c_n x^n converges absolutely.
If |x| > R, suppose for contradiction that \sum c_n x^n converges. Then |x| \in S, contradicting that R is an upper bound. Thus the series diverges. \square
The radius completely determines convergence except at the boundary points x = \pm R, where separate testing is required.
6.5 Computing the Radius
Two formulas compute R directly from coefficients, derived by applying ratio and root tests to the power series.
Theorem 6.2 (Ratio Formula) If \lim_{n \to \infty} \left|\frac{c_{n+1}}{c_n}\right| = L exists (possibly infinite), then R = \begin{cases} 1/L & \text{if } 0 < L < \infty \\ \infty & \text{if } L = 0 \\ 0 & \text{if } L = \infty \end{cases}
Apply the ratio test to \sum c_n x^n with x fixed: \lim_{n \to \infty} \frac{|c_{n+1} x^{n+1}|}{|c_n x^n|} = |x| \lim_{n \to \infty} \left|\frac{c_{n+1}}{c_n}\right| = L|x|. The ratio test gives absolute convergence when L|x| < 1, i.e., |x| < 1/L. Thus R = 1/L.
When L = 0, the inequality holds for all x, giving R = \infty. When L = \infty, it holds only for x = 0, giving R = 0. \square
Theorem 6.3 (Root Formula) If \lim_{n \to \infty} \sqrt[n]{|c_n|} = L exists (possibly infinite), then R = \begin{cases} 1/L & \text{if } 0 < L < \infty \\ \infty & \text{if } L = 0 \\ 0 & \text{if } L = \infty \end{cases}
The proof follows the same pattern using the root test.
Computation Examples
Example 6.4 (Computing Radius for \sum \frac{x^n}{n}) For \sum_{n=1}^{\infty} \frac{x^n}{n}: \lim_{n \to \infty} \frac{n}{n+1} = 1, so R = 1.
Example 6.5 (Computing Radius for \sum \frac{x^n}{n!}) For \sum_{n=0}^{\infty} \frac{x^n}{n!}: \lim_{n \to \infty} \frac{n!}{(n+1)!} = 0, so R = \infty.
Example 6.6 (Computing Radius for \sum n! x^n) For \sum_{n=0}^{\infty} n! x^n: \lim_{n \to \infty} \frac{(n+1)!}{n!} = \infty, so R = 0.
6.6 The Interval of Convergence
The radius R determines convergence on the open interval (-R, R). At endpoints x = \pm R (when R < \infty), the series must be tested separately.
Definition 6.2 (Interval of Convergence) The interval of convergence is the set of all x where \sum c_n x^n converges. It has one of the forms: \{0\}, \quad (-R, R), \quad [-R, R), \quad (-R, R], \quad [-R, R], \quad \mathbb{R}
Procedure:
Compute R using ratio or root formula
Test convergence at x = R (if R < \infty)
Test convergence at x = -R (if R < \infty)
State the interval, including/excluding endpoints as appropriate
Example. Consider \sum_{n=1}^{\infty} \frac{x^n}{n}. We found R = 1.
At x = 1: Series becomes \sum \frac{1}{n} (harmonic series), which diverges.
At x = -1: Series becomes \sum \frac{(-1)^n}{n}, which converges by alternating series test.
Interval of convergence: [-1, 1).
6.7 Vector Space of Analytic Functions
A power series with R > 0 defines a function on its interval of convergence: f(x) = \sum_{n=0}^{\infty} c_n x^n.
The set of all such functions forms a vector space under pointwise operations.
Definition 6.3 (Analytic Function) A function f is analytic at a if there exists R > 0 and coefficients c_n such that f(x) = \sum_{n=0}^{\infty} c_n (x-a)^n for all x with |x-a| < R.
The space of functions analytic at a is denoted \mathcal{A}_a.
Vector space structure:
Addition: If f(x) = \sum a_n x^n with radius R_1 and g(x) = \sum b_n x^n with radius R_2, then (f+g)(x) = \sum (a_n + b_n) x^n with radius at least \min(R_1, R_2).
Scalar multiplication: (\lambda f)(x) = \sum (\lambda a_n) x^n has the same radius as f.
Zero element: The constant function 0(x) = 0 corresponds to the series with all coefficients zero.
The space \mathcal{A}_0 of functions analytic at 0 strictly contains the polynomial space \mathcal{P}. Functions like e^x, \sin(x), and \frac{1}{1-x} have power series representations but are not polynomials.
6.8 Differentiation
Power series inherit the smoothness of polynomials. Just as differentiating \sum_{k=0}^{n} c_k x^k gives \sum_{k=1}^{n} kc_k x^{k-1}, we can differentiate infinite sums term-by-term.
Theorem 6.4 (Term-by-Term Differentiation) Let f(x) = \sum_{n=0}^{\infty} c_n x^n with radius R > 0. Then f is differentiable on (-R, R), and f'(x) = \sum_{n=1}^{\infty} n c_n x^{n-1}. The differentiated series has the same radius R.
Proof sketch. Apply the ratio test to \sum n c_n x^{n-1}: \lim_{n \to \infty} \frac{(n+1)|c_{n+1}|}{n|c_n|} = \lim_{n \to \infty} \frac{n+1}{n} \cdot \frac{|c_{n+1}|}{|c_n|} = 1 \cdot L = L. The radii match. The full proof requires uniform convergence.
Consequence. Power series are C^\infty (infinitely differentiable): f^{(k)}(x) = \sum_{n=k}^{\infty} n(n-1)\cdots(n-k+1) c_n x^{n-k}.
6.9 Uniqueness
If a function equals a power series, the coefficients are uniquely determined.
Theorem 6.5 (Uniqueness of Coefficients) If f(x) = \sum_{n=0}^{\infty} c_n x^n for all x in some interval (-r, r) with r > 0, then c_n = \frac{f^{(n)}(0)}{n!}.
Differentiate k times and evaluate at x = 0: f^{(k)}(x) = \sum_{n=k}^{\infty} n(n-1)\cdots(n-k+1) c_n x^{n-k}. At x = 0, all terms vanish except n = k: f^{(k)}(0) = k! c_k. \quad \square
The coefficients encode the derivatives at the center. In the normalized basis \{1, x, \frac{x^2}{2!}, \frac{x^3}{3!}, \ldots\}, the coordinates of f are precisely (f(0), f'(0), f''(0), \ldots).
This connects to the polynomial framework: just as polynomials have coordinates in \{1, x, x^2, \ldots\}, analytic functions have infinite coordinate vectors determined by their derivatives.
6.10 Summary
Power series extend polynomials from finite to infinite linear combinations. The theory developed here establishes:
Convergence domains: Every power series has a radius R \in [0, \infty] determining where it converges
Vector space structure: Analytic functions form a vector space under pointwise operations
Calculus operations: Power series can be differentiated and integrated term-by-term within their radius
Uniqueness: Coefficients are determined by derivatives at the center
The next chapter develops Taylor series, showing how to construct power series representations for C^\infty functions by extracting their derivatives at a point.