Conditional convergence

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

In mathematics, a series or integral is said to be conditionally convergent if it converges, but it does not converge absolutely.

Definition

More precisely, a series \scriptstyle\sum\limits_{n=0}^\infty a_n is said to converge conditionally if \scriptstyle\lim\limits_{m\rightarrow\infty}\,\sum\limits_{n=0}^m\,a_n exists and is a finite number (not ∞ or −∞), but \scriptstyle\sum\limits_{n=0}^\infty \left|a_n\right| = \infty.

A classic example is the alternating series given by

1 - {1 \over 2} + {1 \over 3} - {1 \over 4} + {1 \over 5} - \cdots =\sum\limits_{n=1}^\infty {(-1)^{n+1}  \over n}

which converges to \ln (2)\,\!, but is not absolutely convergent (see Harmonic series).

Bernhard Riemann proved that a conditionally convergent series may be rearranged to converge to any sum at all, including ∞ or −∞; see Riemann series theorem.

A typical conditionally convergent integral is that on the non-negative real axis of \sin (x^2) (see Fresnel integral).

See also

References

  • Walter Rudin, Principles of Mathematical Analysis (McGraw-Hill: New York, 1964).