Uniformly bounded derivatives implies globally analytic: Difference between revisions

From Calculus
No edit summary
 
(4 intermediate revisions by the same user not shown)
Line 3: Line 3:
===Global statement===
===Global statement===


Suppose <math>f</math> is an infinitely differentiable function on <math>\R</math> such that, for any fixed <math>a,b \in \R</math>, there is a constant <math>C</math> (possibly dependent on <math>a,b</math>) such that for all nonnegative integers <math>n</math>, we have:
Suppose <math>f</math> is an infinitely differentiable function on <math>\R</math> such that, for any fixed <math>a<b \in \R</math>, there is a constant <math>C</math> (possibly dependent on <math>a,b</math>) such that for all nonnegative integers <math>n</math>, we have:


<math>|f^{(n)}(x)| \le C \ \forall x \in [a,b]</math>
<math>|f^{(n)}(t)| \le C \ \forall t \in [a,b]</math>


Then, <math>f</math> is a [[globally analytic function]]: the [[Taylor series]] of <math>f</math> about any point in <math>\R</math> converges to <math>f</math>. In particular, the Taylor series of <math>f</math> about 0 converges to <math>f</math>.
Then, <math>f</math> is a [[globally analytic function]]: the [[Taylor series]] of <math>f</math> about any point in <math>\R</math> converges to <math>f</math>. In particular, the Taylor series of <math>f</math> about 0 converges to <math>f</math>.
==Facts used==
# [[uses::Max-estimate version of Lagrange formula]]


==Examples==
==Examples==


The functions <math>\exp, \sin, \cos</math> all fit this description.
The functions <math>\exp, \sin, \cos</math> all fit this description.
If <math>f = \exp</math>, we know that each of the derivatives equals <math>\exp</math>, so <math>f^{(n)}(t) = f(t)</math> for all <math>t \in [a,b]</math>. Since <math>\exp</math> is continuous, it is bounded on the closed interval <math>[a,b]</math>, and the upper bound for <math>\exp</math> thus serves as a uniform bound for all its derivatives. (In fact, since <math>f</math> is increasing, we can explicitly take <math>C = \exp(b)</math>).
For <math>f = \sin</math> or <math>f = \cos</math>, we know that all the derivatives are <math>\pm \sin</math> or <math>\pm \cos</math>, so their magnitude is at most 1. Thus, we can take <math>C = 1</math>.
==Proof==
'''Given''': <math>f</math> is an infinitely differentiable function on <math>\R</math> such that, for any fixed <math>a,b \in \R</math>, there is a constant <math>C</math> (possibly dependent on <math>a,b</math>) such that for all nonnegative integers <math>n</math>, we have:
<math>|f^{(n)}(t)| \le C \ \forall t \in [a,b]</math>
A point <math>x_0 \in \R</math> and a point <math>x \in \R</math>.
'''To prove''': The Taylor series of <math>f</math> at <math>x_0</math>, evaluated at <math>x</math>, converges to <math>f(x)</math>.
'''Proof''': Note that if <math>x_0 = x</math>, there is nothing to prove, so we consider the case <math>x \ne x_0</math>.
In order to show this, it suffices to show that <math>\lim_{n \to \infty} P_n(f;x_0)(x) = f(x)</math> where <math>P_n(f;x_0)(x)</math> denotes the <math>n^{th}</math> Taylor polynomial of <math>f</math> at <math>x_0</math>, evaluated at <math>x</math>.
This in turn is equivalent to showing that the ''remainder'' approaches zero:
''''Want to show''': <math>\lim_{n \to \infty} R_n(f;x_0)(x) = 0</math>
where <math>R_n(f;x_0)(x) = f(x) - P_n(f;x_0)(x)</math>.
'''Proof of what we want to show''': By Fact (1), we have that:
<math>|R_n(f;x_0)(x)| \le \left(\max_{t \in J} |f^{(n+1)}(t)|\right) \frac{|x - x_0|^{n+1}}{(n + 1)!}</math>
where <math>J</math> is the interval joining <math>x_0</math> to <math>x</math>. Let <math>a = \min \{ x,x_0 \}</math> and <math>b = \max \{ x, x_0 \}</math>. The interval <math>J</math> is the interval <math>[a,b]</math>.
Now, from the given data, there exists <math>C</math>, dependent on <math>x</math> and <math>x_0</math> but ''not'' on <math>n</math>, such that:
<math>\max_{t \in J} |f^{(n+1)}(t)| \le C \ \forall \ n</math>
Plugging this in, we get that:
<math>|R_n(f;x_0)(x)| \le C \frac{|x - x_0|^{n+1}}{(n + 1)!}</math>
Now taking the limit as <math>n \to \infty</math>, we get:
<math>\lim_{n \to \infty} |R_n(f;x_0)(x)| \le C \lim_{n \to \infty} \frac{|x - x_0|^{n+1}}{(n + 1)!}</math>
Since exponentials grow faster than power functions, the expression under the limit goes to zero, so we are left with a right side of zero, hence the left side limit is zero, and we are done.

Latest revision as of 20:27, 12 July 2012

Statement

Global statement

Suppose f is an infinitely differentiable function on R such that, for any fixed a<bR, there is a constant C (possibly dependent on a,b) such that for all nonnegative integers n, we have:

|f(n)(t)|Ct[a,b]

Then, f is a globally analytic function: the Taylor series of f about any point in R converges to f. In particular, the Taylor series of f about 0 converges to f.

Facts used

  1. Max-estimate version of Lagrange formula

Examples

The functions exp,sin,cos all fit this description.

If f=exp, we know that each of the derivatives equals exp, so f(n)(t)=f(t) for all t[a,b]. Since exp is continuous, it is bounded on the closed interval [a,b], and the upper bound for exp thus serves as a uniform bound for all its derivatives. (In fact, since f is increasing, we can explicitly take C=exp(b)).

For f=sin or f=cos, we know that all the derivatives are ±sin or ±cos, so their magnitude is at most 1. Thus, we can take C=1.

Proof

Given: f is an infinitely differentiable function on R such that, for any fixed a,bR, there is a constant C (possibly dependent on a,b) such that for all nonnegative integers n, we have:

|f(n)(t)|Ct[a,b]

A point x0R and a point xR.

To prove: The Taylor series of f at x0, evaluated at x, converges to f(x).

Proof: Note that if x0=x, there is nothing to prove, so we consider the case xx0.

In order to show this, it suffices to show that limnPn(f;x0)(x)=f(x) where Pn(f;x0)(x) denotes the nth Taylor polynomial of f at x0, evaluated at x.

This in turn is equivalent to showing that the remainder approaches zero:

'Want to show: limnRn(f;x0)(x)=0

where Rn(f;x0)(x)=f(x)Pn(f;x0)(x).

Proof of what we want to show: By Fact (1), we have that:

|Rn(f;x0)(x)|(maxtJ|f(n+1)(t)|)|xx0|n+1(n+1)!

where J is the interval joining x0 to x. Let a=min{x,x0} and b=max{x,x0}. The interval J is the interval [a,b].

Now, from the given data, there exists C, dependent on x and x0 but not on n, such that:

maxtJ|f(n+1)(t)|Cn

Plugging this in, we get that:

|Rn(f;x0)(x)|C|xx0|n+1(n+1)!

Now taking the limit as n, we get:

limn|Rn(f;x0)(x)|Climn|xx0|n+1(n+1)!

Since exponentials grow faster than power functions, the expression under the limit goes to zero, so we are left with a right side of zero, hence the left side limit is zero, and we are done.