Chain rule for differentiation

From Calculus
Jump to: navigation, search
ORIGINAL FULL PAGE: Chain rule for differentiation
STUDY THE TOPIC AT MULTIPLE LEVELS:
ALSO CHECK OUT: Quiz (multiple choice questions to test your understanding) |Page with videos on the topic, both embedded and linked to
This article is about a differentiation rule, i.e., a rule for differentiating a function expressed in terms of other functions whose derivatives are known.
View other differentiation rules

Statement for two functions

The chain rule is stated in many versions:

Version type Statement
specific point, named functions Suppose and are functions such that is differentiable at a point , and is differentiable at . Then the composite is differentiable at , and we have:
generic point, named functions, point notation Suppose and are functions of one variable. Then, we have
wherever the right side expression makes sense.
generic point, named functions, point-free notation Suppose and are functions of one variable. Then,
where the right side expression makes sense, where denotes the pointwise product of functions.
pure Leibniz notation Suppose is a function of and is a function of . Then,
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.

One-sided version

A one-sided version of sorts holds, but we need to be careful, since we want the direction of differentiability of to be the same as the direction of approach of to . The following are true:

Condition on at Condition on at Conclusion
left differentiable at differentiable at The left hand derivative of at is times the left hand derivative of at .
right differentiable at differentiable at The right hand derivative of at is times the right hand derivative of at .
left differentiable at , and increasing for on the immediate left of left differentiable at the left hand derivative is the left hand derivative of at times the left hand derivative of at .
right differentiable at , and increasing for on the immediate right of right differentiable at the right hand derivative is the right hand derivative of at times the left hand derivative of at .
left differentiable at , and decreasing for on the immediate left of right differentiable at the left hand derivative is the right hand derivative of at times the left hand derivative of at .
right differentiable at , and decreasing for on the immediate right of left differentiable at the right hand derivative is the left hand derivative of at times the left hand derivative of at .

Statement for multiple functions

Suppose are functions. Then, the following is true wherever the right side makes sense:

For instance, in the case , we get:

In point notation, this is:

Related rules

Similar facts in single variable calculus

Similar facts in multivariable calculus

Reversal for integration

If a function is differentiated using the chain rule, then retrieving the original function from the derivative typically requires a method of integration called integration by substitution. Specifically, that method of integration targets expressions of the form:

The -substitution idea is to set and obtain:

We now need to find a function such that . The integral is . Plugging back , we obtain that the indefinite integral is .

Significance

Why more naive chain rules don't make sense

There are two naive versions of the chain rule one might come up with, neither of which holds:

and

Even without doing any mathematics, we can deduce that neither of these rules can be correct. How? Any rule that holds generically must involve evaluating or only at points that we know to be in the domain of . The only such point in this context is . Therefore, the chain rule cannot involve evaluating or at any point other than .

Note that our actual chain rule:

is quite similar to the naive but false rule , and can be viewed as the corrected version of the rule once we account for the fact that can only be calculated after transforming to .

Qualitative and existential significance

Each of the versions has its own qualitative significance:

Version type Significance
specific point, named functions This tells us that if is differentiable at a point and is differentiable at , then is differentiable at .
generic point, named functions, point notation If is a differentiable function and is a differentiable function on the intersection of its domain with the range of , then is a differentiable function.
generic point, named functions, point-free notation We can deduce properties of based on properties of . In particular, if and are both continuous functions, so is . Another way of putting this is that if and are both continuously differentiable functions, so is .

Computational feasibility significance

Each of the versions has its own computational feasibility significance:

Version type Significance
specific point, named functions If we know the values (in the sense of numerical values) and , we can use these to compute .
generic point, named functions This tells us that knowledge of the general expressions for the derivatives of and (along with expressions for the functions themselves) allows us to compute the general expression for the derivative of .
Note that we do not need to know itself (it suffices to know , which tells us what is up to additive constants), but we do need to know what is. It does not suffice to know merely up to additive constants.

Computational results significance

Shorthand Significance
significance of derivative being zero If , and is differentiable at , then . Note that the conclusion need not follow if is not differentiable at .
Also, if and is differentiable at , then .
Note that it is essential in both cases that the other function be differentiable at the appropriate point. Here are some counterexamples when it's not: [SHOW MORE]
significance of sign of derivative The product of the signs of and gives the sign of . In particular, if both have the same sign, then is positive. If both have opposite signs, then is negative. This is related to the idea that a composite of increasing functions is increasing, and similar ideas.
significance of uniform bounds on derivatives If and are uniformly bounded, then so is , with a possible uniform bound being the product of the uniform bounds for and .

Compatibility checks

Associative symmetry

This is a compatibility check for showing that for a composite of three functions , the formula for the derivative obtained using the chain rule is the same whether we associate it as or as .

  • Derivative as . We first apply the chain rule for the pair of functions and then for the pair of functions :

In point-free notation:

In point notation (i.e., including a symbol for the point where the function is applied):

  • Derivative as . We first apply the chain rule for the pair of functions and then for the pair of functions :

In point-free notation:

In point notation (i.e., including a symbol for the point where the function is applied):

Compatibility with linearity

Consider functions . We have that:

The function can be differentiated either by differentiating the left side or by differentiating the right side. The compatibility check is to ensure that we get the same result from both methods:

  • Left side: In point-free notation:

In point notation (i.e., including a symbol for the point of application):

  • Right side: In point-free notation:

We get .

In point notation:

Thus, we get the same result on both sides, indicating compatibility.

Note that it is not in general true that , so there is no compatibility check to be made there.

Compatibility with product rule

Consider functions . We have that:

The function can be differentiated either by differentiating the left side or by differentiating the right side. The two processes use the product rule for differentiation in different ways. The compatibility check is to ensure that we get the same result from both methods:

  • Left side: In point-free notation:

In point notation:

  • Right side: In point-free notation:

In point notation:

Note that it is not in general true that , so no compatibility check needs to be made there.

Compatibility with notions of order

This section explains why the chain rule is compatible with notions of order that satisfy:

Suppose and . Then we have the following:

  • has order : First, note that has order by the product relation for order. Next, note that differentiating pushes the order down by one.
  • has order : Note that has order and has order . Adding, we get <math(m - 1)n + n - 1 = mn - 1</math>.
Note that this compatibility check fails on both the false chain rules discussed in the significance section: [SHOW MORE]

Some examples of the notion of order which illustrate this are:

  • For nonzero polynomials, the order notion above can be taken as the degree of the polynomial.
  • For functions that are zero at a particular point, the order notion above can be taken as the order of zero at the point. Note that in this case, the order of zero for will be calculated at 0 rather than the original point at which is evaluated.

Examples

Sanity checks

We first consider examples where the chain rule for differentiation confirms something we already knew by other means:

Case on Case on Direct justification, without using the chain rule Justification using the chain rule, i.e., by computing
a constant function any differentiable function zero function is a constant function, so its derivative is the zero function. By the chain rule, . being constant forces to be zero everywhere, hence the product is also zero everywhere. Thus, is also zero everywhere.
any differentiable function a constant function with value zero function is a constant function with value , so its derivative is the zero function. By the chain rule, . being constant forces that everywhere, hence the product is also zero everywhere. Thus, is also zero everywhere.
the identity function, i.e., the function any differentiable function , so . . Since is the function , its derivative is the function . Plugging this in, we get that is also the constant function , so .
any differentiable function the identity function , so . . Since is the identity function, is the function . Also, . Thus, .
the square function any differentiable function and hence its derivative can be computed using the product rule for differentiation. It comes out as . . is the derivative of the square function, and therefore is . Thus, . We thus get .
a one-one differentiable function the inverse function of 1 for all , so the derivative is the function 1. . By the inverse function theorem, we know that , so plugging in, we get .

Nontrivial examples

The chain rule is necessary for computing the derivatives of functions whose definition requires one to compose functions. The chain rule still isn't the only option: one can always compute the derivative as a limit of a difference quotient. But it does offer the only option if one restricts oneself to operating within the family of differentiation rules.

Some examples of functions for which the chain rule needs to be used include:

  • A trigonometric function applied to a nonlinear algebraic function
  • An exponential function applied to a nonlinear algebraic function
  • A composite of two trigonometric functions, two exponential functions, or an exponential and a trigonometric function

A few examples are below.

Sine of square function

Consider the sine of square function:

.


We use the chain rule for differentiation viewing the function as the composite of the square function on the inside and the sine function on the outside:


Sine of sine function

Consider the sine of sine function:

The derivative is: