Difference between revisions of "Chain rule for differentiation"
(→Associative symmetry) |
(→Compatibility with linearity) |
||
Line 155: | Line 155: | ||
The function <math>(f_1 + f_2) \circ g</math> can be differentiated either by differentiating the left side or by differentiating the right side. The compatibility check is to ensure that we get the same result from both methods: | The function <math>(f_1 + f_2) \circ g</math> can be differentiated either by differentiating the left side or by differentiating the right side. The compatibility check is to ensure that we get the same result from both methods: | ||
− | * Left side: | + | * Left side: In point-free notation: |
− | * Right side: We get <math>\! (f_1 \circ g)' + (f_2 \circ g)' = ((f_1' \circ g) \cdot g') + ((f_2' \circ g) \cdot g')</math>. | + | |
+ | <math>\! ((f_1 + f_2) \circ g)' = ((f_1 + f_2)' \circ g) \cdot g' = ((f_1' + f_2') \circ g) \cdot g' = ((f_1' \circ g) + (f_2' \circ g)) \cdot g' = ((f_1' \circ g) \cdot g') + ((f_2' \circ g) \cdot g')</math> | ||
+ | |||
+ | In point notation (i.e., including a symbol for the point of application): | ||
+ | |||
+ | <math>\! ((f_1 + f_2) \circ g)'(x) = (f_1 + f_2)'(g(x))g'(x) = (f_1'(g(x)) + f_2'(g(x)))g'(x) = f_1'(g(x))g'(x) + f_2'(g(x))g'(x)</math> | ||
+ | |||
+ | * Right side: In point-free notation: | ||
+ | |||
+ | We get <math>\! (f_1 \circ g + f_2 \circ g)' = (f_1 \circ g)' + (f_2 \circ g)' = ((f_1' \circ g) \cdot g') + ((f_2' \circ g) \cdot g')</math>. | ||
+ | |||
+ | In point notation: | ||
+ | |||
+ | <math>(f_1 \circ g + f_2 \circ g)'(x) = (f_1 \circ g)'(x) + (f_2 \circ g)'(x) = f_1'(g(x))g'(x) + f_2'(g(x))g'(x)</math> | ||
Thus, we get the same result on both sides, indicating compatibility. | Thus, we get the same result on both sides, indicating compatibility. |
Revision as of 15:57, 12 April 2015
ORIGINAL FULL PAGE: Chain rule for differentiation
STUDY THE TOPIC AT MULTIPLE LEVELS:
ALSO CHECK OUT: Quiz (multiple choice questions to test your understanding) |Page with videos on the topic, both embedded and linked to
This article is about a differentiation rule, i.e., a rule for differentiating a function expressed in terms of other functions whose derivatives are known.
View other differentiation rules
Contents
Statement for two functions
The chain rule is stated in many versions:
Version type | Statement |
---|---|
specific point, named functions | Suppose and are functions such that is differentiable at a point , and is differentiable at . Then the composite is differentiable at , and we have: |
generic point, named functions, point notation | Suppose and are functions of one variable. Then, we have wherever the right side expression makes sense. |
generic point, named functions, point-free notation | Suppose and are functions of one variable. Then, where the right side expression makes sense, where denotes the pointwise product of functions. |
pure Leibniz notation | Suppose is a function of and is a function of . Then, |
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.
One-sided version
A one-sided version of sorts holds, but we need to be careful, since we want the direction of differentiability of to be the same as the direction of approach of to . The following are true:
Condition on at | Condition on at | Conclusion |
---|---|---|
left differentiable at | differentiable at | The left hand derivative of at is times the left hand derivative of at . |
right differentiable at | differentiable at | The right hand derivative of at is times the right hand derivative of at . |
left differentiable at , and increasing for on the immediate left of | left differentiable at | the left hand derivative is the left hand derivative of at times the left hand derivative of at . |
right differentiable at , and increasing for on the immediate right of | right differentiable at | the right hand derivative is the right hand derivative of at times the left hand derivative of at . |
left differentiable at , and decreasing for on the immediate left of | right differentiable at | the left hand derivative is the right hand derivative of at times the left hand derivative of at . |
right differentiable at , and decreasing for on the immediate right of | left differentiable at | the right hand derivative is the left hand derivative of at times the left hand derivative of at . |
Statement for multiple functions
Suppose are functions. Then, the following is true wherever the right side makes sense:
For instance, in the case , we get:
In point notation, this is:
Related rules
Similar facts in single variable calculus
- Chain rule for higher derivatives
- Product rule for differentiation
- Product rule for higher derivatives
- Differentiation is linear
- Inverse function theorem (gives formula for derivative of inverse function).
- Chain rule for differentiation of formal power series
Similar facts in multivariable calculus
Reversal for integration
If a function is differentiated using the chain rule, then retrieving the original function from the derivative typically requires a method of integration called integration by substitution. Specifically, that method of integration targets expressions of the form:
The -substitution idea is to set and obtain:
We now need to find a function such that . The integral is . Plugging back , we obtain that the indefinite integral is .
Significance
Qualitative and existential significance
Each of the versions has its own qualitative significance:
Version type | Significance |
---|---|
specific point, named functions | This tells us that if is differentiable at a point and is differentiable at , then is differentiable at . |
generic point, named functions, point notation | If is a differentiable function and is a differentiable function on the intersection of its domain with the range of , then is a differentiable function. |
generic point, named functions, point-free notation | We can deduce properties of based on properties of . In particular, if and are both continuous functions, so is . Another way of putting this is that if and are both continuously differentiable functions, so is . |
Computational feasibility significance
Each of the versions has its own computational feasibility significance:
Version type | Significance |
---|---|
specific point, named functions | If we know the values (in the sense of numerical values) and , we can use these to compute . |
generic point, named functions | This tells us that knowledge of the general expressions for the derivatives of and (along with expressions for the functions themselves) allows us to compute the general expression for the derivative of . |
Computational results significance
Shorthand | Significance |
---|---|
significance of derivative being zero | If , and is differentiable at , then . Note that the conclusion need not follow if is not differentiable at . Also, if and is differentiable at , then . Note that it is essential in both cases that the other function be differentiable at the appropriate point. Here are some counterexamples when it's not: [SHOW MORE] |
significance of sign of derivative | The product of the signs of and gives the sign of . In particular, if both have the same sign, then is positive. If both have opposite signs, then is negative. This is related to the idea that a composite of increasing functions is increasing, and similar ideas. |
significance of uniform bounds on derivatives | If and are uniformly bounded, then so is , with a possible uniform bound being the product of the uniform bounds for and . |
Compatibility checks
Associative symmetry
This is a compatibility check for showing that for a composite of three functions , the formula for the derivative obtained using the chain rule is the same whether we associate it as or as .
- Derivative as . We first apply the chain rule for the pair of functions and then for the pair of functions :
In point-free notation:
In point notation (i.e., including a symbol for the point where the function is applied):
- Derivative as . We first apply the chain rule for the pair of functions and then for the pair of functions :
In point-free notation:
In point notation (i.e., including a symbol for the point where the function is applied):
Compatibility with linearity
Consider functions . We have that:
The function can be differentiated either by differentiating the left side or by differentiating the right side. The compatibility check is to ensure that we get the same result from both methods:
- Left side: In point-free notation:
In point notation (i.e., including a symbol for the point of application):
- Right side: In point-free notation:
We get .
In point notation:
Thus, we get the same result on both sides, indicating compatibility.
Compatibility with product rule
Consider functions . We have that:
The function can be differentiated either by differentiating the left side or by differentiating the right side. The two processes use the product rule for differentiation in different ways. The compatibility check is to ensure that we get the same result from both methods:
- Left side: We get .
- Right side: We get .
Examples
Sanity checks
We first consider examples where the chain rule for differentiation confirms something we already knew by other means:
Case on | Case on | Direct justification, without using the chain rule | Justification using the chain rule, i.e., by computing | |
---|---|---|---|---|
a constant function | any differentiable function | zero function | is a constant function, so its derivative is the zero function. | By the chain rule, . being constant forces to be zero everywhere, hence the product is also zero everywhere. Thus, is also zero everywhere. |
any differentiable function | a constant function with value | zero function | is a constant function with value , so its derivative is the zero function. | By the chain rule, . being constant forces that everywhere, hence the product is also zero everywhere. Thus, is also zero everywhere. |
the identity function, i.e., the function | any differentiable function | , so . | . Since is the function , its derivative is the function . Plugging this in, we get that is also the constant function , so . | |
any differentiable function | the identity function | , so . | . Since is the identity function, is the function . Also, . Thus, . | |
the square function | any differentiable function | and hence its derivative can be computed using the product rule for differentiation. It comes out as . | . is the derivative of the square function, and therefore is . Thus, . We thus get . | |
a one-one differentiable function | the inverse function of | 1 | for all , so the derivative is the function 1. | . By the inverse function theorem, we know that , so plugging in, we get . |
Nontrivial examples
Here are some examples that cannot be computed using methods other than the chain rule:
Consider the sine of square function:
.
We use the chain rule for differentiation viewing the function as the composite of the square function on the inside and the sine function on the outside: