Chain rule for differentiation

From Calculus
Jump to: navigation, search
ORIGINAL FULL PAGE: Chain rule for differentiation
STUDY THE TOPIC AT MULTIPLE LEVELS:
ALSO CHECK OUT: Quiz (multiple choice questions to test your understanding) |Page with videos on the topic, both embedded and linked to
This article is about a differentiation rule, i.e., a rule for differentiating a function expressed in terms of other functions whose derivatives are known.
View other differentiation rules

Statement for two functions

The chain rule is stated in many versions:

Version type Statement
specific point, named functions Suppose f and g are functions such that g is differentiable at a point x = x_0, and f is differentiable at g(x_0). Then the composite f \circ g is differentiable at x_0, and we have:
\! \frac{d}{dx}[f(g(x))]|_{x = x_0} = f'(g(x_0))g'(x_0)
generic point, named functions, point notation Suppose f and g are functions of one variable. Then, we have
\! \frac{d}{dx}[f(g(x))] = f'(g(x))g'(x) wherever the right side expression makes sense.
generic point, named functions, point-free notation Suppose f and g are functions of one variable. Then,
\! (f \circ g)' = (f' \circ g) \cdot g' where the right side expression makes sense, where \cdot denotes the pointwise product of functions.
pure Leibniz notation Suppose u = g(x) is a function of x and v = f(u) is a function of u. Then,
 \frac{dv}{dx} = \frac{dv}{du}\frac{du}{dx}
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a \{ \}_0 subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.

One-sided version

A one-sided version of sorts holds, but we need to be careful, since we want the direction of differentiability of f to be the same as the direction of approach of g(x) to g(x_0). The following are true:

Condition on g at x_0 Condition on f at g(x_0) Conclusion
left differentiable at x_0 differentiable at g(x_0) The left hand derivative of f \circ g at x_0 is f'(g(x_0)) times the left hand derivative of g at x_0.
right differentiable at x_0 differentiable at g(x_0) The right hand derivative of f \circ g at x_0 is f'(g(x_0)) times the right hand derivative of g at x_0.
left differentiable at x_0, and increasing for x on the immediate left of x_0 left differentiable at g(x_0) the left hand derivative is the left hand derivative of f at g(x_0) times the left hand derivative of g at x_0.
right differentiable at x_0, and increasing for x on the immediate right of x_0 right differentiable at g(x_0) the right hand derivative is the right hand derivative of f at g(x_0) times the left hand derivative of g at x_0.
left differentiable at x_0, and decreasing for x on the immediate left of x_0 right differentiable at g(x_0) the left hand derivative is the right hand derivative of f at g(x_0) times the left hand derivative of g at x_0.
right differentiable at x_0, and decreasing for x on the immediate right of x_0 left differentiable at g(x_0) the right hand derivative is the left hand derivative of f at g(x_0) times the left hand derivative of g at x_0.

Statement for multiple functions

Suppose f_1,f_2,\dots,f_n are functions. Then, the following is true wherever the right side makes sense:

(f_1 \circ f_2 \circ f_3 \dots \circ f_n)' = (f_1' \circ f_2 \circ \dots \circ f_n) \cdot (f_2' \circ \dots \circ f_n) \cdot \dots \cdot (f_{n-1}' \circ f_n) \cdot f_n'

For instance, in the case n = 3, we get:

(f_1 \circ f_2 \circ f_3)' = (f_1' \circ f_2 \circ f_3) \cdot (f_2' \circ f_3) \cdot f_3'

In point notation, this is:

\! \frac{d}{dx}[f_1(f_2(f_3(x)))] = f_1'(f_2(f_3(x))f_2'(f_3(x))f_3'(x)

Related rules

Similar facts in single variable calculus

Similar facts in multivariable calculus

Reversal for integration

If a function is differentiated using the chain rule, then retrieving the original function from the derivative typically requires a method of integration called integration by substitution. Specifically, that method of integration targets expressions of the form:

\int h(g(x))g'(x) \, dx

The u-substitution idea is to set u = g(x) and obtain:

\int h(u) \, du

We now need to find a function f such that f' = h. The integral is f(u) + C. Plugging back u = g(x), we obtain that the indefinite integral is f(g(x)) + C.

Significance

Why more naive chain rules don't make sense

There are two naive versions of the chain rule one might come up with, neither of which holds:

(f \circ g)'(x) = f'(g'(x))

and

(f \circ g)'(x) = f'(x)g'(x)

Even without doing any mathematics, we can deduce that neither of these rules can be correct. How? Any rule that holds generically must involve evaluating f or f' only at points that we know to be in the domain of f. The only such point in this context is g(x). Therefore, the chain rule cannot involve evaluating f or f' at any point other than g(x).

Note that our actual chain rule:

(f \circ g)'(x) = f'(g(x))g'(x)

is quite similar to the naive but false rule \! (f \circ g)'(x) = f'(x)g'(x), and can be viewed as the corrected version of the rule once we account for the fact that f' can only be calculated after transforming x to g(x).

Qualitative and existential significance

Each of the versions has its own qualitative significance:

Version type Significance
specific point, named functions This tells us that if g is differentiable at a point x_0 and f is differentiable at g(x_0), then f \circ g is differentiable at x_0.
generic point, named functions, point notation If g is a differentiable function and f is a differentiable function on the intersection of its domain with the range of g, then f \circ g is a differentiable function.
generic point, named functions, point-free notation We can deduce properties of (f \circ g)' based on properties of f',g',f,g. In particular, if f' and g' are both continuous functions, so is (f \circ g)'. Another way of putting this is that if f and g are both continuously differentiable functions, so is f \circ g.

Computational feasibility significance

Each of the versions has its own computational feasibility significance:

Version type Significance
specific point, named functions If we know the values (in the sense of numerical values) g'(x_0) and f'(g(x_0)), we can use these to compute (f \circ g)'(x_0).
generic point, named functions This tells us that knowledge of the general expressions for the derivatives of f and g (along with expressions for the functions themselves) allows us to compute the general expression for the derivative of f \circ g.
Note that we do not need to know f itself (it suffices to know f', which tells us what f is up to additive constants), but we do need to know what g is. It does not suffice to know g merely up to additive constants.

Computational results significance

Shorthand Significance
significance of derivative being zero If \! g'(x_0) = 0, and f is differentiable at g(x_0), then \! (f \circ g)'(x_0) = 0. Note that the conclusion need not follow if f is not differentiable at g(x_0).
Also, if \! f'(g(x_0)) = 0 and g is differentiable at x_0, then (f \circ g)'(x_0) = 0.
Note that it is essential in both cases that the other function be differentiable at the appropriate point. Here are some counterexamples when it's not: [SHOW MORE]
significance of sign of derivative The product of the signs of \! f'(g(x_0)) and \! g'(x_0) gives the sign of (f \circ g)'(x_0). In particular, if both have the same sign, then (f \circ g)'(x_0) is positive. If both have opposite signs, then (f \circ g)'(x_0) is negative. This is related to the idea that a composite of increasing functions is increasing, and similar ideas.
significance of uniform bounds on derivatives If \! f' and \! g' are uniformly bounded, then so is (f \circ g)', with a possible uniform bound being the product of the uniform bounds for \! f' and \! g'.

Compatibility checks

Associative symmetry

This is a compatibility check for showing that for a composite of three functions f_1 \circ f_2 \circ f_3, the formula for the derivative obtained using the chain rule is the same whether we associate it as f_1 \circ (f_2 \circ f_3) or as (f_1 \circ f_2) \circ f_3.

  • Derivative as f_1 \circ (f_2 \circ f_3). We first apply the chain rule for the pair of functions (f_1, f_2 \circ f_3) and then for the pair of functions (f_2, f_3):

In point-free notation:

(f_1 \circ (f_2 \circ f_3))' = (f_1' \circ (f_2 \circ f_3)) \cdot (f_2 \circ f_3)' = (f_1' \circ (f_2 \circ f_3)) \cdot (f_2' \circ f_3) \cdot f_3'

In point notation (i.e., including a symbol for the point where the function is applied):

(f_1 \circ (f_2 \circ f_3))'(x) = f_1'(f_2 \circ f_3(x))(f_2 \circ f_3)'(x) = f_1'(f_2(f_3(x)))(f_2 \circ f_3)'(x) =  f_1'(f_2(f_3(x)))f_2'(f_3(x))f_3'(x)

  • Derivative as (f_1 \circ f_2) \circ f_3. We first apply the chain rule for the pair of functions (f_1 \circ f_2, f_3) and then for the pair of functions (f_1, f_2):

In point-free notation:

((f_1 \circ f_2) \circ f_3)' = ((f_1 \circ f_2)' \circ f_3) \cdot f_3' = ((f_1' \circ f_2) \cdot f_2') \circ f_3) \cdot f_3' = ((f_1' \circ f_2) \circ f_3) \cdot (f_2' \circ f_3) \cdot f_3'

In point notation (i.e., including a symbol for the point where the function is applied):

((f_1 \circ f_2) \circ f_3)'(x) = ((f_1 \circ f_2)' \circ f_3)(x)f_3'(x) = (f_1 \circ f_2)'(f_3(x))f_3'(x) = f_1'(f_2(f_3(x)))f_2'(f_3(x))f_3'(x)

Compatibility with linearity

Consider functions f_1,f_2,g. We have that:

(f_1 + f_2) \circ g = (f_1 \circ g) + (f_2 \circ g)

The function (f_1 + f_2) \circ g can be differentiated either by differentiating the left side or by differentiating the right side. The compatibility check is to ensure that we get the same result from both methods:

  • Left side: In point-free notation:

\! ((f_1 + f_2) \circ g)' = ((f_1 + f_2)' \circ g) \cdot g' = ((f_1' + f_2') \circ g) \cdot g' = ((f_1' \circ g) + (f_2' \circ g)) \cdot g' = ((f_1' \circ g) \cdot g') + ((f_2' \circ g) \cdot g')

In point notation (i.e., including a symbol for the point of application):

\! ((f_1 + f_2) \circ g)'(x) = (f_1 + f_2)'(g(x))g'(x) = (f_1'(g(x)) + f_2'(g(x)))g'(x) = f_1'(g(x))g'(x) + f_2'(g(x))g'(x)

  • Right side: In point-free notation:

We get \! (f_1 \circ g + f_2 \circ g)' = (f_1 \circ g)' + (f_2 \circ g)' = ((f_1' \circ g) \cdot g') + ((f_2' \circ g) \cdot g').

In point notation:

(f_1 \circ g + f_2 \circ g)'(x) = (f_1 \circ g)'(x) + (f_2 \circ g)'(x) = f_1'(g(x))g'(x) + f_2'(g(x))g'(x)

Thus, we get the same result on both sides, indicating compatibility.

Note that it is not in general true that f \circ (g_1 + g_2) = (f \circ g_1) + (f \circ g_2), so there is no compatibility check to be made there.

Compatibility with product rule

Consider functions f_1,f_2,g. We have that:

(f_1 \cdot f_2) \circ g = (f_1 \circ g) \cdot (f_2 \circ g)

The function (f_1 \cdot f_2) \circ g can be differentiated either by differentiating the left side or by differentiating the right side. The two processes use the product rule for differentiation in different ways. The compatibility check is to ensure that we get the same result from both methods:

  • Left side: In point-free notation:

\! ((f_1 \cdot f_2) \circ g)' = ((f_1 \cdot f_2)' \circ g) \cdot g' = ((f_1' \cdot f_2 + f_1 \cdot f_2') \circ g) \cdot g' = ((f_1' \cdot f_2) \circ g) \cdot g' + ((f_1 \cdot f_2') \circ g) \cdot g'

In point notation:

\! ((f_1 \cdot f_2) \circ g)' = ((f_1 \cdot f_2)'(g(x)) g'(x) = (f_1'(g(x))f_2(g(x)) + f_1(g(x))f_2'(g(x))) g'(x)

  • Right side: In point-free notation:

\! ((f_1 \circ g) \cdot (f_2 \circ g))' = (f_1 \circ g)' \cdot (f_2 \circ g) + (f_1 \circ g) \cdot (f_2 \circ g)' = (f_1' \circ g) \cdot g' \cdot (f_2 \circ g) + (f_1 \circ g) \cdot (f_2' \circ g) \cdot g' \! = [(f_1' \circ g) \cdot (f_2 \circ g)] \cdot g' +  [(f_1 \circ g) \cdot (f_2' \circ g)] \cdot g' = ((f_1' \cdot f_2) \circ g) \cdot g' + ((f_1 \cdot f_2') \circ g) \cdot g'

In point notation:

\! ((f_1 \circ g) \cdot (f_2 \circ g))'(x) = (f_1 \circ g)'(x)(f_2 \circ g)(x) + (f_1 \circ g)(x)(f_2 \circ g)'(x) = (f_1'(g(x))g'(x)f_2(g(x)) + f_1(g(x))g'(x)f_2'(g(x))

\! = (f_1'(g(x))f_2(g(x)) + f_1(g(x))f_2'(g(x))) g'(x)

Note that it is not in general true that f \circ (g_1 \cdot g_2) = (f \circ g_1) \cdot (f \circ g_2), so no compatibility check needs to be made there.

Compatibility with notions of order

This section explains why the chain rule is compatible with notions of order \operatorname{ord} that satisfy:

  • \operatorname{ord}(f') = \operatorname{ord}(f) - 1
  • \operatorname{ord}(f \circ g) = \operatorname{ord}(f)\operatorname{ord}(g)
  • \operatorname{ord}(f \cdot g) = \operatorname{ord}(f) + \operatorname{ord}(g)

Suppose \operatorname{ord}(f) = m and \operatorname{ord}(g) = n. Then we have the following:

  • (f \circ g)' has order mn -1: First, note that f \circ g has order mn by the product relation for order. Next, note that differentiating pushes the order down by one.
  • (f' \circ g) \cdot g' has order mn - 1: Note that f' \circ g has order (m - 1)n and g' has order n - 1. Adding, we get <math(m - 1)n + n - 1 = mn - 1</math>.
Note that this compatibility check fails on both the false chain rules discussed in the significance section: [SHOW MORE]

Some examples of the notion of order which illustrate this are:

  • For nonzero polynomials, the order notion above can be taken as the degree of the polynomial.
  • For functions that are zero at a particular point, the order notion above can be taken as the order of zero at the point. Note that in this case, the order of zero for f will be calculated at 0 rather than the original point at which g is evaluated.

Examples

Sanity checks

We first consider examples where the chain rule for differentiation confirms something we already knew by other means:

Case on f Case on g (f \circ g)' Direct justification, without using the chain rule Justification using the chain rule, i.e., by computing (f' \circ g) \cdot g'
a constant function any differentiable function zero function f \circ g is a constant function, so its derivative is the zero function. By the chain rule, (f \circ g)'(x) = f'(g(x))g'(x). f being constant forces f'(g(x)) to be zero everywhere, hence the product f'(g(x))g'(x) is also zero everywhere. Thus, (f \circ g)' is also zero everywhere.
any differentiable function a constant function with value k zero function f \circ g is a constant function with value f(k), so its derivative is the zero function. By the chain rule, (f \circ g)'(x) = f'(g(x))g'(x). g being constant forces that g'(x) = 0 everywhere, hence the product f'(g(x))g'(x) is also zero everywhere. Thus, (f \circ g)' is also zero everywhere.
the identity function, i.e., the function x \mapsto x any differentiable function \! g' f \circ g = g, so (f \circ g)' = g'. (f \circ g)' = (f' \circ g) \cdot g'. Since f is the function x \mapsto x, its derivative is the function x \mapsto 1. Plugging this in, we get that f' \circ g is also the constant function x \mapsto 1, so (f \circ g)' = 1g' = g'.
any differentiable function the identity function \! f' f \circ g = f, so (f \circ g)' = f'. (f \circ g)' = (f' \circ g) \cdot g'. Since g is the identity function, g' is the function x \mapsto 1. Also, f' \circ g = f'. Thus, (f \circ g)' = f' \cdot 1 = f'.
the square function any differentiable function \! x \mapsto 2g(x)g'(x) f(g(x)) = (g(x))^2 and hence its derivative can be computed using the product rule for differentiation. It comes out as 2g(x)g'(x). (f \circ g)' = (f' \circ g) \cdot g'. f' is the derivative of the square function, and therefore is x \mapsto 2x. Thus, \! f'(g(x)) = 2g(x). We thus get (f \circ g)' = 2g(x)g'(x).
a one-one differentiable function the inverse function of f 1 f(g(x)) = x for all x, so the derivative is the function 1. (f \circ g)' = (f' \circ g) \cdot g'. By the inverse function theorem, we know that g' = 1/(f' \circ g), so plugging in, we get (f \circ g)' = (f' \circ g) \cdot 1/(f' \circ g) = 1.

Nontrivial examples

The chain rule is necessary for computing the derivatives of functions whose definition requires one to compose functions. The chain rule still isn't the only option: one can always compute the derivative as a limit of a difference quotient. But it does offer the only option if one restricts oneself to operating within the family of differentiation rules.

Some examples of functions for which the chain rule needs to be used include:

  • A trigonometric function applied to a nonlinear algebraic function
  • An exponential function applied to a nonlinear algebraic function
  • A composite of two trigonometric functions, two exponential functions, or an exponential and a trigonometric function

A few examples are below.

Sine of square function

Consider the sine of square function:

x \mapsto \sin(x^2).


We use the chain rule for differentiation viewing the function as the composite of the square function on the inside and the sine function on the outside:

\frac{d}{dx}[\sin(x^2)] = \frac{d(\sin(x^2))}{d(x^2)} \frac{d(x^2)}{dx} = (\cos(x^2))(2x) = 2x\cos(x^2)


Sine of sine function

Consider the sine of sine function:

x \mapsto \sin(\sin x)

The derivative is:

(\sin \circ \sin)'(x) = (\sin' \circ \sin)(x)\sin'(x) = \cos(\sin x)\cos(x)