College:Product rule for differentiation

From Calculus
Jump to: navigation, search

This page uses material from product rule for differentiation and Practical:Product rule for differentiation.

ORIGINAL FULL PAGE: Product rule for differentiation
STUDY THE TOPIC AT MULTIPLE LEVELS: Page for school students (first-time learners) | Page for college students (second-time learners) | Page for math majors and others passionate about math |
ALSO CHECK OUT: Practical tips on the topic |Quiz (multiple choice questions to test your understanding) |Pedagogy page (discussion of how this topic is or could be taught)|Page with videos on the topic, both embedded and linked to


Name

This statement is called the product rule, product rule for differentiation, or Leibniz rule.

Statement for two functions

Statement in multiple versions

The product rule is stated in many versions:

Version type Statement
specific point, named functions Suppose f and g are functions of one variable, both of which are differentiable at a real number x = x_0. Then, the product function f \cdot g, defined as x \mapsto f(x)g(x) is also differentiable at x = x_0, and the derivative at x_0 is given as follows:

\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = f'(x_0)g(x_0) + f(x_0)g'(x_0)
or equivalently:
\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = \frac{d(f(x))}{dx}|_{x=x_0} \cdot g(x_0) + f(x_0)\cdot \frac{d(g(x))}{dx}|_{x = x_0}

generic point, named functions, point notation Suppose f and g are functions of one variable. Then the following is true wherever the right side expression makes sense (see concept of equality conditional to existence of one side):
\! \frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)
generic point, named functions, point-free notation Suppose f and g are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense (see concept of equality conditional to existence of one side):
\! (f \cdot g)' = (f'\cdot g) + (f \cdot g')
We could also write this more briefly as:
\! (fg)' = f'g + fg'
Note that the domain of (fg)' may be strictly larger than the intersection of the domains of f' and g', so the equality need not hold in the sense of equality as functions if we care about the domains of definition.
Pure Leibniz notation using dependent and independent variables Suppose u,v are variables both of which are functionally dependent on x. Then:
\! \frac{d(uv)}{dx} = \left(\frac{du}{dx}\right) v + u \frac{dv}{dx}
In terms of differentials Suppose u,v are both variables functionally dependent on x. Then,
\! d(uv) = v (du) + u (dv).
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a \{ \}_0 subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.


One-sided version

The product rule for differentiation has analogues for one-sided derivatives. More explicitly, we can replace all occurrences of derivatives with left hand derivatives and the statements are true. Alternately, we can replace all occurrences of derivatives with right hand derivatives and the statements are true.

Partial differentiation

For further information, refer: product rule for partial differentiation

The product rule is also valid if we consider functions of more than one variable and replace the ordinary derivative by the partial derivative, directional derivative, or gradient vector.

Statement for multiple functions

Below, we formulate the many versions of this product rule:

Version type Statement
specific point, named functions Suppose f_1, f_2, \dots, f_n are functions defined and differentiable at a point x_0. Then the product f_1 \cdot f_2 \cdot \dots \cdot f_n is also differentiable at x_0, and we have:
\! \frac{d}{dx}[f_1(x)f_2(x)\dots f_n(x)]|_{x = x_0} = f_1'(x_0)f_2(x_0) \dots f_n(x_0) + f_1(x_0)f_2'(x_0) \dots f_n(x_0) + \dots + f_1(x_0)f_2(x_0) \dots f_{n-1}(x_0)f_n'(x_0)
generic point, named functions, point notation Suppose f_1, f_2, \dots, f_n are functions. Then the product f_1 \cdot f_2 \cdot \dots \cdot f_n satisfies:
\! (f_1 \cdot f_2 \cdot \dots \cdot f_n)'(x) = f_1'(x)f_2(x) \dots f_n(x) + f_1(x)f_2'(x) \dots f_n(x) + \dots + f_1(x)f_2(x) \dots f_{n-1}(x)f_n'(x) wherever the right side makes sense.
generic points, named functions, point-free notation Suppose f_1, f_2, \dots, f_n are functions. Then the product f_1 \cdot f_2 \cdot \dots \cdot f_n satisfies:
\! (f_1 \cdot f_2 \cdot \dots \cdot f_n)' = f_1' \cdot f_2 \cdot \dots f_n + f_1 \cdot f_2' \cdot \dots \cdot f_n + \dots + f_1 \cdot f_2 \cdot \dots \cdot f_{n-1} \cdot f_n' wherever the right side makes sense. We could also write this more briefly as
\! (f_1 f_2 \dots f_n)' = f_1'  f_2  \dots f_n + f_1  f_2' \dots  f_n + \dots + f_1 f_2  \dots \cdot f_{n-1}f_n'
Pure Leibniz notation using dependent and independent variables Suppose u_1,u_2,\dots,u_n are variables functionally dependent on x. Then \frac{d(u_1u_2\dots u_n)}{dx} = \left(\frac{du_1}{dx}\right)(u_2u_3 \dots u_n) + u_1 \left(\frac{du_2}{dx}\right) (u_3 \dots u_n) + \dots + u_1u_2 \dots u_{n-1} \left(\frac{du_n}{dx}\right) wherever the right side make sense.
In terms of differentials Suppose u_1,u_2,\dots,u_n are variables functionally dependent on x. Then d(u_1u_2\dots u_n) = u_2u_3 \dots u_n(du_1) + u_1u_3 \dots u_n(du_2) + \dots + u_1u_2 \dots u_{n-1}(du_n)

For instance, using the generic point, named functions notation for n = 3, we get:

\! (f_1 \cdot f_2 \cdot f_3)'(x) = f_1'(x)f_2(x)f_3(x) + f_1(x)f_2'(x)f_3(x) + f_1(x)f_2(x)f_3'(x)


Significance

Qualitative and existential significance

Each of the versions has its own qualitative significance:

Version type Significance
specific point, named functions This tells us that if f and g are both differentiable at a point, so is f \cdot g. The one-sided versions allow us to make similar statement for left and right differentiability.
generic point, named functions, point notation This tells us that if both f and g are differentiable on an open interval, then so is f \cdot g. The one-sided versions allow us to make similar statements for closed intervals where we require the appropriate one-sided differentiability at the endpoints.
generic point, point-free notation This can be used to deduce more, namely that the nature of (f \cdot g)' depends strongly on the nature of f and that of g. In particular, if f and g are both continuously differentiable functions on an interval (i.e., f' and g' are both continuous on that interval), then (f \cdot g) is also continuously differentiable on that interval. This uses the sum theorem for continuity and product theorem for continuity.


Computational feasibility significance

Each of the versions has its own computational feasibility significance:

Version type Significance
specific point, named functions This tells us that knowledge of the values (in the sense of numerical values) \! f(x_0), g(x_0), f'(x_0), g'(x_0) at a specific point x_0 is sufficient to compute the value of (f \cdot g)'(x_0). For instance, if we are given that f(1) = 5, g(1) = 11, f'(1) = 4, g'(1) = 13, we obtain that (f \cdot g)'(1) = 4 \cdot 11 + 5 \cdot 13 = 44 + 65 = 109.
A note on contrast with the (false) freshman product rule: [SHOW MORE]
generic point, named functions This tells us that knowledge of the general expressions for f and g and the derivatives of f and g is sufficient to compute the general expression for the derivative of f \cdot g. See the #Examples section of this page for more examples.


Computational results significance

Each of the versions has its own computational results significance:

Shorthand Significance What would happen if the freshman product rule were true instead of the product rule?
significance of derivative being zero If \! f'(x_0) and \! g'(x_0) are both equal to 0, then so is (f \cdot g)'(x_0). In other words, if the tangents to the graphs of f,g are both horizontal at the point x = x_0, so is the tangent to the graph of f \cdot g. This result would still hold, but so would a stronger result: namely that if either f'(x_0) or g'(x_0) is zero, so is (f \cdot g)'(x_0).
significance of sign of derivative \! f'(x_0) and \! g'(x_0) both being positive is not sufficient to ensure that (f \cdot g)'(x_0) is positive. However, if all four of \! f(x_0), g(x_0), f'(x_0), g'(x_0) are positive, then (f \cdot g)'(x_0) is positive. This is related to the fact that a product of increasing functions need not be increasing. In that case, it would be true that \! f'(x_0) and \! g'(x_0) both being positive is sufficient to ensure that (f \cdot g)'(x_0) is positive.
significance of uniform bounds \! f',g' both being uniformly bounded is not sufficient to ensure that (f \cdot g)' is uniformly bounded. However, if all four functions \! f,g,f',g' are uniformly bounded, then indeed (f \cdot g)' is uniformly bounded. In that case, it would be true that \! f'(x_0) and \! g'(x_0) both uniformly being bounded is sufficient to ensure that (f \cdot g)'(x_0) is uniformly bounded.


Examples

For practical tips and explanations on how to apply the product rule in practice, check out Practical:Product rule for differentiation

Sanity checks

We first consider examples where the product rule for differentiation confirms something we already knew through other means. In all examples, we assume that both f and g are differentiable functions:

Case The derivative of x \mapsto f(x)g(x) Direct justification (without use of product rule) Justification using product rule, i.e., computing it as \! f'(x)g(x) + f(x)g'(x)
g is the zero function. zero function \! f(x)g(x) = 0 for all x, so its derivative is also zero . Both \! g(x) and \! g'(x) are zero functions, so \! f'(x)g(x) + f(x)g'(x) is everywhere zero.
g is a constant nonzero function with value \lambda. \! \lambda f'(x) The function is x \mapsto \lambda f(x), and the derivative is \! \lambda f'(x), because the constant can be pulled out of the differentiation process. \! f'(x)g(x) simplifies to \! \lambda f'(x). Since g is constant, g'(x) is the zero function, hence so is \! f(x)g'(x). The sum is thus \! \lambda f'(x).
f = g \! 2f(x)f'(x) The derivative is \! 2f(x)f'(x) by the chain rule for differentiation: we are composing the square function and f. We get \! f'(x)f(x) + f(x)f'(x) = 2f(x)f'(x).
g = 1/f zero function The product is 1, which is a constant function, so its derivative is zero. We get f(x)(1/f)'(x) + f'(x)/f(x). By the chain rule, (1/f)'(x) = -f'(x)/(f(x))^2, so plugging in, we get -f(x)f'(x)/(f(x))^2 + f'(x)/f(x), which simplifies to zero.



Nontrivial examples where simple alternate methods exist

Here is a simple trigonometric example:

\! f(x) := \sin x \cos x.

[SHOW MORE]

Nontrivial examples where simple alternate methods do not exist

Consider a product of the form:

\! f(x) := x \sin x

Using the product rule, we get:

f'(x) = \frac{dx}{dx} \sin x + x \frac{d}{dx}(\sin x) = 1(\sin x) + x \cos x = \sin x + x \cos x



Procedure to apply the product rule for differentiation

The product rule for differentiation is useful as a technique for differentiating functions that are expressed in the form of products of simpler functions.

Most explicit procedure

The explicit procedure is outlined below:

  1. Identify the two functions whose product is the given function. In other words, explicitly decompose the function as a product of two functions. We will here call the functions f and g, though you may choose to give them different names.
  2. Calculate the derivatives of f and g separately, on the side.
  3. Plug into the product rule formula the expressions for the functions and their derivatives.
  4. Simplify the expression thus obtained (this is optional).

Here is an example of a differentiation problem where we use this explicit procedure:

Differentiate the function p(x) := (x^2 + 1)(x^3 + 2) with respect to x

We proceed step by step:

  1. Identify the two functions: Define f(x) = x^2 + 1 and g(x) = x^3 + 2. Then, p(x)= f(x)g(x) by definition.
  2. Calculate the derivatives: The derivative of f(x) is f'(x) = 2x and the derivative of g(x) is g'(x) = 3x^2.
  3. Plug into the product rule formula: We get p'(x) = f'(x)g(x) + f(x)g'(x) = (2x)(x^3 + 2) + (x^2 + 1)(3x^2).
  4. Simplify the expression obtained: We get p'(x) = 2x^4 + 4x + 3x^4 + 3x^2 = 5x^4 + 3x^2 + 4x.

Here is another example of a differentiation problem where we use this explicit procedure:

Differentiate the function p(x) := e^x \sin x
  1. Identify the two functions: Define f(x) = e^x and g(x) = \sin x
  2. Calculate the derivatives: The derivative of f(x) is f'(x) = e^x and the derivative of g(x) is g'(x) = \cos x.
  3. Plug into the product rule formula: We get p'(x) = f'(x)g(x) + f(x)g'(x) = (e^x)(\sin x) + (e^x)(\cos x).
  4. Simplify the expression obtained: The expression is already simplified. If we wish to collect terms, we can rewrite as e^x(\sin x + \cos x).

More inline procedure using Leibniz notation

Although the explicit procedure above is fairly clear, Step (2) of the procedure can be a waste of time in the sense of having to do the derivative calculations separately. If you are more experienced with doing differentiation quickly, you can combine Steps (2) and (3) by calculating the derivatives while plugging into the formula, rather than doing the calculations separately prior to plugging into the formula. Further, we do not need to explicitly name the functions if we use the Leibniz notation to compute the derivatives inline.

The shorter procedure is outlined below:

  1. Identify the two functions being multiplied (but you don't have to give them names).
  2. Plug into the formula for the product rule, using the Leibniz notation for derivatives that have not yet been computed.
  3. Compute derivatives and simplify

For instance, consider the problem:

Differentiate the function p(x) := (2x^3 + 3x - \sin x)(x^2 + \cos x - 3)

The procedure is:

  1. Identify the two functions: The functions are x \mapsto 2x^3 + 3x - \sin x and x \mapsto x^2 + \cos x - 3.
  2. Plug into the formula for the product rule: We get:
    \frac{d(2x^3 + 3x - \sin x)}{dx}(x^2 + \cos x - 3) + (2x^3 + 3x - \sin x)\frac{d(x^2 + \cos x - 3)}{dx}
  3. Compute derivatives and simplify: we get:
    (6x^2 + 3 - \cos x)(x^2 + \cos x - 3) + (2x^3 + 3x - \sin x)(2x - \sin x). The expression can be expanded and simplified if desired.

Shortest inline procedure

If you are really experienced with doing derivatives in your head, you can shorten the procedure even further by combining Steps (2) and (3) in the previous procedure. The procedure has two steps:

  1. Identify the two functions being multiplied (but you don't have to give them names).
  2. Use the formula for the product rule, computing the derivatives of the functions while plugging them into the formula

For instance, going back to the example used in the beginning:

Differentiate the function p(x) := (x^2 + 1)(x^3 + 2) with respect to x

This can be done quickly:

  1. Identify the two functions: They are x \mapsto x^2 + 1 and x \mapsto x^3 + 2.
  2. Use the formula for the product rule, computing the derivatives of the functions while plugging them into the formula: We get (2x)(x^3 + 2) + (x^2 + 1)(3x^2) = 5x^4 + 3x^2 + 4x.

Choosing between procedures

The procedures are not fundamentally different, but they differ in the degree of explicitness of the steps. Generally speaking, the following are recommended:

  • If the functions being multiplied are fairly easy to differentiate mentally, use the shortest inline procedure -- this is fast and reliable.
  • If the functions being multiplied are somewhat more difficult to differentiate, then choose between the other two more explicit procedures, based on whether you are more comfortable with writing large inline expressions or with doing separate work on the side.

Error types

Incorrect formula

A common mistake in differentiating products of functions is the freshman product rule, i.e., the false rule that the derivative of the product is the product of the derivatives. The good news is that, generally speaking, it is easy to avoid this rule once you have enough experience with the actual product rule.

Writing only one piece of the product rule

This is an error of the incomplete task form and is harder to avoid. What happens here is that you forget to write one of the two pieces being added for the product rule, so perhaps you end up doing:

\frac{d}{dx}[f(x)g(x)] = f'(x)g(x) \qquad \mbox{WRONG! Forgot one half of the product rule}

or

\frac{d}{dx}[f(x)g(x)] = f(x)g'(x) \qquad\mbox{WRONG! Forgot one half of the product rule}

A slight variant involves forgetting one of the factors being multiplied:

\frac{d}{dx}[f(x)g(x)] = f'(x) + f(x)g'(x) \qquad \mbox{WRONG! Forgot one of the pieces being multiplied}

Why this error occurs: Usually, this error is common if you are trying to use the shortest inline procedure, i.e., differentiating the functions and applying the product rule simultaneously, and one of the functions being differentiated is rather tricky to differentiate, requiring a product rule or chain rule for differentiation in and of itself.

How to avoid this error:

  • When the functions being differentiated are tricky to differentiate, use either the fully explicit procedure or the inline procedure with Leibniz notation. Do not try to simultaneously differentiate the pieces and use the product rule.
  • After finishing a product rule problem, ask the following sanity check question: did I get a sum of two products after the application of the product rule? If the answer is no, then check your work.


How to remember the formula

Different versions of the formula

Recall that we stated the product rule as:

\frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)

The right side could be written in any of eight equivalent ways:

  1. f'(x)g(x) + f(x)g'(x)
  2. g(x)f'(x) + f(x)g'(x)
  3. g(x)f'(x) + g'(x)f(x)
  4. f'(x)g(x) + g'(x)f(x)
  5. f(x)g'(x) + f'(x)g(x)
  6. g'(x)f(x) + f'(x)g(x)
  7. g'(x)f(x) + g(x)f'(x)
  8. f(x)g'(x) + g(x)f'(x)

All of these are equivalent, so in some sense it does not matter which of the versions you choose to remember. However, the first version is somewhat preferable to the others for a number of reasons mentioned below:

  • The order of multiplication is the same as in the expression being differentiated: the f-part is on the left and the g-part is on the right. This is true only of versions (1) and (5). This is particularly important when we consider generalization to non-commutative situations such as product rule for differentiation of cross product.
  • The differentiation symbol (the prime) hops from left to right: Remembering the rule this way makes it easier to generalize to differentiating products of more than two functions (see product rule for differentiation#Statement for multiple functions).

Quick rationalizations for the formula

If you are a little shaky about the product rule for differentiation, how do you do a reality check on the formula? A quick, intuitive version of the proof of product rule for differentiation using chain rule for partial differentiation will help. Basically, what it says is that to determine how the product changes, we need to count the contributions of each factor being multiplied, keeping the other constant. The contribution of f keeping g constant is f'(x)g(x) and the contribution of g keeping f constant is f(x)g'(x).