# Product rule for differentiation

This article is about a differentiation rule, i.e., a rule for differentiating a function expressed in terms of other functions whose derivatives are known.
View other differentiation rules

## Name

This statement is called the product rule, product rule for differentiation, or Leibniz rule.

## Statement for two functions

### Verbal statement

If two (possibly equal) functions are differentiable at a given real number, then their pointwise product is also differentiable at that number and the derivative of the product is the sum of two terms: the derivative of the first function times the second function and the first function times the derivative of the second function.

### Statement with symbols

The product rule is stated in many versions:

Version type Statement
specific point, named functions Suppose $f$ and $g$ are functions of one variable, both of which are differentiable at a real number $x = x_0$. Then, the product function $f \cdot g$, defined as $x \mapsto f(x)g(x)$ is also differentiable at $x = x_0$, and the derivative at $x_0$ is given as follows:

$\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = f'(x_0)g(x_0) + f(x_0)g'(x_0)$
or equivalently:
$\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = \frac{d(f(x))}{dx}|_{x=x_0} \cdot g(x_0) + f(x_0)\cdot \frac{d(g(x))}{dx}|_{x = x_0}$

generic point, named functions, point notation Suppose $f$ and $g$ are functions of one variable. Then the following is true wherever the right side expression makes sense:
$\! \frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)$
generic point, named functions, point-free notation Suppose $f$ and $g$ are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense:
$\! (f \cdot g)' = (f' \cdot g) + (f \cdot g')$
Pure Leibniz notation using dependent and independent variables Suppose $u,v$ are variables both of which are functionally dependent on $x$. Then:
$\! \frac{d(uv)}{dx} = v \frac{du}{dx} + u \frac{dv}{dx}$
In terms of differentials Suppose $u,v$ are both variables functionally dependent on $x$. Then,
$\! d(uv) = v (du) + u (dv)$.
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a $\{ \}_0$ subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.

### One-sided version

The product rule for differentiation has analogues for one-sided derivatives. More explicitly, we can replace all occurrences of derivatives with left hand derivatives and the statements are true. Alternately, we can replace all occurrences of derivatives with right hand derivatives and the statements are true.

## Statement for multiple functions

If $f_1,f_2,\dots,f_n$ are all functions, and we define $F(x) := f_1(x)f_2(x) \dots f_n(x)$, then we have:

$F'(x) = f_1'(x)f_2(x) \dots f_n(x) + f_1(x)f_2'(x) \dots f_n(x) + \dots + f_1(x)f_2(x) \dots f_{n-1}(x)f_n'(x)$

In other words, we get a sum of $n$ terms, each of which is a product of $n$ evaluations, of which only one is a derivative, and the one we choose as the derivative cycles through all the $n$ possibilities.

For instance, if $n = 3$, we get:

$\! F'(x) = f_1'(x)f_2(x)f_3(x) + f_1(x)f_2'(x)f_3(x) + f_1(x)f_2(x)f_3'(x)$

## Reversal for integration

The reverse to this rule, that is helpful for indefinite integrations, is a method called integration by parts.

## Significance

### Qualitative and existential significance

Each of the versions has its own qualitative significance:

Version type Significance
specific point, named functions This tells us that if $f$ and $g$ are both differentiable at a point, so is $f \cdot g$. The one-sided versions allow us to make similar statement for left and right differentiability.
generic point, named functions, point notation This tells us that if both $f$ and $g$ are differentiable on an open interval, then so is $f \cdot g$. The one-sided versions allow us to make similar statements for closed intervals where we require the appropriate one-sided differentiability at the endpoints.
generic point, point-free notation This can be used to deduce more, namely that the nature of $(f \cdot g)'$ depends strongly on the nature of $f$ and that of $g$. In particular, if $f$ and $g$ are both continuously differentiable functions on an interval (i.e., $f'$ and $g'$ are both continuous on that interval), then $(f \cdot g)'$ is also continuously differentiable on that interval. This uses the sum theorem for continuity and product theorem for continuity.

### Computational feasibility significance

Each of the versions has its own computational feasibility significance:

Version type Significance
specific point, named functions This tells us that knowledge of the values (in the sense of numerical values) $\! f(x_0), g(x_0), f'(x_0), g'(x_0)$ at a specific point $x_0$ is sufficient to compute the value of $(f \cdot g)'(x_0)$. For instance, if we are given that $f(1) = 5, g(1) = 11, f'(1) = 4, g'(1) = 13$, we obtain that $(f \cdot g)'(1) = 4 \cdot 11 + 5 \cdot 13 = 44 + 65 = 109$.
A note on contrast with the (false) freshman product rule: [SHOW MORE]
generic point, named functions This tells us that knowledge of the general expressions for $f$ and $g$ and the derivatives of $f$ and $g$ is sufficient to compute the general expression for the derivative of $f \cdot g$. See the #Examples section of this page for more examples.

### Computational results significance

Each of the versions has its own computational results significance:

Shorthand Significance What would happen if the freshman product rule were true instead of the product rule?
significance of derivative being zero If $\! f'(x_0)$ and $\! g'(x_0)$ are both equal to 0, then so is $(f \cdot g)'(x_0)$. In other words, if the tangents to the graphs of $f,g$ are both horizontal at the point $x = x_0$, so is the tangent to the graph of $f \cdot g$. This result would still hold
significance of sign of derivative $\! f'(x_0)$ and $\! g'(x_0)$ both being positive is not sufficient to ensure that $(f \cdot g)'(x_0)$ is positive. However, if all four of $\! f(x_0), g(x_0), f'(x_0), g'(x_0)$ are positive, then $(f \cdot g)'(x_0)$ is positive. This is related to the fact that a product of increasing functions need not be increasing. In that case, it would be true that $\! f'(x_0)$ and $\! g'(x_0)$ both being positive is sufficient to ensure that $(f \cdot g)'(x_0)$ is positive.
significance of uniform bounds $\! f',g'$ both being uniformly bounded is not sufficient to ensure that $(f \cdot g)'$ is uniformly bounded. However, if all four functions $\! f,g,f',g'$ are uniformly bounded, then indeed $(f \cdot g)'$ is uniformly bounded. In that case, it would be true that $\! f'(x_0)$ and $\! g'(x_0)$ both uniformly being bounded is sufficient to ensure that $(f \cdot g)'(x_0)$ is uniformly bounded.

## Case of infinite or undefined values

The product rule for differentiation has analogues for infinities, with the appropriate caveats about indeterminate forms. Specifically, we have the following:

$f(x_0)$ $g(x_0)$ $f'(x_0)$ $g'(x_0)$ Conclusion about $(f \cdot g)'(x_0)$ Explanation
finite finite undefined undefined insufficient information (could be finite or undefined) We don't know the details behind the undefined
nonzero nonzero and same sign as $f(x_0)$ vertical tangent vertical tangent of same type as for $f$ (i.e., either both are increasing or both are decreasing) vertical tangent, type (increasing/decreasing) is determined by signs of $f,g$ and types of vertical tangent for $f,g$ [SHOW MORE]
nonzero nonzero and opposite sign to $f(x_0)$ vertical tangent vertical tangent of same type as for $f$ (i.e., either both are increasing or both are decreasing) insufficient information [SHOW MORE]
nonzero nonzero and same sign as $f(x_0)$ vertical tangent vertical tangent of opposite type as for $f$ (i.e., one is increasing and one is decreasing) insufficient information
nonzero nonzero and opposite sign to $f(x_0)$ vertical tangent vertical tangent of opposite type as for $f$ (i.e., one is increasing and one is decreasing) vertical tangent, type depends on signs
zero known whether it is zero, positive, or negative known whether it is finite, vertical tangent, etc. vertical tangent insufficient information in all cases.

## Examples

### Trivial examples

We first consider examples where the product rule for differentiation confirms something we already knew through other means:

Case The derivative of $x \mapsto f(x)g(x)$ Direct justification (without use of product rule) Justification using product rule, i.e., computing it as $\! f'(x)g(x) + f(x)g'(x)$
$g$ is the zero function. zero function $\! f(x)g(x) = 0$ for all $x$, so its derivative is also zero . Both $\! g(x)$ and $\! g'(x)$ are zero functions, so $\! f'(x)g(x) + f(x)g'(x)$ is everywhere zero.
$g$ is a constant nonzero function with value $\lambda$. $\! \lambda f'(x)$ The function is $x \mapsto \lambda f(x)$, and the derivative is $\! \lambda f'(x)$, because the constant can be pulled out of the differentiation process. $\! f'(x)g(x)$ simplifies to $\! \lambda f'(x)$. Since $g$ is constant, $g'(x)$ is the zero function, hence so is $\! f(x)g'(x)$. The sum is thus $\! \lambda f'(x)$.
$f = g$ $\! 2f(x)f'(x)$ The derivative is $\! 2f(x)f'(x)$ by the chain rule for differentiation: we are composing the square function and $f$. We get $\! f'(x)f(x) + f(x)f'(x) = 2f(x)f'(x)$.

### Nontrivial examples where simple alternate methods exist

Here is a simple trigonometric example:

$\! f(x) := \sin x \cos x$.

$\! f(x) := x \sin x$
$f'(x) = \frac{dx}{dx} \sin x + x \frac{d}{dx}(\sin x) = 1(\sin x) + x \cos x = \sin x + x \cos x$