Product rule for differentiation: Difference between revisions

From Calculus
No edit summary
Line 13: Line 13:
===Statement with symbols===
===Statement with symbols===


Suppose <math>f</math> and <math>g</math> are functions, both of which are differentiable at a real number <math>x = x_0</math>. Then, the product function <math>f \cdot g</math>, defined as <math>x \mapsto f(x)g(x)</math> is also differentiable at <math>x</math>, and the derivative at <math>x_0</math> is given as follows:
The product rule is stated in many versions:
 
<math>\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = f'(x_0)g(x_0) + f(x_0)g'(x_0)</math>
 
or equivalently:


{| class="sortable" border="1"
! Version type !! Statement
|-
| specific point, named functions || Suppose <math>f</math> and <math>g</math> are functions of one variable, both of which are differentiable at a real number <math>x = x_0</math>. Then, the product function <math>f \cdot g</math>, defined as <math>x \mapsto f(x)g(x)</math> is also differentiable at <math>x = x_0</math>, and the derivative at <math>x_0</math> is given as follows:<br>
<math>\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = f'(x_0)g(x_0) + f(x_0)g'(x_0)</math><br>
or equivalently:<br>
<math>\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = \frac{d(f(x))}{dx}|_{x=x_0} \cdot g(x_0) + f(x_0)\cdot \frac{d(g(x))}{dx}|_{x = x_0}</math>
<math>\! \frac{d}{dx} [f(x)g(x)]|_{x = x_0} = \frac{d(f(x))}{dx}|_{x=x_0} \cdot g(x_0) + f(x_0)\cdot \frac{d(g(x))}{dx}|_{x = x_0}</math>
|-
| general expressions, point notation || Suppose <math>f</math> and <math>g</math> are functions of one variable. Then the following is true wherever the right side expression makes sense:<br><math>\! \frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)</math>
|-
| general expression, point-free notation || Suppose <math>f</math> and <math>g</math> are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense: <br><math>\! (f \cdot g)' = (f' \cdot g) + (f \cdot g')</math>
|-
| Pure Leibniz notation using dependent and independent varibales || Suppose <math>u,v</math> are variables both of which are functionally dependent on <math>x</math>. Then:<br>\! \frac{d(uv)}{dx} = v \frac{du}{dx} + u \frac{dv}{dx}</math>
|-
| In terms of differentials || Suppose <math>u,v</math> are both variables functionally dependent on <math>x</math>. Then, <math>d(uv) = v (du) + u (dv)</math>.
|}


If we consider the general expressions rather than evaluation at a particular point <math>x_0</math>, we can rewrite the above as:
{{generic point specific point confusion}}
 
<math>\! \frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)</math>
 
or equivalently:
 
<math>(f \cdot g)' = (f' \cdot g) + (f \cdot g')</math>


==Statement for multiple functions==
==Statement for multiple functions==

Revision as of 15:49, 15 October 2011

This article is about a differentiation rule, i.e., a rule for differentiating a function expressed in terms of other functions whose derivatives are known.
View other differentiation rules

Name

This statement is called the product rule, product rule for differentiation, or Leibniz rule.

Statement for two functions

Verbal statement

If two (possibly equal) functions are differentiable at a given real number, then their pointwise product is also differentiable at that number and the derivative of the product is the sum of two terms: the derivative of the first function times the second function and the first function times the derivative of the second function.

Statement with symbols

The product rule is stated in many versions:

Version type Statement
specific point, named functions Suppose f and g are functions of one variable, both of which are differentiable at a real number x=x0. Then, the product function fg, defined as xf(x)g(x) is also differentiable at x=x0, and the derivative at x0 is given as follows:

ddx[f(x)g(x)]|x=x0=f(x0)g(x0)+f(x0)g(x0)
or equivalently:
ddx[f(x)g(x)]|x=x0=d(f(x))dx|x=x0g(x0)+f(x0)d(g(x))dx|x=x0

general expressions, point notation Suppose f and g are functions of one variable. Then the following is true wherever the right side expression makes sense:
ddx[f(x)g(x)]=f(x)g(x)+f(x)g(x)
general expression, point-free notation Suppose f and g are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense:
(fg)=(fg)+(fg)
Pure Leibniz notation using dependent and independent varibales Suppose u,v are variables both of which are functionally dependent on x. Then:
\! \frac{d(uv)}{dx} = v \frac{du}{dx} + u \frac{dv}{dx}</math>
In terms of differentials Suppose u,v are both variables functionally dependent on x. Then, d(uv)=v(du)+u(dv).

MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a

{}0

subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.

Statement for multiple functions

If f1,f2,,fn are all functions, and we define F(x):=f1(x)f2(x)fn(x), then we have:

F(x)=f1(x)f2(x)fn(x)+f1(x)f2(x)fn(x)++f1(x)f2(x)fn1(x)fn(x)

In other words, we get a sum of n terms, each of which is a product of n evaluations, of which only one is a derivative, and the one we choose as the derivative cycles through all the n possibilities.

For instance, if n=3, we get:

F(x)=f1(x)f2(x)f3(x)+f1(x)f2(x)f3(x)+f1(x)f2(x)f3(x)

Related rules

Examples

Trivial examples

We first consider examples where the product rule for differentiation confirms something we already knew through other means:

Case What we know about the derivative of xf(x)g(x) What we know about f(x)g(x)+f(x)g(x)
g is the zero function. The derivative is the zero function, because f(x)g(x)=0 for all x. Both g(x) and g(x) are zero functions, so f(x)g(x)+f(x)g(x) is everywhere zero.
g is a constant nonzero function with value λ. The derivative is λf(x), because the constant can be pulled out of the differentiation process. f(x)g(x) simplifies to λf(x). Since g is constant, g(x) is the zero function, hence so is f(x)g(x). The sum is thus λf(x).
f=g The derivative is 2f(x)f(x) by the chain rule for differentiation: we are composing the square function and f. We get f(x)f(x)+f(x)f(x)=2f(x)f(x).

Nontrivial examples where simple alternate methods exist

Fill this in later

Nontrivial examples where simple alternate methods do not exist

Fill this in later