Difference between revisions of "Product rule for differentiation"
(→Partial differentiation) 
(→Qualitative and existential significance) 

(24 intermediate revisions by the same user not shown)  
Line 2:  Line 2:  
{{differentiation rule}}  {{differentiation rule}}  
+  <section begin="college"/>  
+  <section begin="mathmajor"/>  
==Name==  ==Name==  
This statement is called the '''product rule''', '''product rule for differentiation''', or '''Leibniz rule'''.  This statement is called the '''product rule''', '''product rule for differentiation''', or '''Leibniz rule'''.  
−  +  <section begin="school"/>  
==Statement for two functions==  ==Statement for two functions==  
−  +  ===Statement in multiple versions===  
−  
−  
−  
−  ===Statement  
The product rule is stated in many versions:  The product rule is stated in many versions:  
Line 24:  Line 22:  
<math>\! \frac{d}{dx} [f(x)g(x)]_{x = x_0} = \frac{d(f(x))}{dx}_{x=x_0} \cdot g(x_0) + f(x_0)\cdot \frac{d(g(x))}{dx}_{x = x_0}</math>  <math>\! \frac{d}{dx} [f(x)g(x)]_{x = x_0} = \frac{d(f(x))}{dx}_{x=x_0} \cdot g(x_0) + f(x_0)\cdot \frac{d(g(x))}{dx}_{x = x_0}</math>  
    
−   generic point, named functions, point notation  Suppose <math>f</math> and <math>g</math> are functions of one variable. Then the following is true wherever the right side expression makes sense:<br><math>\! \frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)</math>  +   generic point, named functions, point notation  Suppose <math>f</math> and <math>g</math> are functions of one variable. Then the following is true wherever the right side expression makes sense (see [[concept of equality conditional to existence of one side]]):<br><math>\! \frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x)</math> 
    
−   generic point, named functions, pointfree notation  Suppose <math>f</math> and <math>g</math> are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense: <br><math>\! (f \cdot g)' = (f' \cdot g) + (f \cdot g')</math>  +   generic point, named functions, pointfree notation  Suppose <math>f</math> and <math>g</math> are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense (see [[concept of equality conditional to existence of one side]]): <br><math>\! (f \cdot g)' = (f'\cdot g) + (f \cdot g')</math><br>We could also write this more briefly as: <br><math>\! (fg)' = f'g + fg'</math><br>Note that the domain of <math>(fg)'</math> may be strictly larger than the intersection of the domains of <math>f'</math> and <math>g'</math>, so the equality need not hold in the sense of equality as functions if we care about the domains of definition. 
    
−   Pure Leibniz notation using dependent and independent variables  Suppose <math>u,v</math> are variables both of which are functionally dependent on <math>x</math>. Then:<br><math>\! \frac{d(uv)}{dx} =  +   Pure Leibniz notation using dependent and independent variables  Suppose <math>u,v</math> are variables both of which are functionally dependent on <math>x</math>. Then:<br><math>\! \frac{d(uv)}{dx} = \left(\frac{du}{dx}\right) v + u \frac{dv}{dx}</math> 
    
 In terms of differentials  Suppose <math>u,v</math> are both variables functionally dependent on <math>x</math>. Then, <br><math>\! d(uv) = v (du) + u (dv)</math>.   In terms of differentials  Suppose <math>u,v</math> are both variables functionally dependent on <math>x</math>. Then, <br><math>\! d(uv) = v (du) + u (dv)</math>.  
}  }  
+  <center>{{#widget:YouTubeid=hmoNCJxyb4QY}}</center>  
{{generic point specific point confusion}}  {{generic point specific point confusion}}  
+  
+  <section end="school"/>  
===Onesided version===  ===Onesided version===  
Line 56:  Line 57:  
 generic point, named functions, point notation  Suppose <math>f_1, f_2, \dots, f_n</math> are functions. Then the [[pointwise product of functionsproduct]] <math>f_1 \cdot f_2 \cdot \dots \cdot f_n</math> satisfies:<br><math>\! (f_1 \cdot f_2 \cdot \dots \cdot f_n)'(x) = f_1'(x)f_2(x) \dots f_n(x) + f_1(x)f_2'(x) \dots f_n(x) + \dots + f_1(x)f_2(x) \dots f_{n1}(x)f_n'(x)</math> wherever the right side makes sense.   generic point, named functions, point notation  Suppose <math>f_1, f_2, \dots, f_n</math> are functions. Then the [[pointwise product of functionsproduct]] <math>f_1 \cdot f_2 \cdot \dots \cdot f_n</math> satisfies:<br><math>\! (f_1 \cdot f_2 \cdot \dots \cdot f_n)'(x) = f_1'(x)f_2(x) \dots f_n(x) + f_1(x)f_2'(x) \dots f_n(x) + \dots + f_1(x)f_2(x) \dots f_{n1}(x)f_n'(x)</math> wherever the right side makes sense.  
    
−   generic points, named functions, pointfree notation  Suppose <math>f_1, f_2, \dots, f_n</math> are functions. Then the [[pointwise product of functionsproduct]] <math>f_1 \cdot f_2 \cdot \dots \cdot f_n</math> satisfies:<br><math>\! (f_1 \cdot f_2 \cdot \dots \cdot f_n)' = f_1' \cdot f_2 \cdot \dots f_n + f_1 \cdot f_2' \cdot \dots \cdot f_n + \dots + f_1 \cdot f_2 \cdot \dots \cdot f_{n1} \cdot f_n'</math> wherever the right side makes sense.  +   generic points, named functions, pointfree notation  Suppose <math>f_1, f_2, \dots, f_n</math> are functions. Then the [[pointwise product of functionsproduct]] <math>f_1 \cdot f_2 \cdot \dots \cdot f_n</math> satisfies:<br><math>\! (f_1 \cdot f_2 \cdot \dots \cdot f_n)' = f_1' \cdot f_2 \cdot \dots f_n + f_1 \cdot f_2' \cdot \dots \cdot f_n + \dots + f_1 \cdot f_2 \cdot \dots \cdot f_{n1} \cdot f_n'</math> wherever the right side makes sense. We could also write this more briefly as<br><math>\! (f_1 f_2 \dots f_n)' = f_1' f_2 \dots f_n + f_1 f_2' \dots f_n + \dots + f_1 f_2 \dots \cdot f_{n1}f_n'</math> 
    
 Pure Leibniz notation using dependent and independent variables  Suppose <math>u_1,u_2,\dots,u_n</math> are variables functionally dependent on <math>x</math>. Then <math>\frac{d(u_1u_2\dots u_n)}{dx} = \left(\frac{du_1}{dx}\right)(u_2u_3 \dots u_n) + u_1 \left(\frac{du_2}{dx}\right) (u_3 \dots u_n) + \dots + u_1u_2 \dots u_{n1} \left(\frac{du_n}{dx}\right)</math> wherever the right side make sense.   Pure Leibniz notation using dependent and independent variables  Suppose <math>u_1,u_2,\dots,u_n</math> are variables functionally dependent on <math>x</math>. Then <math>\frac{d(u_1u_2\dots u_n)}{dx} = \left(\frac{du_1}{dx}\right)(u_2u_3 \dots u_n) + u_1 \left(\frac{du_2}{dx}\right) (u_3 \dots u_n) + \dots + u_1u_2 \dots u_{n1} \left(\frac{du_n}{dx}\right)</math> wherever the right side make sense.  
Line 66:  Line 67:  
<math>\! (f_1 \cdot f_2 \cdot f_3)'(x) = f_1'(x)f_2(x)f_3(x) + f_1(x)f_2'(x)f_3(x) + f_1(x)f_2(x)f_3'(x)</math>  <math>\! (f_1 \cdot f_2 \cdot f_3)'(x) = f_1'(x)f_2(x)f_3(x) + f_1(x)f_2'(x)f_3(x) + f_1(x)f_2(x)f_3'(x)</math>  
+  
+  <center>{{#widget:YouTubeid=ghadmPQxvI}}</center>  
+  
+  <section end="college"/>  
+  <section end="mathmajor"/>  
==Related rules==  ==Related rules==  
Line 89:  Line 95:  
The reverse to this rule, that is helpful for indefinite integrations, is a method called [[integration by parts]].  The reverse to this rule, that is helpful for indefinite integrations, is a method called [[integration by parts]].  
+  <section begin="school"/>  
+  <section begin="college"/>  
+  <section begin="mathmajor"/>  
==Significance==  ==Significance==  
+  
+  <section end="school"/>  
===Qualitative and existential significance===  ===Qualitative and existential significance===  
Line 102:  Line 113:  
 generic point, named functions, point notation  This tells us that if both <math>f</math> and <math>g</math> are differentiable on an open interval, then so is <math>f \cdot g</math>. The onesided versions allow us to make similar statements for closed intervals where we require the appropriate onesided differentiability at the endpoints.   generic point, named functions, point notation  This tells us that if both <math>f</math> and <math>g</math> are differentiable on an open interval, then so is <math>f \cdot g</math>. The onesided versions allow us to make similar statements for closed intervals where we require the appropriate onesided differentiability at the endpoints.  
    
−   generic point, pointfree notation  This can be used to deduce more, namely that the nature of <math>(f \cdot g)'</math> depends strongly on the nature of <math>f</math> and that of <math>g</math>. In particular, if <math>f</math> and <math>g</math> are both [[continuously differentiable function]]s on an interval (i.e., <math>f'</math> and <math>g'</math> are both continuous on that interval), then <math>(f \cdot g)  +   generic point, pointfree notation  This can be used to deduce more, namely that the nature of <math>(f \cdot g)'</math> depends strongly on the nature of <math>f</math> and that of <math>g</math>. In particular, if <math>f</math> and <math>g</math> are both [[continuously differentiable function]]s on an interval (i.e., <math>f'</math> and <math>g'</math> are both continuous on that interval), then <math>(f \cdot g)</math> is also continuously differentiable on that interval. This uses the [[sum theorem for continuity]] and [[product theorem for continuity]]. 
}  }  
+  
+  <center>{{#widget:YouTubeid=WFufWneBoec}}</center>  
+  
+  <section begin="school"/>  
===Computational feasibility significance===  ===Computational feasibility significance===  
Line 116:  Line 131:  
 generic point, named functions  This tells us that knowledge of the ''general expressions'' for <math>f</math> and <math>g</math> and the derivatives of <math>f</math> and <math>g</math> is sufficient to compute the ''general expression'' for the derivative of <math>f \cdot g</math>. See the [[#Examples]] section of this page for more examples.   generic point, named functions  This tells us that knowledge of the ''general expressions'' for <math>f</math> and <math>g</math> and the derivatives of <math>f</math> and <math>g</math> is sufficient to compute the ''general expression'' for the derivative of <math>f \cdot g</math>. See the [[#Examples]] section of this page for more examples.  
}  }  
+  
+  <section end="school"/>  
===Computational results significance===  ===Computational results significance===  
Line 124:  Line 141:  
! Shorthand !! Significance !! What would happen if the [[freshman product rule]] were true instead of the product rule?  ! Shorthand !! Significance !! What would happen if the [[freshman product rule]] were true instead of the product rule?  
    
−   significance of derivative being zero  If <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> are both equal to 0, then so is <math>(f \cdot g)'(x_0)</math>. In other words, if the tangents to the graphs of <math>f,g</math> are both horizontal at the point <math>x = x_0</math>, so is the tangent to the graph of <math>f \cdot g</math>.  This result would still hold  +   significance of derivative being zero  If <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> are both equal to 0, then so is <math>(f \cdot g)'(x_0)</math>. In other words, if the tangents to the graphs of <math>f,g</math> are both horizontal at the point <math>x = x_0</math>, so is the tangent to the graph of <math>f \cdot g</math>.  This result would still hold, but so would a stronger result: namely that if ''either'' <math>f'(x_0)</math> or <math>g'(x_0)</math> is zero, so is <math>(f \cdot g)'(x_0)</math>. 
    
 significance of sign of derivative  <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> both being positive is ''not'' sufficient to ensure that <math>(f \cdot g)'(x_0)</math> is positive. However, if ''all'' four of <math>\! f(x_0), g(x_0), f'(x_0), g'(x_0)</math> are positive, then <math>(f \cdot g)'(x_0)</math> is positive. This is related to the fact that a product of increasing functions need not be increasing.  In that case, it would be true that <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> both being positive ''is'' sufficient to ensure that <math>(f \cdot g)'(x_0)</math> is positive.   significance of sign of derivative  <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> both being positive is ''not'' sufficient to ensure that <math>(f \cdot g)'(x_0)</math> is positive. However, if ''all'' four of <math>\! f(x_0), g(x_0), f'(x_0), g'(x_0)</math> are positive, then <math>(f \cdot g)'(x_0)</math> is positive. This is related to the fact that a product of increasing functions need not be increasing.  In that case, it would be true that <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> both being positive ''is'' sufficient to ensure that <math>(f \cdot g)'(x_0)</math> is positive.  
Line 130:  Line 147:  
 significance of uniform bounds  <math>\! f',g'</math> both being uniformly bounded is not sufficient to ensure that <math>(f \cdot g)'</math> is uniformly bounded. However, if all four functions <math>\! f,g,f',g'</math> are uniformly bounded, then indeed <math>(f \cdot g)'</math> is uniformly bounded.  In that case, it would be true that <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> both uniformly being bounded ''is'' sufficient to ensure that <math>(f \cdot g)'(x_0)</math> is uniformly bounded.   significance of uniform bounds  <math>\! f',g'</math> both being uniformly bounded is not sufficient to ensure that <math>(f \cdot g)'</math> is uniformly bounded. However, if all four functions <math>\! f,g,f',g'</math> are uniformly bounded, then indeed <math>(f \cdot g)'</math> is uniformly bounded.  In that case, it would be true that <math>\! f'(x_0)</math> and <math>\! g'(x_0)</math> both uniformly being bounded ''is'' sufficient to ensure that <math>(f \cdot g)'(x_0)</math> is uniformly bounded.  
}  }  
+  
+  <section end="college"/>  
==Compatibility checks==  ==Compatibility checks==  
Line 150:  Line 169:  
* Associating as <math>(f_1 \cdot f_2) \cdot f_3</math>:  * Associating as <math>(f_1 \cdot f_2) \cdot f_3</math>:  
−  <math>((f_1 \cdot f_2) \cdot f_3)' = (f_1 \cdot f_2)' \cdot f_3 + (f_1 \cdot f_2) \cdot f_3' = (f_1' \cdot f_2 + f_1 \cdot f_2') \cdot f_3 + f_1 \cdot f_2\ cdot f_3' = f_1' \cdot f_2 \cdot f_3 + f_1 \cdot f_2' \cdot f_3 + f_1 \cdot f_2 \cdot f_3'</math>  +  <math>((f_1 \cdot f_2) \cdot f_3)' = (f_1 \cdot f_2)' \cdot f_3 + (f_1 \cdot f_2) \cdot f_3' = (f_1' \cdot f_2 + f_1 \cdot f_2') \cdot f_3 + f_1 \cdot f_2 \cdot f_3' = f_1' \cdot f_2 \cdot f_3 + f_1 \cdot f_2' \cdot f_3 + f_1 \cdot f_2 \cdot f_3'</math> 
+  
+  <center>{{#widget:YouTubeid=WFufWneBoec}}</center>  
+  
===Compatibility with linearity===  ===Compatibility with linearity===  
Line 187:  Line 209:  
==Case of infinite or undefined values==  ==Case of infinite or undefined values==  
+  
+  {{further[[Using the product rule for differentiation for limiting behavior at points with undefined derivative]]}}  
The product rule for differentiation has analogues for infinities, with the appropriate caveats about indeterminate forms. Specifically, we have the following:  The product rule for differentiation has analogues for infinities, with the appropriate caveats about indeterminate forms. Specifically, we have the following:  
Line 205:  Line 229:  
 zero  known whether it is zero, positive, or negative  known whether it is finite, vertical tangent, etc.  vertical tangent  insufficient information in all cases.    zero  known whether it is zero, positive, or negative  known whether it is finite, vertical tangent, etc.  vertical tangent  insufficient information in all cases.   
}  }  
+  
+  <section begin="school"/>  
+  <section begin="college"/>  
==Examples==  ==Examples==  
+  
+  <section end="school"/>  
+  
+  {{quotationFor practical tips and explanations on how to apply the product rule in practice, check out [[Practical:Product rule for differentiation]]}}  
===Sanity checks===  ===Sanity checks===  
−  We first consider examples where the product rule for differentiation confirms something we already knew through other means:  +  We first consider examples where the product rule for differentiation confirms something we already knew through other means. In all examples, we assume that both <math>f</math> and <math>g</math> are differentiable functions: 
{ class="sortable" border="1"  { class="sortable" border="1"  
Line 223:  Line 254:  
 <math>g = 1/f</math>  zero function  The product is <math>1</math>, which is a constant function, so its derivative is zero.  We get <math>f(x)(1/f)'(x) + f'(x)/f(x)</math>. By the chain rule, <math>(1/f)'(x) = f'(x)/(f(x))^2</math>, so plugging in, we get <math>f(x)f'(x)/(f(x))^2 + f'(x)/f(x)</math>, which simplifies to zero.   <math>g = 1/f</math>  zero function  The product is <math>1</math>, which is a constant function, so its derivative is zero.  We get <math>f(x)(1/f)'(x) + f'(x)/f(x)</math>. By the chain rule, <math>(1/f)'(x) = f'(x)/(f(x))^2</math>, so plugging in, we get <math>f(x)f'(x)/(f(x))^2 + f'(x)/f(x)</math>, which simplifies to zero.  
}  }  
+  
+  <center>{{#widget:YouTubeid=pS3iUxkReAI}}</center>  
+  
+  <section end="mathmajor"/>  
+  <section begin="school"/>  
===Nontrivial examples where simple alternate methods exist===  ===Nontrivial examples where simple alternate methods exist===  
Line 254:  Line 290:  
<math>f'(x) = \frac{dx}{dx} \sin x + x \frac{d}{dx}(\sin x) = 1(\sin x) + x \cos x = \sin x + x \cos x</math>  <math>f'(x) = \frac{dx}{dx} \sin x + x \frac{d}{dx}(\sin x) = 1(\sin x) + x \cos x = \sin x + x \cos x</math>  
+  
+  <section end="school"/>  
+  <section end="college"/>  
+  <section begin="mathmajor"/>  
==Proof==  ==Proof==  
Line 262:  Line 302:  
* [[Proof of product rule for differentiation using chain rule for partial differentiation]]  * [[Proof of product rule for differentiation using chain rule for partial differentiation]]  
* [[Proof of product rule for differentiation using logarithmic differentiation]]  * [[Proof of product rule for differentiation using logarithmic differentiation]]  
+  <section end="mathmajor"/> 
Latest revision as of 15:26, 5 October 2014
ORIGINAL FULL PAGE: Product rule for differentiation
STUDY THE TOPIC AT MULTIPLE LEVELS: Page for school students (firsttime learners)  Page for college students (secondtime learners)  Page for math majors and others passionate about math 
ALSO CHECK OUT: Practical tips on the topic Quiz (multiple choice questions to test your understanding) Pedagogy page (discussion of how this topic is or could be taught)Page with videos on the topic, both embedded and linked to
This article is about a differentiation rule, i.e., a rule for differentiating a function expressed in terms of other functions whose derivatives are known.
View other differentiation rules
Contents
Name
This statement is called the product rule, product rule for differentiation, or Leibniz rule.
Statement for two functions
Statement in multiple versions
The product rule is stated in many versions:
Version type  Statement 

specific point, named functions  Suppose and are functions of one variable, both of which are differentiable at a real number . Then, the product function , defined as is also differentiable at , and the derivative at is given as follows:

generic point, named functions, point notation  Suppose and are functions of one variable. Then the following is true wherever the right side expression makes sense (see concept of equality conditional to existence of one side): 
generic point, named functions, pointfree notation  Suppose and are functions of one variable. Then, we have the following equality of functions on the domain where the right side expression makes sense (see concept of equality conditional to existence of one side): We could also write this more briefly as: Note that the domain of may be strictly larger than the intersection of the domains of and , so the equality need not hold in the sense of equality as functions if we care about the domains of definition. 
Pure Leibniz notation using dependent and independent variables  Suppose are variables both of which are functionally dependent on . Then: 
In terms of differentials  Suppose are both variables functionally dependent on . Then, . 
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.
Onesided version
The product rule for differentiation has analogues for onesided derivatives. More explicitly, we can replace all occurrences of derivatives with left hand derivatives and the statements are true. Alternately, we can replace all occurrences of derivatives with right hand derivatives and the statements are true.
Partial differentiation
For further information, refer: product rule for partial differentiation
The product rule is also valid if we consider functions of more than one variable and replace the ordinary derivative by the partial derivative, directional derivative, or gradient vector.
Statement for multiple functions
Below, we formulate the many versions of this product rule:
Version type  Statement 

specific point, named functions  Suppose are functions defined and differentiable at a point . Then the product is also differentiable at , and we have: 
generic point, named functions, point notation  Suppose are functions. Then the product satisfies: wherever the right side makes sense. 
generic points, named functions, pointfree notation  Suppose are functions. Then the product satisfies: wherever the right side makes sense. We could also write this more briefly as 
Pure Leibniz notation using dependent and independent variables  Suppose are variables functionally dependent on . Then wherever the right side make sense. 
In terms of differentials  Suppose are variables functionally dependent on . Then 
For instance, using the generic point, named functions notation for , we get:
Related rules
Similar rules in single variable calculus
 Differentiation is linear: The derivative of the sum is the sum of the derivatives, and scalars can be pulled out of differentiation.
 Chain rule for differentiation
 Product rule for higher derivatives
 Chain rule for higher derivatives
 Logarithmic differentiation is a version of the product rule for differentiation that is useful for differentiating lengthy products.
 Product rule for differentiation of formal power series
Similar rules in multivariable calculus
 Product rule for partial differentiation
 Product rule for differentiation of dot product
 Product rule for differentiation of cross product
 Product rule for differentiation of scalar triple product
Reversal for integration
The reverse to this rule, that is helpful for indefinite integrations, is a method called integration by parts.
Significance
Qualitative and existential significance
Each of the versions has its own qualitative significance:
Version type  Significance 

specific point, named functions  This tells us that if and are both differentiable at a point, so is . The onesided versions allow us to make similar statement for left and right differentiability. 
generic point, named functions, point notation  This tells us that if both and are differentiable on an open interval, then so is . The onesided versions allow us to make similar statements for closed intervals where we require the appropriate onesided differentiability at the endpoints. 
generic point, pointfree notation  This can be used to deduce more, namely that the nature of depends strongly on the nature of and that of . In particular, if and are both continuously differentiable functions on an interval (i.e., and are both continuous on that interval), then is also continuously differentiable on that interval. This uses the sum theorem for continuity and product theorem for continuity. 
Computational feasibility significance
Each of the versions has its own computational feasibility significance:
Version type  Significance 

specific point, named functions  This tells us that knowledge of the values (in the sense of numerical values) at a specific point is sufficient to compute the value of . For instance, if we are given that , we obtain that . A note on contrast with the (false) freshman product rule: [SHOW MORE] 
generic point, named functions  This tells us that knowledge of the general expressions for and and the derivatives of and is sufficient to compute the general expression for the derivative of . See the #Examples section of this page for more examples. 
Computational results significance
Each of the versions has its own computational results significance:
Shorthand  Significance  What would happen if the freshman product rule were true instead of the product rule? 

significance of derivative being zero  If and are both equal to 0, then so is . In other words, if the tangents to the graphs of are both horizontal at the point , so is the tangent to the graph of .  This result would still hold, but so would a stronger result: namely that if either or is zero, so is . 
significance of sign of derivative  and both being positive is not sufficient to ensure that is positive. However, if all four of are positive, then is positive. This is related to the fact that a product of increasing functions need not be increasing.  In that case, it would be true that and both being positive is sufficient to ensure that is positive. 
significance of uniform bounds  both being uniformly bounded is not sufficient to ensure that is uniformly bounded. However, if all four functions are uniformly bounded, then indeed is uniformly bounded.  In that case, it would be true that and both uniformly being bounded is sufficient to ensure that is uniformly bounded. 
Compatibility checks
Symmetry in the functions being multiplied
We know that the product of two functions is symmetric in them, i.e., . Thus, the product rule for differentiation should satisfy the condition that the formula for is symmetric in and , i.e., we get the same formula for . This is indeed true using the commutativity of addition and multiplication:
Associative symmetry
This is a compatibility check showing that for a product of three functions , we get the same product rule formula whether we associate this product as or as .
 Associating as :
 Associating as :
Compatibility with linearity
Consider functions and the expression:
This can be differentiated in two ways, using the product rule on the left side and then linearity of differentiation, or by differentiating the right side and using the product rule in each term. We verify that both yield the same result:
 Left side: Differentiating, we get
 Right side: Differentiating, we get .
Thus, both sides are equal and the product rule for differentiation checks out.
Compatibility with notions of order
This section explains why the product rule is compatible with notions of order that satisfy:
 If , and both are positive (in a suitable sense) then equals it. Even if they are not both positive, usually has the same order
Suppose and . Then we have the following:
 has order : First, note that has order by the product relation for order. Next, note that differentiating pushes the order down by one.
 also (plausibly) has order : Note that has order and . Adding them should give something of order .
Thus, the product rule is compatible with the order notion.
Note that the freshman product rule is incompatible with notions of order: [SHOW MORE]Some examples of the notion of order which illustrate this are:
 For nonzero polynomials, the order notion above can be taken as the degree of the polynomial (though the zero polynomial creates some trouble for multiplication). The notion of positive can be taken as having a positive leading coefficients.
 For functions that are zero at a particular point, the order notion above can be taken as the order of zero at the point.
Case of infinite or undefined values
For further information, refer: Using the product rule for differentiation for limiting behavior at points with undefined derivative
The product rule for differentiation has analogues for infinities, with the appropriate caveats about indeterminate forms. Specifically, we have the following:
Conclusion about  Explanation  

finite  finite  undefined  undefined  insufficient information (could be finite or undefined)  We don't know the details behind the undefined 
nonzero  nonzero and same sign as  vertical tangent  vertical tangent of same type as for (i.e., either both are increasing or both are decreasing)  vertical tangent, type (increasing/decreasing) is determined by signs of and types of vertical tangent for  [SHOW MORE] 
nonzero  nonzero and opposite sign to  vertical tangent  vertical tangent of same type as for (i.e., either both are increasing or both are decreasing)  insufficient information  [SHOW MORE] 
nonzero  nonzero and same sign as  vertical tangent  vertical tangent of opposite type as for (i.e., one is increasing and one is decreasing)  insufficient information  
nonzero  nonzero and opposite sign to  vertical tangent  vertical tangent of opposite type as for (i.e., one is increasing and one is decreasing)  vertical tangent, type depends on signs  
zero  known whether it is zero, positive, or negative  known whether it is finite, vertical tangent, etc.  vertical tangent  insufficient information in all cases. 
Examples
For practical tips and explanations on how to apply the product rule in practice, check out Practical:Product rule for differentiation
Sanity checks
We first consider examples where the product rule for differentiation confirms something we already knew through other means. In all examples, we assume that both and are differentiable functions:
Case  The derivative of  Direct justification (without use of product rule)  Justification using product rule, i.e., computing it as 

is the zero function.  zero function  for all , so its derivative is also zero .  Both and are zero functions, so is everywhere zero. 
is a constant nonzero function with value .  The function is , and the derivative is , because the constant can be pulled out of the differentiation process.  simplifies to . Since is constant, is the zero function, hence so is . The sum is thus .  
The derivative is by the chain rule for differentiation: we are composing the square function and .  We get .  
zero function  The product is , which is a constant function, so its derivative is zero.  We get . By the chain rule, , so plugging in, we get , which simplifies to zero. 
Nontrivial examples where simple alternate methods exist
Here is a simple trigonometric example:
.
[SHOW MORE]Nontrivial examples where simple alternate methods do not exist
Consider a product of the form:
Using the product rule, we get:
Proof
There are many different versions of the proof, given below: