# Derivative

This page lists a core term of calculus. The term is used widely, and a thorough understanding of its definition is critical.
See a complete list of core terminology

## Name

The term derivative is used for the notion defined here. However, there are many variations of the concept of derivative that are described by using adjectives to modify the noun. When these variations are being talked about, it is helpful to provide a similar adjective to indicate that we are talking about the usual notion of derivative. The variations and corresponding terminological clarification are below:

Variation of notion of derivative Modified name for the usual notion of derivative to emphasize it's the original notion and not the variation
one-sided derivative (left hand derivative and right hand derivative) -- defined on this page two-sided derivative
higher derivative (obtained by repeated differentiation) first derivative
partial derivative (derivative of a function of multiple variables with respect to one of the variables holding the others constant) ordinary derivative
discrete derivative (not commonly used) continuous derivative (not commonly used)

## Definition at a point

### Conceptual definition

Suppose $f$ is a function defined on a subset of the reals and $x_0$ is a point in the interior of the domain of $f$, i.e., the domain of $f$ contains an open interval surrounding $x_0$. The derivative of $f$ at $x_0$, denoted $\! f'(x_0)$, is the instantaneous rate of change of $f(x)$ with respect to $x$ at $x_0$. It is defined as the limit of the average rate of change of $f$ between $x$ and $x_0$, as $x$ approaches $x_0$.

In the more formal definitions below, we will see that:

• Difference quotient formalizes the notion of average rate of change.
• Derivative formalizes the notion of instantaneous rate of change, and is the limit of the difference quotient.

### Algebraic definition

Suppose $f$ is a function defined on a subset of the reals and $x_0$ is a point in the interior of the domain of $f$, i.e., the domain of $f$ contains an open interval surrounding $x_0$. The derivative (also called first derivative) of $f$ at $x_0$, denoted $f'(x_0)$ is defined as the limit of the difference quotient of $f$ between $x_0$ and $x$, as $x \to x_0$. Explicitly:

$\! f'(x_0) := \lim_{x \to x_0} \Delta f(x,x_0) = \lim_{x \to x_0} \frac{f(x) - f(x_0)}{x - x_0}$

If this limit exists, then we say that the derivative exists and has this value, and we say that the function is differentiable at the point. If the limit does not exist, then we say that the function is not differentiable at the point and the derivative does not exist.

### Computationally useful version of algebraic definition

This is obtained from the previous definition by the variable substitution $h := x - x_0$ so $x = x_0 + h$. Explicitly:

$\! f'(x_0) := \lim_{h \to 0} \Delta f(x_0 + h,x_0) = \lim_{h \to 0} \frac{f(x_0 + h) - f(x_0)}{h}$

### Geometric definition

Suppose $f$ is a function and $x_0$ is a point in the interior of the domain of $f$, i.e., the domain of $f$ contains an open interval surrounding $x_0$. The derivative of $f$ at $x_0$ is the slope of the tangent line to the graph of $f$ through the point $(x_0,f(x_0))$.

### Algebraic definition elaborated in terms of epsilon-delta definition of limits

Suppose $f$ is a function defined on a subset of the reals and $x_0$ is a point in the interior of the domain of $f$, i.e., the domain of $f$ contains an open interval surrounding $x_0$. The derivative (also called first derivative) of $f$ at $x_0$, denoted $\! f'(x_0)$, is defined as a real number $L$ such that:

For every $\!\varepsilon > 0$
there exists $\!\delta > 0$ such that
if $\! 0 < |x - x_0| < \delta$
then $\! |f(x) - f(x_0) - L(x - x_0)| < \varepsilon|x - x_0|$.

## Definition as a function

Suppose $f$ is a function defined on a subset of the reals. Its derivative or first derivative, denoted $\! f'$, is a function defined as follows:

• The domain is the following subset of the domain of $\! f$: An element in the domain of $\! f$ is in the domain of $\! f'$ if and only if it is in the interior of the domain of $\! f$ and the derivative of $\! f$ exists at the point.
• The function value at any point in the domain is simply the value of the derivative of $\! f$ at that point.
MORE ON THE WAY THIS DEFINITION OR FACT IS PRESENTED: We first present the version that deals with a specific point (typically with a $\{ \}_0$ subscript) in the domain of the relevant functions, and then discuss the version that deals with a point that is free to move in the domain, by dropping the subscript. Why do we do this?
The purpose of the specific point version is to emphasize that the point is fixed for the duration of the definition, i.e., it does not move around while we are defining the construct or applying the fact. However, the definition or fact applies not just for a single point but for all points satisfying certain criteria, and thus we can get further interesting perspectives on it by varying the point we are considering. This is the purpose of the second, generic point version.

## One-sided notions

### Left-hand derivative

Suppose $f$ is a function defined at a point $x_0 \in \R$ and also to the immediate left of $x_0$. The left-hand derivative of $f$ at $x_0$ is defined as the left hand limit for the difference quotient between $x$ and $x_0$. In other words, it is:

$\operatorname{LHD}(f)(x_0) = f'_-(x_0) := \lim_{x \to x_0^-}\Delta f(x,x_0) = \lim_{x \to x_0^-} \frac{f(x) - f(x_0)}{x - x_0} = \lim_{h \to 0^-} \frac{f(x_0 + h) - f(x_0))}{h}$

### Right-hand derivative

Suppose $f$ is a function defined at a point $x_0 \in \R$ and also to the immediate right of $x_0$. The right-hand derivative of $f$ at $x_0$ is defined as the right hand limit for the difference quotient between $x$ and $x_0$. In other words, it is:

$\operatorname{RHD}(f)(x_0) = f'_+(x_0) := \lim_{x \to x_0^+} \Delta f(x,x_0) = \lim_{x \to x_0^+} \frac{f(x) - f(x_0)}{x - x_0} = \lim_{h \to 0^+} \frac{f(x_0 + h) - f(x_0)}{h}$

### Relation between one-sided derivatives and the usual (two-sided) derivative

The derivative $f'(x_0)$ exists if and only if (both the left hand derivative and the right hand derivative exist at $x_0$ and their values are equal). Further, the value of the derivative equals both these equals values.

## Leibniz notation for derivative

The Leibniz notation for derivative views the derivative as the relative rate of change of two variables and is thus a somewhat different perspective on the derivative.

Suppose $f$ is a function, and $x,y$ are variables related by $y := f(x)$. Here, $x$ is an independent variable and $y$ is the dependent variable (with the dependency being described by the function $f$). We then define:

$\frac{dy}{dx} := f'(x)$

In particular, $dy/dx$ is a function of $x$. Its value at $x = x_0$ is defined as $f'(x_0)$ and is denoted as follows:

$\! \frac{dy}{dx} |_{x = x_0} := f'(x_0)$

Note that the $dy/dx$ notation does not mean that a number $dy$ is being divided by a number $dx$. One way of justifying this notation is by expressing it as a limit of a difference quotient; here, $y_0 = f(x_0)$:

$\! \frac{dy}{dx} |_{x = x_0} := \lim_{x \to x_0} \frac{y - y_0}{x - x_0} = \lim_{x \to x_0} \frac{\Delta y}{\Delta x}$

where $\Delta y = y - y_0$ denotes the difference in $y$-values and $\Delta x = x - x_0$ denotes the difference in $x$-values.

### Quotient notation is misleading but salvageable

The difference quotient is actually a quotient of numbers, and the derivative is a limit of this. Hence, many of the formal manipulations involving fractions of numbers work with this notation, even though $dy/dx$ itself is not a quotient of numbers (see chain rule for differentiation and inverse function theorem).

### Expressive advantage of Leibniz notation

The Leibniz notation is advantageous for carrying out computations by hand and writing derivative expressions because it does not require us to name every function in order to differentiate it. On the other hand, the prime notation requires us to name a function before we can talk of its derivatives.

Thus, the Leibniz notation is crucial for constructing complicated expressions involving derivatives. For instance, consider the expression:

$\frac{d}{dx}\left[\left(\frac{d}{dx}(x - \cos x)\right)\sin^2\left(\frac{d}{dx}(x^2 \cos(x^3))\right)\right]$

In order to write this expression with the prime notation, we would first need to give names to functions $x - \cos x$ and $x^2 \cos x^3$, then give a name to the entire expression within square braces, and then talk of differentiating it.

### Expressive disadvantage of Leibniz notation

The Leibniz notation is not point-free, i.e., we have to use a symbol to denote the point at which the function is being applied. In contrast, with the prime notation, we can make statements like $\! \sin' = \cos$.

## Physical applications

### Note on units

In applications to the natural and social sciences, the units used for measuring $dy/dx$ are the units used for measuring $y$ divided by the units used for measuring $x$. This is because the derivative is a limit of a difference quotient, which is a quantity measured in units of $y$ divided by a quantity measured in units of $x$.

If the dimensions are expressed using a framework such as the MLT framework for physical quantities, then the MLT exponents subtract.

Context Example of derivative from real world application Functionally dependent variable being differentiated Independent variable in terms of which differentiation is happening Corresponding difference quotient Term for numerator of difference quotient Term for denominator of difference quotient Comment
kinematics (classical mechanics, physics) instantaneous velocity
measured in units of length/time
position
measured in units of length
time
measured in units of time
average velocity displacement time elapsed strictly speaking, this is a vector-valued derivative, but we can use single variable calculus if we restrict to motion along a straight line.
kinematics (classical mechanics, physics) instantaneous acceleration
measured in units of length/(time)^2
velocity
measured in units of length/time
time
measured in units of time
average acceleration change in velocity time elapsed strictly speaking, this is a vector-valued derivative, but we can use single variable calculus if we restrict to motion along a straight line.
chemical reaction (chemistry) rate of change of concentration of a particular reaction product
measured in units of (concentration measure)/(time).
concentration
Suitable concentration measure could be molarity (for reactions in solution) or partial pressure (for gaseous reactions)
time
measured in units of time
average rate of change of concentration of the product
measured in units of (concentration measure)/(time).
change in concentration of product time elapsed

## Related notions

Notion How it relates to derivative
higher derivative differentiate again the function obtained by differentiating a particular function, and apply this process repeatedly. Specifically, the $k^{th}$ derivative is the function obtained by applying the differentiation operation $k$ times.
antiderivative a function that has the given derivative. Antidifferentiation is the reverse of differentiation. The general expression for the antiderivative is also called the indefinite integral.
partial derivative a function of more than one variable is differentiated with respect to one of the variables keeping the others constant.
higher partial derivative obtained by applying the partial differentiation operation to a function of more than one variable. The pure higher partials are those where all the partial differentiation operations are with respect to the same variable. The mixed higher partials are those where the partial differentiation operations are with respect to more than one variable.
differential Fill this in later

## Significance

### Significance of sign on intervals

The derivative represents the rate of change, and roughly speaking, the sign of derivative represents the direction of change. We list the loose and precise statements below:

Loose statement Precise versions
increasing function means positive derivative positive derivative implies increasing
nonnegative derivative that is not identically zero on any interval implies increasing
increasing and differentiable implies nonnegative derivative that is not identically zero on any interval
decreasing function means negative derivative negative derivative implies decreasing
decreasing and differentiable implies nonpositive derivative that is not identically zero on any interval
constant function means zero derivative constant function implies zero derivative
zero derivative implies locally constant

### Significance of sign at points

This is quite similar to the significance on an interval, but the behavior at individual points can be anomalous and can also represent transitions between different kinds of intervals.

Loose statement Precise versions
comparison of function value with points on immediate left/right tells us sign of one-sided derivative and vice versa local maximum from the left implies left hand derivative is nonnegative if it exists
local maximum from the right implies right hand derivative is nonpositive if it exists
local minimum from the left implies left hand derivative is nonpositive if it exists
.ocal minimum from the right implies right hand derivative is nonnegative if it exists
local maximum/minimum value must occur at critical point, which is a point where derivative is zero or does not exist. Moreover, sign of derivative on immediate left and right help determine whether it is local max, min, or neither. first derivative test, see also second derivative test and higher derivative tests.

## Computation of derivative

### List of most commonly used rules

For a full list, see Category:Differentiation rules.

Method for constructing new functions from old In symbols Derivative in terms of the old functions and their derivatives Proof
pointwise sum $f + g$ is the function $x \mapsto f(x) + g(x)$
$f_1 + f_2 + \dots + f_n$ is the function $x \mapsto f_1(x) + f_2(x) + \dots + f_n(x)$
Sum of the derivatives of the functions being added (the derivative of the sum is the sum of the derivatives)
$\! f' + g'$
$\! f_1' + f_2' + \dots + f_n'$
differentiation is linear
pointwise difference $f - g$ is the function $x \mapsto f(x) - g(x)$ Difference of the derivatives, i.e., $f' - g'$ differentiation is linear
scalar multiple by a constant $af$ is the function $x \mapsto af(x)$ where $a$ is a real number $x \mapsto af'(x)$ differentiation is linear
pointwise product $f \cdot g$ (sometimes denoted $fg$) is the function $x \mapsto f(x)g(x)$
$f_1 \cdot f_2 \cdot \dots f_n$ (sometimes denoted $f_1f_2\dots f_n$ is the function $x \mapsto f_1(x)f_2(x) \dots f_n(x)$
For two functions, $x \mapsto f'(x)g(x) + f(x)g'(x)$
For multiple functions, $x \mapsto f_1'(x)f_2(x) \dots f_n(x) + f_1(x)f_2'(x) \dots f_n(x) + \dots + f_1(x)f_2(x) \dots f_n'(x)$
product rule for differentiation
pointwise quotient $f/g$ is the function $x \mapsto f(x)/g(x)$ $x \mapsto \frac{g(x)f'(x) - f(x)g'(x)}{(g(x))^2}$ quotient rule for differentiation
composite of two functions $f \circ g$ is the function $x \mapsto f(g(x))$ $x \mapsto f'(g(x))g'(x)$ chain rule for differentiation
inverse function of a one-one function $f^{-1}$ sends $x$ to the unique $y$ such that $f(y) = x$ $\! \frac{1}{f'(f^{-1}(x))}$ inverse function theorem
piecewise definition $f(x) := \left\lbrace \begin{array}{rl} f_1(x), & x < c \\ f_2(x), & c < x \\v, & x = c \end{array}\right.$ where $f_1, f_2$ can be extended to differentiable functions on all reals $f' = f_1'$ to the left of $c$ and $f' = f_2'$ to the right of $c$. At $c$, $f$ is differentiable iff $f_1(c) = f_2(c) = v$ and $f_1'(c) = f_2'(c)$. differentiation rule for piecewise definition by interval

### Significance of differentiation rules

The list of differentiation rules is fairly complete with respect to the most typical ways of constructing functions from other functions. This means that if we have explicit expressions for the derivatives of a collection of functions, we can obtain explicit expressions for the derivatives of any other function constructed from them via any of the methods covered in the table above (pointwise sum, pointwise difference, scalar multiple, pointwise product, pointwise quotient, composite, inverse, and piecewise definition).

Computer programs that implement symbolic mathematics (such as Mathematica) generally have all these rules coded in, and are able to use them to differentiate any function constructed by applying these operations to functions whose derivative is already known by the program.