Jacobian matrix

From Calculus

This article describes an analogue for functions of multiple variables of the following term/fact/notion for functions of one variable: derivative

Importance

The Jacobian matrix is the appropriate notion of derivative for a function that has multiple inputs (or equivalently, vector-valued inputs) and multiple outputs (or equivalently, vector-valued outputs).

Definition at a point

Direct epsilon-delta definition

Definition at a point in terms of gradient vectors as row vectors

Suppose is a vector-valued function with -dimensional inputs and -dimensional outputs. Explicitly, suppose is a function with inputs and outputs . Suppose is a point in the domain of such that is differentiable at for . Then, the Jacobian matrix of at is a matrix of numbers whose row is given by the gradient vector of at .

Explicitly, in terms of rows, it looks like:

{{#widget:YouTube|id=O8isoxng_9g}}

Definition at a point in terms of partial derivatives

Suppose is a vector-valued function with -dimensional inputs and -dimensional outputs. Explicitly, suppose is a function with inputs and outputs . Suppose is a point in the domain of such that is differentiable at for . Then, the Jacobian matrix of at is a matrix of numbers whose entry is given by:

Here's how the matrix looks:

Note that for this definition to be correct, it is still necessary that the gradient vectors exist. If the gradient vectors do not exist but the partial derivatives do, a matrix can still be constructed using this recipe but it may not satisfy the nice behavior that the Jacobian matrix does.

{{#widget:YouTube|id=VCM4RVM09_I}}

Definition as a function

Definition in terms of gradient vectors as row vectors

Suppose is a vector-valued function with -dimensional inputs and -dimensional outputs. Explicitly, suppose is a function with inputs and outputs . Then, the Jacobian matrix of is a matrix of functions whose row is given by the gradient vector of . Explicitly, it looks like this:


Note that the domain of this function is the set of points at which all the s individually are differentiable.

Definition in terms of partial derivatives

Suppose is a vector-valued function with -dimensional inputs and -dimensional outputs. Explicitly, suppose is a function with inputs and outputs . Then, the Jacobian matrix of is a matrix of functions whose entry is given by:

wherever all the s individually are differentiable in the sense of the gradient vectors existing. Here's how the matrix looks:

If the gradient vectors do not exist but the partial derivatives do, a matrix can still be constructed using this recipe but it may not satisfy the nice behavior that the Jacobian matrix does.

{{#widget:YouTube|id=jTmwUMnuUec}}

Particular cases

Case What happens in that case?
is a real-valued function of one variable. The Jacobian matrix is a matrix whose entry is the ordinary derivative.
, is a vector-valued function of one variable. We can think of it as a parametric curve in . The Jacobian matrix is a matrix which, read as a column vector, is the parametric derivative of the vector-valued function.
, is a real-valued function of multiple variables. The Jacobian matrix is a matrix which, read as a row vector, is the gradient vector function.
is a linear or affine map. The Jacobian matrix is the same as the matrix describing (or, if is affine, the matrix describing the linear part of ).
, and we are identifying the spaces of inputs and outputs of . The Jacobian matrix can then be thought of as a linear self-map from the -dimensional space to itself. In this context, we can consider the Jacobian determinant.