Hessian matrix: Difference between revisions

From Calculus
No edit summary
No edit summary
Line 1: Line 1:
{{multivariable analogue of|second derivative}}
{{multivariable analogue of|second derivative}}
==Definition at a point==
 
==Definition==
 
===Definition in terms of Jacobian matrix and gradient vector===
 
Suppose <math>f</math> is a real-valued function of <math>n</math> variables <math>x_1,x_2,\dots,x_n</math>. The '''Hessian matrix'' of <math>f</math> is a <math>n \times n</math>-matrix-valued function with [[domain]] a subset of the domain of <math>f</math>, defined as follows: the Hessian matrix at any point in the domain is the [[Jacobian matrix]] of the [[gradient vector]] of <math>f</math> at the point. In point-free notation, we denote by <math>H(f)</math> the Hessian matrix function, and we define it as:
 
<math>H(f) = J(\nabla f)</math>
 
===Interpretation as second derivative===
 
The Hessian matrix function is the correct notion of second derivative for a real-valued function of <math>n</math> variables. Here's why:
 
* The correct notion of ''first'' derivative for a scalar-valued function of multiple variables is the [[gradient vector]], so the correct notion of first derivative for <math>f</math> is <math>\nabla f</math>.
* The gradient vector <math>\nabla f</math> is itself a vector-valued function with <math>n</math>-dimensional inputs and <math>n</math>-dimensional outputs. The correct notion of derivative for ''that'' is the [[Jacobian matrix]], with <math>n</math>-dimensional inputs and outputs valued in <math>n \times n</math>-matrices.
 
Thus, the Hessian matrix is the correct notion of second derivative.
 
===Definition in terms of second-order partial derivatives===
 
{{further|[[Relation between Hessian matrix and second-order partial derivatives]]}}
 
Wherever the Hessian matrix for a function exists, its entries can be described as second-order partial derivatives of the function. Explicitly, for a function <math>f</math> is a real-valued function of <math>n</math> variables <math>x_1,x_2,\dots,x_n</math>, the Hessian matrix <math>H(f)</math> is a <math>n \times n</math>-matrix-valued function whose <math>(ij)^{th}</math> entry is the second-order partial derivative <math>\partial^2/(\partial x_j\partial x_i}</math>, which is the same as <math>f_{x_ix_j}</math>. Note that the diagonal entries give second-order pure partial derivatives whereas the off-diagonal entries give [[second-order mixed partial derivative]]s.
 
==Computationally useful definition at a point==


===For a function of two variables at a point===
===For a function of two variables at a point===


Suppose <math>f</math> is a real-valued function of two variables <math>x,y</math> and <math>(x_0,y_0)</math> is a point in the domain of <math>f</math>. Suppose all the four second-order partial derivatives exist at <math>(x_0,y_0)</math>, i.e., the two pure second-order partials <math>f_{xx}(x_0,y_0),f_{yy}(x_0,y_0)</math> exist, and so do the two [[second-order mixed partial derivative]]s <math>f_{xy}(x_0,y_0)</math> and <math>f_{yx}(x_0,y_0)</math>. Then, the Hessian matrix of <math>f</math> at <math>(x_0,y_0)</math>, denoted <math>H(f)(x_0,y_0)</math>, is a <math>2 \times 2</math> matrix of real numbers defined as follows:
Suppose <math>f</math> is a real-valued function of two variables <math>x,y</math> and <math>(x_0,y_0)</math> is a point in the domain of <math>f</math> at which <math>f</matH> is twice differentiable. In particular, this means that all the four second-order partial derivatives exist at <math>(x_0,y_0)</math>, i.e., the two pure second-order partials <math>f_{xx}(x_0,y_0),f_{yy}(x_0,y_0)</math> exist, and so do the two [[second-order mixed partial derivative]]s <math>f_{xy}(x_0,y_0)</math> and <math>f_{yx}(x_0,y_0)</math>. Then, the Hessian matrix of <math>f</math> at <math>(x_0,y_0)</math>, denoted <math>H(f)(x_0,y_0)</math>, can be expressed explicitly as a <math>2 \times 2</math> matrix of real numbers defined as follows:


<math>\begin{pmatrix} f_{xx}(x_0,y_0) & f_{xy}(x_0,y_0) \\ f_{yx}(x_0,y_0) & f_{yy}(x_0,y_0) \\\end{pmatrix}</math>
<math>\begin{pmatrix} f_{xx}(x_0,y_0) & f_{xy}(x_0,y_0) \\ f_{yx}(x_0,y_0) & f_{yy}(x_0,y_0) \\\end{pmatrix}</math>
Line 12: Line 36:
===For a function of multiple variables at a point===
===For a function of multiple variables at a point===


Suppose <math>f</math> is a real-valued function of multiple variables <math>(x_1,x_2,\dots,x_n)</math>. Suppose <math>(a_1,a_2,\dots,a_n)</math> is a point in the domain of <math>f</math>. In other words, <math>a_1,a_2,\dots,a_n</math> are real numbers and the point has coordinates <math>x_1 = a_1, x_2 = a_2, \dots,x_n = a_n</math>. Suppose, further, that all the second-order partials (pure and mixed) of <math>f</math> with respect to these variables exist at the point <math>(a_1,a_2,\dots,a_n)</math>. Then, the Hessian matrix of <math>f</math> at <math>(a_1,a_2,\dots,a_n)</math>, denoted <math>H(f)(a_1,a_2,\dots,a_n)</math>, is a <math>n \times n</math> matrix of real numbers defined as follows:
Suppose <math>f</math> is a real-valued function of multiple variables <math>(x_1,x_2,\dots,x_n)</math>. Suppose <math>(a_1,a_2,\dots,a_n)</math> is a point in the domain of <math>f</math> at which <math>f</math> is twice differentiable. In other words, <math>a_1,a_2,\dots,a_n</math> are real numbers and the point has coordinates <math>x_1 = a_1, x_2 = a_2, \dots,x_n = a_n</math>. Suppose, further, that all the second-order partials (pure and mixed) of <math>f</math> with respect to these variables exist at the point <math>(a_1,a_2,\dots,a_n)</math>. Then, the Hessian matrix of <math>f</math> at <math>(a_1,a_2,\dots,a_n)</math>, denoted <math>H(f)(a_1,a_2,\dots,a_n)</math>, is a <math>n \times n</math> matrix of real numbers that can be expressed explicitly as follows:


The <math>(ij)^{th}</math> entry (i.e., the entry in the <math>i^{th}</math> row and <math>j^{th}</math> column) is <math>f_{x_ix_j}(a_1,a_2,\dots,a_n)</math>. This is the same as <math>\frac{\partial^2}{\partial x_j \partial x_i}f(x_1,x_2,\dots,x_n)|_{(x_1,x_2,\dots,x_n) = (a_1,a_2,\dots,a_n)}</math>. Note that in the two notations, the order in which we write the partials differs because the convention differs (left-to-right versus right-to-left).
The <math>(ij)^{th}</math> entry (i.e., the entry in the <math>i^{th}</math> row and <math>j^{th}</math> column) is <math>f_{x_ix_j}(a_1,a_2,\dots,a_n)</math>. This is the same as <math>\frac{\partial^2}{\partial x_j \partial x_i}f(x_1,x_2,\dots,x_n)|_{(x_1,x_2,\dots,x_n) = (a_1,a_2,\dots,a_n)}</math>. Note that in the two notations, the order in which we write the partials differs because the convention differs (left-to-right versus right-to-left).

Revision as of 16:12, 12 May 2012

This article describes an analogue for functions of multiple variables of the following term/fact/notion for functions of one variable: second derivative

Definition

Definition in terms of Jacobian matrix and gradient vector

Suppose is a real-valued function of variables . The 'Hessian matrix of is a -matrix-valued function with domain a subset of the domain of , defined as follows: the Hessian matrix at any point in the domain is the Jacobian matrix of the gradient vector of at the point. In point-free notation, we denote by the Hessian matrix function, and we define it as:

Interpretation as second derivative

The Hessian matrix function is the correct notion of second derivative for a real-valued function of variables. Here's why:

  • The correct notion of first derivative for a scalar-valued function of multiple variables is the gradient vector, so the correct notion of first derivative for is .
  • The gradient vector is itself a vector-valued function with -dimensional inputs and -dimensional outputs. The correct notion of derivative for that is the Jacobian matrix, with -dimensional inputs and outputs valued in -matrices.

Thus, the Hessian matrix is the correct notion of second derivative.

Definition in terms of second-order partial derivatives

For further information, refer: Relation between Hessian matrix and second-order partial derivatives

Wherever the Hessian matrix for a function exists, its entries can be described as second-order partial derivatives of the function. Explicitly, for a function is a real-valued function of variables , the Hessian matrix is a -matrix-valued function whose entry is the second-order partial derivative Failed to parse (syntax error): {\displaystyle \partial^2/(\partial x_j\partial x_i}} , which is the same as . Note that the diagonal entries give second-order pure partial derivatives whereas the off-diagonal entries give second-order mixed partial derivatives.

Computationally useful definition at a point

For a function of two variables at a point

Suppose is a real-valued function of two variables and is a point in the domain of at which is twice differentiable. In particular, this means that all the four second-order partial derivatives exist at , i.e., the two pure second-order partials exist, and so do the two second-order mixed partial derivatives and . Then, the Hessian matrix of at , denoted , can be expressed explicitly as a matrix of real numbers defined as follows:

{{#widget:YouTube|id=47WX0VfWS8k}}

For a function of multiple variables at a point

Suppose is a real-valued function of multiple variables . Suppose is a point in the domain of at which is twice differentiable. In other words, are real numbers and the point has coordinates . Suppose, further, that all the second-order partials (pure and mixed) of with respect to these variables exist at the point . Then, the Hessian matrix of at , denoted , is a matrix of real numbers that can be expressed explicitly as follows:

The entry (i.e., the entry in the row and column) is . This is the same as . Note that in the two notations, the order in which we write the partials differs because the convention differs (left-to-right versus right-to-left).

The matrix looks like this:

{{#widget:YouTube|id=FIRMeFAeYqc}}

Definition as a function

For a function of two variables

Suppose is a real-valued function of two variables . The Hessian matrix of , denoted , is a matrix-valued function that sends each point to the Hessian matrix at that point, if that matrix is defined. It is defined as:

In the point-free notation, we can write this as:

{{#widget:YouTube|id=39VO16DieuQ}}

For a function of multiple variables

Suppose is a function of variables . The Hessian matrix of , denoted , is a matrix-valued function that sends each point to the Hessian matrix at that point, if the matrix is defined. It is defined as:

In the point-free notation, we can write it as:

{{#widget:YouTube|id=DeFoV-NfjQQ}}

Under continuity assumptions

If we assume that all the second-order partials of are continuous functions everywhere, then the following happens:

  • The Hessian matrix of at any point is a symmetric matrix, i.e., its entry equals its entry. This follows from Clairaut's theorem on equality of mixed partials.
  • We can think of the Hessian matrix as the second derivative of the function, i.e., it is a matrix describing the second derivative.
  • is twice differentiable as a function. Hence, the Hessian matrix of is the same as the Jacobian matrix of the gradient vector , where the latter is viewed as a vector-valued function.

Note that the final conclusion actually only requires the existence of the gradient vector, hence it holds even if the second-order partials are not continuous.