In vector calculus, the Jacobian matrix is the matrix of all first-order partial derivatives of a vector-valued function. Its importance lies in the fact that it represents the best linear approximation to a differentiable function near a given point.
The Jacobian matrix is named after the mathematician Carl Gustav Jacobi; the term "Jacobian" is pronounced as "yah-KO-bee-un".
Suppose F : R^{n} → R^{m} is a function from Euclidean n-space to Euclidean m-space. Such a function is given by m real-valued component functions, y_{1}(x_{1},...,x_{n}), ..., y_{m}(x_{1},...,x_{n}). The partial derivatives of all these functions (if they exist) can be organized in an m-by-n matrix, the Jacobian matrix of F, as follows:
If p is a point in R^{n} and F is differentiable at p, then its derivative is given by J_{F}(p) (and this is the easiest way to compute said derivative). In this case, the linear map described by J_{F}(p) is the best linear approximation of F near the point p, in the sense that
Table of contents |
2 Jacobian determinant 3 Example |
The Jacobian determinant at a given point gives important information about the behavior of F near that point. For instance, the continuously differentiable function F is invertible near p if and only if the Jacobian determinant at p is non-zero. This is the inverse function theorem. Furthermore, if the Jacobian determinant at p is positive, then F preserves orientation near p; if it is negative, F reverses orientation. The absolute value of the Jacobian determinant at p gives us the factor by which the function F expands or shrinks volumes near p; this is why it occurs in the general substitution rule.