3

Let $f(x,y) = x^2-3xy+y^2$. Determine whether the point $(0,0)$ is a local maxima, local minima, or a saddle point using the eigenvalues of the Hessian of $f$ at the point $(0,0)$ or the eigenvalues of the associated symmetric matrix of $f$ .

This is start of my calculation: The Hessian is a 2x2 matrix and looks like this:

$$ \text{Hess}(f) = \begin{bmatrix} \frac{\partial^2 f}{\partial x^2} & \frac{\partial^2 f}{\partial x \partial y} \\ \frac{\partial^2 f}{\partial y \partial x} & \frac{\partial^2 f}{\partial y^2} \end{bmatrix} $$

Now, calculate the second derivatives of $f$:

$$ \frac{\partial^2 f}{\partial x^2} = 2 $$

$$ \frac{\partial^2 f}{\partial y^2} = 2 $$

$$ \frac{\partial^2 f}{\partial x \partial y} = \frac{\partial^2 f}{\partial y \partial x} = -3 $$

So the Hessian of $f$ is:

$$ \text{Hess}(f) = \begin{bmatrix} 2 & -3 \\ -3 & 2 \end{bmatrix} $$

Now we need to calculate the eigenvalues of this matrix. The eigenvalues can be calculated by solving the determinant equation:

$$ \text{det}(\text{Hess}(f) - \lambda E) = 0 $$

where $\lambda$ is the eigenvalue we are looking for and $E$ is the identity matrix. For our Hessian matrix we get:

$$ \text{det}\left(\begin{bmatrix} 2-\lambda & -3 \\ -3 & 2-\lambda \end{bmatrix}\right) = 0 $$

How to continue?

Tim
  • 779

4 Answers4

1

Well, continuing from where you stopped, \begin{align*} \det \begin{pmatrix} 2 - \lambda & -3 \\ -3 & 2 - \lambda \end{pmatrix} &= 0 \\ (2-\lambda)(2-\lambda) - (-3)(-3) &= 0 \\ \lambda^2 - 4\lambda + 4 - 9 &= 0 \\ (\lambda - 5)(\lambda+1) &= 0. \end{align*} So the eigenvalues are $\lambda = 5, -1$.

What does this tell you about what type of stationary point $(0,0)$ is?

The eigenvalues have different signs, so $f$ increases in some directions around $(0,0)$ and decreases in others. Thus $(0,0)$ is a saddle point.

You can actually deduce the signs of the eigenvalues in the $2 \times 2$ case straight from the Hessian $H$. If the eigenvalues are $\lambda_1$ and $\lambda_2$, then here $\det H = \lambda_1 \lambda_2 = -5$ so the eigenvalues must have opposite signs.

kipf
  • 2,531
0

Take the determinant of your Hessian, set the result equal to zero and then solve for $\lambda$. In this case, you you get $$ \begin{vmatrix} 2-\lambda & -3 \\ -3 & 2-\lambda \end{vmatrix} = (2-\lambda)^2 - (-3)(-3) = 0 $$

The above is quadratic in $\lambda$ meaning you will get two values for $\lambda$ which may, or may not, be unique. These are your eigenvalues.

To determine if the point is a minima, or maxima, you need to look at the eigenvalues you just found. What are the conditions on the eigenvalues that would allow you to make this calssification?

AdamsK
  • 329
0

For dimension less than 3, to check local maxima, local minima, or a saddle point, it is not a must to calculate the eigenvalues of the hessian.

You can compute $\operatorname{det}(H)$ and $f_{xx}$ to determine the result. In your case, $\operatorname{det}(H)<0$ , which implies $(0,0)$ is a saddle point.

For more details, you may check Second partial derivative test and Sylvester's criterion

Tim
  • 779
0

$f(x,y) = x^2 - 3xy + y^2$ is either a paraboliod or a hypebolic paraboliod. If it is a paraboloid, $(0,0)$ will be a minimum. If it is a hyperbolic paraboloid $(0,0)$ is a saddle point.

Consider $x^2 - 3xy + y^2 = 0$
Plugging into the quadratic formula

$(x - \frac {3\pm \sqrt {5}}{2}y)(x + \frac {3\pm \sqrt {5}}{2}y) = 0$
or $f(x,y) = (x - \frac {3\pm \sqrt {5}}{2}y)(x + \frac {3\pm \sqrt {5}}{2}y)$

We are dealing with a hyperbolic paraboloid. $(0,0)$ will be a saddle point.

If you really want to do calculus...

If $f_{xx} f_{yy} > (f_{xy})^2$ we have a saddle point, while $f_{xx} f_{yy} < (f_{xy})^2$ is a maximum or a minimum.

If $\det\pmatrix {f_{xx}& f_{xy}\\f_{xy} & f_{yy}} < 0$ then we have a saddle point.

If the eigenvalues of $\pmatrix {f_{xx}& f_{xy}\\f_{xy} & f_{yy}}$ are one positive and one negative we have a saddle point.

Or, we could take the original function.

$f(x,y) = \begin{bmatrix}x&y\end{bmatrix}\begin{bmatrix}1&-\frac 32\\-\frac 32& 1\end{bmatrix}\begin{bmatrix}x\\y\end{bmatrix}$

Diagonalize that matrix, and the same implications of the eigenvalues holds.

user317176
  • 12,247