Questions tagged [matrices]

For any topic related to matrices. This includes: systems of linear equations, eigenvalues and eigenvectors (diagonalization, triangularization), determinant, trace, characteristic polynomial, adjugate and adjoint, transpose, Jordan normal form, matrix algorithms (e.g. LU, Gauss elimination, SVD, QR), invariant factors, quadratic forms, etc. For questions specifically concerning matrix equations, use the (matrix-equations) tag.

A matrix is a rectangular array of elements, usually numbers or variables, arranged in rows and columns. A matrix with $m$ rows and $n$ columns has $m \times n$ elements and is called an $m$ by $n$ matrix. Matrices are a part of .

Matrices can be added and subtracted. Furthermore, if they have compatible shapes, they can be multiplied. More precisely, given two matrices $A$ and $B$, the matrix $AB$ is defined when the number of columns of $A$ is equal to the number of rows of $B$. In particular, given a natural number $n$, any two matrices $A$ and $B$ with $n$ columns and $n$ rows can be multiplied in both ways (that is, both $AB$ and $BA$ exist).


For questions specifically concerning matrix equations, use the tag.

57452 questions
847
votes
20 answers

What's an intuitive way to think about the determinant?

In my linear algebra class, we just talked about determinants. So far I’ve been understanding the material okay, but now I’m very confused. I get that when the determinant is zero, the matrix doesn’t have an inverse. I can find the determinant of a…
Jamie Banks
  • 13,410
465
votes
4 answers

What is the intuitive relationship between SVD and PCA?

Singular value decomposition (SVD) and principal component analysis (PCA) are two eigenvalue methods used to reduce a high-dimensional data set into fewer dimensions while retaining important information. Online articles say that these methods are…
405
votes
36 answers

If $AB = I$ then $BA = I$

If $A$ and $B$ are square matrices such that $AB = I$, where $I$ is the identity matrix, show that $BA = I$. I do not understand anything more than the following. Elementary row operations. Linear dependence. Row reduced forms and their…
Dilawar
  • 6,353
369
votes
11 answers

What is the importance of eigenvalues/eigenvectors?

What is the importance of eigenvalues/eigenvectors?
350
votes
0 answers

Limit of sequence of growing matrices

Let $$ H=\left(\begin{array}{cccc} 0 & 1/2 & 0 & 1/2 \\ 1/2 & 0 & 1/2 & 0 \\ 1/2 & 0 & 0 & 1/2\\ 0 & 1/2 & 1/2 & 0 \end{array}\right), $$ $K_1=\left(\begin{array}{c}1 \\ 0\end{array}\right)$ and consider the sequence of matrices defined by $$ K_L =…
333
votes
9 answers

Is a matrix multiplied with its transpose something special?

In my math lectures, we talked about the Gram-Determinant where a matrix times its transpose are multiplied together. Is $A A^\mathrm T$ something special for any matrix $A$?
285
votes
3 answers

How does one prove the determinant inequality $\det\left(6(A^3+B^3+C^3)+I_{n}\right)\ge 5^n\det(A^2+B^2+C^2)$?

Reposted on MathOverflow Let $\,A,B,C\in M_{n}(\mathbb C)\,$ be Hermitian and positive definite matrices such that $A+B+C=I_{n}$, where $I_{n}$ is the identity matrix. Show that $$\det\left(6(A^3+B^3+C^3)+I_{n}\right)\ge 5^n \det…
math110
  • 94,932
  • 17
  • 148
  • 519
271
votes
6 answers

Eigenvectors of real symmetric matrices are orthogonal

Can someone point me to a paper, or show here, why symmetric matrices have orthogonal eigenvectors? In particular, I'd like to see proof that for a symmetric matrix $A$ there exists decomposition $A = Q\Lambda Q^{-1} = Q\Lambda Q^{T}$ where…
253
votes
8 answers

Proof that the trace of a matrix is the sum of its eigenvalues

I have looked extensively for a proof on the internet but all of them were too obscure. I would appreciate if someone could lay out a simple proof for this important result. Thank you.
248
votes
8 answers

What are the Differences Between a Matrix and a Tensor?

What is the difference between a matrix and a tensor? Or, what makes a tensor, a tensor? I know that a matrix is a table of values, right? But, a tensor?
Aurelius
  • 2,881
241
votes
13 answers

Inverse of the sum of matrices

I have two square matrices: $A$ and $B$. $A^{-1}$ is known and I want to calculate $(A+B)^{-1}$. Are there theorems that help with calculating the inverse of the sum of matrices? In general case $B^{-1}$ is not known, but if it is necessary then it…
229
votes
6 answers

Why does this matrix give the derivative of a function?

I happened to stumble upon the following matrix: $$ A = \begin{bmatrix} a & 1 \\ 0 & a \end{bmatrix} $$ And after trying a bunch of different examples, I noticed the following remarkable pattern. If $P$ is a polynomial,…
224
votes
6 answers

What is the geometric interpretation of the transpose?

I can follow the definition of the transpose algebraically, i.e. as a reflection of a matrix across its diagonal, or in terms of dual spaces, but I lack any sort of geometric understanding of the transpose, or even symmetric matrices. For example,…
221
votes
7 answers

Transpose of inverse vs inverse of transpose

Given a square matrix, is the transpose of the inverse equal to the inverse of the transpose? $$ (A^{-1})^T = (A^T)^{-1} $$
Void Star
  • 2,665
212
votes
7 answers

How could we define the factorial of a matrix?

Suppose I have a square matrix $\mathsf{A}$ with $\det \mathsf{A}\neq 0$. How could we define the following operation? $$\mathsf{A}!$$ Maybe we could make some simple example, admitted it makes any sense, with $$\mathsf{A} = \left(\begin{matrix} 1…
user266764
1
2 3
99 100