2

after seeing this answer to a post about the origins of the determinant, I have a couple of questions that only mathematicians will be able to answer.

The author writes that the formula pops up after solving a system of linear equations by hand, I have tried but cannot work it out (for the system of equations with 3 variables), would anyone be able to derive it for this case?

What is the underlying algebraic process, and why does it hold for all square dimensions.

Also, I am wondering about the progression of understanding the properties of the determinant, such as multilinearity. Cramer did not give a proof of his rule, and said he "thinks" he might have a solution to solvable systems of linear equations. A proof would have been challenging without knowing the properties of the determinant.

So I am wondering at about what time / who developed the idea that the determinant was multilinear.

Thank you.

Frazer
  • 554
  • 1
    Basic multilinearity properties were recognized early on, but the characterization of a determinant as an alternating multilinear function is certainly associated with Weierstrass and Kronecker, late 1800's. For details a classic reference is Muir's history of determinants. – blargoner Apr 04 '24 at 16:15

2 Answers2

3

OK, so, you want a derivation of Cramer's Rule for $2\times2$ systems.

We have $ax+by=c$, $dx+ey=f$, to solve for $x,y$, with $a,b,c,d,e,f$ all given.

Multiply the first equation by $e$, the second by $b$:

$aex+bey=ce$, $bdx+bey=bf$.

Subtract the second equation from the first:

$(ae-bd)x=ce-bf$.

Assuming $ae-bd\ne0$, we can divide by $ae-bd$, to get $$ x={ce-bf\over ae-bd} $$ Now we have to recognize $ce-bf=\det\pmatrix{c&b\cr f&e\cr}$ and $ae-bd=\det\pmatrix{a&b\cr d&e\cr}$, and voila – Cramer's Rule!

Well, half of it – we still have to do the $y$-part – the idea is the same – can you work out the details?

Gerry Myerson
  • 185,413
  • But what about the derivation in the system of 3 case – Frazer Apr 04 '24 at 09:51
  • 2
    You wrote, "I have tried but cannot work it out (for the system of equations with 2 variables), would anyone be able to derive it for this case?" so that's what I did, Fraser. After I answered, you edited your question to ask about $3\times3$. If I do that, you'll edit your question again, and ask me to do the $4\times4$. No, thanks. Instead, why don't you show us what happens when you try to follow my lead, and do the $3\times3$ yourself? – Gerry Myerson Apr 04 '24 at 11:45
  • 1
    Gerry I figured it out for 3 case! However, I actually had to use a formula for the 3d determinant instead of coming up with the algebraic manipulation myself to motivate the algebraic manipulation of equations. What's the underlying algebraic process that is going on here? Why do the permutation of the rows and columns show up? – Frazer Apr 04 '24 at 23:09
  • Good! I'm not sure what you mean by "the permutation of the rows and columns". – Gerry Myerson Apr 05 '24 at 00:25
  • Ignore that, I essentially just want to know the underlying algebraic process that governs which terms you multiply the equations by and then add and subtract. I see that it is essentially the cofactor expansion. Is this how the cofactor expansion was discovered? Is it provable by induction? – Frazer Apr 05 '24 at 00:46
  • A lot depends on what you take as your definition of the determinant. If you take it to be the sum, over all permutations $\sigma$, of the sign of $\sigma$ times the product of the $a_{i\sigma(i)}$, then you can get cofactor expansion by induction. I don't know the history. – Gerry Myerson Apr 05 '24 at 04:59
0

A historical survey is here with a literature references including Cramers paper Some Identities in the Theory of Determinants

G. B. Price The American Mathematical Monthly Vol. 54, No. 2 (Feb., 1947), pp. 75-90 (16 pages) Published By: Taylor & Francis, Ltd.

Roland F
  • 5,122