6

The standard proof that a $k$-algebra $A$ that is finite-dimensional as a $k$-vector space is Artinian goes as follows:

"Suppose we have an infinite descending chain of ideals $I_1 \supseteq I_2 \supseteq I_3 \supseteq ...$ in $A$. Then each $I_j$ is also a finite-dimensional $k$--vector space (since $A$ is), so $\dim_k I_1 \ge \dim_k I_2 \ge \dim_k I_3 \ge ...$. Since we cannot have an infinite strictly descending sequence of non-negative integers, the sequence of dimensions must stabilize, so the chain of ideals must stabilize."

What I'm wondering is why it isn't possible to eventually have $I_n \supseteq I_{n+1} \supseteq I_{n+2} \supseteq ...$ where $\dim_k I_n = \dim_k I_{n+1} = \dim_k I_{n+2} = ...$ but $I_n$, $I_{n+1}$, $I_{n+2}$, etc. are all distinct ideals in $A$. In other words, why should the fact that the dimensions stabilize imply that the ideals also stabilize?

Thanks in advance for any help.

  • 1
    How many $n$-dimensional subspaces are there of an $n$-dimensional vector space? Adapt the answer to your chain of ideals. Can the $I_{n+i}$, for non-negative integers $i,$ be distinct? – Chris Leary Jan 23 '19 at 00:25
  • 1
    note that if an algebral is unital then any ideal must be a subspace. – David Holden Jan 23 '19 at 00:30

1 Answers1

3

If $V$ is a finite-dimensional vector space and $W\subseteq V$ is a subspace with $\dim W=\dim V$, then $W=V$. Indeed, $\dim V=\dim W+\dim V/W$ so $\dim V/W=0$ which means $V/W$ is trivial so $W=V$.

Eric Wofsey
  • 342,377