The standard proof that a $k$-algebra $A$ that is finite-dimensional as a $k$-vector space is Artinian goes as follows:
"Suppose we have an infinite descending chain of ideals $I_1 \supseteq I_2 \supseteq I_3 \supseteq ...$ in $A$. Then each $I_j$ is also a finite-dimensional $k$--vector space (since $A$ is), so $\dim_k I_1 \ge \dim_k I_2 \ge \dim_k I_3 \ge ...$. Since we cannot have an infinite strictly descending sequence of non-negative integers, the sequence of dimensions must stabilize, so the chain of ideals must stabilize."
What I'm wondering is why it isn't possible to eventually have $I_n \supseteq I_{n+1} \supseteq I_{n+2} \supseteq ...$ where $\dim_k I_n = \dim_k I_{n+1} = \dim_k I_{n+2} = ...$ but $I_n$, $I_{n+1}$, $I_{n+2}$, etc. are all distinct ideals in $A$. In other words, why should the fact that the dimensions stabilize imply that the ideals also stabilize?
Thanks in advance for any help.