3

Does there exist a finite Commutative ring with 100 elements where $x^2=x$ for every $x\in R$? I know finite Boolean rings has the property this property but they have cardinality $2^n$, for some $n$. Also Boolean rings are rings with identity.

Does there exist a ring(with 100 elements) without identity and satisfies $x^2=x$ for every $x\in R$?

But why there may/may not exist such a ring?

Learner
  • 367

1 Answers1

6

Without the existence of the identity, we just cannot (? see below) prove that $R \cong (\mathbb Z / 2 \mathbb Z)^n$ as the comment pointed.

But we can still prove $\lvert R \rvert = 2^n$, by focusing on the additive structure. Notice that $(x + x)^2 = x + x \implies 2 x^2 = 0 \implies x + x = 0$. That is, the underlying group is a Boolean group, and we use the fact that the order of a Boolean group is a power of 2.


Update: We actually can prove that such ring must be isomorphic to $(\mathbb Z / 2 \mathbb Z)^n$, and identity must exist (if $R \ne 0$).

Define a relation $\subseteq$ on $R$ (the notation is chosen deliberately) by $x \subseteq y \iff x = x y$. Then,

  1. $x \subseteq x$: $x = x x$.
  2. $x \subseteq y \land y \subseteq z \implies x \subseteq z$: $x = x y = x y z = x z$.
  3. $x \subseteq y \land y \subseteq x \implies x = y$: $x = x y$ and $y = x y$ imply $x = y$.
  4. $0 \subseteq x$: Trivial.
  5. $x y \subseteq x$: Trivial.

So we’ve shown $\subseteq$ is a partial order on $R$ with a unique minimum $0$.

Now consider removing this $0$ and investigate the remaining minimal elements, say $n$ of them, denoting $\{ e_1, e_2, \ldots, e_n \}$. We have $x \subseteq e_i \implies \text{$x = 0$ or $x = e_i$}$.

Claim 1. $e_i e_j = 0$ for all $i \ne j$.
Proof. Clearly $e_i e_j \subseteq e_i$, so $e_i e_j$ is either $0$ or $e_i$. We only need to exclude $e_i e_j = e_i$, but this means $e_i \subseteq e_j$, contradicting with the minimality of $e_j$.

Define $S(x) = \{ i : e_i \subseteq x \}$. So $e_i x = \begin{cases} e_i & \text{, $i \in S(x)$,} \\ 0 & \text{, $i \notin S(x)$.} \end{cases}$

Claim 2. $\displaystyle x = \sum_{j \in S(x)} e_j$.
Proof. By induction on $\lvert S(x) \rvert$:

  • When $\lvert S(x) \rvert = 0$, (with the finiteness of $R$) implying $x = 0$, the claim holds.
  • When $\lvert S(x) \rvert \ge 1$, choose arbitrary $i \in S(x)$, and what we can say about $S(x + e_i)$:
    • For $e_i$, we have $e_i (x + e_i) = e_i x + e_i^2 = e_i + e_i = 0$.
    • For $e_j$ ($j \ne i$ and $j \in S(x)$), we have $e_j (x + e_i) = e_j x + e_j e_i = e_j + 0 = e_j$.
    • For $e_j$ ($j \ne i$ and $j \notin S(x)$), we have $e_j (x + e_i) = e_j x + e_j e_i = 0 + 0 = 0$.
    • Therefore, $S(x + e_i) = S(x) \setminus \{ i \}$, and from induction we have: $$ x + e_i = \sum_{\substack{j \in S(x) \\ j \ne i}} e_j \implies x = \Biggl( \sum_{\substack{j \in S(x) \\ j \ne i}} e_j \Biggr) - e_i = \sum_{j \in S(x)} e_j \text{.} $$

From Claim 2 we observe that all $x \in R$ can write as a subset sum of $\{ e_1, \ldots, e_n \}$.
That is, $\{ e_1, \ldots, e_n \}$ spans $R$, in the sense of $\mathbb{F}_2$-vector space.

To prove $\{ e_1, \ldots, e_n \}$ is a basis, it remains for us to show the linear independence:
For any nonempty $S \subseteq \{ 1, \ldots, n \}$, choose $i \in S$, we have $\displaystyle e_i \sum_{j \in S} e_j = \sum_{j \in S} e_i e_j = e_i \ne 0$, so it’s impossible for $\displaystyle \sum_{j \in S} e_j = 0$.

We now claim that $\displaystyle e := \sum_{i = 1}^{n} e_i$ is the identity in $R$:
For any $x \in R$, write $\displaystyle x = \sum_{j \in S(x)} e_j$, we have $$ \begin{aligned} x e &= \sum_{j \in S(x)} e_j \sum_{i = 1}^{n} e_i \\ &= \sum_{j \in S(x)} \sum_{i = 1}^{n} e_j e_i \\ &= \sum_{j \in S(x)} e_j \\ &= x \text{.} \end{aligned} $$

Finally we’re done: The structure of such $R$ must admit that, the elements are subsets of $\{ 1, \ldots, n \}$, the additive structure is the symmetric difference of those subsets, and the multiplicative structure is the intersection of those subsets. This coincides with the mentioned $R \cong (\mathbb Z / 2 \mathbb Z)^n$ in A Finite Boolean Ring is Generated by Finitely Many Copies of $\mathbb{Z} / 2 \mathbb{Z}$.

PinkRabbit
  • 1,062
  • +1. Maybe another intermediate step or two to add would be helpful though: on the one hand, $(x+x)^2 = (2x)^2=4x^2$. On the other hand $(x+x)^2= x+x = x^2+x^2 = 2x^2$. Putting these together gives $4x^2=2x^2$ giving on the one hand $4x^2-2x^2=0$ and $4x^2-2x^2=2x^2$ on the other, giving $2x^2=0$. – Mike Jul 02 '24 at 17:24
  • Neat! I will note for completion that while the second part uses commutativity, it is not required; it can be shown that commutativity is automatic from the Boolean property – Milten Jul 02 '24 at 18:47
  • The proof of mine is sort of duplicate, see https://math.stackexchange.com/q/3555733 for historical answers, https://math.stackexchange.com/questions/391169 for counterexamples when dropping the finiteness and some other proofs. The answer https://math.stackexchange.com/a/391197 mentioned the fact that if $R \otimes_{\mathbb Z} R \to R$ is surjective and $R$ is finite, then $R$ is unital (which I do not recognize and has no idea how to prove it). And https://math.stackexchange.com/a/305389 this gives another proof. – PinkRabbit Jul 02 '24 at 19:06