Without the existence of the identity, we just cannot (? see below) prove that $R \cong (\mathbb Z / 2 \mathbb Z)^n$ as the comment pointed.
But we can still prove $\lvert R \rvert = 2^n$, by focusing on the additive structure. Notice that $(x + x)^2 = x + x \implies 2 x^2 = 0 \implies x + x = 0$. That is, the underlying group is a Boolean group, and we use the fact that the order of a Boolean group is a power of 2.
Update: We actually can prove that such ring must be isomorphic to $(\mathbb Z / 2 \mathbb Z)^n$, and identity must exist (if $R \ne 0$).
Define a relation $\subseteq$ on $R$ (the notation is chosen deliberately) by $x \subseteq y \iff x = x y$. Then,
- $x \subseteq x$: $x = x x$.
- $x \subseteq y \land y \subseteq z \implies x \subseteq z$: $x = x y = x y z = x z$.
- $x \subseteq y \land y \subseteq x \implies x = y$: $x = x y$ and $y = x y$ imply $x = y$.
- $0 \subseteq x$: Trivial.
- $x y \subseteq x$: Trivial.
So we’ve shown $\subseteq$ is a partial order on $R$ with a unique minimum $0$.
Now consider removing this $0$ and investigate the remaining minimal elements, say $n$ of them, denoting $\{ e_1, e_2, \ldots, e_n \}$. We have $x \subseteq e_i \implies \text{$x = 0$ or $x = e_i$}$.
Claim 1. $e_i e_j = 0$ for all $i \ne j$.
Proof. Clearly $e_i e_j \subseteq e_i$, so $e_i e_j$ is either $0$ or $e_i$. We only need to exclude $e_i e_j = e_i$, but this means $e_i \subseteq e_j$, contradicting with the minimality of $e_j$.
Define $S(x) = \{ i : e_i \subseteq x \}$. So $e_i x = \begin{cases} e_i & \text{, $i \in S(x)$,} \\ 0 & \text{, $i \notin S(x)$.} \end{cases}$
Claim 2. $\displaystyle x = \sum_{j \in S(x)} e_j$.
Proof. By induction on $\lvert S(x) \rvert$:
- When $\lvert S(x) \rvert = 0$, (with the finiteness of $R$) implying $x = 0$, the claim holds.
- When $\lvert S(x) \rvert \ge 1$, choose arbitrary $i \in S(x)$, and what we can say about $S(x + e_i)$:
- For $e_i$, we have $e_i (x + e_i) = e_i x + e_i^2 = e_i + e_i = 0$.
- For $e_j$ ($j \ne i$ and $j \in S(x)$), we have $e_j (x + e_i) = e_j x + e_j e_i = e_j + 0 = e_j$.
- For $e_j$ ($j \ne i$ and $j \notin S(x)$), we have $e_j (x + e_i) = e_j x + e_j e_i = 0 + 0 = 0$.
- Therefore, $S(x + e_i) = S(x) \setminus \{ i \}$, and from induction we have:
$$ x + e_i = \sum_{\substack{j \in S(x) \\ j \ne i}} e_j \implies x = \Biggl( \sum_{\substack{j \in S(x) \\ j \ne i}} e_j \Biggr) - e_i = \sum_{j \in S(x)} e_j \text{.} $$
From Claim 2 we observe that all $x \in R$ can write as a subset sum of $\{ e_1, \ldots, e_n \}$.
That is, $\{ e_1, \ldots, e_n \}$ spans $R$, in the sense of $\mathbb{F}_2$-vector space.
To prove $\{ e_1, \ldots, e_n \}$ is a basis, it remains for us to show the linear independence:
For any nonempty $S \subseteq \{ 1, \ldots, n \}$, choose $i \in S$, we have $\displaystyle e_i \sum_{j \in S} e_j = \sum_{j \in S} e_i e_j = e_i \ne 0$, so it’s impossible for $\displaystyle \sum_{j \in S} e_j = 0$.
We now claim that $\displaystyle e := \sum_{i = 1}^{n} e_i$ is the identity in $R$:
For any $x \in R$, write $\displaystyle x = \sum_{j \in S(x)} e_j$, we have
$$ \begin{aligned} x e &= \sum_{j \in S(x)} e_j \sum_{i = 1}^{n} e_i \\ &= \sum_{j \in S(x)} \sum_{i = 1}^{n} e_j e_i \\ &= \sum_{j \in S(x)} e_j \\ &= x \text{.} \end{aligned} $$
Finally we’re done: The structure of such $R$ must admit that, the elements are subsets of $\{ 1, \ldots, n \}$, the additive structure is the symmetric difference of those subsets, and the multiplicative structure is the intersection of those subsets. This coincides with the mentioned $R \cong (\mathbb Z / 2 \mathbb Z)^n$ in A Finite Boolean Ring is Generated by Finitely Many Copies of $\mathbb{Z} / 2 \mathbb{Z}$.