4

I have read the so-called Bit-Sum Lemma from

Neil Immerman. "Descriptive Complexity" (Lemma 1.18) and from

Barrington, Immerman and Straubing. "On uniformity within $NC^1$" (Lemma 7.2)

but I am unable to understand a small part of the proof.

The statement of the Bit-Sum Lemma goes as follows:

Let $BSUM(x,y)$ hold iff $y$ is equal to the sum of bits in $x$'s binary representation. Then $BSUM$ is first-order expressible using a total order on the universe and the BIT predicate (Remember that $BIT(x,y)$ holds if the $y$-th bit of $x$ is $1$).

A rough sketch of the proof that is given in both of the above sources, goes as follows: assuming that $\log n \geq (\log \log n)^2$ (the finitely many cases for which this is not true can be handled by a first order formula). We think that the binary representation of every element in the universe is divided in $\log \log n$ many parts each of them having length $\log \log n$ bits. We guess (i.e. we existentially quantify over) some $z$, such that part $i$ of $z$ contains the sum of the bits of $x$ until and including part $i$. In order to assert that the choice of $z$ is correct we guess another element of the universe $w$, such that part $i$ of $w$ contains exactly the sum of the bits of part $i$ of $x$. Finally using carry look-ahead addition we verify that our choice of $z$ is correct: that is part $i$ or $z$ is equal to the sum of part $i-1$ of $z$ plus part $i$ of $w$. Finally $y$ is equal to the final part of $z$.

The only thing I am missing in the above proof is how can we assert that part $i$ of $z$ is equal to the sum of all bits in part $i$ of $x$. In the above sources they treat this verification as being trivial. However if this verification trivial, why it is not trivial to verify that $y$ is equal to the bit sum of $x$ from the very beginning? The answer should probably depend on the fact that the length of $x$ is $\log n$ bits, whereas the length of part $i$ is only $\log \log n$ bits. I am unable to see why this restriction in length makes computing the bit sum easier. Any help would be appreciated.

1 Answers1

2

This is better explained in Lemma 1 and 2 of Durand, Lautemann, More: Counting Results in Weak Formalisms:

  • Lemma 1: There's a FO[BIT] formula $B(L, x, i, y)$ which holds iff $x < 2^L$ and the first block of length $L$ in $x$ is the same as the $i$th block of length $L$ in $y$.
    Their proof uses multiplication ($iL$) and sum (to range within the block). However, if you assume that $L$ is a power of 2, then $iL$ is easy to compute with BIT and sum, since it's a bit-shift. That sum is in FO[BIT] is Prop 1.9 in Immerman (carry look-ahead).
  • Lemma 2 (reworked for your specific question): There's a FO[BIT] formula $C(L, z, P)$ such that $z$ is the number of values $x$ in $\{0, \ldots, 2^L-1\}$ such that $P(x)$ holds.
    Proof: We quantify $y$ such that its first $z$ blocks of length $L$ are all distinct, and such that these blocks are exactly the positions that satisfy $P$. In symbols, there exists $y$ such that:
    • $(\forall i, i' < z)(\forall x < 2^L)[(B(L, x, i, y) \land B(L, x, i', y)) \to (i = i')]$, i.e., all the blocks of $y$ are distinct (note that $x < 2^L$ is easy to express with BIT);
    • $(\forall x < 2^L)[P(x) \leftrightarrow (\exists i < z)[B(L, x, i, y)]]$, i.e., every $x$ that satisfies $P$ is a block of $y$.

Clearly, this technique cannot be extended beyond log counting, since we can only quantify $\log n$ blocks in $y$.

  • 1
    It is possible to count polylog bits by iterating log counting. – Emil Jeřábek Jun 17 '25 at 05:37
  • @EmilJeřábek: Yes! And this is the purpose of the lemmata that follow Lemma 2 in the paper I quote. Therein, they rely on induction on $s$ to count up to $\log^s n$. I think you're pointing at the statement "this technique cannot be extended beyond log counting" as being maybe an overstatement — that's a fine point; the extra lemmata do not rely on this exact technique, but inductively build on it. In any case, I hope these comments will point readers in the right direction ☺ Cheers! – Michaël Cadilhac Jun 19 '25 at 04:57