1

The Elementary Recursive functions are, roughly speaking, all the functions $\mathbb{N}^k\to \mathbb{N}$ which are definable using only $\sum$ summation and $\prod$ product notation. The complexity class $\sf{ELEMENTARY}$ consists of every decision problem which can be solved in time bounded by some elementary recursive function (equivalently, a finitely iterated exponential). We consider the related class of computable functions which can be computed within $\sf{ELEMENTARY}$ time. It's somewhat trivial that every Elementary Recursive function is within that set. My question is about whether the converse is true.

If $F$ is a total computable function, and $F$ can be computed within $\sf{ELEMENTARY}$ time, then is $F$ necessarily an Elementary Recursive function? That is, can we define $F$ using only the operations $\sum,\prod,0,1$?

My question is essentially about how expressive the elementary recursive functions are. If they can implement any total function which can be computed in a "reasonable" amount of time, then we can say the ER functions are highly expressive. The analogous question about primitive recursive functions has an affirmative answer, and this conveys how expressive PR functions really are. I get the feeling that Elementary Recursive functions have a similar degree of expressiveness.

Jade Vanadium
  • 5,046
  • 10
  • 25

1 Answers1

1

Yes, the functions computable in $\sf{ELEMENTARY}$ time are necessarily Elementary Recursive. This ultimately just comes down to the fact that elementary recursive functions can encode arbitrary $\Delta_0$ arithmetical formulae, as well as the bounded minimization operator. In fact, even the Lower Elementary Recursive functions can do this. We can also implement any typical Godel-coding of finite sequences.

With the help of exponentiation, it becomes possible to simulate arbitrary computer programs by reconstructing their program trace. Provided a computable function $F$ terminates in $\sf{ELEMENTARY}$-time, the output $F(x)$ can be obtained from any program trace longer than the computation time, and so $F$ itself is Elementary Recursive.


We start by recovering some basic arithmetic. Note that $\mathrel{\dot-}$ denotes the truncated subtraction. $$\begin{align} x\cdot y &:= \sum_{k=1}^y x \\ x+1 &:= \sum_{k=0}^x 1 \\ y\mathrel{\dot-}x &:= \sum_{k=x+1}^y 1 \\ x+y &:= (x+1)\cdot (y+1)-((x+1)\cdot (y+1)\mathrel{\dot-}x\mathrel{\dot-}y) \end{align}$$

The above definition for addition works since $(x+1)\cdot (y+1) > x+y$, hence the subtractions never get truncated. Since the successor function was definable using nothing but $0$ and $1$, this affirms the fact that we could omit all the other constant functions.

To implement more complex behavior, we need material logic. We can have $0,1$ represent false/true respectively, and the predicate "$x=0$" can be implemented by $1\mathrel{\dot-}x$, which doubles as a definition for negation. The entirety of material logic, including the bounded quantifiers, can thus be implemented like so. $$\begin{align} \neg x &:= (1\mathrel{\dot-}x) \\ x\land y &:= x\cdot y \\ x\lor y &:= \neg(\neg x \land \neg y) \\ \exists[k<x : \phi(k)] &:= \neg\neg\sum_{k=1}^x \phi(k-1) \\ \forall[k<x : \phi(k)] &:= \neg\exists[k<x : \neg\phi(k)] \\ \end{align}$$

To implement any $\Delta_0$ arithmetical predicate, we just need the equality relation. We can implement $\leq$ by comparing a truncated subtraction with $0$, and equality can be obtained from that. Finally, the bounded minimization operator should return the least object satisfying a property in a given range, if such an object exists. That minimum can be found by counting how many elements are lesser than that minimum.

$$\begin{align} [x\leq y] &:= \neg(x\mathrel{\dot-}y) \\ [x=y] &:= (x\leq y)\land (y\leq x) \\ \min\{k<x : \phi(k)\} &:= \sum_{k=1}^x \forall[j<k, \neg\phi(j)] \end{align}$$

Since we have addition, multiplication, the $=$ relation, all logical connectives, and bounded quantification, it's clear that any $\Delta_0$ arithmetical formula can be implemented by some Lower Elementary Recursive predicate. That is, the ER predicates defined without invoking $\prod$ products. We also have the bounded minimization operator, as promised. The majority of Godel coding can also be implemented using LER functions. We'll use the standard technique via prime factorizations, where the sequence of primes are defined with the help of this bound. $$\begin{align} \lfloor x/y\rfloor &:= \min\{d<x+1 : x<y\cdot(d+1)\}\\ \mathbb{P}(n) &:= (1<n)\land \forall[k<n, \forall[j<n, \neg(n=j\cdot k)]] \\ \pi(n) &:= \sum_{k=0}^n \mathbb{P}(n)\\ P_n &:= \min\{p<n\cdot n+2 : \pi(p)=n\} \\ (d|n) &:= \exists[j<n+1, d\cdot j = n] \\ \nu(x,n) &:= \sum_{k=P_n}^x (k|x)\land \forall[d<k+1, \neg(d|k) \lor \neg\mathbb{P}(d) \lor d=P_n] \\ f[n] &:= \nu(f+1,n+1) \end{align}$$

For each zero-terminating $F:\mathbb{N}\to\mathbb{N}$, there's exactly one $f\in\mathbb{N}$ where generally $f[x]=F(x)$. Moreover, $f$ can be obtained from $f=\left(\prod_{k=1}^N P_n^{F(x)}\right)-1$ for sufficiently large $N$.


From here, it's a standard result that each total computable function can be defined by a $\Sigma_1$ arithmetical formula. This means that a total computable $F:x\mapsto y$ can be defined by a formula of the form $\exists n, \phi(x,y,n)$, where $\phi$ is some $\Delta_0$ arithmetical formula. Basically, $n$ represents a program trace of the relevant computation continued up to a halting state, and $\phi$ is just defined in terms of Kleene's $T$ predicate.

Using $N(x):=\min\{n : \exists y, \phi(x,y,n)\}$, we can see $F(x)=\min\{y : \phi(x,y,N(x))\}$. Under our coding of sequences, we may assume $F(x)\leq N(x)$, and moreover if $F(x)$ can be computed in $\sf{ELEMENTARY}$ time, then $N(x)$ should be bounded above by an ER function, say $N(x)\leq B(x)$. This works since the number of bits needed to specify the program trace $N(x)$ is at most exponential as a function of the computation time. Since we assumed the computation time for $F$ is bounded by an Elementary Recursive function, and ER functions are closed under exponentiation, then $N(x)$ is bounded similarly.

Finally, we'll have $\phi(x,y,n)$ being an Elementary Recursive predicate since it's $\Delta_0$, and $N(x)=\min\{n\leq B(n) : \exists(y\leq n), \phi(x,y,n)\}$ is Elementary Recursive since we can implement the bounded minimization operator, and lastly $F(x) = \min\{y\leq N(x) : \phi(x,y,N(x))\}$ is Elementary Recursive for the same reason. This concludes the proof.

Jade Vanadium
  • 5,046
  • 10
  • 25
  • As a corollary: ER can perform any instance of primitive recursion where the resulting function is bounded above by an ER function. Suppose $g(n+1)=f(g(n))$ recursively, where $f$ is ER. Then, there's monotonic ER functions $B$ and $T$ where $g(n)\leq B(n)$, and where $T(b)$ bounds how long it takes to compute $f(b)$. We easily prove that the time it takes to compute $g(n)$ is bounded by roughly $\sum_{x<n} T(B(n))$, which is ER. Therefore $g$ is $\sf{ELEMENTARY}$-time computable, and thus Elementary Recursive. – Jade Vanadium Dec 23 '24 at 01:51