5

Two closely related questions about ordinals that I found quite confusing at first and couldn't find a satisfactory answer online (self-answering):

  • I've heard sentences like "$\omega^{CK}$ is the least ordinal that cannot be represented on a computer". What does this mean, precisely? Surely after exhausting notations for all computable ordinals, I could just represent $\omega^{CK}$ with something else? E.g. even in the Kleene's O notation, I could just denote $\omega^{CK}$ by $-1$ or something?

  • We say things like "the strength of a theory corresponds to the smallest recursive ordinal it cannot prove well-founded" -- but we know every ordinal is well-founded, right? So are we better than every theory?

  • If we know there exist ordinals we aren't even capable of thinking about, maybe we know all ordinals are well-founded, but can't prove each ordinal is well-founded. – aschepler Jun 16 '23 at 16:43
  • @aschepler Socrates is an ordinal, all ordinals are well-founded, therefore Socrates is well-founded. TL;DR of my answer is that it's better to phrase it as: $T$ can't prove each well-founded recursion is well-founded // i.e. each total computable function is total. Of course each well-founded recursion corresponds to an ordinal, but it is not clear that a given recursion represents an ordinal either, so that doesn't help us. – Abhimanyu Pallavi Sudhir Jun 16 '23 at 19:00
  • 1
    Sure. I was getting more at the Gödel Incompleteness angle, and that knowing something is not the same as being able to prove something. This is heading toward philosophy.stackexchange, though. – aschepler Jun 16 '23 at 19:53
  • @AbhimanyuPallaviSudhir Yes, that's a reasonable way to look at it. But deciding whether a partial recursive function is total is much easier than deciding whether a recursively presented ordering is well-founded. (The former is arithmetical, but the latter is not.) – Mitchell Spector Jun 16 '23 at 20:09
  • @aschepler That is not the point of Godel's incompleteness theorem, and the theorem applies to "knowing" in any sense of the term, see my previous answer. – Abhimanyu Pallavi Sudhir Jun 16 '23 at 22:38
  • @MitchellSpector I'm confused -- isn't a "recursively presented ordering" always a partial recursive function? – Abhimanyu Pallavi Sudhir Jun 16 '23 at 22:38
  • @AbhimanyuPallaviSudhir Determining whether a partial recursive function $\varphi_e$ is total is arithmetic: $\forall m \exists n (\varphi_e(m)\text{ converges in }n\text{ steps}).$ In contrast, determining whether an ordering is well-founded requires a second-order quantifier; you need to check that every function from $\omega$ to the ordering fails to be an infinite descending chain. The first is much simpler than the second. – Mitchell Spector Jun 17 '23 at 01:45
  • @MitchellSpector I get that, I'm just confused because doesn't a recursively presented ordering always define a partial recursive function? Why can't we just check if that function is total? – Abhimanyu Pallavi Sudhir Jun 17 '23 at 18:01
  • You can represent it as a partial recursive function if you want to (although more commonly it would be as a recursive set or an r.e. set). But there's no way to set it up so that testing if the partial recursive function is total is the same thing as testing whether the ordering is well-founded. Testing a partial recursive function to see if it's total is low in the arithmetical hierarchy $(\Pi^0_2),$ while testing for well-foundedness is only $\Pi^1_1$ (and not arithmetical or even hyperarithmetical). – Mitchell Spector Jun 17 '23 at 23:41

2 Answers2

11

First, a minor notational point: The ordinal in question is normally denoted $\omega_1^\mathrm{CK},$ with a subscript $1,$ not just $\omega^\mathrm{CK}.$ (It's Church and Kleene's effective analogue to $\omega_1,$ the first uncountable ordinal, not an analogue to $\omega.)$

As to the actual question, an ordinal $\alpha$ is said to be computable (or recursive) if there is a computable subset of $\omega\times\omega$ which, as a relation, is a well-ordering of $\omega$ of order type $\alpha.$

(By the way, this is a robust notion, in that you can look at recursively enumerable subsets of $\omega\times\omega,$ or arithmetically definable subsets, or even hyperarithmetically definable subsets, and you'll get exactly the same computable ordinals.)

Every computable ordinal is clearly a countable ordinal, since it is the order type of a well-ordering of $\omega.$

There are only countable many computable subsets of $\omega\times\omega,$ so there are only countably many computable ordinals.

So there are ordinals that are not computable, and the least such ordinal must be countable. That least non-computable ordinal is called $\omega_1^\mathrm{CK}.$

It's easy to see that if $\alpha$ is a computable ordinal and $\beta<\alpha,$ then $\beta$ is also computable.

It follows that the computable ordinals are precisely the ordinals that are less than $\omega_1^\mathrm{CK}.$

Of course, as suggested in the question, you can give $\omega_1^\mathrm{CK}$ a name (for example, "$\omega_1^\mathrm{CK}$") and "represent" it in the sense that it now has a name. But that doesn't give us a recursive representation of an actual ordering of order type $\omega_1^\mathrm{CK}.$

What happens if we take recursive representations of all the smaller ordinals (all those smaller ordinals are computable, after all) and piece them together? It turns out that there is no computable way to do this, so we can't get a computable well-ordering in this fashion. And this makes sense, since, by definition, the ordinal $\omega_1^\mathrm{CK}$ is not computable.


A second way of looking at this is via notations for computable ordinals.

Define what it means for a natural number to represent a computable ordinal by induction, as follows:

• The natural number $0$ represents the ordinal $0.$

• If $n$ represents $\alpha,$ then $2^n$ represents $\alpha+1.$

• If $\varphi_e$ happens to be a total recursive function such that $\varphi_e(0), \varphi_e(1), \varphi_e(2), \dots$ represent computable ordinals $\alpha_0, \alpha_1, \alpha_2, \dots,$ then $3^e$ represents $\sup_{n<\omega}\alpha_n.$ (Here $\varphi_e$ represents the $e^\mathrm{th}$ partial recursive function in some standard ordering.)

(This is a bit of a simplification. You'd actually define the natural ordering of these representations as part of the same induction, so you can require that $\alpha_0<\alpha_1<\alpha_2<\dots.$ By the way, if you read this in the literature, usually $3\cdot 5^e$ is used instead of $3^e,$ but this is just a historical accident that doesn't matter for anything. Also, just to be clear, although this is a definition by induction, it can't be formulated in Peano arithmetic. You can do it in ZF, of course.)

The set of all numbers that represent some countable ordinal is called Kleene's $\scr O.$ The natural ordering of these representations is called $<_\scr O.$

Every countable ordinal has a representation in $\scr O,$ and every computable ordinal greater than or equal to $\omega$ has infinitely many representations in $\scr O.$

You can see that $\langle \scr O, <_\scr O\rangle$ is a tree of height $\omega_1^\mathrm{CK}.$

But we can't use this ordering to get a recursive ordering of order type $\omega_1^\mathrm{CK}$ (for example, by finding a recursive branch through the ordering), since $\scr O$ isn't recursive. (It's not even arithmetically definable; $\scr O$ is a complete $\Pi^1_1$ set.)


For completeness, I'll mention that there's a third way of looking at all this: $\omega_1^\mathrm{CK}$ is the least ordinal $\alpha$ such that $L_\alpha$ (in Gödel's constructible hierarchy) is a model of Kripke-Platek set theory (which is like ZF but with separation and collection axioms limited to bounded formulas, and without the power set axiom).

This shows that $\omega_1^\mathrm{CK}$ is a very natural closure ordinal.


As for the last question — we're better than many theories. For example, $\epsilon_0$ is a computable ordinal, and we can write down a recursive definition of a well-ordering of $\omega$ of order type $\epsilon_0.$ But Peano arithmetic can't prove this is a well-ordering.

So this is an example of something that we know is true but that PA can't prove (the fact that the recursive ordering in question is a well-ordering).

When you go up to more powerful theories (for example, ZFC with large cardinals), we don't even know whether these theories are consistent, and, if they are, whether they actually have well-founded models. So who is correct – the theory ZFC + “there exists a ‘super-duper’ cardinal”, or the person who says “I don’t believe that ‘super-duper’ cardinals exist”? We just don’t know.

As far as I'm aware, the proof-theoretic ordinals of ZFC or more powerful theories haven't been extensively studied; it doesn't seem that there's a whole lot one can say about them (except that they're very complicated countable ordinals, much larger than $\epsilon_0.)$

  • Forgive me if I've just missed something in your post, but what does $\varepsilon_0$ mean? – Rob Arthan Jun 29 '23 at 20:59
  • @RobArthan $\epsilon_0$ is the least ordinal $\alpha$ such that $\omega^\alpha=\alpha.$ Alternatively, you can define it as the sup of the sequence $\omega,\omega^\omega,\omega^{\omega^\omega}, \omega^{\omega^{\omega^\omega}},\dots.$ There's some info on Wikipedia at: https://en.wikipedia.org/wiki/Epsilon_number – Mitchell Spector Jun 30 '23 at 02:49
6

implementing the ordinals

The thing that really clarified this for me was to actually implement ordinals in Python. Implementations in Haskell are plentiful of course (e.g.) but Haskell doesn't feel as grounded, since you don't actually see the computations being done.

from typing import Union, Callable
from __future__ import annotations

class Ordinal: def init(self, children : None | Ordinal | Callable[[int], Ordinal]): self.children = children

That's it. An ordinal is defined by its "children" -- if that's None, it's 0; if that's a single ordinal, it's the successor of that ordinal; if that's an infinite sequence of ordinals, it's the limit of that sequence.

You can then define some obvious operations.

Zero = Ordinal(None)

def Succ(x : Ordinal) -> Ordinal: return Ordinal(x)

def Lim(xs : Callable[[int], 'Ordinal']) -> Ordinal: return Ordinal(xs)

def Add(x : Ordinal, y : Ordinal) -> Ordinal: if not y.children: return x elif type(y.children) == Ordinal: return Succ(Add(x, y.children)) else: return Lim(lambda n : Add(x, y.children(n)))

def Mul(x : Ordinal, y : Ordinal) -> Ordinal: if not y.children: return Zero elif type(y.children) == Ordinal: return Add(Mul(x, y.children), x) else: return Lim(lambda n : Mul(x, y.children(n)))

def Pow(x : Ordinal, y : Ordinal) -> Ordinal: if not y.children: return Succ(Zero) elif type(y.children) == Ordinal: return Mul(Pow(x, y.children), x) else: return Lim(lambda n : Pow(x, y.children(n)))

And represent some small ordinals:

# ordinal corresponding to some finite n
def Fin(n : int) -> Ordinal:
  x = Zero
  for i in range(n):
    x = Succ(x)
  return x

Omega = Lim(Fin)

ordinal corresponding to omega^n

def Foo(n : int) -> Ordinal: x = Succ(Zero) for i in range(n): x = Pow(Omega, x) return x

Epsilon0 = Lim(Foo)


explicitly seeing them as functions

What stops us from making an infinite sequence of all (computable) ordinals? That would be paradoxical, of course -- if you had an ordinal whose children/descendants included itself, its definition would not be a well-founded recursion, it would be circular. But it's easy to enumerate all programs, is it really so hard to enumerate specifically programs that represent objects of the Ordinal class?

Let's think of a way to reduce an ordinal into a simple-to-understand program. The natural way to do this is:

def call(x : Ordinal) -> Callable | None:
  if not x.children:
    return None
  elif type(x.children) == Ordinal:
    return lambda: call(x.children)
  else:
    return lambda n: call(x.children(n))

So e.g. call(Epsilon0) returns a multivariable function that you can call as follows:

$$\epsilon_0(2)=\omega^\omega$$ $$\omega^\omega(4) = \omega^4$$ $$\omega^4(5) = \omega^3\cdot 5$$ $$\omega^3\cdot 5(2) = \omega^3\cdot4+\omega^2\cdot2$$ $$\omega^3\cdot4+\omega^2\cdot2(4) = \omega^3\cdot4+\omega^2+\omega\cdot4$$ $$\omega^3\cdot4+\omega^2+\omega\cdot4(3)=\omega^3\cdot4+\omega^2+\omega\cdot3+3$$ $$\omega^3\cdot4+\omega^2+\omega\cdot3+3()()()=\omega^3\cdot4+\omega^2+\omega\cdot3$$ ... and ultimately $\epsilon_0(2)(4)(5)(2)(4)(3)()()()(1)()(1)()(1)()(2)(2)()()(1)()(0)(0)(0)(1)(2)(3)()()()(2)()()=0$, which you can check by running

print(call(Epsilon0)(2)(4)(5)(2)(4)(3)()()()(1)()(1)()(1)()(2)(2)()()(1)()(0)(0)(0)(1)(2)(3)()()()(2)()())

You may think this is quite long, but I've actually chosen fairly small numbers at each step. The ridiculous thing is that despite these paths being absurdly long -- all of them do, in fact, terminate! This is a statement that should initially surprise you by the way -- you might think you can just provide larger and larger numbers at each step and get a non-terminating sequence, but nope.

$$\epsilon_0(2)(3)(4)(5)(6)()^6(7)()^7(8)()^8(9)()^9(10)(11)(12)()^{12}(13)()^{13}\dots(22)()^{22}(23)(24)^{24}(25)^{25}\dots(45)()^{45}(46)(47)(48)()^{48}(49)^{49}\dots(94)()^{94}=0$$

In general, the main point is that these sequences always terminate, i.e. the ordinals represent total functions. But given an arbitrary such function, it is not clear if it represents an ordinal. This is still a bit of a claim to unpack, at least to me, so it is helpful to think of ordinals from an entirely new perspective--


nested for loops

Remember the first time you actually bothered to implement recursion, because nested for loops wouldn't suffice? while loops would do it, of course, but often that's inelegant.

Generally speaking, if the only function we are allowed to define without recursion is the successor function (as it is the case with one nested for loop is sufficient to define addition $f_1(n)$, two to define multiplication $f_2(n)$, three to define exponentiation $f_3(n)$, and so on to define any hyperoperation.

And indeed this demonstrates that the hyperoperations partition (in terms of growth rate) all the primitive recursive functions -- but there are total computable functions that grow faster than any hyperoperation (i.e. than any primitive recursion), which you can define by diagonalization: i.e. $f_\omega(n)$, which uses $n$ for loops where $n$ is the value inputed (this is basically the "Ackermann function"), and you can go further and further. This is called the fast-growing hierarchy.

Quite generally such total computable functions can be written recursively, by Kleene's fixed-point theeorem (such a function is the "fixed point" of an operator like $\lambda (f, x) \mapsto xf(x-1)$ or whatever). What seems to be a good reference:

Fairtlough & Wainer (1992). Ordinal Complexity of Recursive Definitions.

TL;DR: Specifying an ordinal is equivalent to specifying a particular recursion. It is not immediately obvious that a given recursion is well-founded. ZFC can't actually prove "bla bla recursion equals bla bla big ordinal", even though it (non-constructively) "has" the latter ordinal.

some general concluding tl;dr comments

  • The right way to phrase the statement is "$T$ cannot prove that a given recursively-defined function is well-founded". Every recursively defined function is equivalent to a while loop (partial computable function), and every well-founded one is equivalent to a total computable function, so this is equivalent to saying that $T$ cannot enumerate the total computable functions (don't take it personally -- no one can!)
  • Yes, well-founded recursions can be represented by ordinals, but ZFC does not know which ordinal (if any) corresponds to a given recursive definition. I think this has to do something with Veblen hierarchies and the Feferman-Schütte_ordinal, but I'm not sure.