30

In computability theory, computable functions are also called recursive functions. At least at first sight, they do not have anything in common with what you call "recursive" in day-to-day programming (i.e., functions that call themselfes).

What is the actual meaning of recursive in the context of computability? Why are those functions called "recursive"?

To put it in other words: What is the connection between the two meanings of "recursiveness"?

Raphael
  • 73,212
  • 30
  • 182
  • 400
Golo Roden
  • 402
  • 4
  • 8

3 Answers3

22

The founders of computability theory were mathematicians. They founded what is now called computability theory before there was any computers. What was the way mathematicians defined functions that could be computed? By recursive definitions!

So there were recursive function before there were any other model of computation like Turing machines or lambda calculus or register machines. So people referred to these function as recursive functions. The fact that they turned out to be exactly what Turing machines and other models can compute is a later event (mostly proven by Kleene).

We have the simple definition of a recursive function which is now called primitive recursive function. There were not general enough (e.g. Ackermann's function) so people developed more general notions like $\mu$-recursive functions and Herbrand-Gödel general recursive functions that did capture all computable functions (assuming the Church's thesis). Church claimed that his model of lambda calculus captured all computable functions. Many people, and in particular Gödel, were not convinced that these capture all functions that can be computed. Until Turing's analysis of computation and introduction of his machine model.

The name of the field used to recursion theory. However there has been a successful push in recent decades to change the name to something more appealing from recursion theory to something more computer sciency (vs. mathy). As a result the field is now called computability theory. However if you look at books, papers, conferences, etc. in the early decades they are called recursion theory and not computability theory. Even the title of Soare's own 1987 book (who was the main person behind the push to change the name to computability theory) is "Recursively Enumerable Sets and Degrees".

If you want to know more about the history a fun and good place to read about it is the first chapter of Classical Recursion Theory by Odifreddi.

Kaveh
  • 22,661
  • 4
  • 53
  • 113
17

Define some basic functions:

  • zero function

    $$ zero: \mathbb{N} \rightarrow \mathbb{N} : x \mapsto 0 $$

  • successor function

    $$ succ: \mathbb{N} \rightarrow \mathbb{N} : x \mapsto x + 1 $$

  • projection function

$$p_i^n: \mathbb{N}^n \rightarrow \mathbb{N} : (x_1, x_2, \dots, x_n) \mapsto x_i $$

From now on I will use $\bar{x_n}$ to denote $(x_1, x_2, \dots, x_n)$

Define a composition:

Given functions

  • $g_1, g_2, \dots, g_m$ each with signature $\mathbb{N}^k \rightarrow \mathbb{N}$
  • $f : \mathbb{N}^m \rightarrow \mathbb{N}$

Construct the following function:

\begin{align} h: \mathbb{N}^k \rightarrow \mathbb{N}: \bar{x_k} \mapsto h(\bar{x_k}) = f ( & g_1(\bar{x_k}), g_2(\bar{x_k}), \dots, g_m(\bar{x_k}) ) \end{align}

Define primitive recursion:

Given functions

  • $ f: \mathbb{N}^k \rightarrow \mathbb{N} $
  • $ g: \mathbb{N}^{k+2} \rightarrow \mathbb{N} $

Construct the following (piecewise) function:

$$ h : \mathbb{N}^{k+1} \rightarrow \mathbb{N} : \\ (\bar{x_k}, y + 1) \mapsto \begin{cases} f(\bar{x_k}), & y + 1 = 0 \\ g (\bar{x_k}, y, h(\bar{x_k}, y)), & y + 1 > 0 \end{cases} $$


All functions that can be made using compositions and primitive recursion on basic functions, are called primitive recursive. It is called that way by definition. While a link with functions that call themselves exists, there's no need to try and link them with each other. You might consider recursion a homonym.

This definition and construction above was constructed by Gödel (a few other people were involved too) in an attempt to capture all functions that are computable i.e. there exists a Turing Machine for that function. Note that the concept of a Turing Machine was not yet described, or it was at least very vague.

(Un)fortunately, someone called Ackermann came along and defined the following function:

  • $Ack : \mathbb{N}^2 \rightarrow \mathbb{N}$
  • $Ack(0, y) = y+1$
  • $Ack(x+1, 0) = Ack(x, 1)$
  • $Ack(x+1, y+1) = Ack(x, Ack(x+1,y))$

This function is computable, but there's no way to construct it using only the constructions above! (i.e. $Ack$ is not primitive recursive) This means that Gödel and his posse failed to capture all computable functions in their construction!

Gödel had to expand his class of functions so $Ack$ could be constructed. He did this by defining the following:

Unbounded minimisation

  • $g : \mathbb{N}^k \rightarrow \mathbb{N}$
  • IF $\left[f(\bar{x_k}, y) = 0 \text{ AND } \\f(\bar{x_k}, z) \text{ is defined } \forall z < y \text{ AND } \\f(\bar{x_k}, z)\neq 0\right]$
    THEN
    $g(\bar{x_k}) = y$
    ELSE
    $g(\bar{x_k})$ is not defined.

This last one may be hard to grasp, but it basically means that $g((x_1, x_2, \dots, x_k))$ is the smallest root of $f$ (if a root exists).


All functions that can be constructed with all the constructions defined above are called recursive. Again, the name recursive is just by definition, and it doesn't necessarily have correlation with functions that call themselves. Truly, consider it a homonym.

Recursive functions can be either partial recursive functions or total recursive functions. All partial recursive functions are total recursive functions. All primitive recursive functions are total. As an example of a partial recursive function that is not total, consider the minimisation of the successor function. The successor function doesn't have roots, so its minimisation is not defined. An example of a total recursive function (which uses minimisation) is $Ack$.

Now Gödel was able to construct the $Ack$ function as well with his expanded class of functions. As a matter of fact, every function that can be computed by a Turing machine, can be represented by using the constructions above and vice versa, every construction can be represented by a Turing machine.

If you're intrigued, you could try to make Gödel's class bigger. You can try to define the 'opposite' of unbounded minimisation. That is, unbounded maximisation i.e. the function that finds the biggest root. However, you may find that computing that function is hard (impossible). You can read into the Busy Beaver Problem, which tries to apply unbounded maximisation.

Auberon
  • 1,332
  • 7
  • 23
10

Robert Soare wrote an essay (Wayback Machine) in 1995 about this issue. According to him, the term (general) recursive functions was coined by Gödel, who defined them using some sort of mutual recursion. The name stuck, though later on other equivalent definitions were found.

For more information, I recommend Soare's essay.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514