4

Has there been any attempt at a general theory to describe how an algorithm can be "deformed" to solve the problem more efficiently?

For example suppose we have an algorithm (say sorting a list of numbers) which solves the problem in $O(n^2)$. Can we deform this algorithm (in the "space of algorithms") to an algorithm which solves the problem in $O(n \log(n))$ time?

My motivation for asking this questions comes from analysis where if we want to solve the equation $f(x) = 0$ one technique is to first guess an $x_0$ such that $f(x_0)$ is small, then look in the neighborhood of that $x_0$ to find $x$.

Of course, I'm guessing the answers is no, but I'm sure a question similar to the above must have been posed somewhere in the literature somewhere.

Raphael
  • 73,212
  • 30
  • 182
  • 400
tom
  • 59
  • 2

1 Answers1

11

There is no general way to do this. The "space of algorithms" is not a nice one, with a natural metric or other nice properties, unlike e.g. the real numbers. Note that even in the case of trying to solve $f(x)=0$, where your search space is $\mathbb{R}$, most algorithms work under several assumptions on $f$, e.g. continuity (there is no algorithm which can solve/approximate $f(x)=0$ for an arbitrary $f:\mathbb{R}\rightarrow\mathbb{R}$).

See the answers here, and also here for simple impossibility results regarding a general approach for optimizing the running time of an algorithm.

Ariel
  • 13,614
  • 1
  • 22
  • 39