13

I am wondering, is there a method for automatic runtime analysis that works at least on a relevant subset of algorithms (algorithms that can be analyzed)?

I googled "Automatic algorithm analysis" which gave me this but it is too mathy. I just want a simple example in psuedocode that I can understand. Might be too specific, but I thought it was worth a shot.

Nathvi
  • 185
  • 1
  • 8

4 Answers4

13

No algorithm can decide whether a given algorithm ever halts or not, so in particular no algorithm can tightly analyze the complexity of a given algorithm.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
13

The COSTA tool does just this, although it fails in many cases, as you can imagine, due to computability problems. There are many papers about this; Cost Analysis of Java Bytecode by E. Albert, P. Arenas, S. Genaim, G. Puebla, D. Zanardini is a good starting point.

The approach taken is to infer a run-time recurrence from the Javabyte code, the convert this to a closed form. The tool also compute space usage bounds.

Dave Clarke
  • 20,345
  • 4
  • 70
  • 114
10

I know one approach to (semi-)automated average case analysis, namely MaLiJAn¹. It closely resembles the kind of analysis Knuth uses in TAoCP. The core idea is to

  • model the program (flow) as Markov Chain,
  • train its transition probabilities for some fixed input sizes $n$ by counting a set of program runs (which yields maximum likelihood estimators),
  • extrapolate to probility functions in $n$ and
  • use computer algebra to derive the average cost (w.r.t. these functions).

Note that only additive cost measures (e.g. comparisons, "time") work and only the expected value is accurate (assuming perfect probability functions), higher moments can not be derived.

All steps but the extrapolation are rigorous [2] and the method has been demonstrated to reproduce well-known results with high precision -- given suitable random sample inputs, of course. While there is no proof or even approximation guarantee on the results (the extrapolation step is, so far, purely heuristic) the results obtained with the tool serve well to experiment with hard to analyse algorithms and formulate hypotheses [3,4].


  1. Full disclosure: I'm was a member of this research group and had been involved in the development of the tool.
  2. Maximum Likelihood Analysis of Algorithms and Data Structures by U. Laube and M. Nebel (2010) [preprint]
  3. Engineering Java 7's Dual Pivot Quicksort Using MaLiJAn by S. Wild et al (2012) [preprint]
  4. Maximum Likelihood Analysis of the Ford–Fulkerson Method on Special Graphs by U. Laube und M. Nebel (2015) [preprint]
Raphael
  • 73,212
  • 30
  • 182
  • 400
8

Of course, as noted by Yuval Filmus, one should not expect a general solution to such problems. But as is usually the case, solutions can be found for interesting subsets of the general case.

I am in no way expert, or even significantly knowledgeable in this area, by I happen to know of some work of the kind. It concerns automatic average complexity analysis, and the work was done by Philippe Flajolet and his colleagues.

From what I understood when it was explained to me, the authors designed a small language (nothing Turing complete as you might expect, but significant enough) so that any algorithm written within the constraint of that language could have its average complexity analyzed automatically.The system was called at the time Lambda-Upsilon-Omega, i.e. $\lambda\acute\upsilon\omega$ (I unbind).

One paper I found on the web is a 1990 paper: Automatic average-case analysis of algorithms by Philippe Flajolet, Paul Zimmermann, and Bruno Salvy.

I would expect that later papers have extended this work, but I do not really know. The work was quite heavily cited, and searching the web for it should yield more recent work on the same topic.

Now, I am afraid that the work of Flajolet and his colleagues was very mathematical, and I would not expect much easy reading.

babou
  • 19,645
  • 43
  • 77