6

Suppose we have a set of equations like this

p7=f(p1+p6); p6=f(p2+p5); p5=f(p3+p4); p4=f(p3); p3=f(p2); p2=f(p1); p1=f()

It can be represented by computational graph below

If each intermediate value takes 1 unit of memory, you need at least 4 units to compute p7 without any duplicate computation.

Is there an algorithm for estimating memory needed in this setting for a general DAG?

I found a paper called "Adjoint Dataflow Analysis" for estimating this for restricted set of graphs, but it feels like this ought to be a problem that is covered more generally in graph theory.

Raphael
  • 73,212
  • 30
  • 182
  • 400
Yaroslav Bulatov
  • 201
  • 2
  • 12

1 Answers1

4

Your problem sounds similar to one-shot (black) pebbling. Wu, Austrin, Pitassi, and Liu, in their paper titled Inapproximability of treewidth, one-shot pebbling, and related layout problems (J. Artificial Intelligence Res. 49 (2014), 569–600), show that it is (probably) hard to compute the optimal cost (which corresponds to your memory).

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514