2

I am exploring how a Dynamic Programming design approach relates to the underlying combinatorial properties of problems.

For this, I am looking at the canonical instance of the coin exchange problem: Let S = [d_1, d_2, ..., d_m] and n > 0 be a requested amount. In how many ways can we add up to n using nothing but the elements in S?

If we follow a Dynamic Programming approach to design an algorithm for this problem that would allow for a solution with polynomial complexity, we would start by looking at the problem and how it is related to smaller and simpler sub-problems. This would yield a recursive relation describing an inductive step representing the problem in terms of the solutions to its related subproblems. We can then implement either a memoization technique or a tabulation technique to efficiently implement this recursive relation in a top-down or a bottom-up manner, respectively.

A recursive relation could be the following (Python 3.6 syntax and 0-based indexing):

def C(S, m, n):
    if n < 0:
        return 0
    if n == 0:
        return 1
    if m <= 0:
        return 0
    count_wout_high_coin = C(S, m - 1, n)
    count_with_high_coin = C(S, m, n - S[m - 1])
    return count_wout_high_coin + count_with_high_coin

However, when drawing the sub-problem DAG, one can see that any DP-based algorithm implementing this recursive relation would yield a correct amount of solutions but disregarding the order.

For example, for S = [1, 2, 6] and n = 6, one can identify the following ways (assumming order matters):

  1. 1 + 1 + 1 + 1 + 1 + 1
  2. 2 + 1 + 1 + 1 + 1
  3. 1 + 2 + 1 + 1 + 1
  4. 1 + 1 + 2 + 1 + 1
  5. 1 + 1 + 1 + 2 + 1
  6. 1 + 1 + 1 + 1 + 2
  7. 2 + 2 + 1 + 1
  8. 1 + 2 + 2 + 1
  9. 1 + 1 + 2 + 2
  10. 2 + 1 + 2 + 1
  11. 1 + 2 + 1 + 2
  12. 2 + 1 + 1 + 2
  13. 2 + 2 + 2
  14. 6

Assumming order does not matter, we could count the following solutions:

  1. 1 + 1 + 1 + 1 + 1 + 1
  2. 2 + 1 + 1 + 1 + 1
  3. 2 + 2 + 1 + 1
  4. 2 + 2 + 2
  5. 6

When approaching a problem solution from the Dynamic Programming standpoint, how can I control the order? Specifically, how could I write functions:

  • count_with_order()
  • count_wout_order()

?

Could it be that the need for order mattering implies choosing pruned backtracking over a Dynamic Programming approach?

1 Answers1

3

There is absolutely no problem adapting dynamic programming to count solutions without regard to order (i.e., when order doesn't matter). Let $D(S,m,n)$ be the number of ways to obtain a change of $n$ using the first $m$ coins of $S = S_1,\ldots,S_M$. We have $D(S,m,0) = 1$, $D(S,m,n) = 0$ when $n < 0$, and otherwise $$ D(S,m,n) = \sum_{i=1}^m D(S,i,n-S_i). $$ This recurrences forces the indices of coins used to be non-increasing: after using $S_i$, we are only allowed to use $S_1,\ldots,S_i$. Counting non-increasing (or non-decreasing) solutions is the same as counting all solutions without regard to order.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514