16

I am considering attacks on Two-Key Triple-DES Encryption assuming $2^{32}$ known plaintext/ciphertext pairs (that's a mere 32GiB of ciphertext) by the method devised by Paul C. van Oorschot and Michael J. Wiener: A Known-Plaintext Attack on Two-Key Triple Encryption (in proceedings of Eurocrypt 1990), or another published method not requiring significantly more DES computations.

As a synthetic information for decision makers, I am looking for an independent estimate of how much time this is expected to require, assuming all the RAM ever built by mankind to that day (of April 2012) was put to full use.

Note: I'm purposely not asking when the attack could become feasible using all the RAM ever built by mankind, because estimates on the amount of RAM mankind will build, and when, are less falsifiable.

Update: I am not considering cost; neither of RAM, power, logic including DES engines (as long as the number of DES operations remains within $2^{90}$). I am willing to assume that the amount of RAM used, and its effective speed, are the only factors to account for in determining the expected duration of the attack. This is similar to the hypothesis made by the authors of the linked paper, that their attack is limited by the amount (or cost) of RAM used, with all other factors of secondary importance.

Update: sadly, nobody dared answer the question and the bounty period is over. Thus here is a first order answer to criticize.

fgrieu
  • 149,326
  • 13
  • 324
  • 622

1 Answers1

9

The original article rightfully neglects the cost of DES computations (there are less than $2^{90}$) and everything except memory accesses to its Table 1 and Table 2. I go one step further: considering that Table 1 is initialized only once and then read-only, it could be in ROM, and I neglect all except the accesses to Table 2. The attack requires an expected $2^{88}$ random writes and as many random reads to Table 2, organized as $2^{25}\cdot 24$-bit words.

The cheap PC that I bought today (of May 2012) came with 4 GByte of DDR3 DRAM, as a single 64-bit-wide DIMM with 16 DRAM chips each $2^{28}\cdot 8$-bit, costing about \$1 per chip in volume. Bigger chips exists: my brand new 32-GByte server uses 64 chips each $2^{29}\cdot 8$-bit, and these are becoming increasingly common (though price per bit is still higher than for the mainstream $2^{28}\cdot 8$-bit chips).

Two mainstream $2^{28}\cdot 8$-bit chips hold one instance of Table 2, and one 124-bit word can be accessed as 8 consecutive 8-bit locations in each of the two chips simultaneously (consecutive accesses are like 15 times faster than random accesses). One $2^{29}\cdot 8$-bit chip would be slightly slower.

Assuming DDR3-1066 with 7-cycles latency (resp. DDR3-1333 with 9-cycles latency), 8 consecutive access require at least $(7\cdot 2+7)/1066\approx 0.020$ µs (resp. $(9\cdot 2+7)/1333\approx 0.019$ µs). This is a decimal order of magnitude less than considered in the original article. For each instance of Table 2, that is 0.5 GByte, we can perform at most $365\cdot 86400\cdot 10^6/0.019/2\approx 2^{49.6}$ read+write accesses per year to Table 2 using mainstream DRAM. Thus with $n$ GByte of mainstream DRAM, and unless I err somewhere, the expected duration is $2^{37.4}/n$ years.

Based on press releases of a serious reference, there are less than $2^{31}$ PCs around, and assuming that my cheap PC is representative, that's $2^{33}$ GByte. Another way to look at that is that each 0.25-GByte chip cost about \$$1$; and the DRAM revenues in 2011 is less than \$$2^{35}$, thus enough for $2^{33}$ GByte (but notice that most of the revenue is from chips that are not optimized for cost per bit). I'll guesstimate all the RAM ever built is equivalent to at most $2^{35}$ GByte of mainstream DRAM for the purpose of the attack.

Thus at the end of the day, my answer is: the attack in the original article, updated to use all the RAM chips ever built by mankind to mid 2012 at the maximum of their potential, has an expected duration of at least 5 years; or equivalently has odds at best 20% to succeed in one year.

Update: as noted by the authors of the original article, "the execution time is not particularly sensitive to the number of plaintext/ciphertext pairs $n$ (provided that $n$ is not too small) because as $n$ increases, the number of operations required for the attack ($2^{120-\log_2 n}$) decreases, but memory requirements increase, and the number of machines that can be built with a fixed amount of money decreases". By the same argument, our required amount of RAM is not much changed if we get more known plaintext/ciphertext pairs.

fgrieu
  • 149,326
  • 13
  • 324
  • 622