24

I was just reading the Stick Figure Guide to AES and came across an interesting table explaining how the winner was chosen:

stick figure aes

Unfortunately the NIST site is down so I can't gain further information about the approval process so I was hoping someone here would know in more detail.

  • Who or what decided on the numbers in this table which ranks each algorithm? I.e. can the exact analysis process be described?
  • Who created that process? People in NIST or an equally divided group of government, industry and public cryptographers?
  • What is included in "Design Features"? I.e. What were those features that were important?
  • "Performance" analysis could have been done with benchmarks but "Implementation Difficulty" sounds subjective. How was that quantified?
  • Could the numbers in that table been tweaked to skew the results in favor of a particular algorithm?

Those questions aside and assuming the magic numbers above were arrived at a fair and equitable process without bias or hidden agendas, it seems to me this table or ranking system is still missing a "weighting" criteria. All categories seem to be weighted with the same importance. That is "Smart Card Performance" is equally important as "General Security". That seems incorrect. I would argue that security is of the utmost importance so that should have a higher weighting relative to the other criteria which would be secondary concerns. A good quote would be:

Security at the expense of usability, comes at the expense of security.

I wondered what would happen if I applied a high weighting factor to security and left the other points as they are. For example:

security

Now Serpent is first equal with Rijndael with Twofish coming in a close second. Interesting.

What about if I am developing a software product, I don't care about about hardware performance or smart card performance so I can rule those two out completely in my decision. The table might look like this:

software

Now Twofish is the winner and Rijndael is second equal with Serpent and MARS. MARS might even be more attractive with its variable key size up to 448 bits.

My overall point is that Rijndael, Serpent, Twofish and MARS all appear to be within the same ballpark quality range as far as block ciphers go and this rating criteria. There might be a more accurate mathematical way to apply a weighting factor. If I was revamping the security on a project and concerned about NSA involvement in weakening encryption standards then I could re-weight some of the criteria and priorities to suit my project's specific goals. I might decide on a different algorithm from Rijndael. Back in 2000 Rijndael might have suited the US government's purposes and planned surveillance agendas but not my project's. I would compare it to selecting an algorithm just like TrueCrypt gives you the option of choosing between 3 different algorithms. Would that be a reasonable call?

Patriot
  • 3,162
  • 3
  • 20
  • 66
J_M
  • 341
  • 2
  • 3

2 Answers2

29

At the time of the competition (I can talk about it, I was there), there was a lot of discussion and various people showed arguments. However, there was never an official, publicly known "board of scores" with totals and definite rules, as the pictures you show seem to purport. It is possible that the NIST people did make something similar internally, but they certainly did not publish it. From the outside, the choice has been made by NIST in "some way" and they then provided qualitative reasons, not really quantitative. NIST never bound itself into following strict rules; they wanted to remain in control of the whole proceedings.

The picture still conveys the main reasons why Rijndael was chosen:

  • Its performance is nowhere bad. It was not the highest performer on every platform, but there was no platform where it would be abysmally slow, in contrast to almost any other candidate.

  • The "implementation difficulty" relates to features which, on some platforms, imply quite some work. For instance, RC6 requires a multiplier, which is hard on ASIC / FPGA (which translates to: it uses a lot of silicon space). MARS is very complex (many kinds of transforms piled up together) and implementers have reported that it took them quite some time to come up with working implementations, let alone optimized implementations. Twofish uses key-dependent S-boxes, which need RAM (bad for smart cards, bad for ASIC/FPGA). Rijndael was rather simple to implement (in retrospect, Serpent was better for that, especially if you want to implement the algorithm without lookup tables).

During the round conferences, where cryptographers who meet to talk about the AES candidates, some informal surveys were conducted, where people could give "scores" to candidates. While Rijndael did not necessarily elicit the best marks from everybody, nobody really hated it, so it looked fine as a future standard.


All of this, of course, depends on quite arbitrary assumptions on the usage context. NIST wanted an all-purpose block cipher, suitable to a large range of hardware platforms. If you target a specific system (e.g. for disk encryption on your PC), then this may point at another algorithm; for instance, RC6 is faster than Rijndael on a PC (except if the said PC offers the AES-NI instructions, of course).

The really good thing about the AES competition is not that it came up with a good, strong algorithm; what matters is that almost all candidates turned out to be good and strong. Among the 15 candidates, only two were "broken", and then only in an academic way. The trust we can have in AES comes from that: the AES competition proved that we apparently knew how to design algorithms that nobody else knows how to break (that's about as good as you can get with symmetric cryptography).

Thomas Pornin
  • 88,324
  • 16
  • 246
  • 315
1

I also took part in this process, in that I helped to optimize one of the original 15 contenders (CERN's DFC): I think Thomas Pornin is exactly right when he writes: "The really good thing about the AES competition is not that it came up with a good, strong algorithm; what matters is that almost all candidates turned out to be good and strong. Among the 15 candidates, only two were "broken", and then only in an academic way. The trust we can have in AES comes from that: the AES competition proved that we apparently knew how to design algorithms that nobody else knows how to break (that's about as good as you can get with symmetric cryptography)." In hindsight I'm quite pleased that I figured out a way to implement DFC which had no possible timing attacks: Such attacks were only theoretical at the time but I still found a method that was less than 10% slower than the fastest (but timing-vulnerable) implementation.