1

This question is related to this (but it is not the same).

Let's suppose I have a seed with an entropy of 1024-bits and hash it with a counter using a hash function with one-quarter of the seed size in bits as BLAKE2s (256-bits digest size).

I hash the seed with counters and XOR the result to plaintext.

As said in this answer, some options are (the third I proposed by myself):

  1. H(00∥F) ∥ H(01∥F) ∥ H(02∥F) ∥ H(03∥F)...
  2. H(H(F)∥00) ∥ H(H(F)∥01) ∥ H(H(F)∥02) ∥ H(H(F)∥03)...
  3. H(00∥H(F)) ∥ H(01∥H(F)) ∥ H(02∥H(F)) ∥ H(03∥H(F))...

/\ H is the hash, F is the file and 00, 01, 02, 03 the counters.

PS: For H(F∥00)∥H(F∥01)∥H(F∥02)∥H(F∥03), it was answered here.

Assuming that the hash function is not vulnerable to length-extension attacks and not caring about optimizations, what of these it is the most secure scheme in practice? And why?

phantomcraft
  • 887
  • 6
  • 14

1 Answers1

2

I’m unsure of your exact definition of practical, but the first scheme is more secure than the other two. From an entropy point of view, the second and third schemes throw away roughly three-quarters of the key entropy. Put another way, they have very large families of equivalent keys: every pair of values $F$ and $F’$ with $H(F)=H(F’)$ produce the same stream under schemes 2 and 3.

In particular, if $2^{128}$ instances of the cipher are used, there are likely to be two instances with the same key stream. This may not count as practical, but is very undesirable.

On a slightly related picky note, you should require that the seed have a large min-entropy and not just a large Shannon entropy. It is possible to construct sources of random numbers that have over, say, 1024-bits of entropy but which output one particular number, say, half of the time.

Daniel S
  • 29,316
  • 1
  • 33
  • 73