4

That seems like it would be a useful primitive. Using it in place of a block cipher in the Merkle–Damgård construction would give a variable-length hash, maybe without the tricky business with related keys? Using it to hash a key+nonce+position would give you a stream cipher. Though maybe the collision resistance we need from hashes is wasteful for a cipher?

Relatedly, why is it that block ciphers seem to be the low-level building block of choice for hashes and stream ciphers? Especially when the decryption operation doesn't seem to get used very often.

Jack O'Connor
  • 647
  • 6
  • 13

2 Answers2

3

Such a category of functions is not generally used as is, but compression functions, which are close to what you describe, are (as you describe) used to build (variable length) hash functions. For example, Merkle–Damgård hashes like SHA-2 have a compression function that takes an IV (or previous block output) and a fixed size data block to generate a smaller fixed-size output.

If you want to use a compression function as a cipher you can. If you use the key as the data block input, you get a block cipher that in the case of SHA-1 and SHA-256 is called SHACAL-1 and SHACAL-2. You can then construct a stream cipher from these using CTR mode.

However, Merkle Damgård does not require a block cipher, and Damgård suggested several number theoretical problems for the compression function in A Design Principle for Hash Functions.

Relatedly, why is it that block ciphers seem to be the low-level building block of choice for hashes and stream ciphers?

Many hashes and stream ciphers are built from something else, like Salsa20 or SHA-3.

However, it is true that block ciphers have been used a lot. Partially because everyone was looking for ways to use DES (and later AES) which was standardized by NIST. Partially because the design of block ciphers is well understood.

otus
  • 32,462
  • 5
  • 75
  • 167
2

Hash functions generally (always ?) take a variable size input that's why they are also called compression functions. The hash functions are also generally collision resistant, or they are designed in such aim for cryptographic applications. What you are describing is a particular category of one way function (OWF). There is also a subcategory : random permutation where the input and output sets have the same size (necessary but not sufficient: the sufficient condition is one to one mapping).

In this function category the collision resistance is not always necessary, for instance in the Lamport signature scheme a one way function with second preimage only is required.

For a "post quantum" design it seems that the ideal primitive would be a 256 bits -> 256 bits function. Does anyone know such primitive?

Fraktal
  • 229
  • 1
  • 5