6

Preface: This question was originally asked on Theoretical Computer Science, and the kind people there referred me to this web site. It is being repeated here in an attempt to find a satisfying answer.


Over the years, two novel encryption techniques have come to mind and been implemented as programming libraries that could be integrated into applications. However, how to analyze their security and vulnerability characteristics has never been very clear, and their usage has been limited to mainly experimental tests. Are there tools available for automated examination of such parameters one may be interested in understanding for an encryption library? Are there bodies of people who are interested in being introduced to new encryption concepts for the purpose of executing their own personal analysis on such a process? I'm not sure where to look.

The first encryption algorithm is a mono-alphabetic simple substitution cipher. It requires two keys to operate and is designed to frustrate frequency analysis. The longer of the keys forms a table by which plain-text has a normal substitution cipher applied. Each encoded byte is then split into four values of two bits each. The second, shorter key is then used to allow a random selection from four groups of sixty-four unique bytes each. Each two bit value from the encoded byte is used to select which group of sixty-four bytes to use. Encoding has two disadvantages: the output is four times larger, and repeated data encoding may allow some frequency analysis.

The second encryption algorithm is a stream cipher like the first but internally operates on blocks of data. It utilizes two keys to operate: the first is a two-dimensional array that describes how to construct a (virtual) multidimensional grid, and the second is an initialization vector for the encoding/decoding engine. It attempts to overcome frequency analysis by encoding bytes with a window of preceding bytes (initialized from the second key). A byte with its preceding window of bytes form a multidimensional index into the aforementioned grid. Unfortunately, encoding duplicate blocks of data longer than the window size starts yielding equivalent data.

Noctis Skytower
  • 231
  • 3
  • 8

4 Answers4

11

Typically, there are a variety of steps in evaluating a new algorithm. They start with the quick review

  • is it already known?
  • does it vary only in nonrelevant ways from what is known?

which is commonly enough to show vulnerabilities in many amateur attempts at encryption. The point is that there are a number of well known ways to translate strings of symbols into other strings of symbols, and many of those have been evaluated in common cryptographic studies. Good books on cryptography cover these.

Then, the more intensive analyses begin. These include:

  • functional inversion
  • analysis of the symbol statistics
  • differential cryptanalysis

and a variety of other mathematical techniques for extracting information of the keys from the intercept stream.

Also, if the technique involves nontraditional information stores (if it doesn't have keys, if it has strange communication channels and protocols, etc.), then you have to also analyse the various impersonation and interception scenarios on each channel.

For the description given, it would appear that the first cryptographic algorithm would fail the first quick view. Substitution ciphers are well known. It doesn't really matter how many times you compose substitution ciphers, their properties are the same. If you had 10 keys and did substitution 10 times before final send, it would be equivalent to a single key substitution cipher and have no better security. Unless there is some other tweak to the algorithm, that would quickly be dismissed as already known.

ex0du5
  • 603
  • 3
  • 9
10

This is not an answer to your question. But as is often stated in crypto circles:

Cryptographic protocols and algorithms are difficult to get right, so do not create your own. Instead, where you can, use protocols and algorithms that are widely-used, heavily analyzed, and accepted as secure. When you must create anything, give the approach wide public review and make sure that professional security analysts examine it for problems. In particular, do not create your own encryption algorithms unless you are an expert in cryptology, know what you're doing, and plan to spend years in professional review of the algorithm. Creating encryption algorithms (that are any good) is a task for experts only.

Dave Clarke
  • 20,345
  • 4
  • 70
  • 114
3

First, I must agree with Dave's answer: In order to be able to compose a proper cipher, one needs to be knowledgeable enough about analyzing ciphers. Otherwise, there is a high probability that your cipher suffers from well known attack, and you will be considered as a crank.

That being said, there are several groups that specialize in cryptanalysis. Breaking codes is their bread-and-butter. All day long they just wait for other groups to come up with new ciphers, so that they will try to break it (or improve a previous cryptanalysis, etc.). They also try to break other primitives such as hash functions.

some groups comes to my mind, but instead of listing all the people (and forgetting half of them) let me refer you to the last NIST's SHA-3 competition, where groups were (actually, are!) called to design hash-functions and break it. Look at the list of candidates and their designers (say of the last 15 finalists of round 2). Those people know a lot about cryptanalysis and can evaluate a new cipher. Look at their publication lists to get more information on the current state-of-the-art in cryptanalysis techniques and methods.

A little bit older but also interesting is NIST's AES competition, in which several groups provided new candidates for block-ciphers. Those groups (as well as the crypto community) also analyzed the candidate ciphers until one cipher was selected to become NIST's standard.

Ran G.
  • 20,884
  • 3
  • 61
  • 117
1

The answer to this problem is more general: If you need an expert to review a novel algorithm in a field with which you are not completely familiar, your best bet is to read-up on and study that field long enough until you become that expert yourself. And by "expert" I mean somebody that does research and publishes in that field!

If you're serious enough about your interest in an area, they you should also be willing to shoulder the challenge of trying to get some deeper insight therein. Chances are you'll find the flaws in your algorithms yourself, but you may also just as well find improvements. Most importantly, though, you will only be able to convince others to invest the necessary time to consider your work only once you've established some credibility in that area yourself.

Pedro
  • 1,116
  • 5
  • 14