"Post-Quantum Cryptography" is Not Going to Work

by Dave D'Rave

In the community, it is widely thought that within the next ten to twenty years, quantum computers will make most existing cipher systems obsolete.  For this reason, NIST has a fairly large program to develop "post-quantum cryptography."  They have even had a public contest at: csrc.nist.gov/projects/post-quantum-cryptography

There are various problems with this program, due to the fact that a bunch of government people are in charge, and due to the fact that almost all of the proposals are coming from the usual black budget contractors.  Specifically, the algorithms which are being considered are things like elliptic curve, integer programming methods, lattice methods, and similar legacy ideas.  Very few people in the program management have quantum computer expertise, and it looks like nobody is a hacker.

Let's consider a major use case of cryptography: encrypting a file.  Today, this means a plain old block cipher, usually with some kind of block chain or incrementing block key.  Padding, scrambling, and similar methods are added on top.

Consider how a quantum hacker is going to attack this problem: We know that the most likely situation is one where we have intercepted the ciphertext and we have obtained the plaintext.  (Boris and Natasha are a good source of plaintext; websites in .gov are another good place to look.)  Brute-force attacks using a quantum computer can get the key of individual cipher blocks, and this is often enough to break the entire system.  (Obviously, it you use a different key for each block of, say, an AES-256 message, then you have reinvented the one-time pad.  Key reuse is a feature of all practical crypto systems.)

You are wondering how this works, exactly.  Quantum computers are able to represent a plaintext block as a vector in an appropriate Hilbert space, and are able to represent the ciphertext block as another vector in the same Hilbert space.  The key is a function which describes the n-dimensional angle between these two vectors.  It turns out that there are transforms which can turn this situation into a one-dimensional representation which is suitable for Fourier transform.  It also turns out that there are transforms which simplify the construction of practical quantum computer algorithms, for example by defining a set of permutations which map the ciphertext into a standard format, such as 0x0000000000000000.  The central idea is that if you present a quantum computer with a problem which has one, and only one solution, it will find that solution efficiently.

If the goal is to build a classical encryption algorithm which cannot (easily) be broken by a quantum computer, then one approach is to build crypto-systems which have ambiguous keys.  Another approach is to build a system which has an ambiguous mapping between plaintext and ciphertext.

Block cipher systems which have ambiguous keys generally have the property that the key size is larger than the block size.  If a block cipher has ambiguous keys, then the algorithm will have many different keys which will transform the plaintext into the ciphertext.  Typical numbers for the degree of ambiguity are 64k and 4G (16-bits and 32-bits).  If, given the plaintext and the ciphertext, a straightforward quantum computer algorithm attempts to reverse the encryption algorithm, it will return a superposition of some large number of possible keys.  This is not useful for further processing.

Similarly, you can obtain ambiguous text mapping as follows: If the crypto system uses a 16-byte block cipher, you would normally break up a long message into 16-byte chunks and encode them individually.  Instead, you break the message up into 12-byte chunks, add a four-byte random bit string, and then encode each resulting block using the 16-byte block cipher as usual.  This technique means that, for each chunk of plaintext, there are 4G possible ciphertext blocks produced.

Similar methods are used to frustrate statistical attacks, language-recognition attacks, etc.

If you read the NIST website, it looks nothing like this.  Instead, they are talking about algorithms, without any discussion of what characteristics of a code system would be most vulnerable to quantum computer cryptanalysis.  While it is possible that there are some smart people in the back room who are going to make the final choice about which algorithms (if any) get chosen as the next standard, it is more likely that NIST will be under pressure to "do something," and will choose the best set of algorithms available.  After all, increasing the key size and moving to somewhat more complex algorithms will push back the day when algorithmic crypto systems are obsolete.  It's just that this approach will not be good enough.

"Post-quantum cryptography" is not going to work.

Return to $2600 Index