Quantum computing is a novel paradigm for computing that was introduced as a concept in the 1980s and has enjoyed a lot of attention in the recent years as the research and development for building actual quantum computers has started to bear fruit. Quantum computing holds great promise for solving some of the most difficult computational problems. It is expected to bring major advantages, for example, for drug development, weather forecasting, different kind of optimizations problems, etc. Unfortunately, quantum computing also has a darker side; If large enough quantum computers become reality, then they may solve the computational problems that are the basis of modern computer security.
Specifically, a quantum algorithm introduced by Peter Shor in the mid-1990s and subsequently called the Shor’s algorithm can perform integer factorisation and find discrete logarithms in polynomial time (that is to say, significantly faster than what is possible with classical computers). RSA and Elliptic Curve Cryptography (ECC), which together cover practically all currently deployed public key cryptosystems, are based on integer factorisation and discrete logarithms, respectively. Consequently, quantum computing poses a threat to RSA and ECC and the security of the modern computation and communication infrastructure as a whole. The state-of-the-art of quantum computers is still far from being able to break practical cryptosystems, and certain difficult technical problems must be solved before quantum computers can be scaled to the sizes that pose a practical threat. Nevertheless, the threat of quantum computing must be taken seriously and it must be addressed pro-actively because often data needs to remain secure for decades and also rolling any new cryptosystems into practical use takes a long time.
Not all cryptography is at risk
Despite the gloomy sky, it is important to understand that not all cryptography is at risk and that is a clear roadmap for protecting systems even in the era of quantum computing. First of all, symmetric cryptography (for example, AES or ChaCha20) are not affected by quantum computing in any significant way – there exist a quantum algorithm called Grover’s algorithm which solves generic search problems faster and, consequently, affects also symmetric cryptography, but it can be countermeasured by doubling the key lengths. That is, you can just use 256-bit keys in systems where you currently use 128-bit keys and you will be safe: for instance, replace AES-128 with AES-256 and you are done.
Although the currently deployed public key cryptosystems are vulnerable to the Shor’s algorithm, the general idea of public key cryptography (asymmetric cryptography) is not, and new public key cryptosystems can be designed based on computational problems that cannot be solved with the Shor’s algorithm. Such algorithms are already available and are also currently developed and studied further. They are often referred to as Post-Quantum Cryptography (PQC), but sometimes also called quantum-proof, quantum-safe, or quantum-resistant cryptography. The cryptographic research community has studied PQC already for more than 15 years (the first academic conference on PQC was held in 2006). It is essential to realize that when it comes to implementing PQC algorithms, they are like any other algorithms and can be implemented with existing classical computers.
NIST PQC competition and its current state
In December 2016, the American National Institute of Standards and Technology (NIST) announced that it will organize a competition-like process for developing a standard for PQC. NIST aims to standardize PQC algorithms in two categories: key-encapsulation mechanisms and digital signature algorithms. The PQC competition received 69 proposals that fulfilled the initial requirements and that entered Round 1. At the time of writing this blog, the competition has proceeded to Round 3, and 15 algorithms remain in the competition, seven of which are the finalists and the rest are alternatives. After Round 3, NIST will select the first winners from the finalists. They will be included in the forthcoming PQC standard that is expected to be ready within a couple of years.
NIST said in the Round 2 status report that Kyber and Saber are its favorite key-encapsulation mechanisms and Dilithium and Falcon are favored for digital signatures. Our view at Xiphera is that the situation has not changed, with the exception of digital signatures where Dilithium and Falcon have become even stronger favorites because Rainbow has been broken. The most promising algorithms (Kyber, Saber, and Dilithium) are based on different variations of the learning with errors problem over rings. The advantages of such cryptosystems over other algorithms in the competition are that they have relatively small key sizes and good performance, and do not really have any obvious weak points like many other candidates (for example, Classic McEliece provides good confidence in its security and fast computations, but suffers from extremely large key sizes).
But, in the end, we must wait for NIST to make the final decision before we have any certainty about the algorithms that will form the basis of the future PQC standard. NIST has stated that the announcement of the winners will be done very soon – by the end of March 2022. Even after that the competition will continue with Round 4 including selected algorithms that may be added into the standard later on.
Also certain European nations (for instance, Germany and France) have published their own recommendations on PQC. Anyone who designs new systems that require public key cryptography should study also them.
What can be expected to change with PQC?
The new PQC algorithms will imply changes into currently used security protocols, and it can be expected that algorithms can be rarely changed in a simple plug-and-play manner. One notable difference, especially when compared with the current ECC-based solutions, is that key sizes will grow regardless of which algorithm is announced as a winner, and for certain algorithms the growth would be significant. Additionally, protocols that currently rely on Diffie-Hellman key exchange will need to change to use a key-encapsulation mechanism for the key exchange (for instance, TLS 1.3). There may also be difference in the speed of computation compared to current highly optimized ECC implementations, but the difference in this respect is probably smaller than what people typically believe. Methods such as Kyber and Saber may be even faster than the current ECC counterparts.
Another aspect to consider is that PQC algorithms have not yet gained a similar level of confidence in their security as RSA and ECC have at the moment. Therefore, it is not completely out of the question that a severe weakness even against classical attacks could be found, and Rainbow already gives us a frightening example. For this reason, many experts (for example, French ANSSI) recommend designing hybrid systems that combine PQC with classical public key cryptography such as ECC. Such hybrid protocols are already under development (see, for example, for TLS).
We at Xiphera are eagerly waiting for the NIST decision and planning to start offering PQC IP cores and solutions to our customers soon after that.
Xiphera’s webinar: State of Play of Post Quantum Cryptography
Dr. Matti Tommiska reviews the current status in quantum computing, and specifically its impact on the currently used public-key cryptographic algorithms. The winning algorithms of NIST (National Institute of Standards and Technology) PQC (Post Quantum Cryptography) competition will form the foundation of future public-key cryptographic algorithms and protocols. Since the adoption and finalization of PQC algorithms and protocols will likely take years, the crypto agility of FPGAs has clear advantages over fixed-function silicon.
Find the full recording of the webinar and presentation slides here.
If you want receive information about Xiphera’s upcoming webinars, sign up for Xiphera’s webinar subscription list here, and you’ll never miss any of our fascinating webinars!