QUICK FACTS
Created Jan 0001
Status Verified Sarcastic
Type Existential Dread
quantum cryptography, post-quantum cryptography, cryptographic, public-key, cryptanalytic attack, quantum computer, integer factorization problem, discrete logarithm problem, shor's algorithm

Post-Quantum Cryptography

“*Not to be confused with Quantum cryptography, which, I assure you, is an entirely different flavor of existential...”

Contents
  • 1. Overview
  • 2. Etymology
  • 3. Cultural Impact

Cryptography Secured Against Quantum Computers

Not to be confused with Quantum cryptography , which, I assure you, is an entirely different flavor of existential dread.

Post-quantum cryptography (PQC), a term that rolls off the tongue with all the grace of a brick, is also known by the equally evocative titles of quantum-proof, quantum-safe, or quantum-resistant cryptography. This field is dedicated to the rather pressing task of developing cryptographic algorithms—primarily those of the public-key variety—that are designed to withstand a cryptanalytic attack from a sufficiently powerful quantum computer . It’s a race against an inevitable future, one where our current digital fortresses crumble under the sheer, brute-force elegance of quantum mechanics.

The vast majority of widely deployed public-key algorithms today hang their entire security on the perceived computational difficulty of one of three specific mathematical problems: the venerable integer factorization problem , the discrete logarithm problem , or its more fashionable cousin, the elliptic-curve discrete logarithm problem . These problems, once considered intractable even for the most formidable classical supercomputers, are, unfortunately for us, mere trifles for a sufficiently powerful quantum computer. Such a machine, armed with Shor’s algorithm and its elegant, terrifying efficiency, could slice through these mathematical foundations with alarming ease. And who’s to say Shor’s algorithm is the only threat? Other quantum-accelerated alternatives are perpetually lurking, waiting to be discovered or refined, further underscoring the urgency of this transition.

As of the current calendar cycle (2025, for those still keeping track), the processing power of existing quantum computers remains, thankfully, insufficient to pose a real and present danger to the cryptographic algorithms that underpin global digital security. They lack the computational muscle, the sheer number of stable qubits, to execute Shor’s algorithm at a scale that matters. However, the migration to quantum-safe cryptography is not a trivial undertaking; it’s a monumental, multi-year endeavor requiring careful planning and execution across entire digital infrastructures. This lengthy transition period is precisely why cryptographers aren’t twiddling their thumbs. They are already diligently designing and validating new algorithms, bracing for what has been colloquially dubbed “Y2Q ” or “Q-Day”—the day when current cryptographic standards will be rendered obsolete and vulnerable to the relentless power of quantum computing attacks. To assist organizations in navigating this looming threat, Mosca’s theorem offers a practical risk analysis framework, helping them to quantify their exposure and determine the necessary speed of their cryptographic migration efforts. Because, apparently, humanity needs a theorem to tell it when to act.

This critical work has not gone unnoticed, garnering significant attention from both the academic ivory tower and the pragmatic halls of industry. Key platforms for collaboration and dissemination include the PQCrypto conference series, which has been hosted regularly since 2006, along with numerous workshops on Quantum Safe Cryptography organized by the European Telecommunications Standards Institute (ETSI) and the esteemed Institute for Quantum Computing . Furthermore, the unsettling whispers and rumored existence of widespread “harvest now, decrypt later ” programs—where encrypted data is collected today with the intent of decrypting it years down the line once quantum capabilities are realized—have provided a particularly potent, if dystopian, motivation for the accelerated introduction of post-quantum algorithms. After all, information that is sensitive today will likely remain sensitive tomorrow, regardless of technological advancements designed to expose it.

In stark contrast to the existential threat that quantum computing poses to our current suite of public-key algorithms, most contemporary symmetric cryptographic algorithms and hash functions are generally considered to be relatively robust against attacks by nascent quantum computers. While the quantum Grover’s algorithm does indeed offer a theoretical speed-up for attacking symmetric ciphers, this threat can be effectively mitigated by the rather straightforward measure of simply doubling the key size. It’s almost disappointingly simple. Thus, the landscape of post-quantum symmetric cryptography doesn’t necessitate a radical departure from existing symmetric cryptographic practices; it merely demands a bit more elbow room, or rather, key space.

In a significant step towards this quantum-resistant future, the U.S. National Institute of Standards and Technology (NIST) took a definitive stride in 2024, releasing the final versions of its first three Post-Quantum Cryptography Standards. A small victory, perhaps, but a victory nonetheless.

Preparation

The bedrock of any modern digital infrastructure is its robust cybersecurity, and at the heart of this lies a complex web of cryptographic systems. These systems are not merely optional extras; they are absolutely vital for safeguarding the confidentiality and authenticity of data, whether it’s at rest or in transit. The advent of practical quantum computing represents an unprecedented threat to many of the cryptographic algorithms that currently uphold these critical protection goals.

Consider the data that exists today, meticulously encrypted and stored, or constantly flowing across networks. If this data is not “quantum-safe” – meaning it’s secured by algorithms vulnerable to quantum attacks – and if it must retain its confidentiality for an extended period, it faces a grim prognosis. In the future, once quantum computers mature, this data could be compromised through what is ominously termed “store now, decrypt later” attacks. The implication is chilling: secrets thought secure for decades could be laid bare. Beyond confidentiality, the very authenticity of digital communications and data will also be fundamentally jeopardized by quantum capabilities, potentially leading to widespread forgery and manipulation.

The only credible countermeasure to the profound threat that quantum computing poses to our global cybersecurity infrastructure is a timely, comprehensive, and meticulously coordinated transition to post-quantum cryptography (PQC). This isn’t just about swapping out a few algorithms; it’s about re-architecting the very foundations of digital trust. It demands foresight, international collaboration, and an understanding that the consequences of inaction are, quite frankly, catastrophic. Because, as always, humans tend to wait until the last possible moment to address an obvious problem.

Algorithms

The intellectual vanguard of post-quantum cryptography research is currently focused on developing, refining, and, one hopes, securing algorithms across six distinct mathematical landscapes. It’s a bit like a digital Noah’s Ark, trying to gather two of every kind before the flood.

Lattice-based cryptography

This approach, a rather elegant one if you appreciate the geometry of numbers, forms the basis for several promising cryptographic systems. It leverages the inherent computational difficulty of certain problems within mathematical lattices . Among the most prominent examples are systems built upon the learning with errors (LWE) problem, and its more structured variant, ring learning with errors (ring-LWE ). These include the ring learning with errors key exchange and the [ring learning with errors signature].

Beyond these newer constructions, the field also encompasses older, yet still relevant, schemes such as NTRU (specifically, NTRUEncrypt ) and GGH encryption. It’s worth noting that schemes like NTRU encryption have undergone extensive scrutiny for many years, remarkably, without anyone managing to discover a feasible attack that would render them insecure. This longevity, in the often-fleeting world of cryptography, is a testament to their underlying robustness. Furthermore, some of the more modern algorithms, particularly those based on the ring-LWE problem, benefit from rigorous proofs that reduce their security to a worst-case problem in lattice theory . This means that if you can break the cryptography, you’ve essentially solved a problem that’s widely believed to be extremely difficult, offering a strong theoretical guarantee.

Historically, the Post-Quantum Cryptography Study Group, a body sponsored by the European Commission, once suggested that the Stehle–Steinfeld variant of NTRU should be prioritized for standardization over the original NTRU algorithm . This recommendation was made, in part, due to the fact that, at the time, the original NTRU was still protected by patents. However, subsequent studies have hinted that NTRU may, in fact, possess more secure properties than some other lattice-based algorithms, proving that even old dogs can learn new tricks, or at least, be re-evaluated. Most recently, two lattice-based algorithms, CRYSTALS-Kyber for key encapsulation and CRYSTALS-Dilithium for digital signatures, were among the very first post-quantum algorithms to achieve standardization by NIST, marking a significant milestone in the journey towards quantum resistance.

Multivariate cryptography

This branch of cryptography delves into the notoriously complex world of solving systems of multivariate equations over finite fields. It’s a mathematical landscape littered with broken schemes and the ghosts of past attempts. It includes cryptographic systems like the Rainbow scheme, which is itself an iteration of the Unbalanced Oil and Vinegar (UOV) scheme. The security of such systems hinges on the sheer difficulty of solving these intricate systems of equations.

While numerous past attempts to construct robust multivariate equation encryption schemes have, shall we say, failed spectacularly, multivariate signature schemes, such as Rainbow, hold considerable promise. They could potentially serve as the bedrock for a quantum-secure digital signature , providing integrity and authenticity in a post-quantum world. It’s worth noting, for those who appreciate the finer points of intellectual property, that the Rainbow Signature Scheme is patented, though its patent is set to expire in August 2029.

Hash-based cryptography

Born from the foundational work of Ralph Merkle in the late 1970s, hash-based digital signatures represent an interesting, if somewhat constrained, alternative to the number-theoretic signatures like RSA and DSA that have dominated the cryptographic landscape for decades. This category encompasses schemes such as Lamport signatures , the seminal Merkle signature scheme (also known as Merkle Hash Tree signatures), and its more sophisticated descendants like XMSS (eXtended Merkle Signature Scheme), SPHINCS, and WOTS (Winternitz One-Time Signature scheme).

The primary drawback that historically limited interest in these schemes was a significant one: for any given hash-based public key, there’s an inherent limit to the number of signatures that can be generated using its corresponding set of private keys. This “one-time” or “bounded-time” nature made them less practical for widespread, general-purpose use. However, the looming threat of quantum computing has dramatically revived interest in these algorithms. Their security, rooted in the properties of cryptographic hash functions rather than number theory, appears to be far more resistant to quantum attacks. Conveniently, there appear to be no patents on the core Merkle signature scheme , and a plethora of non-patented hash functions are available to be used with these constructions. The stateful hash-based signature scheme XMSS, a creation of a research team led by Johannes Buchmann , is meticulously detailed in RFC 8391 , providing a clear path for its implementation.

It’s also worth noting that while many of these schemes are one-time or bounded-time, Moni Naor and Moti Yung introduced the concept of Universal One-Way Hash Functions (UOWHF) in 1989. They subsequently designed a signature scheme (the Naor-Yung scheme) based on this hashing paradigm that, crucially, can be unlimited-time in its use. This marked a significant advancement, as it was one of the first such signatures that didn’t rely on the “trapdoor” properties characteristic of number-theoretic cryptography.

Code-based cryptography

This category of cryptographic systems derives its security from the well-established mathematical theory of error-correcting codes . It’s a field that has seen algorithms like the foundational McEliece and Niederreiter encryption schemes, along with the related Courtois, Finiasz and Sendrier Signature scheme. The original McEliece cryptosystem , utilizing random Goppa codes , has shown remarkable resilience, having withstood over four decades of intense cryptographic scrutiny without succumbing to any practical attacks. This kind of longevity is almost unheard of in this business.

However, a cautionary tale exists within this family: many variants of the McEliece scheme , often designed to reduce the size of the cryptographic keys by introducing more structured codes, have unfortunately been found to be insecure. It seems that trying to optimize for convenience can sometimes compromise the very security you’re trying to achieve. Nevertheless, the Post-Quantum Cryptography Study Group, under the patronage of the European Commission, has specifically recommended the original McEliece public key encryption system as a strong candidate for long-term protection against the looming threat of quantum computer attacks. Furthermore, in 2025, NIST announced its intention to standardize the code-based HQC encryption algorithm, solidifying the importance of this approach.

Isogeny-based cryptography

These cryptographic systems tap into the intricate properties of isogeny graphs of elliptic curves (and, for the truly adventurous, higher-dimensional abelian varieties ) over finite fields. Specifically, they often leverage the unique structure of supersingular isogeny graphs to construct cryptographic primitives. Among the more recognized constructions in this intellectually demanding field is CSIDH (Commutative Supersingular Isogeny Diffie–Hellman), a Diffie–Hellman -like key exchange protocol. CSIDH presents itself as a remarkably straightforward, quantum-resistant replacement for the ubiquitous Diffie–Hellman and elliptic curve Diffie–Hellman key-exchange methods that currently underpin much of our secure digital communication.

Another notable representative is the signature scheme SQIsign , which derives its security from the deep mathematical concept of the categorical equivalence between supersingular elliptic curves and maximal orders within specific types of quaternion algebras . However, not all ventures in this field have been equally successful. A widely recognized construction, SIDH/SIKE (Supersingular Isogeny Diffie–Hellman / Supersingular Isogeny Key Encapsulation), suffered a rather spectacular and public break in 2022. This incident, while a significant setback for that particular family of schemes, is crucial to contextualize: the attack was highly specific to the SIDH/SIKE family and does not, fortunately, generalize to other, distinct isogeny-based constructions. The field, it seems, is still evolving, with its share of triumphs and humbling defeats.

Symmetric key quantum resistance

In a refreshing display of stability amidst the quantum upheaval, most symmetric-key cryptographic systems , such as the widely adopted Advanced Encryption Standard (AES) and the stream cipher SNOW 3G , are already considered robust against attacks by a quantum computer , provided they utilize sufficiently large key sizes. The primary quantum threat to symmetric ciphers comes from Grover’s algorithm , which can theoretically speed up a brute-force search by a quadratic factor. However, this speed-up can be effectively counteracted by simply doubling the key size. For instance, if a 128-bit key is considered secure against classical attacks, a 256-bit key would offer equivalent security against a quantum adversary employing Grover’s algorithm . It’s almost too easy, isn’t it?

Furthermore, existing key management systems and protocols that fundamentally rely on symmetric-key cryptography, rather than public-key cryptography, inherently possess a degree of security against quantum computer attacks. Prime examples include Kerberos , a network authentication protocol widely used in enterprise environments, and the 3GPP Mobile Network Authentication Structure , which secures mobile communications. Given their widespread deployment and inherent quantum resistance (with appropriate key lengths), some researchers advocate for the expanded use of Kerberos-like symmetric key management as a pragmatic and efficient pathway to achieving post-quantum cryptography today. Sometimes, the old ways are the best ways, or at least, the most readily adaptable.

Security reductions

In the often-rigorous world of cryptography research, one of the most coveted achievements is the ability to formally prove that a cryptographic algorithm’s security is mathematically equivalent to the difficulty of solving a known, hard mathematical problem. These formal proofs, often referred to as “security reductions,” serve as powerful assurances, demonstrating that cracking the encryption algorithm is at least as difficult as solving that underlying hard problem. In essence, the security of a given cryptographic algorithm is “reduced” to the security of a problem that has been extensively studied and is widely believed to be computationally intractable. It’s the closest thing to a guarantee you’ll get in this business.

Cryptographers are, quite naturally, intensely focused on discovering and establishing these security reductions for prospective post-quantum cryptography algorithms. The more robust the reduction, the more confidence one can place in the algorithm’s long-term security. The current state of these crucial reductions for various PQC candidates is outlined below.

Lattice-based cryptography – Ring-LWE Signature

For certain iterations of Ring-LWE (Learning With Errors over Rings), a security reduction has been formally established. This reduction links the security of the cryptographic scheme to the difficulty of solving the shortest-vector problem (SVP) in a lattice, serving as a lower bound on its security. The SVP is a problem of significant computational complexity, and it is famously known to be NP-hard , meaning that finding an efficient, general solution is considered impossible.

Specific ring-LWE systems that boast these provable security reductions include a variant of Lyubashevsky’s ring-LWE signatures, meticulously defined in a paper by GĂŒneysu, Lyubashevsky, and Pöppelmann. The GLYPH signature scheme, for instance, is a further refinement and variant of the GĂŒneysu, Lyubashevsky, and Pöppelmann (GLP) signature , incorporating more recent research findings that have emerged since the GLP signature’s initial publication in 2012. Another notable Ring-LWE signature scheme is Ring-TESLA.

Furthermore, a “derandomized variant” of LWE exists, known as Learning with Rounding (LWR). This variant offers “improved speedup (by eliminating sampling small errors from a Gaussian-like distribution with deterministic errors) and bandwidth.” While the traditional LWE scheme relies on the addition of a small, random error to obscure the lower bits of data, LWR achieves a similar obfuscation by employing a rounding operation for the same purpose, simplifying implementation without compromising security.

Lattice-based cryptography – NTRU, BLISS

The perceived security of the NTRU encryption scheme and the BLISS signature scheme is widely believed to be intricately related to the difficulty of solving the closest vector problem (CVP) in a lattice. The CVP , like its cousin SVP , is also recognized as an NP-hard problem, underscoring its computational intractability. However, it’s crucial to distinguish this belief from a formal, provable reduction. While the connection is strong, it’s not a direct, provable equivalence in the same way as some Ring-LWE schemes.

Recognizing this distinction, the Post-Quantum Cryptography Study Group, under the aegis of the European Commission, specifically advocated for the study of the Stehle–Steinfeld variant of NTRU . This particular variant, unlike the original NTRU algorithm , does possess a formal security reduction, making it a more theoretically sound choice for long-term cryptographic deployment. It’s a subtle but significant difference, one that cryptographers take very seriously.

Multivariate cryptography – Unbalanced oil and vinegar

Unbalanced Oil and Vinegar (UOV) signature schemes are a fascinating class of asymmetric cryptographic primitives built upon the intricate structures of multivariate polynomials defined over a finite field (denoted as $\mathbb{F}$). The security of these schemes, as with all multivariate cryptography, is tied to the difficulty of solving systems of such equations.

Bulygin, Petzoldt, and Buchmann conducted important research demonstrating a reduction of generic multivariate quadratic UOV systems to the multivariate quadratic equation solving problem. This problem itself is known to be NP-Hard , providing a strong theoretical underpinning for the security of these schemes. This means that if you could efficiently break a generic UOV system, you would essentially have found a fast way to solve a problem that is considered one of the hardest in computational complexity theory.

Hash-based cryptography – Merkle signature scheme

The enduring appeal of hash-based cryptography , particularly the Merkle signature scheme , lies in its relatively clear security foundations. In 2005, Luis Garcia provided a compelling proof demonstrating a security reduction of Merkle Hash Tree signatures to the security of the underlying cryptographic hash function . Garcia’s paper established that, provided computationally one-way hash functions exist (a widely accepted premise in modern cryptography), then the Merkle Hash Tree signature is provably secure. This is a significant claim, as it means the security of the signature scheme is no weaker than the security of its fundamental building block.

Consequently, if a chosen hash function itself possesses a provable reduction of security to a known hard problem (for instance, certain hash functions might be provably secure if specific assumptions about collision resistance hold), then the Merkle tree signature constructed with that hash function would, by extension, inherit a provable security reduction to that same known hard problem. This layered approach to security proofs is highly desirable in cryptographic design. Unsurprisingly, the Post-Quantum Cryptography Study Group, sponsored by the European Commission, has explicitly recommended the use of the Merkle signature scheme for long-term security protection against the impending threat of quantum computers. When in doubt, go with what’s provably solid.

Code-based cryptography – McEliece

The McEliece Encryption System , a stalwart of code-based cryptography , also benefits from a robust security reduction. Its security is formally reduced to the syndrome decoding problem (SDP). The SDP is a well-studied problem in coding theory, and it is unequivocally known to be NP-hardness , placing it firmly in the category of problems considered intractable for classical computers. This strong mathematical foundation is a key reason for the McEliece cryptosystem’s enduring respect.

Given this solid theoretical backing, it is hardly surprising that the Post-Quantum Cryptography Study Group, operating under the auspices of the European Commission, has specifically recommended the use of this cryptography for long-term protection against the potentially devastating attacks by a quantum computer . It’s one of the few algorithms that has truly stood the test of time, a rare feat in the fast-evolving world of encryption.

Code-based cryptography – RLCE

In 2016, Wang introduced a novel random linear code encryption scheme, dubbed RLCE (Random Linear Code Encryption). This scheme is conceptually rooted in the robust principles of McEliece schemes , building upon their established foundations. A key feature of the RLCE scheme is its versatility: it can be constructed using virtually any standard linear code , such as a Reed-Solomon code , by strategically inserting random columns into the underlying linear code generator matrix. This randomization is crucial for its security, adding a layer of complexity that makes it difficult for an adversary to exploit the underlying code structure. The ability to leverage well-understood linear codes while introducing randomness for security makes RLCE an intriguing candidate in the post-quantum landscape.

Supersingular elliptic curve isogeny cryptography

The security of cryptographic systems built upon supersingular elliptic curve isogenies is intricately linked to the computational difficulty of constructing an isogeny (a special type of mapping) between two supersingular curves that share the same number of points. This problem is considered computationally challenging, forming the basis of their security.

The most recent published investigations into the difficulty of this problem, notably by Delfs and Galbraith, have generally corroborated the claims made by the inventors of the isogeny-based key exchange schemes, indicating that the problem is indeed as hard as initially suggested. However, and this is a critical distinction, there is currently no formal security reduction for these schemes to a known NP-hard problem. While the problem is empirically difficult and has withstood analysis, the lack of a formal reduction means its security relies more on the current absence of known attacks rather than a proven equivalence to a universally accepted hard problem. This doesn’t necessarily make it insecure, but it means the cryptographic community’s confidence is built on a different kind of evidence.

Comparison

One rather unavoidable characteristic shared by many, though certainly not all, post-quantum cryptography algorithms is their proclivity for requiring significantly larger key sizes compared to the “pre-quantum” public key algorithms we currently employ. It’s a trade-off, as most things in life are. There’s a delicate balance, an almost artistic tension, between the size of the cryptographic keys, the computational efficiency required for encryption and decryption, and the resulting size of the ciphertext or digital signature. You can’t have everything.

The table below provides a glimpse into some of these values for various schemes, all calibrated to achieve a respectable 128-bit post-quantum security level .

AlgorithmTypePublic keyPrivate keySignature
ML-DSALattice1,312 B2,560 B2,420 B
NTRU EncryptLattice766.25 B842.875 B
Streamlined NTRU Prime [citation needed]Lattice154 B
RainbowMultivariate124 kB95 kB
SPHINCSHash Signature1 kB1 kB41 kB
SPHINCS+Hash Signature32 B64 B8 kB
BLISS -IILattice7 kB2 kB5 kB
GLP-Variant GLYPH SignatureRing-LWE2 kB0.4 kB1.8 kB
NewHopeRing-LWE2 kB2 kB
Goppa-based McElieceCode-based1 MB11.5 kB
Random Linear Code based encryptionRLCE115 kB3 kB
Quasi-cyclic MDPC-based McElieceCode-based1,232 B2,464 B
SIDHIsogeny564 B48 B
SIDH (compressed keys)Isogeny330 B48 B
3072-bit Discrete Lognot PQC384 B32 B96 B
256-bit Elliptic Curvenot PQC32 B32 B65 B

A rather practical consideration, often overlooked in the theoretical debates, is the sheer effort involved in transmitting these public keys across the internet. In this regard, algorithms like Ring-LWE , NTRU , and SIDH offer a distinct advantage, with key sizes that conveniently remain under 1 kilobyte (kB). Hash-signature public keys are also reasonably compact, typically falling under 5 kB, while MDPC-based McEliece keys hover around 1 kB. On the other end of the spectrum, Rainbow schemes demand a heftier transmission, requiring approximately 125 kB, and the traditional Goppa-based McEliece system, while robust, necessitates a public key that approaches a full megabyte (MB) in size. This isn’t just a matter of aesthetics; larger keys can impact network latency, bandwidth consumption, and the efficiency of cryptographic operations, especially in resource-constrained environments.

Lattice-based cryptography – LWE key exchange and Ring-LWE key exchange

The foundational concept of employing Learning With Errors (LWE) and Ring-LWE for robust key exchange protocols was initially proposed and formally filed at the University of Cincinnati in 2011 by Jintai Ding. The core brilliance of this approach stems from a rather elegant observation: the associativity of matrix multiplications. This mathematical property allows two parties to independently construct a shared secret key without ever explicitly exchanging it, while the judicious introduction of “errors” serves as the ingenious mechanism for ensuring the scheme’s security against eavesdroppers. Ding’s seminal paper outlining this idea subsequently appeared in 2012, following the filing of a provisional patent application earlier that year.

Building upon this pioneering work, Chris Peikert presented a key transport scheme in 2014 that adopted the same fundamental principles elucidated by Ding. Peikert’s contribution further refined the concept by introducing the novel idea of sending an additional 1-bit signal specifically for rounding within Ding’s original construction, optimizing its practical application. For security levels slightly exceeding 128 bits of security , Vikram Singh later published a set of parameters for Peikert’s scheme, which notably featured public keys of 6956 bits. The corresponding private key, as one might expect, would be roughly twice that size, clocking in at approximately 14,000 bits.

Further advancements in this area were showcased at Eurocrypt 2015, where an authenticated key exchange protocol, also based on Ding’s foundational ideas, was unveiled. This protocol offered the highly desirable property of provable forward secrecy , a critical feature for long-term security. This particular construction was presented as an extension of the HMQV (HMQV: A High-Performance Secure Diffie–Hellman Protocol) construction, which had itself been introduced at Crypto2005. The paper meticulously detailed various parameters for different security levels, ranging from a baseline of 80 bits all the way up to a robust 350 bits, along with their corresponding key sizes, providing a comprehensive guide for implementation.

Lattice-based cryptography – NTRU encryption

For those seeking to achieve a solid 128 bits of security with the NTRU encryption scheme , Hirschhorn, Hoffstein, Howgrave-Graham, and Whyte, a collective of prominent researchers, have provided specific recommendations regarding parameter choices. They suggest employing a public key that is represented as a polynomial of degree 613, with coefficients operating modulo $2^{10}$. This configuration results in a public key of approximately 6130 bits in length. The corresponding private key, which is essential for decryption, would consequently be around 6743 bits. While not the smallest key size in the post-quantum landscape, it represents a well-studied and empirically robust option.

Multivariate cryptography – Rainbow signature

When aiming for a 128 bits of security with the Rainbow multivariate quadratic equation signature scheme, particularly when prioritizing the smallest possible signature size, Petzoldt, Bulygin, and Buchmann offer a specific set of parameters. Their recommendation involves using equations defined over the finite field GF(31). This choice leads to a public key size of just over 991,000 bits (approximately 124 kB), a private key of just over 740,000 bits (approximately 92 kB), and digital signatures that are a concise 424 bits in length. The trade-off here is evident: while the signatures themselves are compact, the public and private key sizes are considerably larger than those found in many other post-quantum schemes, posing practical challenges for storage and transmission.

Hash-based cryptography – Merkle signature scheme

To achieve a robust 128 bits of security for hash-based signatures, particularly when the requirement is to sign a substantial number of messages—say, 1 million—the fractal Merkle tree method, as devised by Naor, Shenhav, and Wool, presents a viable solution. Under these specific conditions, the public and private key sizes for such a scheme would each be roughly 36,000 bits in length. This approach cleverly manages the expansion inherent in hash-based signatures, allowing for a higher volume of messages to be signed while maintaining a manageable key footprint, albeit one still larger than traditional public-key cryptosystems.

Code-based cryptography – McEliece

For those aiming for a stringent 128 bits of security with a McEliece scheme , the European Commission’s Post-Quantum Cryptography Study Group has put forth detailed recommendations. They advocate for the use of a binary Goppa code with a length of at least $n = 6960$ and a dimension of at least $k = 5413$. Furthermore, this code must be capable of correcting $t = 119$ errors. With these precise parameters, the public key for the McEliece system will manifest as a systematic generator matrix. The non-identity portion of this matrix alone will consume $k \times (n - k) = 5413 \times (6960 - 5413) = 5413 \times 1547 = 8,373,911$ bits, which translates to approximately 1 MB. The corresponding private key, a more compact affair, consists of the code support (with $n = 6960$ elements drawn from GF($2^{13}$)) and a generator polynomial (with $t = 119$ coefficients from GF($2^{13}$)), summing up to a length of 92,027 bits. It’s a hefty key, but it offers a proven level of security.

The study group is also actively investigating the potential of Quasi-cyclic MDPC (Moderate Density Parity-Check) codes. For a comparable security level, they consider codes with a length of at least $n = 2^{16} + 6 = 65542$ and a dimension of at least $k = 2^{15} + 3 = 32771$, capable of correcting $t = 264$ errors. In this configuration, the public key for the McEliece system is represented by the first row of a systematic generator matrix, whose non-identity part requires only $k = 32771$ bits. The private key, a quasi-cyclic parity-check matrix with $d = 274$ nonzero entries per column (or twice that per row), can be compactly represented by the coordinates of the nonzero entries on its first row, requiring no more than $d \times 16 = 4384$ bits. This offers a significant reduction in key size compared to the traditional Goppa code approach.

Separately, Barreto et al. have recommended an alternative set of parameters for a binary Goppa code within a McEliece scheme , aiming for similar security. Their suggestion involves a code of length at least $n = 3307$ and a dimension of at least $k = 2515$, capable of correcting $t = 66$ errors. With these parameters, the public key’s non-identity part of the systematic generator matrix would require $k \times (n - k) = 2515 \times (3307 - 2515) = 2515 \times 792 = 1,991,880$ bits (approximately 249 kB). The corresponding private key, comprising the code support ($n = 3307$ elements from GF($2^{12}$)) and a generator polynomial ($t = 66$ coefficients from GF($2^{12}$)), would be 40,476 bits in length. Clearly, there are many paths to quantum resistance, each with its own set of trade-offs.

Supersingular elliptic curve isogeny cryptography

For those venturing into the realm of supersingular isogeny Diffie–Hellman (SIDH) to achieve 128 bits of security , De Feo, Jao, and Plut provided early guidance. They recommended utilizing a supersingular curve modulo a 768-bit prime number. If one employs standard elliptic curve point compression , the resulting public key would need to be no more than $8 \times 768 = 6144$ bits in length. This was considered a respectable size at the time, particularly given the novelty of the approach.

However, the field is dynamic, and progress is relentless. A March 2016 paper by authors Azarderakhsh, Jao, Kalach, Koziel, and Leonardi demonstrated an ingenious method to halve the number of bits required for transmission. This advancement was further refined by Costello, Jao, Longa, Naehrig, Renes, and Urbanik, leading to a highly optimized, compressed-key version of the SIDH protocol. This improved version boasted public keys that were a mere 2640 bits in size. This significant reduction in key size brings the number of bits transmitted for SIDH roughly into parity with the key sizes of non-quantum secure algorithms like RSA and Diffie–Hellman at equivalent classical security levels, making it a highly attractive option for practical deployment.

Symmetric-key–based cryptography

As a general, and rather refreshing, rule of thumb, to achieve a robust 128 bits of security within a symmetric-key–based system, one can confidently employ key sizes of 256 bits. The reasoning is straightforward: the most potent quantum attack against an arbitrary symmetric-key system is an application of Grover’s algorithm . This algorithm, while powerful, requires computational work proportional to the square root of the size of the key space. Therefore, doubling the key size effectively squares the search space for a quantum computer, making a 256-bit key quantum-resistant to the same degree that a 128-bit key is classically secure.

Furthermore, the act of transmitting an encrypted key to a device that already possesses the necessary symmetric key for decryption also typically requires a modest 256 bits. It becomes unequivocally clear that, among all the approaches to post-quantum cryptography , symmetric-key systems consistently offer the smallest and most efficient key sizes. Sometimes the simplest solutions are the most elegant, and the most robust.

Forward secrecy

A public-key system truly demonstrates the highly desirable property referred to as perfect forward secrecy when it consistently generates ephemeral, random public keys for each individual session, solely for the purpose of establishing a shared key agreement. The profound implication of this design choice is that the compromise of a single message, or even an entire session key, cannot possibly lead to the compromise of other, unrelated messages or past communications. More critically, it means that there is no single, long-term secret value—such as a static private key—whose compromise could unilaterally expose multiple past or future messages.

Security experts, with good reason, strongly advocate for the widespread adoption and use of cryptographic algorithms that inherently support forward secrecy over those that do not. The rationale is compelling: forward secrecy acts as a crucial protective barrier against the catastrophic consequences of a long-term private key compromise. Should an adversary, whether a nation-state actor or a sophisticated criminal organization, manage to obtain a long-term private key associated with a public/private key pair (a “master key,” if you will), forward secrecy ensures that all past communications secured by ephemeral session keys remain impenetrable. This property is widely regarded as an indispensable tool in the ongoing battle against pervasive mass surveillance by intelligence agencies, as it frustrates attempts to retroactively decrypt vast archives of intercepted data.

In the realm of post-quantum cryptography , both the Ring-LWE key exchange and the supersingular isogeny Diffie–Hellman (SIDH) key exchange are commendably designed to support forward secrecy within a single exchange with the other communicating party. This means they can establish a new, unique, and ephemeral shared secret for each communication session, thereby ensuring that past and future sessions remain secure even if the current session’s keys are compromised. Importantly, both Ring-LWE and SIDH are also flexible enough to be deployed without forward secrecy , typically by creating a variant akin to the classic ElGamal encryption adaptation of Diffie–Hellman , though this is generally not the recommended mode of operation for maximum security.

Conversely, other algorithms discussed in this article, such as NTRU , do not inherently support forward secrecy in their standard configurations. While they can still provide robust encryption, their reliance on static key pairs means that a compromise of the long-term private key would indeed jeopardize all communications secured with that key. It’s a fundamental architectural difference. It is important to note, however, that any authenticated public key encryption system can, in principle, be leveraged and extended to construct a key exchange protocol that does offer forward secrecy , demonstrating the modularity and adaptability of cryptographic design.

Open Quantum Safe project

The Open Quantum Safe (OQS) project, a rather ambitious undertaking, commenced in late 2016 with the explicit goal of developing and prototyping quantum-resistant cryptography. It’s a proactive measure, a digital shield being forged before the arrows start flying. The project’s central aim is to consolidate and integrate the most promising current post-quantum schemes into a single, cohesive library, aptly named liboqs.

liboqs itself is an open-source C library specifically designed for quantum-resistant cryptographic algorithms. Initially, its focus was primarily on key exchange algorithms, recognizing the immediate need for quantum-secure session establishment. However, its scope has since expanded to include several critical signature schemes, broadening its utility. It provides a standardized and common application programming interface (API) tailored for post-quantum key exchange algorithms, serving as a central repository for various implementations and fostering interoperability. Beyond merely housing these implementations, liboqs also incorporates a comprehensive test harness and benchmarking routines, allowing researchers and developers to rigorously compare the performance characteristics (speed, memory footprint, etc.) of different post-quantum algorithm implementations. Furthermore, the OQS project extends its reach by providing seamless integration of liboqs into OpenSSL , the widely used toolkit for TLS and other cryptographic protocols, thereby facilitating the adoption of quantum-resistant capabilities in existing infrastructure.

As of March 2023, the liboqs library supported a range of key exchange algorithms, demonstrating the breadth of active research and development in the field.

In a landmark announcement in August 2024, NIST officially published three algorithms as FIPS standards, marking them as recommended for federal use. A fourth is anticipated by the end of the year, further solidifying the transition to a quantum-resistant future.

AlgorithmType
BIKEcodes
Classic McEliecegoppa codes
FIPS-203: CRYSTALS-KyberML-KEM: Module Learning With Error
FIPS-204: CRYSTALS-DilithiumML-DSA: Module Short Integer Solution
FIPS-205: SPHINCS+SLH-DSA: hash based
FIPS-206: FalconFN-DSA: Short Integer Solution
FrodoLearning with errors
HQCcodes
NTRULattice-based cryptography

It’s a dynamic field, and the progression of the NIST Post-Quantum Cryptography Standardization Project means that some algorithms, once supported, have since been removed from the active roster. These include:

AlgorithmType
BCNS15Ring learning with errors key exchange
McBitsError-correcting codes
NewHopeRing learning with errors key exchange
SIDHSupersingular isogeny key exchange

Such is the nature of cutting-edge research; what is promising today may be superseded or, unfortunately, broken tomorrow.

Implementation

The theoretical development of quantum-safe algorithms is, of course, a critical first step. However, the true challenge lies in the arduous task of implementing these potentially quantum-secure algorithms into the sprawling, complex, and often archaic existing digital systems. It’s a bit like trying to upgrade a jet engine mid-flight.

Concrete progress is being made on this front. For instance, Microsoft Research has conducted extensive tests, successfully implementing the PICNIC signature scheme within a Public Key Infrastructure (PKI) that leverages Hardware Security Modules (HSMs) for secure key storage and cryptographic operations. Similarly, various HSM vendors have undertaken test implementations for Google’s NewHope algorithm, demonstrating the feasibility of integrating these new cryptographic primitives into specialized hardware designed for high-security applications. In a notable development in August 2023, Google released a FIDO2 security key implementation that employs an ECC /Dilithium hybrid signature schema. This groundbreaking work was performed in partnership with ETH ZĂŒrich , showcasing practical, multi-layered post-quantum security in a widely used authentication standard.

Messaging applications, a critical component of modern communication, are also rapidly adapting. The widely respected Signal Protocol has already incorporated Post-Quantum Extended Diffie–Hellman (PQXDH), providing an additional layer of quantum resistance to its renowned end-to-end encryption.

In a significant announcement on February 21, 2024, Apple Inc. revealed its plans to upgrade its proprietary iMessage protocol with a new, advanced PQC protocol dubbed “PQ3.” This new protocol will feature “ongoing keying,” a mechanism designed to enhance long-term security. Apple explicitly stated that, while fully capable quantum computers are not yet a reality, they are proactively mitigating the risks posed by future quantum machines, as well as the insidious “harvest now, decrypt later” attack scenarios. Apple confidently asserted that their PQ3 implementation provides protections that “surpass those in all other widely deployed messaging apps,” primarily due to its innovative use of ongoing keying, which periodically refreshes session keys to limit the impact of any potential compromise. The company intends to replace the existing iMessage protocol with PQ3 across all supported conversations by the end of 2024. To aid in comparing the security properties of messaging apps, Apple also introduced a security level scale, ranging from 0 to 3: Level 0 signifies no end-to-end encryption by default; Level 1 denotes pre-quantum end-to-end encryption by default; Level 2 indicates PQC key establishment only (e.g., PQXDH); and Level 3 represents the pinnacle, with PQC key establishment and ongoing rekeying, as implemented in PQ3.

The Internet Engineering Task Force (IETF) is also actively preparing, having drafted an Internet-Draft for integrating PQC algorithms into the Messaging Layer Security (MLS) protocol. MLS is slated for use in Rich Communication Services (RCS) text messaging, which will be adopted by both Google Messages and Messages (Apple) , indicating a broad industry move towards quantum-resistant messaging.

Other notable implementations of post-quantum cryptography include:

  • bouncycastle : A widely used collection of cryptographic APIs for Java and C#.
  • liboqs: The aforementioned open-source C library for quantum-resistant cryptographic algorithms, serving as a foundational resource for many implementations.

Hybrid encryption

Ah, hybrid encryption. Because trusting one new, unproven thing is risky enough, so why not add another layer of complexity?

Google has consistently championed and maintained the use of “hybrid encryption” in its deployment of post-quantum cryptography . The philosophy behind this approach is eminently pragmatic: whenever a relatively novel post-quantum scheme is introduced and utilized, it is always combined with a more established, traditionally proven, non-PQ cryptographic scheme. This dual-layer strategy is designed to ensure that the data remains secure even if the newer, relatively untested PQC algorithm turns out to be vulnerable to unforeseen non-quantum attacks before Y2Q (the day quantum computers become a threat). It’s a belt-and-suspenders approach for a future that is, admittedly, still somewhat uncertain.

This type of hybrid scheme was prominently featured in Google’s pioneering tests for post-quantum TLS in both 2016 and 2019, demonstrating a commitment to layered security. It was also adopted in their 2023 FIDO2 key implementation. The wisdom of this approach was starkly illustrated when one of the algorithms used in the 2019 test, SIKE, was spectacularly broken in 2022. Crucially, because of the hybrid design, the non-PQ X25519 layer (an elliptic curve Diffie–Hellman variant already widely used in TLS ) still protected the data, preventing any actual compromise. This served as a powerful validation of the hybrid strategy. Similarly, Apple’s PQ3 protocol for iMessage and Signal’s Protocol with PQXDH are also hybrid constructions, reflecting a growing consensus on this cautious approach.

However, not everyone is in agreement. The NSA and GCHQ, for instance, have voiced arguments against hybrid encryption, primarily contending that it introduces unnecessary complexity into both the implementation and the transition process. Daniel J. Bernstein , a prominent cryptographer and staunch advocate for hybrid encryption, has rather bluntly dismissed these claims as “bogus,” arguing that the security benefits far outweigh any perceived increase in complexity. It’s a debate that highlights the differing priorities between intelligence agencies focused on operational agility and cryptographers prioritizing robust, future-proof security.

See also