Charles Bennett and Gilles Brassard won the Turing Award for their foundational work on quantum information science — specifically, for the BB84 quantum key distribution protocol they published in 1984. That's 40 years from paper to Turing Award, which tells you something about how long it takes for theoretical work in quantum computing to become relevant enough for the broader CS community to recognize it.
Their recognition comes at an interesting moment. Quantum computers capable of breaking RSA encryption don't exist yet — and might not for another decade — but the cryptographic community is already in full migration mode. NIST finalized its first post-quantum cryptography standards, major browsers are testing post-quantum key exchange, and Signal has already deployed it in production. The gap between 'quantum computers will break encryption someday' and 'we need to change our systems now' has closed.
What Quantum Computers Actually Threaten
Most of the popular coverage of quantum computing and cryptography is either terrifyingly alarmist ('all encryption is broken!') or dismissively skeptical ('it'll never work'). The reality is more specific and more interesting.
Quantum computers threaten asymmetric cryptography — the systems based on the mathematical difficulty of factoring large numbers (RSA) or computing discrete logarithms (Diffie-Hellman, ECC). Shor's algorithm, running on a sufficiently large quantum computer, can solve these problems in polynomial time. That means RSA-2048, which would take classical computers billions of years to break, could theoretically be cracked by a quantum computer in hours.
Quantum computers are much less threatening to symmetric cryptography. Grover's algorithm gives a quadratic speedup for brute-force search, which effectively halves the key length. AES-256 becomes equivalent to AES-128 against a quantum attacker — still impractical to brute-force. AES-128 drops to the equivalent of 64-bit security, which is concerning but not catastrophic.
What's threatened by quantum computers:
BROKEN (by Shor's algorithm):
├── RSA (all key sizes)
├── Diffie-Hellman key exchange
├── Elliptic Curve Cryptography (ECDSA, ECDH)
└── DSA
WEAKENED (by Grover's algorithm):
├── AES-128 → effectively 64-bit security (upgrade to AES-256)
├── AES-256 → effectively 128-bit security (still secure)
└── SHA-256 → effectively 128-bit preimage resistance (still secure)
NOT AFFECTED:
├── One-time pads
├── Hash-based signatures (SPHINCS+)
└── Symmetric encryption with sufficiently large keys
The practical implication: anything that uses public-key cryptography — TLS handshakes, SSH connections, code signing, cryptocurrency, digital signatures — will need to migrate to quantum-resistant algorithms. Symmetric encryption mostly just needs larger keys.
The 'Harvest Now, Decrypt Later' Problem
This is the reason migration is urgent even though quantum computers can't break anything yet. Adversaries — nation-states, primarily — are almost certainly recording encrypted traffic now with the intent to decrypt it once quantum computers become available.
Think about data that needs to remain confidential for 20+ years: diplomatic communications, intelligence reports, trade secrets, medical records. If that data is encrypted with RSA or ECDH today, and a capable quantum computer arrives in 15 years, the encryption retroactively fails. The data was always vulnerable — you just didn't know it yet.
This isn't speculative threat modeling. NSA guidance has explicitly recommended transitioning to quantum-resistant algorithms for classified systems. The assumption in the intelligence community is that state actors are already stockpiling encrypted traffic. If your data has a long secrecy requirement, the time to migrate was yesterday.
Post-Quantum Cryptography: What NIST Chose
NIST ran a multi-year competition to standardize post-quantum cryptographic algorithms — similar to how AES was selected. After evaluating dozens of candidates, they standardized three primary algorithms:
- ML-KEM (Kyber) — A key encapsulation mechanism for key exchange. Based on the Module Learning With Errors (MLWE) problem from lattice cryptography. This replaces Diffie-Hellman and ECDH in TLS handshakes and similar protocols. It's fast, produces relatively small keys, and is the primary recommendation for general-purpose key exchange.
- ML-DSA (Dilithium) — A digital signature algorithm, also based on lattice cryptography. Replaces RSA and ECDSA for signing. Signatures are larger than ECDSA (about 2.5 KB vs 64 bytes), which has implications for certificate chains and protocols that transmit many signatures.
- SLH-DSA (SPHINCS+) — A hash-based digital signature scheme. Based on the security of hash functions rather than lattice problems. Slower and produces larger signatures than ML-DSA, but its security relies on well-understood hash function assumptions rather than newer lattice-based assumptions. It's the conservative fallback.
The lattice-based algorithms (ML-KEM, ML-DSA) are favored for performance reasons, but they're based on mathematical problems that are relatively new compared to the decades of analysis behind RSA and AES. There's a small but nonzero chance that a breakthrough in lattice cryptanalysis could weaken them. SPHINCS+ exists as insurance — its security is based on hash functions we've studied for 30+ years.
What's Already Deployed
Post-quantum cryptography isn't theoretical anymore. It's in production systems you use today.
- Chrome and Firefox use hybrid key exchange (X25519 + ML-KEM-768) for TLS connections. The 'hybrid' part means they combine a classical key exchange with a post-quantum one — if either is broken, the connection is still secure. This adds about 1 KB to the TLS handshake.
- Signal deployed PQXDH, a post-quantum key agreement protocol, for initial key exchange. Every new Signal conversation now has post-quantum forward secrecy.
- Apple iMessage introduced PQ3, using post-quantum key exchange with periodic rekeying. Apple claims this provides 'Level 3' security — the highest level in their framework.
- Cloudflare supports post-quantum key exchange on its CDN. If you're behind Cloudflare, your connections may already be using ML-KEM without you knowing.
- AWS KMS supports hybrid post-quantum TLS for key management operations.
The Migration Challenge for Developers
If you're building software that uses cryptography (which is almost all software), here's what migration actually looks like in practice.
TLS: Mostly Handled for You
If your application uses TLS through a standard library (OpenSSL, BoringSSL, Go's crypto/tls), post-quantum support is being added at the library level. You'll get it through dependency updates. The main action item is to make sure you're not pinning to old TLS library versions and that your systems handle the slightly larger handshake sizes.
The size increase matters more than you might think. ML-KEM-768 adds about 1,100 bytes to the TLS ClientHello message. Some middleboxes, firewalls, and poorly implemented TLS stacks don't handle ClientHello messages larger than ~512 bytes. Google's experience rolling out post-quantum key exchange found that about 0.5% of connections failed due to middlebox incompatibility. If your users are behind enterprise firewalls, test this.
Digital Signatures: More Disruptive
Post-quantum signatures are significantly larger than classical ones. An ECDSA signature is 64 bytes. An ML-DSA-65 signature is about 3,300 bytes. An SLH-DSA signature can be over 17,000 bytes. This has cascading effects:
- X.509 certificate chains become much larger. A typical chain of 3 certificates with ML-DSA signatures is roughly 10 KB larger than ECDSA. On bandwidth-constrained connections, this matters.
- Blockchain and cryptocurrency systems that rely on compact signatures face scalability challenges. Each transaction with a post-quantum signature takes 50x more space.
- Code signing, package signing, and software update verification need to handle larger signatures without breaking size assumptions in existing tooling.
- Certificate transparency logs, OCSP responses, and CRL distribution all grow in size.
Application-Layer Crypto: Your Problem
If your application implements its own cryptographic protocols — end-to-end encryption, custom key exchange, signed tokens, encrypted storage — you need to actively plan your migration. The general strategy:
- Inventory your cryptographic dependencies. Find every place your code uses RSA, ECDSA, ECDH, or Diffie-Hellman. This includes libraries, key management systems, certificate authorities, and hardware security modules.
- Adopt hybrid schemes first. Combine classical and post-quantum algorithms. If the post-quantum algorithm turns out to have a weakness, you fall back to classical security. If quantum computers arrive, you have post-quantum protection.
- Use established libraries. Don't implement post-quantum algorithms yourself. Use liboqs (Open Quantum Safe), which integrates with OpenSSL and provides tested implementations of ML-KEM, ML-DSA, and SPHINCS+.
- Test performance impact. Post-quantum operations are generally fast (ML-KEM key generation is comparable to ECDH), but signature verification is slower, and key/signature sizes affect bandwidth and storage.
- Plan for crypto agility. Design your protocols so the cryptographic algorithms can be swapped without breaking the protocol. This is hard to retrofit — it's much easier to build in from the start.
What About Quantum Key Distribution?
Bennett and Brassard's BB84 — the work they won the Turing Award for — is quantum key distribution (QKD), which is a different approach entirely. Instead of using mathematical problems that quantum computers can't solve, QKD uses the physical properties of quantum mechanics to distribute encryption keys. Any attempt to eavesdrop on the key exchange disturbs the quantum states and is detectable.
QKD is theoretically beautiful and provably secure based on physics rather than computational assumptions. In practice, it has serious limitations: it requires dedicated fiber optic links (you can't run it over the internet), the maximum distance is a few hundred kilometers without quantum repeaters (which don't exist at scale yet), and it's enormously expensive. China has deployed a QKD network between Beijing and Shanghai, but it relies on trusted relay nodes that somewhat defeat the purpose.
For the foreseeable future, post-quantum cryptography (mathematical algorithms on classical computers) is the practical path. QKD is relevant for high-security government and military links, but it won't replace TLS for your web application.
Timeline: When Does This Actually Matter?
Nobody knows when a cryptographically relevant quantum computer (CRQC) will exist — one large enough to break RSA-2048. Estimates range from 2030 to 'never,' with most experts clustering around 2035-2040. The current largest quantum computers have around 1,000 physical qubits; breaking RSA-2048 is estimated to require millions of error-corrected logical qubits.
But here's the thing: it doesn't matter exactly when. The migration itself takes years. Large organizations need to inventory their cryptographic usage, update libraries, test compatibility, rotate keys and certificates, and update protocols. NIST recommends completing the transition by 2035. Given that enterprise software migrations routinely take 5-10 years, starting now is already arguably late.
The practical advice is boring but correct: update your TLS libraries, plan your signature migration, adopt hybrid schemes where possible, and build crypto agility into new systems. You don't need to panic — but you do need to start. The organizations that will struggle most are the ones that treat post-quantum migration as a future problem until it becomes an emergency.