
The first instinct many people have when they hear about the quantum threat is straightforward: just use longer keys. If a quantum computer can break RSA-2048, why not use RSA-8192 or RSA-16384?
The answer is that Shor's Algorithm does not merely chip away at RSA's security margin -- it collapses the entire mathematical paradigm. Shor's Algorithm factors integers in polynomial time, meaning that doubling the key length does not exponentially increase the difficulty for a quantum attacker. It increases it only polynomially. To maintain security against a quantum computer using Shor's Algorithm, you would need RSA keys so astronomically large that they would be completely impractical -- we are talking about keys measured in terabits, with encryption and decryption operations that would take minutes or hours on classical hardware. The same applies to ECC: no elliptic curve, regardless of how large, survives the quantum variant of Shor's Algorithm.
This means we cannot incrementally improve our existing algorithms. We need an entirely different class of mathematical problems -- problems that remain hard even when an attacker has access to a quantum computer. This is the fundamental insight behind post-quantum cryptography: the mathematical foundation itself must change.
The cryptographic community recognized this need well before quantum computers became front-page news. Researchers had been developing and analyzing quantum-resistant algorithms for over a decade when, in 2016, NIST issued a formal call for proposals. The message was clear: the world's most critical standards body was signaling that the transition was not optional. It was inevitable, and the groundwork needed to start immediately.
NIST's Post-Quantum Cryptography Standardization Process was deliberately modeled after the open competitions that produced the AES block cipher and the SHA-3 hash function. The philosophy was the same: invite the global cryptographic community to submit their best candidates, then subject them to years of rigorous public analysis.
The response was massive. NIST received 82 initial submissions from research teams around the world. These proposals spanned a diverse range of mathematical approaches: lattice-based schemes, code-based schemes, multivariate polynomial systems, hash-based constructions, isogeny-based schemes, and others. Each submission had to include a complete specification, reference implementations, performance benchmarks, and a security analysis.
The evaluation process was grueling and methodical. NIST organized the competition into multiple rounds, each lasting approximately one to two years. At each stage, submissions were scrutinized across several dimensions:
Security analysis. The global cryptographic community -- including researchers not affiliated with any submission -- attempted to find weaknesses. Attacks were published, security proofs were challenged, and parameter sets were adjusted. Any scheme that showed a meaningful vulnerability was eliminated or required to strengthen its parameters.
Performance characteristics. Key generation time, encryption/decryption speed, signature generation/verification speed, key sizes, and ciphertext sizes were all benchmarked across diverse hardware platforms. A scheme that is mathematically beautiful but takes 10 seconds to generate a key on a web server is not a viable replacement for RSA.
Side-channel resistance. Unlike a textbook mathematical proof, real-world implementations must resist timing attacks, power analysis, and electromagnetic emanation analysis. Schemes whose implementations were inherently difficult to protect against side-channel attacks received additional scrutiny.
Implementation complexity. Algorithms that required highly specialized expertise to implement correctly posed a greater risk of deployment errors. Simplicity and clarity of specification were valued.
Round 1 narrowed the field from 82 to 26 candidates. Round 2 reduced it further, and by 2022, NIST announced the initial selections for standardization. One notable casualty during the process was SIKE (Supersingular Isogeny Key Encapsulation), which had been a Round 3 finalist before a devastating mathematical attack in 2022 broke it completely using a classical computer. This was a sobering reminder of why the multi-year evaluation process exists: cryptographic assumptions that appear solid can collapse under sustained analysis.
The timeline of the full process stretched from 2016 to 2024, when the first three final standards -- FIPS 203, FIPS 204, and FIPS 205 -- were officially published. An additional signature scheme, FN-DSA (based on FALCON), is expected to follow. The entire effort represents one of the most consequential standardization exercises in the history of information security.
The algorithms that survived this gauntlet represent the new foundation of cryptographic security. Understanding what they are and how they work is essential for any security leader planning a migration.
ML-KEM (Module-Lattice-Based Key-Encapsulation Mechanism), standardized as FIPS 203, is the primary replacement for key exchange protocols like ECDHE. It is based on the CRYSTALS-Kyber algorithm and is the mechanism you will use to establish shared secrets over untrusted channels -- the same role that Diffie-Hellman and ECDHE play in today's TLS handshakes.
ML-KEM comes in three parameter sets -- ML-KEM-512, ML-KEM-768, and ML-KEM-1024 -- offering increasing levels of security. ML-KEM-768 is the recommended general-purpose choice, offering a security level roughly equivalent to AES-192.
ML-DSA (Module-Lattice-Based Digital Signature Algorithm), standardized as FIPS 204, replaces RSA and ECDSA for digital signatures. It is based on the CRYSTALS-Dilithium algorithm and will be used for signing certificates, authenticating software updates, verifying identities, and every other application that currently relies on RSA or ECC signatures.
Like ML-KEM, it offers multiple parameter sets (ML-DSA-44, ML-DSA-65, ML-DSA-87) balancing security level against signature size and performance.
SLH-DSA (Stateless Hash-Based Digital Signature Algorithm), standardized as FIPS 205, is based on SPHINCS+ and serves as a conservative backup. Unlike the lattice-based schemes, SLH-DSA relies solely on the security of hash functions -- a well-understood and deeply analyzed primitive. The trade-off is performance: SLH-DSA signatures are significantly larger and slower to generate than ML-DSA signatures. However, its security assumptions are among the most conservative in all of cryptography, making it valuable as a hedge against the possibility that lattice-based assumptions are eventually weakened.
FN-DSA, based on FALCON (Fast-Fourier Lattice-based Compact Signatures over NTRU), is an additional signature scheme expected to be standardized as a complement to ML-DSA. FALCON produces notably compact signatures -- smaller than ML-DSA at equivalent security levels -- but is more complex to implement, particularly regarding protection against side-channel attacks during signing. It is expected to be favored in applications where signature size is at a premium, such as certificate chains and blockchain systems.
Both ML-KEM and ML-DSA are built on lattice-based cryptography, specifically the Learning With Errors (LWE) problem and its structured variants. The core idea is elegant: given a system of approximate linear equations over a finite field -- where each equation includes a small, random error term -- recovering the secret solution is believed to be computationally infeasible, even for a quantum computer.
More formally, the LWE problem asks: given a matrix A and a vector b = As + e, where s is a secret vector and e is a vector of small errors, find s. Without the error terms, this is simple linear algebra. With them, the problem becomes extraordinarily hard. The module variant (Module-LWE) adds algebraic structure for efficiency, allowing operations over polynomial rings that dramatically reduce key sizes and computation time while preserving the underlying hardness.
The LWE problem has been studied extensively since Oded Regev introduced it in 2005, and no efficient quantum algorithm for solving it is known. This nearly two-decade track record of resisting cryptanalysis -- including quantum cryptanalysis -- is what gives the cryptographic community confidence in these new standards.
Adopting PQC does not mean ripping out classical cryptography overnight. The recommended approach -- and the one being deployed by the world's leading technology companies -- is hybrid key exchange: running a classical algorithm and a PQC algorithm in parallel, combining their outputs to establish a shared secret.
The most widely deployed hybrid approach pairs X25519 (a classical elliptic-curve Diffie-Hellman scheme) with ML-KEM-768. In a hybrid TLS handshake, both key exchanges are performed simultaneously, and the final shared secret is derived from both results. The security guarantee is powerful: the connection remains secure as long as either algorithm remains unbroken.
This strategy addresses two risks simultaneously. First, it protects against the possibility that the new PQC algorithms have undiscovered weaknesses -- an unlikely but non-zero risk for any relatively new cryptographic scheme. If a flaw is found in ML-KEM, X25519 still protects the session. Second, it provides quantum resistance: if a CRQC breaks X25519, ML-KEM still protects the session. You only lose security if both are broken simultaneously, which is an extremely conservative security posture.
The real-world adoption of hybrid key exchange has been remarkably swift. Google Chrome enabled X25519+ML-KEM-768 hybrid key exchange by default starting in version 124 (early 2024), meaning hundreds of millions of TLS connections shifted to quantum-resistant key exchange with no user-visible change. Signal deployed its PQXDH protocol in 2023, adding PQC protection to its messaging platform. Cloudflare activated hybrid PQC support across its network, covering a significant fraction of global web traffic. Apple integrated PQ3 into iMessage, incorporating ML-KEM into its end-to-end encryption protocol. AWS, Microsoft Azure, and Google Cloud have all begun offering PQC-enabled TLS endpoints.
These are not experimental deployments. They are production systems handling billions of connections, proving that the performance overhead of hybrid PQC is manageable in practice.
Understanding PQC algorithms is necessary, but it is not sufficient. The critical challenge for most organizations is migration -- the practical work of transitioning an enterprise's cryptographic infrastructure from classical to post-quantum algorithms. This is a multi-year effort, and the organizations that start now will be in a dramatically stronger position than those that wait.
Here is a five-step framework for your PQC migration:
You cannot migrate what you cannot find. The first step is a comprehensive inventory of every cryptographic asset in your organization. This includes TLS certificates, VPN configurations, SSH keys, code-signing certificates, database encryption, API authentication tokens, hardware security modules (HSMs), embedded device firmware, and third-party integrations that rely on cryptographic protocols. Many organizations discover that their cryptographic footprint is far larger and more complex than they assumed. Automated discovery tools are available from vendors including Venafi, Keyfactor, and InfoSec Global.
Apply the HNDL risk framework from Part 1. Data with long confidentiality requirements and high sensitivity should be prioritized for migration. Government and healthcare data, long-lived signing keys, and inter-organizational data sharing agreements typically top the list. Systems that are hardest to update -- embedded devices, SCADA systems, legacy mainframes -- should be identified early because they will require the longest migration timelines.
Stand up PQC-enabled test environments and benchmark performance. Measure the impact of larger key sizes and ciphertext sizes on your network bandwidth, database storage, and application latency. ML-KEM-768 public keys are 1,184 bytes compared to 32 bytes for X25519 -- this difference matters in constrained environments. Identify any systems that cannot handle the increased sizes and develop mitigation plans.
Deploy hybrid key exchange (classical + PQC) on non-critical production systems first. Monitor for compatibility issues, performance degradation, and any unexpected behavior. Hybrid mode is specifically designed for this phase: it provides quantum resistance while maintaining full backward compatibility with systems that do not yet support PQC.
Roll out PQC across all systems according to your prioritized roadmap. Update certificates, rekey HSMs, deploy firmware updates to embedded devices, and renegotiate cryptographic requirements with third-party vendors. This step will take the longest and should be treated as a multi-year program, not a one-time project.
The most important architectural principle to embed throughout this migration is crypto-agility: the ability to swap cryptographic algorithms without redesigning your systems. If the SIKE collapse taught us anything, it is that cryptographic assumptions can fail. Your architecture should be designed so that cryptographic algorithms are configuration parameters, not hardcoded dependencies. Abstracting cryptographic operations behind well-defined interfaces ensures that when the next algorithmic transition occurs -- and it will -- your organization can respond in weeks rather than years.
The regulatory landscape is already moving. The NSA's CNSA 2.0 guidance mandates that National Security Systems begin transitioning to PQC algorithms immediately, with specific milestones: software and firmware signing must use PQC by 2025, web servers and cloud services by 2025, traditional networking equipment by 2026, operating systems by 2027, niche equipment by 2030, and legacy systems by 2033. While these deadlines apply specifically to NSS, they signal the direction for the broader industry. Organizations in regulated sectors -- finance, healthcare, critical infrastructure -- should expect similar requirements from their respective regulators in the near term.
The post-quantum cryptographic toolkit is no longer theoretical. The standards are published. The algorithms are tested. Production deployments are live and growing. The technology is ready.
What remains is execution. The migration to PQC is the largest cryptographic transition in the history of computing -- larger than the move from DES to AES, larger than the adoption of TLS. It touches every system, every protocol, and every device. It requires planning, investment, and sustained organizational commitment.
But the cost of inaction is clear. Every day of delay is another day that encrypted data flows across networks where adversaries may be recording it. The Harvest Now, Decrypt Later clock does not pause while we plan.
Start your cryptographic inventory this quarter. Stand up a PQC test environment this month. Deploy hybrid key exchange on your first production system before the end of the year. The tools exist. The standards are final. The only variable is whether your organization will act with the urgency this moment demands.
Next in this series: In Part 3, we shift perspective entirely. Quantum computing is not just a threat to defend against -- it is one of the most transformative technological opportunities of the century. We will explore how quantum computing is poised to revolutionize drug discovery in pharmaceuticals and reshape the foundations of computational finance.

Ryan previously served as a PCI Professional Forensic Investigator (PFI) of record for 3 of the top 10 largest data breaches in history. With over two decades of experience in cybersecurity, digital forensics, and executive leadership, he has served Fortune 500 companies and government agencies worldwide.

How Apple Intelligence hallucinations exposed fragile market microstructure, and why iOS 26's Liquid Glass UI and FinanceKit API are fundamentally reshaping fintech data provenance, algorithmic trading, and the death of screen scraping.

A deep technical analysis of Notion's architectural security gaps, permission model failures, AI exfiltration vulnerabilities, and why enterprise IT leaders should look past the polished UI before adopting it as a system of record.

Why 95% of enterprise AI investments fail to deliver ROI, and how the rise of the Chief AI Officer and proprietary data systems offers the only path to sustainable competitive advantage.