
Every day, we bank, shop, and communicate online, placing our trust in a digital shield. This shield is built from asymmetric encryption, with names like RSA and ECC. These systems work on a simple mathematical premise: they use problems that are incredibly difficult for even the most powerful classical supercomputers to solve.
The core of this security is prime factorization. It's easy to multiply two large prime numbers together (a "public key") but practically impossible for a classical computer to take that massive result and figure out the original two primes (the "private key"). An attempt to brute-force this could take thousands of years. We have built our entire modern economy on this mathematical "impossibility." That impossibility is about to vanish.
To appreciate what is at stake, it helps to understand how RSA key generation operates under the hood. The process begins by selecting two large, random prime numbers, typically denoted p and q. In a 2048-bit RSA key -- the minimum standard recommended by NIST -- each prime is roughly 1024 bits long, meaning each is a number with over 300 decimal digits. These primes are multiplied together to produce the modulus n = p * q, which forms the backbone of the public key. A public exponent e (commonly 65537) is chosen, and the private exponent d is computed such that e * d is congruent to 1 modulo the least common multiple of (p - 1) and (q - 1). The security of the entire scheme rests on one assumption: that given n, no attacker can efficiently recover p and q. For classical computers, the best known factoring algorithms -- such as the General Number Field Sieve -- run in sub-exponential time, making the problem computationally infeasible for sufficiently large keys.
Elliptic Curve Cryptography (ECC) takes a different mathematical approach but shares the same fundamental reliance on computational hardness. ECC operates over the algebraic structure of elliptic curves defined over finite fields. The security premise is the Elliptic Curve Discrete Logarithm Problem (ECDLP): given a base point G on a curve and a point Q = kG (where k is the private key and Q is the public key), recovering k from Q and G is computationally infeasible. ECC achieves equivalent security to RSA with dramatically smaller key sizes -- a 256-bit ECC key provides roughly the same security as a 3072-bit RSA key -- which is why it has become the preferred choice for mobile devices, IoT endpoints, and modern TLS implementations.
The scale of our collective dependency on these algorithms is staggering. Every time you see the padlock icon in your browser, a TLS handshake has occurred. During that handshake, asymmetric encryption is used to exchange a symmetric session key. The client and server negotiate a cipher suite, the server presents its certificate (signed using RSA or ECDSA), and a key exchange protocol (often ECDHE -- Ephemeral Elliptic Curve Diffie-Hellman) establishes a shared secret. This entire chain of trust depends on the hardness assumptions we just described.
Consider the volume: major cloud providers and CDNs handle billions of TLS connections every single day. Cloudflare alone has reported processing over 55 million HTTP requests per second at peak. Every one of those connections relies on asymmetric cryptography during the handshake phase. But TLS is only the beginning.
Digital signatures built on RSA and ECC are embedded in virtually every layer of our digital infrastructure. Every software update pushed to your laptop, phone, or server is cryptographically signed. Package managers like npm, PyPI, and Maven rely on signature verification to ensure integrity. Certificate Authorities form a hierarchical Public Key Infrastructure (PKI) that underpins identity verification across the entire internet -- from website certificates to email signing (S/MIME) to code-signing certificates that operating systems use to verify drivers and kernel modules. Government identity systems, electronic passports, and secure communications between diplomatic missions all depend on these same mathematical foundations.
When we talk about the quantum threat to encryption, we are not talking about breaking one lock. We are talking about a skeleton key that opens every lock simultaneously -- across every industry, every government, and every personal device on Earth.
A quantum computer isn't just a faster classical computer. It's an entirely new kind of machine that operates on the laws of quantum mechanics, using qubits instead of bits. Where a bit is either a 0 or a 1, a qubit can exist in both states at once (superposition) and be intrinsically linked to other qubits (entanglement).
This allows a quantum computer to perform many calculations simultaneously. In 1994, a mathematician named Peter Shor developed an algorithm -- Shor's Algorithm -- designed specifically to run on a quantum computer. Its purpose? To find the prime factors of large numbers.
A large, fault-tolerant quantum computer running Shor's Algorithm won't take millennia to break our encryption. It will take hours, or even minutes. This isn't a theoretical vulnerability; it is a mathematical certainty. The only question is when, not if, a machine becomes capable of it.
To understand why Shor's Algorithm is so devastating, it helps to look at the mathematical mechanism. The algorithm converts the factoring problem into a period-finding problem. Given a composite number N that we want to factor, the algorithm selects a random integer a and seeks the period r of the function f(x) = a^x mod N. This period r reveals the factors of N through straightforward arithmetic: if r is even and a^(r/2) mod N is not equal to -1 mod N, then the greatest common divisor of (a^(r/2) - 1) and N yields a non-trivial factor.
The critical quantum component is the Quantum Fourier Transform (QFT), which efficiently extracts the period from a quantum superposition of all possible values of f(x). On a classical computer, finding this period requires evaluating the function an exponential number of times. The quantum computer, leveraging superposition and interference, evaluates all inputs simultaneously and uses constructive interference to amplify the correct period while destructive interference suppresses incorrect answers. The result is an algorithm that runs in polynomial time -- specifically O((log N)^3) -- compared to the sub-exponential time required by the best classical factoring algorithms.
The same mathematical framework applies to ECC. A variant of Shor's Algorithm solves the discrete logarithm problem on elliptic curves with comparable efficiency, meaning both RSA and ECC fall to the same quantum attack.
The critical question facing every security leader is timeline. When will a "cryptographically relevant quantum computer" (CRQC) -- one capable of running Shor's Algorithm against real-world key sizes -- actually exist?
A CRQC capable of breaking RSA-2048 would require approximately 4,000 error-corrected logical qubits. However, because quantum error correction demands significant overhead, the actual number of physical qubits required is far larger -- current estimates range from several million to tens of millions of physical qubits, depending on the error rates of the underlying hardware.
As of early 2025, the largest quantum processors contain roughly 1,000-1,200 physical qubits, and these are noisy, non-error-corrected qubits far from the fault-tolerant variety needed for Shor's Algorithm. The gap between where we are and where we need to be is substantial -- but it is closing. IBM, Google, Quantinuum, and others have published aggressive roadmaps targeting thousands of logical qubits within the decade. Google's Willow chip demonstrated error correction below the fault-tolerance threshold, a milestone many experts considered years away.
Timeline estimates vary considerably. The NSA's Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) guidance, published in 2022, set hard deadlines for transitioning to post-quantum algorithms -- with some milestones as early as 2025 and full transition required by 2033. This is not a casual recommendation; it reflects the intelligence community's assessment of the threat timeline. NIST has similarly accelerated its standardization efforts, finalizing the first post-quantum standards in 2024.
A 2024 Global Risk Institute survey of quantum computing experts found that roughly one in three assigned a greater than 50% probability that a CRQC will exist by 2034. Some credible researchers place the timeline even sooner, particularly if there are breakthroughs in error correction or alternative qubit architectures.
The consensus among security professionals is clear: even if the most optimistic timelines prove wrong, the migration to quantum-resistant cryptography takes years. If you wait until a CRQC is announced, you are already too late.
This brings us to the most immediate and silent threat: Harvest Now, Decrypt Later (HNDL).
Hostile actors -- be they nation-states or sophisticated criminal organizations -- are not waiting for quantum computers to arrive. They are actively stealing and stockpiling massive amounts of encrypted data today. They cannot read this data... yet. They are vacuuming up everything:
They are storing this data in massive data centers, waiting for the day they gain access to a quantum computer capable of running Shor's Algorithm. The moment that happens, decades of secrets, protected by encryption we thought was unbreakable, will be decrypted all at once. The "shelf-life" of your data is everything. If a secret needs to remain secret for 10 years, it is already at risk.
HNDL is not a theoretical concern -- it is a documented reality. Intelligence agencies from multiple nations have publicly acknowledged the threat. In 2022, the White House issued National Security Memorandum 10 (NSM-10), explicitly citing the HNDL threat as a primary motivation for accelerating the transition to post-quantum cryptography across all federal systems. The memorandum specifically warned that adversaries could be "recording encrypted communications now with the intent to decrypt them in the future."
The Cybersecurity and Infrastructure Security Agency (CISA) has issued multiple advisories warning critical infrastructure operators about the HNDL threat. In their joint advisory with the NSA and NIST, they characterized the risk as one that "cannot wait" for quantum computers to materialize because the data exfiltration is happening in the present.
Reports from multiple cybersecurity firms have documented increased data exfiltration activity from state-sponsored advanced persistent threat (APT) groups -- activity that cannot be fully explained by current intelligence objectives. The volume of data being stolen in some operations far exceeds what could be immediately useful, consistent with a long-term HNDL strategy. When an adversary steals 50 terabytes of encrypted VPN traffic, they are not looking for today's intelligence -- they are building a library for tomorrow's quantum decryption.
Not all data faces equal HNDL risk. The critical factor is sensitivity duration -- how long the data must remain confidential. The categories most exposed include:
Classified government and military data. National security secrets often carry classification periods of 25 years or more. Diplomatic communications, intelligence sources and methods, and military operational plans stolen today could be devastating if decrypted even a decade from now. The geopolitical consequences of retroactively exposing years of diplomatic communications are difficult to overstate.
Healthcare records. Medical data is sensitive for the lifetime of the patient -- and often beyond. Genetic data, in particular, never loses its sensitivity; your genome does not change. Mental health records, substance abuse treatment records, and HIV status carry stigma and legal protections that extend indefinitely. The HIPAA framework requires protection of this data for at least 50 years after death.
Trade secrets and intellectual property. Pharmaceutical formulations, proprietary algorithms, manufacturing processes, and strategic business plans can retain competitive value for decades. A competitor or nation-state that decrypts a pharmaceutical company's R&D data could leapfrog years of research investment.
Financial data and cryptographic keys. Long-term financial records, merger and acquisition plans, and -- critically -- the cryptographic keys themselves used to sign certificates and validate identities. If root CA private keys are compromised retroactively, an adversary could forge certificates that appear to have been valid at the time of issuance.
Every organization should be conducting an HNDL risk assessment today. The framework is straightforward but requires honest analysis. For each category of data you hold, ask three questions:
How long must this data remain confidential? This is your data's security shelf-life. For some operational data, the answer might be months. For trade secrets or patient records, it could be decades.
How long until a CRQC could feasibly exist? Use conservative estimates. If expert consensus suggests a 10-20 year window, plan for the shorter end.
How long will it take your organization to migrate to quantum-resistant cryptography? This is the variable most organizations underestimate. Large enterprises with complex PKI infrastructure, embedded systems, and regulatory requirements can expect migration timelines of 5-10 years.
If the sum of your migration timeline and the time until a CRQC exists is less than your data's required confidentiality period, you are already exposed. This is Mosca's Theorem in practice, and it is the single most important framework for understanding HNDL urgency. For many organizations handling long-lived sensitive data, the math already does not work in their favor.
The quantum threat is not a distant problem. The "harvest" is already happening. Every piece of encrypted data being sent today is vulnerable to decryption tomorrow. The tipping point is here. We can no longer ask if our encryption will fail, but must instead ask what we are going to do about it.
The adversaries have already started their collection campaign. Nation-states with sophisticated signals intelligence capabilities are intercepting and archiving encrypted traffic at scale. The storage is cheap. The patience is strategic. And the eventual payoff -- retroactive access to years of encrypted communications, financial transactions, and state secrets -- is incalculably valuable.
The good news is that the cryptographic community has not been idle. A new generation of encryption algorithms -- designed to resist both classical and quantum attacks -- has been developed, tested, and standardized. The tools to defend against the quantum threat exist today. The question is whether organizations will deploy them before the window of protection closes.
Next in this series: In Part 2, we will explore the solution: Post-Quantum Cryptography (PQC). We will examine the new NIST-standardized algorithms, understand how hybrid key exchange provides a bridge strategy for the transition, and lay out a practical migration framework that your organization can begin implementing immediately.
At the beginning
Read Next →
Keep Exploring

Ryan previously served as a PCI Professional Forensic Investigator (PFI) of record for 3 of the top 10 largest data breaches in history. With over two decades of experience in cybersecurity, digital forensics, and executive leadership, he has served Fortune 500 companies and government agencies worldwide.

How Apple Intelligence hallucinations exposed fragile market microstructure, and why iOS 26's Liquid Glass UI and FinanceKit API are fundamentally reshaping fintech data provenance, algorithmic trading, and the death of screen scraping.

A deep technical analysis of Notion's architectural security gaps, permission model failures, AI exfiltration vulnerabilities, and why enterprise IT leaders should look past the polished UI before adopting it as a system of record.

Why 95% of enterprise AI investments fail to deliver ROI, and how the rise of the Chief AI Officer and proprietary data systems offers the only path to sustainable competitive advantage.