
Every time you log into your bank, send an encrypted email, connect to a VPN, or complete an online purchase, you rely on asymmetric cryptography -- a system of mathematics that has protected digital communication for nearly half a century. Understanding how it works, and why it is vulnerable, is essential to grasping the quantum threat.
Asymmetric cryptography, also called public-key cryptography, uses two mathematically related keys: a public key that anyone can see, and a private key that only the owner possesses. The public key encrypts data or verifies signatures; the private key decrypts data or creates signatures. The security of the entire system rests on a mathematical trapdoor -- an operation that is easy to perform in one direction but computationally infeasible to reverse.
RSA, named after its inventors Rivest, Shamir, and Adleman, is the most widely deployed public-key algorithm in history. Its trapdoor is integer factorization. The key generation process works as follows: choose two very large prime numbers (p and q), multiply them together to produce a modulus (n = p * q), and derive the public and private keys from n and related values. The public key contains n, which is freely shared. The private key depends on knowing p and q individually.
The security assumption is straightforward: multiplying two large primes together is trivial (a modern processor can do it in microseconds), but factoring their product back into the original primes is extraordinarily difficult. For RSA-2048, the standard key size, n is a 617-digit number. The best known classical factoring algorithms (the General Number Field Sieve) would require computational effort on the order of 2^112 operations to factor such a number -- far beyond the reach of all classical computers that exist or could plausibly be built.
Elliptic Curve Cryptography (ECC) is the more modern alternative, standardized in the early 2000s and now dominant in mobile devices, IoT, and TLS connections. ECC's trapdoor is the Elliptic Curve Discrete Logarithm Problem (ECDLP). Rather than factoring large numbers, ECC operates on the algebraic structure of elliptic curves over finite fields. Given a base point G on the curve and a scalar k, computing Q = kG (adding G to itself k times using the curve's group operation) is efficient. But given Q and G, recovering k is computationally infeasible for sufficiently large curves.
ECC's major advantage over RSA is key size efficiency. An ECC key of 256 bits (P-256, the most common curve) provides roughly equivalent security to an RSA key of 3072 bits. This translates to faster operations, smaller certificates, lower bandwidth consumption, and reduced power usage -- which is why ECC is the preferred choice for constrained environments like smartphones, embedded systems, and IoT devices.
To appreciate the security margins at play: RSA-2048 provides approximately 112 bits of classical security, meaning a classical attacker would need roughly 2^112 operations to break it. ECC P-256 provides approximately 128 bits of classical security. At current computational capabilities, breaking either would require more energy than the sun produces in its lifetime. Against classical computers, these algorithms are essentially unbreakable.
These security margins have been the bedrock of internet security since the late 1990s. The entire global infrastructure of digital trust -- TLS certificates securing web traffic, SSH keys protecting server access, S/MIME and PGP encrypting email, code-signing certificates validating software updates, VPN tunnels protecting corporate communications -- all of it depends on either RSA or ECC (or both) at some layer of the stack. The sheer ubiquity of these algorithms is difficult to overstate. Every HTTPS connection, every digital signature, every secure key exchange in modern networking relies on the hardness of either integer factorization or the elliptic curve discrete logarithm problem.
Against a sufficiently large quantum computer, they are both trivially broken.
In 1994, mathematician Peter Shor published a quantum algorithm that sent shockwaves through the cryptographic community. Shor's Algorithm demonstrated that a quantum computer could factor large integers and solve discrete logarithm problems in polynomial time -- exponentially faster than any known classical algorithm.
To understand the magnitude of this result, consider the distinction between exponential and polynomial time complexity. Classical factoring algorithms run in sub-exponential time: roughly e^(n^(1/3)) operations for an n-bit number. Shor's Algorithm runs in O(n^3) time -- polynomial in the number of bits. For RSA-2048 (a 2048-bit key), this transforms the problem from one requiring billions of years on classical hardware to one requiring hours or minutes on a sufficiently capable quantum computer.
Shor's Algorithm works by exploiting quantum superposition and interference to find the period of a modular exponential function. Finding this period is the hard step in factoring, and it maps naturally onto the quantum Fourier transform -- an operation that quantum computers perform exponentially faster than classical computers. Once the period is found, simple classical arithmetic extracts the factors.
The same algorithmic framework extends directly to the discrete logarithm problem, which means ECC falls to Shor's Algorithm just as completely as RSA does. For ECC, the quantum attack recovers the private scalar k from the public point Q = kG by solving the discrete logarithm on the elliptic curve group. The quantum resources required are actually more modest than for RSA: breaking ECC P-256 requires fewer logical qubits than breaking RSA-2048, because the underlying mathematical structure is more efficiently exploitable by quantum algorithms.
There is no version of RSA with a larger key size or ECC with a different curve that survives. Doubling the RSA key size from 2048 to 4096 bits roughly doubles the quantum resources needed -- a linear increase that provides no meaningful defense against an exponential speedup. Similarly, moving to larger elliptic curves only incrementally increases the quantum effort. The vulnerability is not in the implementation or the key length -- it is in the mathematical structure these algorithms rely on.
It is also important to understand what Shor's Algorithm does not threaten. Symmetric cryptography (AES, ChaCha20) and hash functions (SHA-256, SHA-3) are not broken by Shor's Algorithm. Grover's Algorithm does provide a quadratic speedup for brute-force attacks on symmetric keys, effectively halving the security level (AES-256 provides 128 bits of security against a quantum attacker rather than 256). The practical remedy is straightforward: double your symmetric key sizes. AES-256 remains quantum-safe for the foreseeable future.
The critical qualifier is "sufficiently capable quantum computer." Running Shor's Algorithm to break RSA-2048 requires an estimated 4,000 to 20,000 logical qubits, depending on the implementation and error correction overhead. As we will explore in Part 4 of this series, a logical qubit requires thousands of physical qubits to construct reliably. Current quantum computers have on the order of 1,000 to 1,500 physical qubits with error rates far too high for Shor's Algorithm to execute successfully.
So when will a Cryptographically Relevant Quantum Computer (CRQC) -- one capable of breaking RSA-2048 or ECC P-256 -- actually exist? Estimates vary, but the consensus among researchers and intelligence agencies has been converging. The U.S. National Security Agency, NIST, and leading quantum computing companies generally project that CRQCs could emerge between the early 2030s and mid-2040s. Some aggressive estimates, particularly from well-funded national programs in the U.S., China, and the EU, suggest the early end of that range is plausible.
But the timeline for a full-scale CRQC may not be the right metric for risk assessment. The concept of a mosaic threat is gaining attention among security researchers. This model recognizes that partial quantum capability, combined with classical cryptanalytic techniques, advanced AI, and side-channel attacks, could weaken cryptographic systems well before a "clean" Shor's Algorithm implementation becomes feasible. A quantum computer that cannot fully factor RSA-2048 might still reduce the classical effort required by orders of magnitude, making previously infeasible attacks practical when combined with other techniques.
The mosaic threat means that the transition from "safe" to "broken" may not be a single dramatic moment. It could be a gradual erosion that is difficult to detect until significant damage has been done.
The timeline uncertainty creates a second, more immediate threat that does not require waiting for CRQCs at all. It is called "Harvest Now, Decrypt Later" (HNDL), and it is almost certainly happening right now.
The HNDL attack model is straightforward: a sophisticated adversary intercepts and stores encrypted communications today, knowing that the data cannot currently be decrypted. The adversary archives these encrypted data stores and waits. When a sufficiently capable quantum computer becomes available -- whether in five years, ten years, or twenty -- the adversary retroactively decrypts the archived data.
This is not a theoretical concern. Multiple intelligence agencies and cybersecurity firms have reported evidence of state-sponsored actors systematically harvesting encrypted data from high-value targets. The targets are predictable: government communications, diplomatic cables, military plans, intelligence data, corporate trade secrets, pharmaceutical research, financial transaction records, and personal data of high-value individuals.
The economic logic of HNDL is compelling for any nation-state adversary. Storage is cheap -- a petabyte of data costs a few thousand dollars to store annually. The potential intelligence value of decades of retroactively decrypted diplomatic and military communications is immense. Even for corporate espionage, the calculus often works: pharmaceutical development data, M&A strategy documents, and proprietary technology designs may retain their value for years or decades.
The critical question for any organization evaluating HNDL risk is data shelf-life -- how long does the data need to remain confidential? Consider the following categories:
If your data's required confidentiality period extends beyond the most optimistic CRQC timeline, you are already in the HNDL threat window. The encrypted data you transmit today could be readable by an adversary within its useful lifetime.
This is why the migration to post-quantum cryptography (PQC) is not a future problem -- it is a present one. Every day of delay extends the window of HNDL-vulnerable data. Organizations that wait until CRQCs are demonstrated before beginning their migration will find that years of their most sensitive communications were harvested and are now decryptable.
The urgency is compounded by the reality that cryptographic migrations are slow. The transition from SHA-1 to SHA-256 took over a decade. Moving from 3DES to AES took nearly as long. The PQC migration is more complex than either of those transitions, involving new key sizes (PQC keys and ciphertexts are significantly larger than their classical counterparts), new protocol behaviors, and potentially new hardware. Organizations that begin planning now are not being premature -- they are being responsible.
There is a useful formula for thinking about HNDL urgency, sometimes called the Mosca inequality: if X is the time your data needs to remain secure, Y is the time required to migrate your cryptographic infrastructure, and Z is the time until a CRQC exists, then you are at risk if X + Y > Z. If your healthcare records need 30 years of confidentiality (X=30) and your migration will take 5 years (Y=5), you need to have completed migration at least 35 years before a CRQC arrives. If a CRQC arrives in 2035, you needed to start migrating by 2000. The math is unforgiving, and for long-lived data, many organizations are already behind.
The connection to post-quantum cryptography (PQC) migration is direct: NIST finalized its first PQC standards in 2024 (FIPS 203, 204, and 205), providing standardized algorithms that organizations can begin deploying. The tools exist. The standards exist. The threat model is well-understood. What remains is organizational will and execution -- the subject of Part 5 in this series.
The cryptographic foundation of our digital world -- RSA and ECC -- is built on mathematical problems that quantum computers will solve efficiently. Shor's Algorithm does not merely weaken these systems; it renders them fundamentally broken. While the timeline for a full-scale CRQC remains uncertain, the HNDL threat model means the risk is already active for any data with a long confidentiality requirement.
The mosaic threat compounds this urgency: partial quantum capabilities combined with classical techniques may erode cryptographic protections sooner than clean break timelines suggest. And the sheer duration of cryptographic migrations -- measured in years, not months -- means that starting now is a strategic imperative, not an overreaction.
In the next part of this series, we will shift from threats to opportunities, exploring how today's imperfect, noisy quantum computers are already solving real-world problems in optimization, simulation, and machine learning through hybrid quantum-classical approaches.

Ryan previously served as a PCI Professional Forensic Investigator (PFI) of record for 3 of the top 10 largest data breaches in history. With over two decades of experience in cybersecurity, digital forensics, and executive leadership, he has served Fortune 500 companies and government agencies worldwide.

How Apple Intelligence hallucinations exposed fragile market microstructure, and why iOS 26's Liquid Glass UI and FinanceKit API are fundamentally reshaping fintech data provenance, algorithmic trading, and the death of screen scraping.

A deep technical analysis of Notion's architectural security gaps, permission model failures, AI exfiltration vulnerabilities, and why enterprise IT leaders should look past the polished UI before adopting it as a system of record.

Why 95% of enterprise AI investments fail to deliver ROI, and how the rise of the Chief AI Officer and proprietary data systems offers the only path to sustainable competitive advantage.