
There is a persistent narrative in boardrooms and media headlines that goes something like this: "Quantum computing is always ten years away." The implication is that quantum technology is perpetually in the future, that nothing useful exists today, and that organizations can safely defer any engagement until fault-tolerant, million-qubit machines arrive.
This narrative is wrong, and it is dangerous.
It is wrong because quantum computers exist today and are being used to solve real problems. It is dangerous because organizations that internalize this "wait and see" mentality will find themselves years behind competitors who built quantum capabilities, developed domain expertise, and identified use cases while others watched from the sidelines.
The reality is more nuanced than either extreme. We do not have fault-tolerant quantum computers capable of running Shor's Algorithm against RSA-2048. We will not have them for years, possibly a decade or more. But we do have quantum computers with hundreds to over a thousand qubits that can execute meaningful computations for certain problem classes. These machines are imperfect, noisy, and limited -- but they are real, accessible, and improving rapidly.
The key is understanding what today's quantum computers can and cannot do, and then applying them strategically to problems where even imperfect quantum computation offers an advantage or, at minimum, builds organizational readiness for the fault-tolerant era ahead.
In 2018, Caltech physicist John Preskill coined the term NISQ -- Noisy Intermediate-Scale Quantum -- to describe the current generation of quantum hardware. The name captures the two defining characteristics of today's machines.
Intermediate-Scale refers to qubit count. NISQ devices range from roughly 50 to over 1,000 qubits. IBM's Eagle processor (127 qubits, 2021), Osprey (433 qubits, 2022), Condor (1,121 qubits, 2023), and subsequent generations have steadily pushed qubit counts upward. IonQ, Quantinuum, Google, and others have their own scaling roadmaps. These machines are large enough to perform computations that are extremely difficult or impossible to simulate on classical computers (Google's 2019 "quantum supremacy" demonstration established this threshold at around 53 qubits for specific tasks), but they are not large enough for the full error correction overhead required for fault-tolerant computing.
Noisy is the more important qualifier. Every quantum operation introduces errors. Gate fidelities on current hardware range from about 99% to 99.9% for single-qubit gates and 99% to 99.8% for two-qubit gates. These numbers sound high until you consider that a useful quantum computation might require thousands or millions of gate operations. At 99.5% two-qubit gate fidelity, a circuit of 200 two-qubit gates has only about a 37% chance of executing without any errors. This limits the depth of quantum circuits that can produce reliable results, which in turn limits the algorithms that can run effectively.
NISQ machines also suffer from limited connectivity (not every qubit can directly interact with every other qubit, requiring additional SWAP operations that add noise), readout errors (the measurement process itself is imperfect), and crosstalk (operations on one qubit unintentionally affecting neighboring qubits).
Despite these limitations, NISQ devices are far from useless. The key insight is that certain algorithms are specifically designed to extract useful results from shallow, noisy circuits. These algorithms trade the theoretical perfection of fault-tolerant algorithms for practical tolerances that work within NISQ constraints. The dominant paradigm for doing this is the hybrid quantum-classical computing model.
The hybrid quantum-classical model is the defining computational strategy of the NISQ era. The idea is elegant: use the quantum processor for the part of the computation where it offers an advantage (exploring exponentially large state spaces through superposition and entanglement), and use a classical computer for everything else (optimization, parameter updates, error mitigation, pre- and post-processing). The quantum computer becomes a specialized coprocessor, not a standalone replacement for classical infrastructure.
Two algorithms exemplify this approach: the Quantum Approximate Optimization Algorithm (QAOA) and the Variational Quantum Eigensolver (VQE).
QAOA, introduced by Edward Farhi, Jeffrey Goldstone, and Sam Gutmann in 2014, addresses combinatorial optimization problems -- the class of problems where you must find the best solution from a finite but astronomically large set of possibilities. Think vehicle routing (which sequence of deliveries minimizes total distance?), job scheduling (which assignment of tasks to machines minimizes completion time?), or portfolio optimization (which mix of assets maximizes return for a given risk level?).
These problems are often NP-hard, meaning no known classical algorithm solves them efficiently in the worst case. Classical approaches rely on heuristics -- genetic algorithms, simulated annealing, mixed-integer programming -- that find good-but-not-optimal solutions.
QAOA works by encoding the optimization problem into a quantum circuit with adjustable parameters. The algorithm begins by preparing all qubits in superposition, then alternates between two types of quantum operations: a "problem" operation that encodes the cost function (the objective you are trying to minimize or maximize) and a "mixer" operation that explores the solution space. The circuit's parameters control the relative strength and duration of these operations.
After the quantum circuit runs, the qubits are measured, yielding a candidate solution. This solution and its cost are fed back to a classical optimizer, which adjusts the circuit parameters and sends an updated circuit back to the quantum processor. This loop repeats -- quantum execution, classical parameter update, quantum execution -- until the solution converges or a stopping criterion is met.
The power of QAOA lies in the quantum processor's ability to explore the solution space using interference effects that classical heuristics cannot replicate. Probability amplitudes for good solutions are amplified while poor solutions are suppressed. Whether QAOA achieves meaningful quantum advantage on NISQ hardware for practical problem sizes remains an active area of research, but early results on small instances are encouraging, and the algorithm's performance is expected to improve as hardware quality increases.
VQE addresses what may be the most natural application for quantum computers: simulating quantum systems. Molecules are inherently quantum mechanical, and simulating their behavior on classical computers requires approximations that become prohibitively expensive as molecule size grows. Accurately simulating the electronic structure of a caffeine molecule, for example, would require more classical bits than there are atoms in the observable universe.
VQE estimates the ground-state energy of a molecular system -- the lowest-energy configuration of its electrons. This is critical for drug design (predicting how a drug molecule binds to a protein target), materials science (designing battery cathode materials with specific properties), and catalyst development (finding molecules that lower energy barriers for chemical reactions).
The algorithm works through the variational principle of quantum mechanics: any trial quantum state will have an energy greater than or equal to the true ground-state energy. VQE prepares a parameterized quantum state (called an ansatz) on the quantum processor, measures the energy of that state, and then uses a classical optimizer to adjust the parameters to find the state with the lowest energy.
Like QAOA, VQE uses shallow circuits that are compatible with NISQ hardware. The quantum processor handles the exponentially complex task of representing and manipulating the molecular wavefunction, while the classical computer handles the optimization of circuit parameters. Current VQE demonstrations have accurately simulated small molecules like lithium hydride (LiH), beryllium hydride (BeH2), and water (H2O). Scaling to industrially relevant molecules will require both more qubits and lower error rates, but VQE establishes a clear path from current capability to future impact.
Theory and algorithms are important, but the practical question for most organizations is: how do I actually engage with quantum computing today? The answer lies in building a "quantum sandbox" -- a structured program for exploring quantum use cases, developing internal expertise, and positioning the organization for the fault-tolerant era.
The barrier to entry has never been lower. Every major cloud provider now offers quantum computing services, eliminating the need for organizations to purchase, operate, or maintain quantum hardware.
IBM Quantum provides access to a fleet of superconducting quantum processors through IBM Cloud, along with Qiskit, the most widely used open-source quantum software framework. IBM also offers a generous free tier and extensive educational resources.
Amazon Braket provides a unified interface to multiple quantum hardware providers -- IonQ (trapped ions), Rigetti (superconducting), and QuEra (neutral atoms) -- along with managed simulators. This multi-vendor approach lets organizations compare hardware modalities without committing to a single provider.
Google Cloud offers access to Google's quantum processors and the Cirq framework. Google's focus on achieving quantum error correction milestones means its hardware roadmap is particularly relevant for organizations planning long-term quantum strategies.
Azure Quantum integrates quantum computing from IonQ, Quantinuum, and others directly into the Microsoft Azure ecosystem, with support through Q# and the Azure Quantum Development Kit. The integration with existing Azure services makes it attractive for enterprises already invested in the Microsoft ecosystem.
Across industries, organizations are running quantum pilots that, while not yet delivering production-scale quantum advantage, are building critical capabilities and identifying promising use cases.
Logistics and supply chain: Companies like BMW, Airbus, and DHL have explored quantum optimization for vehicle routing, supply chain network design, and warehouse operations scheduling. The problems map naturally to QAOA: large numbers of variables, complex constraints, and objective functions where even small improvements yield significant cost savings. BMW's collaboration with quantum computing companies to optimize paint shop scheduling demonstrated that hybrid quantum-classical approaches could match or approach the quality of solutions from classical solvers on small problem instances.
Pharmaceutical and life sciences: Biogen, Roche, and several others have partnered with quantum computing firms to explore molecular simulation for drug discovery. The goal is to use VQE and related algorithms to screen drug candidates more accurately than classical density functional theory (DFT) methods allow. While current quantum hardware can only handle molecules too small for practical drug design, these pilots establish the computational pipelines, workflows, and domain expertise that will be needed when hardware scales.
Financial services: JPMorgan Chase, Goldman Sachs, and HSBC have been among the most active financial institutions in quantum computing. Use cases include portfolio optimization (finding optimal asset allocations under complex constraints), risk analysis (Monte Carlo simulations for derivative pricing), and fraud detection (identifying anomalous patterns in transaction networks). JPMorgan's research group has published extensively on quantum algorithms for financial applications.
Materials science and energy: Companies exploring next-generation battery chemistries (lithium-sulfur, solid-state) and catalyst design are natural candidates for quantum simulation. Accurately modeling the quantum chemistry of electrode materials and catalytic surfaces could accelerate the development of more efficient batteries and cleaner industrial processes.
A word of caution is warranted. The quantum computing space is rife with overpromising, and organizations must be discerning consumers. When evaluating claims of "quantum advantage," ask the following questions:
An underappreciated byproduct of quantum computing research is the development of "quantum-inspired" classical algorithms. These are classical algorithms that borrow mathematical frameworks from quantum computing -- tensor networks, variational methods, random sampling techniques -- to improve classical performance on optimization and simulation problems.
Companies like Microsoft, Amazon, and Toshiba offer quantum-inspired optimization solvers that run on classical hardware but incorporate ideas from quantum algorithm design. These solvers can provide immediate value today while serving as a bridge to future quantum implementations. The organizational investment in formulating problems in quantum-compatible frameworks is not wasted even if quantum hardware is not yet ready for production deployment.
For organizations considering quantum engagement, a practical sandbox strategy involves four elements:
The NISQ era is not a waiting room for fault-tolerant quantum computing -- it is an active laboratory where organizations can build the knowledge, relationships, and strategic frameworks that will determine their quantum readiness when the technology matures. Hybrid quantum-classical algorithms like QAOA and VQE are already producing meaningful results on small problem instances, and cloud platforms have made experimentation accessible to any organization willing to invest the effort.
The organizations that will lead in the quantum era are not those with the most qubits -- they are those that started building their quantum muscles earliest. In the next installment, we will confront the central engineering challenge standing between today's noisy machines and the fault-tolerant quantum computers that will unlock the technology's full potential: the monumental problem of quantum error correction.

Ryan previously served as a PCI Professional Forensic Investigator (PFI) of record for 3 of the top 10 largest data breaches in history. With over two decades of experience in cybersecurity, digital forensics, and executive leadership, he has served Fortune 500 companies and government agencies worldwide.

How Apple Intelligence hallucinations exposed fragile market microstructure, and why iOS 26's Liquid Glass UI and FinanceKit API are fundamentally reshaping fintech data provenance, algorithmic trading, and the death of screen scraping.

A deep technical analysis of Notion's architectural security gaps, permission model failures, AI exfiltration vulnerabilities, and why enterprise IT leaders should look past the polished UI before adopting it as a system of record.

Why 95% of enterprise AI investments fail to deliver ROI, and how the rise of the Chief AI Officer and proprietary data systems offers the only path to sustainable competitive advantage.