Quantum Computing From First Principles
“The future is already here, it’s just not evenly distributed.” – William Gibson
Quantum physics has been around for over a century. The first quantum computer was conceived forty years ago. And yet today, few use them. Why?
I was looking for an overview on quantum computers, but I couldn’t find any information that combined how they work + why I should care + who’s working on them. So, I figured I’d synthesize the answers to these questions myself. My goal is to help the reader assess where they can make an impact with quantum computers.
How do quantum computers work?
“Nature isn’t classical, dammit, and if you want to make a simulation of Nature, you’d better make it quantum mechanical.” – Richard Feynman
Quantum computers are mysterious partly because people don’t understand the underlying principles behind them, but also partly because their behavior is counterintuitive. This leads to a variety of consequences, but fear not. We will start with the most fundamental principles of quantum mechanics, all without quantum notation, so you can assess how you’ll position yourself in the field. However, it will help to understand how classical computers work first.
Classical computers are the guys you and I are using to read this right now are just machines that manipulate these things called bits. A bit is just the most basic form of information. Each bit is one of two states:
Values 0 or 1
North or South spin
Low voltage (0v) or high voltage (1000v)
To understand the basic building blocks, I will use the analogy of a coin that’s on the table. Like a bit, it’s either heads or tails.
To do useful work, computers combine bits with logic gates. These guys perform basic operations on bits, such as AND (both), OR (either), NOT (invert), etc. Physically, they’re built using transistors. This architecture is predictable, reliable, and deterministic. If you run the same program twice, you’ll get the same result.
Quantum computers are different because they use qubits. These guys are just the quantum versions of a bit. Each qubit is in combination of both states:
Values 0 and 1
North and South spin
Low (0v) and high voltage (1000v)
A qubit is analogous to a coin that’s spinning in the air. You don’t know if it’s heads or tails yet. But this isn’t a regular coin, it’s a quantum coin. This means the coin isn’t switching between the heads and tails. It is actually both states. This is called the state of superposition.
The thing that describes all possible outcomes and their likelihoods is called a wavefunction. In the coin flip example, imagine pausing the coin midair. If you asked, “how much % is this ‘tails'?” the wavefunction would tell you (actually, the probability comes from squaring the amplitude. This will make more sense if you are interested in the math behind quantum computing).
Now imagine you wanted the coin to land on heads. If you could swirl some air currents around the coin, you could push it slightly toward heads as it falls. In quantum mechanics this is called interference. With interference, we can guide the probability towards the right answer.
As we try to boost the outcome of heads and cancel the outcome of tales, we create a quantum algorithm. These guys operate on basic math: they can add up or cancel out. What are we adding and subtracting? That’s the amplitude of different quantum states. A good quantum algorithm will increase the probability of right answers (constructive interference) and cancel out the wrong answers (destructive interference).
Now imagine throwing two spinning coins (AKA two qubits) in the air. In the classical world, you wouldn’t expect a correlation between the two. However, in quantum mechanics, their outcomes can be linked. This is called entanglement.
Entanglement lets quantum computers encode relationships between data. For example, if coin A lands heads, you can make Coin B land heads too. Or you could invert it so that coin B lands tails every time.
Now let's go back to the one coin (which is the same as a qubit). Imagine I toss it in the air. Then I ask you, “what’s the value?” You might think that if you wait until it falls on the table, you’ll know its value. However, that’s not how the quantum coin works. You don’t know whether it’s heads or tails until you actually look at the coin. This is called measurement.
To understand measurement, let’s replay this toss. This time, imagine adding a cloud to completely hide the coin we just flipped. During this time, the coin continues to have a spin. In fact, it’s only when you look that the spin collapses (ie. the cloud vanishes, and you get heads or tails).The cloud is important because it represents mathematical probabilities. This is called the wavefunction. If I flip a quantum coin, what do you think the probability of heads/tails is? It’s not necessarily 50/50. That’s because of superposition: there’s a range of possible combinations.
Now, in a perfect world, you choose when the coin stops spinning. But quantum systems are fragile in reality. Factors like heat, noise, and vibrations can all impact the coin's trajectory. You may think of these forces as a storm. When the storm hits the coin before you look, it destroys all possibilities, and gives you a random answer. When this happens, we call it decoherence.
Quantum computers fight decoherence constantly. The goal is to keep the coin spinning long enough to complete the algorithm and measure it at the right moment. This requires manipulating environments to keep the coin(s) in high quality condition.
One of the factors that cause the coin to stop spinning is important: noise. It’s what causes the process of decoherence. Noise is any unwanted disturbance that introduces errors to the quantum system. In the coin analogy, imagine noise as a tornado that knocks the coin off its spinning top.Recap of the principles:
Superposition: qubit exists in a linear combination of basis states (ie. coin spinning in midair).
Interference: Adding or cancelling the probability amplitudes of different quantum states (ie. air currents steer the spin toward desirable outcomes).
Entanglement: qubits can exist in a joint state such that the measurement of one affects the outcome of the other (ie. two coins are linked so that knowing one tells you information about the other).
Measurement: observing a qubit collapses its state from a superposition into a definite basis state (ie. looking freezes the coin into heads or tails).
Decoherence: environment causes the qubit’s wavefunction to collapse prematurely into a classical state, destroying quantum behavior (ie. storm knocks the coin down too early).
Noise: cause of random errors in the quantum system (ie. wind currents that we didn’t create).
When would I use a quantum computer?
“We can only see a short distance ahead, but we can see plenty there that needs to be done” – Alan Turing
Humanity knows where the frontier of computing lies. The most important questions in science – how nature conducts her processes, what the universe's first moments were, how the mind works – are bottlenecked not by imagination, but by the limits of classical computation.
Theory says a quantum computer can do anything a classical computer can do. However, as of 2025, we aren’t even close to doing that. Therefore, as Turing said, we must focus on what we can see in front of us.
Let’s first think about what problems classical computers fail to solve. Some of these areas have progressed with quantum computers. However, keep in mind there’s more to be discovered.
Structured Math Problems:
Let’s say you’re given the number 91 and asked: "Which two numbers multiply together to make this? You might quickly try: 2? Nope. 3? Nope. 7? Maybe. 91 ÷ 7 = 13! You can do this because 91 is a small number. But what if I gave you a number with a million digits? The guesses required grow exponentially as the number you’re given grows bigger.
That’s because classical computers try one possibility at a time, and they don’t know which guesses are more promising than others.
When you have to break numbers into prime pieces, you are solving a problem of factoring. It’s what modern encryption is based on: we can hide a message using a product of two big prime numbers, because reversing the factoring is slow.
However, quantum computers can solve this because they don’t guess one-by-one. They use superposition to explore many guesses at once, and interference to cancel the wrongs / amplify the rights. That’s what Shor’s algorithm does.Shor’s Algorithm:
It turns factoring into finding the period. f(x) = a^N modulo N. Once a period is found, use it to recover primes. Superposition explores all inputs at once. In this case, 1 to N exponents + their periods, at once. Interference as usual, amplifies correct/cancels wrong wavefunctions. Quantum Fourier transformation: decomposes signal to discover hidden frequencies in the wave function.
Optimization / Unstructured problems:
Let’s say I gave you a list of cities. I ask you to go on a shortest round-trip route that visits every city once and comes back to wherever you started. How would you do it?
This is called the traveling salesman problem. You can do it easily with 5 cities, but what if I told you to visit every city in the world (tens of thousands)?
Classical computers get overwhelmed here for the same reason again: they take too long to check all routes one by one, and they don’t know which route is likely to be better.
Unstructured problems are saying, “Try each possibility. Which ones work?” In these kinds of problems, there’s no structure to exploit. There’s a space of possible answers that has no obvious shape. You’re trying to figure out what the best result is. The number of possible routes grows combinatorially.Grover’s Algorithm:
Let’s say you’re looking for a single correct answer out of a list of N components. If you use a classical computer, on average, it’ll take you about half the list before finding it. Now if you used a quantum computer, it would require only √N steps (quadratic speedup). This is Grover’s algorithm. How does it work?
You use qubits’ superposition to explore all possibilities. Then, you have this mathematical function called an oracle. You can think of asking the oracle “is this the right answer?” When the oracle says “yes,” the state of that specific possibility becomes negative. Then, when you continue with interference, this small change shifts the wavefunction’s balance so the probability moves towards the right answer. When you repeat this, measurement gives the correct result with high probability. (However, if you want an approximation, Grover’s algorithm won’t work. QAOA is an example of this.)
Physical Simulations:
We’ve had quantum theory for 100 years. We had the blueprint for quantum computers in the 1980s. And yet, in 2025, we still don’t know how to simulate a single molecule accurately. If you want to simulate nature, you need to play by her rules. And as Feynman said, nature doesn’t run on classical physics.
Classical computers are excellent at modeling systems that are local: things only affect one another if they’re close together, casual: outputs depend only on past and current inputs (not future ones), reversible: systems can play backwards like a movie, and deterministic: the output is always get the same answer.
However, the quantum world doesn’t operate like that. Quantum systems are non-local: particles can influence one another across space. They’re also probabilistic: the output isn’t the same result each time. Finally, they’re entangled: measuring one particle can change the outcome of another (Einstein referred to this as “spooky action at a distance.”).
Every time a particle can be in two states, you have to double your memory. If you have two particles, you have 4 states. If you have ten particles, you have 1,024 states. This is not a coding problem. It's a physics bottleneck. In addition, classical computers process each possibility step by step, but nature doesn't. In quantum systems, all possibilities coexist. So to simulate this, you need to run every path in parallel, something classical computers cannot do.
On the other hand, quantum computers with n qubits can represent superposition over 2^N states, but reading out information (via measurement) doesn’t scale the same way. You need clever algorithms to extract value without exponential resource use. So, trying to simulate a quantum system on a classical computer is like trying to paint a 3D sculpture with a 2D brush. No matter how hard you work, you’re missing the shape.
So, we recognize that quantum computers are ideal for simulating quantum systems. Now, what kinds of physical systems should we model?Quantum Chemistry:
When we try to understand chemical systems – molecules, drugs, catalysts – with the principles of quantum mechanics, we’re entering the world of quantum chemistry. Progress is needed in this field because classical computers fail to accelerate drug discovery. Every delay in simulating these molecules means a potential treatment isn’t reaching the person who needs it.
One example of this is FeMoCo, a wild electronic structure. This molecule matters because it’s the active site of nitrogenase, an enzyme bacteria used to fix nitrogen from the air. The best classical systems struggle to estimate its ground state energy. But, if we could understand and replicate this process in nature, we will create fertilizers without the energy-hungry Haber-Bosch process. This tiny molecule represents a bigger problem: food security and climate.
Another application of quantum chemistry is in designing antibiotics. People are dying from untreatable infections. However, it’s hard to design new antibiotics because the structure-activity relationship isn’t well understood. We need quantum modeling to predict binding affinity: the strength of interaction between molecules, and molecular folding: the shape molecules take to interact with other structures.Condensed Matter:
What happens physically when particles start to talk? Condensed matter is the field that figures this out. It’s where properties like magnetism or superconductivity appear. Behavior here is non-linear, entangled, probabilistic. If you want to improve materials used in applications such as solar panels, maglev trains, or implants, you need to understand how many-body systems – electrons, atoms, molecules – behave inside matter.
In contrast, a quantum computer is already a condensed matter system. That means when you're simulating a magnetic material, a high temperature superconductor, or a topological insulator using a quantum computer, you're recreating the physics directly. In other words, quantum computers natively simulate entanglement, interference, and phase changes.
Every delay in simulating nature with quantum computers means we're still designing materials by trial and error. Quantum computers offer the ability to model the emergent behavior of reality itself, not just guess at it. We could be predicting new phases of matter, new superconductors, new states of energy. And we're not doing it, yet.Particle Physics:
At the smallest scales of reality, particles act like ripples in invisible fields: photons, quarks, photons are not standalone entities. This is the world of quantum field theory. Here, we are trying to understand matter. There is potential to understand the birth of the universe, to discover better forms of cancer therapy, and to develop new forms of energy production.
The state-of-the-art method we have today operates on classical supercomputers. Lattice gauge theory involves chopping space-time into a lattice and calculating how each piece of the universe affects the others. But we’re hitting a wall.
The problem is, we only have simple 2D/3D toy models, which take months to compute and are often static snapshots in imaginary time. This is like seeing a cartoon scene of the universe instead of the real thing. If you wanted to simulate a few dozen interacting particles over 4D space-time, you're far beyond the capabilities of classical supercomputers.
One big, unique reason for this is because of the sign problem. In classical Monte Carlo simulations, the quantum probabilities can become negative or complex. Classical computers break down because sampling methods are not meant for these values. This makes key phenomena – real-time particle collision, confined topological effects, baryogenesis – practically impossible to simulate with classical tools.
Since quantum computers follow the same quantum rules as the systems, they could also help us study field interactions in real time, in full dimensionality, and without the sign problem. This will transform how we study QCD, the standard model, comma, and the origin of matter – things that no classical method will ever reach.
Quantum Machine Learning:
Machine learning has been applied everywhere from biology to finance to language. It runs on linear algebra: matrices, vectors, and dot products, executed at scale by GPUs and TPUs. So it's tempting to ask, can quantum computers do this faster?
Quantum computers operate in Hilbert spaces. These are high-dimensional complex vector spaces where superposition and interference can process information in a way that’s different from classical computers. While this gives them native advantages in some forms of linear algebra, turning this mathematical edge into real-world machine learning gains is not as straightforward.Classical data, quantum processing:
This is where you encode classical data in the form of images, sensor readings, CSVs, into quantum states and pass them through quantum circuits to better extract features or make predictions faster. However, data input is expensive for quantum computers. That's because you need to encode large, unstructured, classical data into quantum states. This requires a device with quantum gates that preserve probability, quantum memory to store data, and circuits that don’t lose coherence.
Quantum data, quantum learning:
A better approach is to analyze data produced by quantum systems: sensors, simulations, experiments, using quantum algorithms. Data doesn’t need to be translated. This approach offers scope to discover quantum-native structures, such as topological data and symmetry-preserving embeddings. And we can create algorithms that use entanglement or interference to detect hidden correlations faster than any classical method could.
Hybrid models:
This approach uses quantum circuits with trainable parameters and a classical optimizer to offload different parts of the work. Famous examples include quantum classifiers, quantum GANs, and quantum kernel methods. Currently, these hybrid models are limited by noise, circuit depth, and training instability, but they do work on today's NISQ devices.Core problems with Quantum ML:
Many quantum algorithms assume access to QRAM, but these devices haven’t yet been built.
Quantum evolution is governed by unitary operations, which are linear and reversible. Measurement causes probabilistic collapse. In contrast, classical ML often uses irreversible or lossy operations (ex. ReLU), which don’t directly translate to quantum algorithms.
Lastly, decoherence time limits the duration of a single circuit run. If you want to train a quantum computer, you need to ensure it doesn’t decohere for that same period.
What’s the State of the Art?
“New directions in science are launched by new tools much more often than by new concepts" — James Dyson
Classical computers were first theorized in the 1830s by Charles Babbage. It wasn’t until the 1940s that huge rooms were dedicated to computers for basic arithmetic. Then, they were made smaller post WWII due to the invention of transistors. This is where we see the first programming languages like FORTRAN/COBOL, the operating systems, and commercial machines like the IBM 1401. Then there was the 1970s advancement of microprocessors, enabling personal computers. Computing was democratized, and the internet age was on the rise. The late 1990s led to connecting PCs to the cloud via the internet. Today, computers are everywhere: watches, cars, refrigerators.
The reason why this timeline is important is because classical computers once seemed impossible to be in the hands of the average person. The first real computers were fragile, expensive, and hard to program. With decades of engineering effort, classical computing overcame its bottlenecks. They got smaller, became programmable, and scaled around the world. While the principles and applications of quantum mechanics are somewhat understood, building quantum computers is one of the hardest engineering challenges humanity faces.
We are in the NISQ era, where quantum computers have limited qubit counts and high noise rates. No current architecture has achieved broadly useful fault-tolerant quantum computing, though some have shown quantum supremacy on niche benchmarks. Different platforms are betting on tradeoffs in speed, noise resilience, and manufacturability.
Superconducting Qubits:How they work: tiny electrical circuits cooled to near absolute zero. This temperature allows circuits to carry current with zero resistance, allowing stable quantum states.
Josephson Junction: insulating material placed between two superconductors. It allows current to tunnel in a way that supports quantum effects.
Microwaves: pulses used to manipulate current and phase differences in the circuit.
Why absolute zero? Atoms jiggle when they’re warm, which can lead to the destruction of quantum behavior (aka decoherence). Cooling reduces the thermal noise in the system. Jiggle = noise = decoherence.
Pros:
Production: Chip fabrication techniques inspired by semiconductors.
Fast: Gates operate at nanosecond speed.
Tested: most mature platform today.
Cons:
Environment: requires dilution refrigerators.
Two-level systems: solid material that contains defects or impurities can create quantum behavior that leads to noise.
Noise: solid form of matter is susceptible to other atoms’ fields.
Companies: IBM, Google Quantum, Rigetti (partners with Zapata Computing), Alibaba DAMO
Trapped Ion qubits:
How they work: individual ions held in place with electromagnetic fields in vacuum. Lasers manipulate the ions’ energy levels to perform computations.
Vacuum: isolates atoms from other particles, reducing noise a lot.
Lasers: reduce kinetic energy, flip quantum states, and entangle / measure ions.
Pros:
Coherence: ions can maintain quantum states for many seconds.
Accuracy: single and two-qubit-gates have low error rate.
Control: lasers can address each atom individually.
Cons:
Speed: slower than superconducting qubits.
Lasers: complexity increases per qubit.
Scalability: hard to manage many lasers without special techniques (shuttling ions, optical tweezers).
Companies: IonQ, Quantinuum, Alpine Quantum, Oxford Ionics
Photonic Qubits:
How they work: single particles of light (photons) represent qubits. Computation is performed by directing photons through a maze of mirrors, beam splitters, and phase shifters.
Optical components: used to implement quantum logic operations.
Why photons? They don’t interact with most kinds of matter (ie. less likely to decohere), and are widely used in classical communication networks.
Pros:
Temperature: works at room temperature.
Speed: move at the speed of light (ie. well-suited for long-distance computing).
Scalability: integration into chips inspired by classical methods (optical fiber networks, PICs).
Cons:
Interaction: photons don’t interact with each other, making entanglement hard.
Precision: aligning optical components for quantum operations is sensitive.
Control: difficult due to the non-interacting nature of photons, making computation circuits harder to scale.
Examples: PsiQuantum, Xanadu
Silicon Spin Qubits:
How they work: trap single electrons inside quantum dots, created on a silicon chip. Each electron’s spin represents a qubit.
Why silicon? It’s the foundation of modern computing. Similar manufacturing techniques can be used for qubits.
Pros:
Accuracy: some operations reach 99.9%+ accuracy.
Scalability: Semiconductor industry can leverage CMOS infrastructure for qubit production.
Temperature: can operate at higher temperatures.
Cons:
Noise: solid-state silicon systems have nearby atoms with nuclear spins that can cause decoherence.
Valley splitting: energy difference between quantum states can lead to errors in operations.
Companies: Intel, Quantum Motion, Silicon Quantum Computing.
Topological Qubits:
How it works: rely on non-Abelian anyons, whose braiding statistics encode quantum information in a way that’s inherently protected from local noise.
Topological qubits: instead of using the state of a single particle, they rely on the structure of special particles called anyons.
Braid: encodes information in a way analogous to logical gates in classical computing.
Pros:
Built-in error correction: since information is encoded in the global properties of the system, they’re less susceptible to decoherence.
Simplicity: inherent robustness = less resource-intensive error correction mechanisms.
Cons:
Theoretical: Practical applications are still uncertain.
Creation: specific material and conditions required to craft quasiparticles.
Exotic physics: relies on tricky effects (fractional quantum Hall effect or Majorana fermions) which are hard to observe and control.
Companies: Microsoft
Neutral Atom qubits:
How they work: individual atoms (often rubidium or cesium) are not charged, making them less prone to electromagnetic noise. These are held in place by optical tweezers, and manipulated with lasers.
Optical tweezers: super-focused laser beams that trap and arrange atoms in a grid.
Rydberg state: lasers manipulate quantum energy levels to expand the electric field. This prevents nearby atoms from changing states, creating entanglement.
Pros:
Coherence: neutral atoms are easy to reproduce, and aren’t easily disturbed.
Scalability: optical tweezers can trap hundreds of atoms in custom layouts.
Temperature: can operate at or near room temperature.
Cons:
Complexity: lasers are hard to set up for extreme precision.
Reproducibility: Rydberg entanglement is hard to entangle atoms repeatedly.
Companies: QuEra, ColdQuantum, Pascal, Atom Computing