“The future is already here, it’s just not evenly distributed.” – William Gibson
Quantum physics has been around for over a century. The first quantum computer was conceived forty years ago. And yet today, few use them. Why?
I was looking for an overview on quantum computers, but I couldn’t find any information that combined how they work + why I should care + who’s working on them. So, I figured I’d synthesize the answers to these questions myself. My goal is to help the reader assess where they can make an impact with quantum computers.How do quantum computers work?
“Nature isn’t classical, dammit, and if you want to make a simulation of Nature, you’d better make it quantum mechanical.” – Richard Feynman
Quantum computers are mysterious partly because people don’t understand the underlying principles behind them, but also partly because their behavior is counterintuitive. This leads to a variety of consequences, but fear not. We will start with the most fundamental principles of quantum mechanics, all without quantum notation, so you can assess how you’ll position yourself in the field. However, it will help to understand how classical computers work first.
Classical computers are the guys you and I are using to read this right now are just machines that manipulate these things called bits. A bit is just the most basic form of information. Each bit is one of two states:
Values 0 or 1
North or South spin
Low voltage (0v) or high voltage (5v)
To understand the basic building blocks, I will use the analogy of a coin that’s on the table. Like a bit, it’s either heads or tails.
To do useful work, computers combine bits with logic gates. These guys perform basic operations on bits, such as AND (both), OR (either), NOT (invert), etc. Physically, they’re built using transistors. This architecture is predictable, reliable, and deterministic. If you run the same program twice, you’ll get the same result.
Quantum computers are different because they use qubits. These guys are just the quantum versions of a bit. Each qubit is in combination of both states:
Values 0 and 1
North and South spin
Low (0v) and high voltage (1000v)
A qubit is analogous to a coin that’s spinning in the air. You don’t know if it’s heads or tails yet. But this isn’t a regular coin, it’s a quantum coin. This doesn’t mean it’s 0 and 1 at the same time. Instead, it means the qubit exists in a delicate mixture of the two possibilities, with a certain probability amplitude for each — 1/2[heads] + 1/2[tails] in this case. This is called a superposition of quantum states.
Now imagine you wanted the coin to land on heads. If you could swirl some air currents around the coin, you could push it slightly toward heads as it falls. In quantum mechanics this is called interference. With interference, we can guide the probability towards the right answer.
As we try to boost the outcome of heads and cancel the outcome of tales, we create a quantum algorithm. These guys operate on basic math: they can add up or cancel out. What are we adding and subtracting? That’s the amplitude of different quantum states. A good quantum algorithm will increase the probability of right answers (constructive interference) and cancel out the wrong answers (destructive interference).
Now imagine throwing two spinning coins (AKA two qubits) in the air. In the classical world, you wouldn’t expect a correlation between the two. However, in quantum mechanics, their outcomes can be linked. This is called entanglement.
Entanglement lets quantum computers encode relationships between data. If coin A lands heads, then coin B will also land heads — or tails, depending on how you entangle them — but you can’t control either outcome. You’ll only see the correlation once both are measured.
What does measurement mean? Let's go back to the one coin (which is the same as a qubit). Imagine I toss it in the air. Then I ask you, “what’s the value?” You might think that if you wait until it falls on the table, you’ll know its value. However, that’s not how the quantum coin works. You don’t know whether it’s heads or tails until you actually look at the coin. This is called measurement.
To understand measurement, let’s replay this toss. This time, imagine adding a cloud to completely hide the coin we just flipped. During this time, the coin continues to have a spin. In fact, it’s only when you look that the spin collapses (ie. the cloud vanishes, and you get information on the quantum state).
The cloud is important because it represents mathematical probabilities. This is called the wavefunction. If you asked, “how much % is this ‘tails'?” the wavefunction would tell you (actually, the probability comes from squaring the amplitude. This will make more sense if you are interested in the math behind quantum computing). [1]In a perfect world, you decide exactly when to measure the qubit (ie. choosing when to catch a spinning coin). But in the real world, quantum systems are fragile. Heat, noise, and stray signals from the environment are like a storm hitting the coin mid-air. Instead of neatly catching it, the storm knocks the spin off course and scrambles the patterns that made it quantum. The interference effects vanish, and the system starts to behave like it’s in a mix of different possibilities, not a classical outcome. This is what we call decoherence.
Decoherence doesn’t tell us which of those possibilities will actually happen. It tells us how the system starts to look classical, with certain probabilities for each outcome, but it never picks one. In other words, it explains why quantum behavior disappears, not how definite outcomes emerge (this is the quantum measurement problem. It’s where the Many-Worlds Interpretation (AKA the Multiverse) comes from).Quantum computers fight decoherence constantly. The goal is to keep the coin spinning long enough to complete the algorithm and measure it at the right moment. This requires manipulating environments to keep the coin(s) in high quality condition.
One of the factors that cause the coin to stop spinning is important: noise. It’s what causes the process of decoherence. Noise is any unwanted disturbance that introduces errors to the quantum system. In the coin analogy, imagine noise as a tornado that knocks the coin off its spinning top.Recap of the principles:
Superposition: qubit exists in a linear combination of basis states (ie. coin spinning in midair).
Interference: Adding or cancelling the probability amplitudes of different quantum states (ie. air currents steer the spin toward desirable outcomes).
Entanglement: qubits can exist in a joint state such that the measurement of one affects the outcome of the other (ie. two coins are linked so that knowing one tells you information about the other).
Measurement: observing a qubit collapses its state from a superposition into a definite basis state (ie. looking freezes the coin into heads or tails).
Decoherence: environment scrambles the delicate superposition, making it behave like a classical probability, not because it collapses, but because interference is lost (ie. like a storm knocking a spinning coin into unpredictable motion).
Noise: cause of random errors in the quantum system (ie. wind currents that we didn’t create).
When would I use a quantum computer?
“We can only see a short distance ahead, but we can see plenty there that needs to be done” – Alan Turing
Humanity knows where the frontier of computing lies. The most important questions in science – how nature conducts her processes, what the universe's first moments were, how the mind works – are bottlenecked not by imagination, but by the limits of classical computation.
Theory says a quantum computer can do anything a classical computer can do. However, as of 2025, we aren’t even close to doing that. Therefore, as Turing said, we must focus on what we can see in front of us.
Let’s first think about what problems classical computers fail to solve. Some of these areas have progressed with quantum computers. However, keep in mind there’s more to be discovered.
Structured Math Problems:
Let’s say you’re given the number 91 and asked: "Which two numbers multiply together to make this? You might quickly try: 2? Nope. 3? Nope. 7? Maybe. 91 ÷ 7 = 13! You can do this because 91 is a small number. But what if I gave you a number with a million digits? The guesses required grow exponentially as the number you’re given grows bigger.
That’s because classical computers try one possibility at a time, and they don’t know which guesses are more promising than others.
When you have to break numbers into prime pieces, you are solving a problem of factoring. It’s what modern encryption is based on: we can hide a message using a product of two big prime numbers, because reversing the factoring is slow.
However, quantum computers can solve this because they don’t guess one-by-one. They use superposition to explore many guesses at once, and interference to cancel the wrongs / amplify the rights. That’s what Shor’s algorithm does.
Shor’s Algorithm:
It turns factoring into finding the period. f(x) = a^x mod N. Once a period is found, use it to recover primes. Superposition explores all inputs at once. In this case, 1 to N exponents + their periods, at once. Interference as usual, amplifies correct/cancels wrong wavefunctions. Quantum Fourier transformation: decomposes signal to discover hidden frequencies in the wave function.
Optimization / Unstructured problems:
Let’s say I gave you a list of cities. I ask you to go on a shortest round-trip route that visits every city once and comes back to wherever you started. How would you do it?
This is called the traveling salesman problem. You can do it easily with 5 cities, but what if I told you to visit every city in the world (tens of thousands)?
Classical computers get overwhelmed here for the same reason again: they take too long to check all routes one by one, and they don’t know which route is likely to be better.
Unstructured problems are saying, “Try each possibility. Which ones work?” In these kinds of problems, there’s no structure to exploit. There’s a space of possible answers that has no obvious shape. You’re trying to figure out what the best result is. The number of possible routes grows combinatorially.Grover’s Algorithm:
Let’s say you’re looking for a single correct answer out of a list of N components. If you use a classical computer, on average, it’ll take you about half the list before finding it. Now if you used a quantum computer, it would require only √N steps (quadratic speedup). This is Grover’s algorithm. How does it work?
You use qubits’ superposition to explore all possibilities. Then, you have this mathematical function called an oracle. You can think of asking the oracle “is this the right answer?” When the oracle says “yes,” the state of that specific possibility becomes negative. Then, when you continue with interference, this small change shifts the wavefunction’s balance so the probability moves towards the right answer. When you repeat this, measurement gives the correct result with high probability. (Grover’s algorithm provides a quadratic speedup, but for broader approximate optimization tasks, algorithms like QAOA are typically preferred..)
Physical Simulations:
We’ve had quantum theory for 100 years. We had the blueprint for quantum computers in the 1980s. And yet, in 2025, we can’t simulate large, strongly correlated systems — where electrons interactions are deeply entangled — remain out of reach. If you want to simulate nature, you need to play by her rules. And as Feynman said, nature doesn’t run on classical physics.
Classical computers are excellent at modeling systems that are local: things only affect one another if they’re close together, casual: outputs depend only on past and current inputs (not future ones), reversible: systems can play backwards like a movie, and deterministic: the output is always get the same answer.
However, the quantum world doesn’t operate like that. Quantum systems are non-local: particles can influence one another across space. They’re also probabilistic: the output isn’t the same result each time. Finally, they’re entangled: measuring one particle can change the outcome of another (Einstein referred to this as “spooky action at a distance.”).
Every time a particle can be in two states, you have to double your memory. If you have two particles, you have 4 states. If you have ten particles, you have 1,024 states. This is not a coding problem. It's a physics bottleneck. In addition, classical computers process each possibility step by step, but nature doesn't. In quantum systems, all possibilities coexist. So to simulate this, you need to run every path in parallel, something classical computers cannot do.
On the other hand, quantum computers with n qubits can represent superposition over 2^N states, but reading out information (via measurement) doesn’t scale the same way. You need clever algorithms to extract value without exponential resource use. So, trying to simulate a quantum system on a classical computer is like trying to paint a 3D sculpture with a 2D brush. No matter how hard you work, you’re missing the shape.
So, we recognize that quantum computers are ideal for simulating quantum systems. Now, what kinds of physical systems should we model?Quantum Chemistry:
When we try to understand chemical systems – molecules, drugs, catalysts – with the principles of quantum mechanics, we’re entering the world of quantum chemistry. Progress is needed in this field because classical computers fail to accelerate drug discovery. Every delay in simulating these molecules means a potential treatment isn’t reaching the person who needs it.
One example of this is FeMoCo, a wild electronic structure. This molecule matters because it’s the active site of nitrogenase, an enzyme bacteria used to fix nitrogen from the air. The best classical systems struggle to estimate its ground state energy. But, if we could understand and replicate this process in nature, we will create fertilizers without the energy-hungry Haber-Bosch process. This tiny molecule represents a bigger problem: food security and climate.
Another application of quantum chemistry is in designing antibiotics. People are dying from untreatable infections. However, it’s hard to design new antibiotics because the structure-activity relationship isn’t well understood. We need quantum modeling to predict binding affinity: the strength of interaction between molecules, and molecular folding: the shape molecules take to interact with other structures.Condensed Matter:
What happens physically when particles start to talk? Condensed matter is the field that figures this out. It’s where properties like magnetism or superconductivity appear. Behavior here is non-linear, entangled, probabilistic. If you want to improve materials used in applications such as solar panels, maglev trains, or implants, you need to understand how many-body systems – electrons, atoms, molecules – behave inside matter.
In contrast, a quantum computer is already a condensed matter system. That means when you're simulating a magnetic material, a high temperature superconductor, or a topological insulator using a quantum computer, you're recreating the physics directly. In other words, quantum computers natively simulate entanglement, interference, and phase changes.
Every delay in simulating nature with quantum computers means we're still designing materials by trial and error. Quantum computers offer the ability to model the emergent behavior of reality itself, not just guess at it. We could be predicting new phases of matter, new superconductors, new states of energy. And we're not doing it, yet.Particle Physics:
At the smallest scales of reality, particles act like ripples in invisible fields: photons, quarks, photons are not standalone entities. This is the world of quantum field theory. Here, we are trying to understand matter. There is potential to understand the birth of the universe, to discover better forms of cancer therapy, and to develop new forms of energy production.
The state-of-the-art method we have today operates on classical supercomputers. Lattice gauge theory involves chopping space-time into a lattice and calculating how each piece of the universe affects the others. But we’re hitting a wall.
The problem is, we only have simple 2D/3D toy models, which take months to compute and are often static snapshots in imaginary time. This is like seeing a cartoon scene of the universe instead of the real thing. If you wanted to simulate a few dozen interacting particles over 4D space-time, you're far beyond the capabilities of classical supercomputers.
One big, unique reason for this is because of the sign problem. In classical Monte Carlo simulations, the quantum probabilities can become negative or complex. Classical computers break down because sampling methods are not meant for these values. This makes key phenomena – real-time particle collision, confined topological effects, baryogenesis – practically impossible to simulate with classical tools.
Since quantum computers follow the same quantum rules as the systems, they could also help us study field interactions in real time, in full dimensionality, and without the sign problem. This will transform how we study QCD, the standard model, comma, and the origin of matter – things that no classical method will ever reach.
Quantum Machine Learning:
Machine learning has been applied everywhere from biology to finance to language. It runs on linear algebra: matrices, vectors, and dot products, executed at scale by GPUs and TPUs. So it's tempting to ask, can quantum computers do this faster?
Quantum computers operate in Hilbert spaces. These are high-dimensional complex vector spaces where superposition and interference can process information in a way that’s different from classical computers. While this gives them native advantages in some forms of linear algebra, turning this mathematical edge into real-world machine learning gains is not as straightforward.Classical data, quantum processing:
This is where you encode classical data in the form of images, sensor readings, CSVs, into quantum states and pass them through quantum circuits to better extract features or make predictions faster. However, data input is expensive for quantum computers. That's because you need to encode large, unstructured, classical data into quantum states. This requires a device with quantum gates that preserve probability, quantum memory to store data, and circuits that don’t lose coherence.Quantum data, quantum learning:
A better approach is to analyze data produced by quantum systems: sensors, simulations, experiments, using quantum algorithms. Data doesn’t need to be translated. This approach offers scope to discover quantum-native structures, such as topological data and symmetry-preserving embeddings. And we can create algorithms that use entanglement or interference to detect hidden correlations faster than any classical method could.Hybrid models:
This approach uses quantum circuits with trainable parameters and a classical optimizer to offload different parts of the work. Famous examples include quantum classifiers, quantum GANs, and quantum kernel methods. Currently, these hybrid models are limited by noise, circuit depth, and training instability, but they do work on today's NISQ devices.Core problems with Quantum ML:
Many quantum algorithms assume access to QRAM, but these devices haven’t yet been built.
Quantum evolution is governed by unitary operations, which are linear and reversible. Measurement causes probabilistic collapse. In contrast, classical ML often uses irreversible or lossy operations (ex. ReLU), which don’t directly translate to quantum algorithms.
Lastly, decoherence time limits the duration of a single circuit run. If you want to train a quantum computer, you need to ensure it doesn’t decohere for that same period.What are the State of the Art Quantum Computers?
“New directions in science are launched by new tools much more often than by new concepts" — James Dyson
Classical computers were first theorized in the 1830s by Charles Babbage. It wasn’t until the 1940s that huge rooms were dedicated to computers for basic arithmetic. Then, they were made smaller post WWII due to the invention of transistors. This is where we see the first programming languages like FORTRAN/COBOL, the operating systems, and commercial machines like the IBM 1401. Then there was the 1970s advancement of microprocessors, enabling personal computers. Computing was democratized, and the internet age was on the rise. The late 1990s led to connecting PCs to the cloud via the internet. Today, computers are everywhere: watches, cars, refrigerators.
The reason why this timeline is important is because classical computers once seemed impossible to be in the hands of the average person. The first real computers were fragile, expensive, and hard to program. With decades of engineering effort, classical computing overcame its bottlenecks. They got smaller, became programmable, and scaled around the world. While the principles and applications of quantum mechanics are somewhat understood, building quantum computers is one of the hardest engineering challenges humanity faces.
We are in the NISQ era, where quantum computers have limited qubit counts and high noise rates. No current architecture has achieved broadly useful fault-tolerant quantum computing, though some have shown quantum supremacy on niche benchmarks. Different platforms are betting on tradeoffs in speed, noise resilience, and manufacturability.
Superconducting Qubits:How they work: tiny electrical circuits cooled to near absolute zero. This temperature allows circuits to carry current with zero resistance, allowing stable quantum states.
Josephson Junction: insulating material placed between two superconductors. It allows current to tunnel in a way that supports quantum effects.
Microwaves: pulses used to manipulate current and phase differences in the circuit.
Why absolute zero? Atoms jiggle when they’re warm, which can lead to the destruction of quantum behavior (aka decoherence). Cooling reduces the thermal noise in the system. Jiggle = noise = decoherence.
Pros:
Production: Chip fabrication techniques inspired by semiconductors.
Fast: Gates operate at nanosecond speed.
Tested: most mature platform today.
Cons:
Environment: requires dilution refrigerators.
Two-level systems: solid material that contains defects or impurities can create quantum behavior that leads to noise.
Noise: solid form of matter is susceptible to other atoms’ fields.
Companies: IBM, Google Quantum, Rigetti (partners with Zapata Computing), Alibaba DAMO
Trapped Ion qubits:
How they work: individual ions held in place with electromagnetic fields in vacuum. Lasers manipulate the ions’ energy levels to perform computations.
Vacuum: isolates atoms from other particles, reducing noise a lot.
Lasers: reduce kinetic energy, flip quantum states, and entangle / measure ions.
Pros:
Coherence: ions can maintain quantum states for many seconds.
Accuracy: single and two-qubit-gates have low error rate.
Control: lasers can address each atom individually.
Cons:
Speed: slower than superconducting qubits.
Lasers: complexity increases per qubit.
Scalability: hard to manage many lasers without special techniques (shuttling ions, optical tweezers).
Companies: IonQ, Quantinuum, Alpine Quantum, Oxford Ionics
Photonic Qubits:
How they work: single particles of light (photons) represent qubits. Computation is performed by directing photons through a maze of mirrors, beam splitters, and phase shifters.
Optical components: used to implement quantum logic operations.
Why photons? Photons interact weakly with most kinds of matter. This gives them resistance to decoherence in fibers or free space, but also makes entangling them more technically challenging.
Pros:
Temperature: works at room temperature.
Speed: move at the speed of light (ie. well-suited for long-distance computing).
Scalability: integration into chips inspired by classical methods (optical fiber networks, PICs).
Cons:
Interaction: photons don’t interact with each other, making entanglement hard.
Precision: aligning optical components for quantum operations is sensitive.
Control: difficult due to the non-interacting nature of photons, making computation circuits harder to scale.
Examples: PsiQuantum, Xanadu
Silicon Spin Qubits:
How they work: trap single electrons inside quantum dots, created on a silicon chip. Each electron’s spin represents a qubit.
Why silicon? It’s the foundation of modern computing. Similar manufacturing techniques can be used for qubits.
Pros:
Accuracy: some operations reach 99.9%+ accuracy.
Scalability: Semiconductor industry can leverage CMOS infrastructure for qubit production.
Temperature: can operate at higher temperatures.
Cons:
Noise: solid-state silicon systems have nearby atoms with nuclear spins that can cause decoherence.
Valley splitting: energy difference between quantum states can lead to errors in operations.
Companies: Intel, Quantum Motion, Silicon Quantum Computing.
Topological Qubits:
How it works: rely on non-Abelian anyons, whose braiding statistics encode quantum information in a way that’s inherently protected from local noise.
Topological qubits: instead of using the state of a single particle, they rely on the structure of special particles called anyons.
Braid: encodes information in a way analogous to logical gates in classical computing.
Pros:
Built-in error correction: since information is encoded in the global properties of the system, they’re less susceptible to decoherence.
Simplicity: inherent robustness = less resource-intensive error correction mechanisms.
Cons:
Theoretical: Practical applications are still uncertain.
Creation: specific material and conditions required to craft quasiparticles.
Exotic physics: relies on tricky effects (fractional quantum Hall effect or Majorana fermions) which are hard to observe and control.
Companies: Microsoft
Neutral Atom qubits:
How they work: individual atoms (often rubidium or cesium) are not charged, making them less prone to electromagnetic noise. These are held in place by optical tweezers, and manipulated with lasers.
Optical tweezers: super-focused laser beams that trap and arrange atoms in a grid.
Rydberg state: lasers manipulate quantum energy levels to expand the electric field. This prevents nearby atoms from changing states, creating entanglement.
Pros:
Coherence: neutral atoms are easy to reproduce, and aren’t easily disturbed.
Scalability: optical tweezers can trap hundreds of atoms in custom layouts.
Temperature: atoms themselves are laser-cooled, but apparatus around them can operate at or near room temperature.
Cons:
Complexity: lasers are hard to set up for extreme precision.
Reproducibility: Rydberg entanglement is hard to entangle atoms repeatedly.
Companies: QuEra, ColdQuantum, Pascal, Atom Computing
What Are the Engineering Challenges?
“Scientists investigate that what already is. Engineers create that which has never been.” — Albert Einstein
It’s clear that quantum computers aren’t just bigger, faster versions of classical ones. To build one, you need to recreate physics that we don’t encounter day-to-day (examples are absolute zero, vacuum chambers, microwave pulses). While companies vary on their approach, there are a few common engineering challenges all companies face.
Fidelity vs. scalability:
Fidelity: quantifies how reliable quantum computations are. High fidelity means quantum gate operations are performed accurately.
Scalability: increasing the number of qubits and operations to the point where quantum computers can actually tackle real-world problems. It’s the bridge between current platforms and the transformative power for applications discussed above.
Challenge:
As you scale the number of qubits, you increase the chance of error in your operation. More qubits = more noise sources.
Many platforms can’t scale qubits without scaling control systems. More control systems = more crosstalk (unwanted qubit interactions).
Error correction:
Classical error correction:
A bit is either in state 0 or 1. However, bits are also fragile. That is because they are stored in physical systems, which are noisy. The solution we came up with is to send more than we need. This is called redundancy. You add backup bits, check digits, votes – if something goes wrong, you can catch it this way. However, every bit takes up space, time, and power, so the challenge is to balance the amount of data we send.Quantum error correction:
Unlike classical bits, qubits cannot be copied (no cloning theorem), cannot be measured directly, and are not just flips between 0 to 1. So you have to protect information without looking at it + fix the damage without knowing what the original message was.
To protect one logical qubit, we encode it to many physical qubits. This way, information is distributed across a system. To catch errors, we measure helper qubits (ancilla) that are entangled with the data. These indirect checks reveal just enough to detect what went wrong. This process gives us syndrome, which is basically a fingerprint of the error. From there, we can apply quantum gates to reverse the damage and restore the logical qubit to its intended state.
However, if your qubits are too noisy, the correction process can actually introduce more errors than it fixes. The point where the benefits of QEC outweigh the errors introduced by it is called the fault-tolerance threshold.
Tools and Education:
The moment where you can use a quantum computer to outperform all tasks on a classical one is called quantum supremacy, whether it be in accuracy, speed, or efficiency.
Right now, quantum computing is stuck in discovery mode. We understand the physics, and we’ve built some hardware. But most applications are still theoretical. The average person can't use them meaningfully. It's just like the early days of classical computing, a world reserved for researchers. We need a Gutenberg moment. How do we do this? We can look back at classical computers for a guide.
Classical computing began as a centralized technology, limited by bulky hardware like vacuum tubes, highly specialized operators, and institutional control by governments and universities. But over time, breakthroughs in physics (transistors, integrated circuits), engineering (microprocessors, standardized parts), and software (high-level languages, operating systems) made computers smaller, cheaper, and easier to use. The threshold to participate dropped also because knowledge itself became decentralized: manuals, classes, open-source code, and internet communities allowed anyone with curiosity to hack.
Quantum computing is tracing a similar arc, but we're still early in the journey. We've seen major breakthroughs in quantum physics with the development of topological qubits, superconducting circuits, trapped ions, but the systems still remain fragile and bulky, just like early mainframes. There's no robust, scalable modular hardware unit that can be mass produced and widely adopted. On the software side, the field still lacks stable abstractions, quantum operating systems, or accessible compilers that can be used by anyone from a 5 year old to a university student to my grandma.
Education is another bottleneck. Most quantum learning pathways are still locked behind heavy math and physics prerequisites, and culturally, knowledge is concentrated in corporate labs and PhD programs, with little room for tinkerers or open source experimentation. To surpass the revolution of classical computing, quantum needs breakthroughs not just in physics, but also in engineering, software, education, and community.
Who Am I, and Why Did I Write This Piece?
“The best way to predict the future is to create it” — Abraham Lincoln
I’m not a physics PhD. This piece began from curiosity that grew as I overheard friends discussing their work in quantum computing. I wanted to understand what the field actually is, why it matters, where progress is occurring. Then I realized that maybe others are wondering too. My intention with this piece is to help you get a sense of what the field looks like.
But something else happened as I wrote. My motives deepened. I thought about where the world would have been, had great minds like Turing and Einstein lived longer before their work ended. What could we have built if they had just a few more decades? I don't know, but I do know this. We don't have time to waste. Quantum computing will unlock things that we can't even name yet. The only way to find out is to understand more now.
My motives also stem from science fiction (Three Body Problem, 1984). If extraterrestrial species were to land on Earth in a couple centuries, what technologies would you want to prepare? Imagine few people had understanding of how to develop pharmaceutical breakthroughs, energy technologies, and knowledge of the early universe. Whose hands do you want this technology to be in? [2]
I started this piece by critiquing how old quantum mechanics and quantum computing was. But when you look at the scope of civilization, you realize this field is still young. If you want to create the future of the quantum world, or if you are already involved within it, please reach out. I would love to find out about you and your research!
Footnotes:
[1] If you want to learn more, I’d start with Feynman’s lectures on Quantum Behavior
[2] I realize no optimistic sci fi comes to mind when thinking about a future involving Quantum Computers. Let me know if you have recommendations!
Seriously interesting, reading this makes me excited for the future of tech. When i was reading first half i was wondering about machine learning which you later wrote about, i didn’t think it would be so physically difficult (for now). AI + quantum computing when that day arrives would certainly change the universe.