29
votes
What's a quantum computer?
I keep seeing this term but I have no idea what it means ?
and what does it actually mean in practice. for example, what it can actually do, it seems to only be used for algorithms and such. not personal computing. I assume I don't understand because I'm unfamiliar with Quantum mechanics
Firs you have to relax your definition of "computer". We think of computers as silicon chips, but that's just one way to make a computer. A computer is fundamentally us taking advantage of physical reactions in a way that the result of some series of physical reactions has a useful relationship with the input.
That's why you can make computers capable of adding with just flowing water and some boxes. Or make an entire CPU in Minecraft redstone.
Now, in practice, we've discovered that we can make extremely small and reliable logic gates with transistors and electrical signals, and this allows us to make very powerful general purpose computing units that takes up little space.
A quantum computer is a a computer that is trying to take advantage of quantum mechanics to make useful results. That's special because quantum mechanics encodes more information than classical mechanics.
A normal bit can be in one of two states. 0 or 1, as it is usually represented. A quantum bit, or qubit, is a linear combination of two states. That means that 0a + 1b is a valid qubit state, and 1a + 0b, but so is 2a + 1b or 3a + 2b. That is, a single qubit already can be an infinite number of values (specifically, in a vector space).
The application of quantum computers is that there are classes of problems which are too hard for us to compute classically. Some of those are intentional, like some encryption algorithms. Because of how quantum mechanics works, hypothetically, you could encode the input as a superposition of every possible input, run the calculation, and then you have every possible output.
All personal computer is algorithms. Just with a pretty layer on top. But how could you even calculate the correct size of a window without algorithms?
I like your explanation about computation and qubits, but there is one part that's a bit misleading.
This is technically true, but it's also not how quantum computers work. (The misconception is common enough that Scott Aaronson has a permanent banner on his blog imploring, "If you take nothing else from this blog: quantum computers won't solve hard problems instantly by just trying all solutions in parallel.") The issue is that, although a quantum computer can simultaneously simulate all possible solutions, you only get one of those results when you actually make the measurement ("collapse the wave function", so to speak). On average, this approach will do no better than a classical algorithm.
Rather, quantum algorithms exploit another property of quantum mechanics: entanglement. For some particular problems, it is possible to design a circuit such that the wrong solutions destructively interfere with each other (like two waves annihilating each other), which consequently makes it more likely that you measure the correct solution. However, quantum algorithms do not generally (ever?) guarantee that a solution is correct, only that it is correct with some probability. This probability can be increased by tuning the circuit or rerunning it multiple times.
Helpfully (for the usefulness of quantum computers), encryption algorithms generally have two major properties relevant here:
The fact that they're easy to verify means even if our quantum computer isn't guaranteed to output the correct result, we can just... try the output it gives us, and see if it solves the original equation. If it doesn't, keep running the quantum algorithm until you get a different result, and try that.
There's some optimization that you can explore for "how probable do we want to make the correct output from the quantum system" based on the relative expensiveness (in time or whatever other metric) of running the quantum algorithm vs checking the result... But ultimately my point is, it doesn't actually matter that we can't get a guarantee, at least in the context of encryption.
Even if we end up with a probability of 1%, that means we only need to run the quantum algorithm 50 times in the median case to solve the equation. Given that encryption algorithms tend to be in the space of 2^256 possibilities and higher, and the current best classical solution is "pick a random number and see if you guessed right", that means as long as running the quantum algorithm is faster than guessing
6805647338418769269267492148635364229
random numbers, it's the faster solution. So there's a lot of headroom for quantum algorithms.
I'd like to ask more about the "personal computing" applications of quantum computing. Years ago when I was working part time at a university we had a lunch-and-learn where someone visiting the lab talked about quantum computers. I specifically remember the speaker telling everyone that quantum computers would be all we'd need some day within our lifetimes (so you could buy a "quantum-only" laptop within a few decades). What I'd heard is that they are well suited to certain types of algorithms but not something you'd want to program to run an operating system. So we may all have a quantum co-processor soon but we won't be throwing away our silicon wafers for them.
When I probed about this the speaker maintained their stance, but I wasn't really convinced. I didn't push any further because this was an ivy league school and I was just half way through a bachelor's degree. I'm still not sure if he was right.
All quantum computation can be simulated on a classical computer--it's just very slow. 30 qubits is roughly equivalent to one billion complex numbers (which can be stored in double precision in 16 GB of RAM or so--but just storing those numbers isn't very useful; I only mention this because it is fun to think about). If you need more than 30 qubits though--things exponentially become a lot more difficult.
I have doubts about whether quantum hardware will ever be truly ubiquitous but if it becomes easier than classical computing then I'm sure we could write the same types of programs that we have today using qubits rather than bits as the underlying unit. The reason that probably won't happen is that some algorithms are faster with simple maths and others are faster with quantum maths.
But one thing to mention is that if quantum computers are somewhat ubiquitous--we'll probably all have quantum co-processors if only to have quantum encryption/communication.
so follow up question, are there other problems that quantum computers can solve besides encryption? I read about how they can model how atoms interact, I didn't really understand that either but I'm curious what else it can do
I appreciate you asking this question! I was considering making a similar topic.
To add on to what you’ve said: I know that quantum computers pose a threat to encryption, but beyond that I’m lost. How and why? Also, are they still just theoretical or do they exist? Are these the future of personal computing, or will they remain highly specialized equipment like, say, the Large Hadron Collider?
Not a quantum expert, but I am a “conventional” computer engineer. Essentially, most encryption algorithms are “secure” because the amount of computation required to guess the right answer is functionally infinite. Like “all computers on earth, running for 1 million years” territory.
Quantum computers posed a threat to certain encryption algorithms, but not all. This is because the math used to obscure the right answer happens to be easily guessable by quantum computers.
There have been known “quantum proof” encryption algorithms for decades, they just weren’t used for one reason or another. These are quantum proof simply because the obscuring math happens to not be easily done by quantum computers. They end up in the same case of “you need infinite compute” to break the encryption.
These algorithms weren’t designed to be quantum proof. They just happened to use different math for the encryption.
Most systems should/have/will update to quantum proof algorithms in the near future, because it’s fairly easy to upgrade and the algorithms are just generally better at all aspects of encryption.
Post-quantum cryptography is not as easy as you describe it: only recently have standards bodies like NIST standardized it; the algorithms are significantly slower, use a lot more network bandwidth, and are worse understood/riskier.
Signal recently introduced post-quantum crypto, and if you read their very technical paper on it, you can tell it took a lot of effort.
Thanks for the article! That was an interesting read.
You’re right that it’s not as simple as I made it out to be. I read a bit more, and it seems like the additional difficulty is primarily with Asymmetric Encryption. My understanding was “just plop AES 256 in” but that is evidently not close to the whole picture. For Symmetric Encryption, it does sound reasonably easy to upgrade.
They do exist. I work on one every day as it is what my PhD is on.
And Shor's algorithm indeed is supposed to break a lot of public key encryption schemes, but currently we are far from actually doing this. This is also one that people like to bring up since it is easy to understand the impact of it, but probably is not the most important use.
Generally there are certain computational problems that a quantum computer is much much better at than a classical computer.
For me the most interesting thing is the original idea. I'll try my best to explain even though I'm a bit drunk rn. Way back when the idea of building a quantum computers became a thing, it was Richard Feynman that showed that is was impossible for classical (normal everyday) computers to efficiently simulate quantum mechanics. Naively one may think this is only some vague fundamental research thing. But the consequences of it in reality are quite far reaching. These days quantum mechanics pop in every day technology.
We use quantum mechanics to understand chemistry, material science, semiconductors, certain branches of biology etc. A quantum computer would make things that are not simulatable for these fields right now suddenly possible. And major breakthroughs could be expected. Even applications for machine learning and AI are expected.
Currently the field is thinking more along the lines of using it similar to how researchers use supercomputing clusters. Not so much personal computers. So this would be useful for big research institutions and companies, less for individuals.
The way we build quantum computers now it is unrealistic to expect this to be in your house (though maybe via the cloud?). But who knows. Nobody really knows yet what a utility scale fault-tolerant quantum computer would look like. So ask me again in a few years.
They do exist (one of my closest friends programs them), but they’re somewhere around where classical computers were in the 1950s-60s. Big pieces of research equipment with teams of academics simultaneously working out how to build them and how to use the things they’ve just built.
I’ll never say never on them somehow finding their way into personal computing - science and tech seems to have a strong track record of finding uses that nobody expected for things as they advance - but realistically I’d be surprised to see it happen in our lifetime, and if anyone’s trying to pitch end user applications for these things in the next couple of decades they’re probably lying.
That said, it’s not quite at the LHC level either: that’s a global one-off with pure science as its goal, so I’d put quantum computers somewhere in the middle. Maybe something like an injection moulding machine, or a combine harvester - big, expensive, specialist equipment that does a particular thing for a particular job. Even if you or I could somehow spend six or seven figures on one, there’d be no reason to: I don’t need to mould 150,000 identical widgets, and my cheap 3D printer is better for making the one off bits I do need; I don’t have a field to harvest, and farm equipment is a terrible option for driving to the shops.
The kind of companies that would buy a room-sized IBM machine in 1962 may well have a reason to buy a quantum computer in 2032, but they’re unlikely to be something the rest of us really encounter any time soon. Also how in the hell does “plausible date a few years from now” get me to twenty thirty goddamn two?!
I think a good ELI5 answer is to think of it like traditional computers work by doing math in two dimensions (1s and 0s), while quantum computers do math in three dimensions.
Certain types of math problems are really hard to do when you're only in two dimensions but they're actually really easy to do if you look at them in three dimensions.
But it's also a lot of work just to do math in three dimensions because there are a lot more outside factors involved. Most current QC require a lot of error correction (or running the same problems multiple times to verify consistent results) and also need to run at extremely cold temperatures.
And, going back to the original metaphor, it's unlikely we'll ever use QC for common types of computing because even though QC algorithms are really good at certain types of math problems, they are much less efficient at a lot of really simple ones.
This is incorrect. That would be a ternary computer which can easily be emulated by a traditional binary computer. There is nothing quantum or particularly interesting about those. Quantum computers work with qubits which are bits that can be in a superposition states of 0 and 1. But what makes them potentially extremely powerful is that one can entangle multiple qubits and with some cleverness perform what resembles parallel computation over all possible states at the same time (it's not exactly that. this is extremely oversimplified. Actual gains are much smaller than actual parallel computations would be).
I should have omitted the
(1s and 0s)part of my description, as I believe it led you to misunderstand what I said. Moving from two dimensions to three dimensions does not describe a ternary system.Well you absolutely need to have at least a grasp on a couple of concepts from quantum mechanics. Namely, superposition and entanglement. 3blue1brown's videos on grover's algorithm are great introductions to this and if you want to dive deeper the book https://en.wikipedia.org/wiki/Quantum_Computation_and_Quantum_Information is the canonical starting point.
On a less serious note, have you seen what they look like? All the quantum computers I’ve seen look like chandeliers!
The shape is probably driven by the need to keep them so cold. They’re held at absolute zero/-273°C/−459 °F.
The chandelier is called a dilution refrigerator. The brass plates are what is being cooled and every plate below is colder than the one above it. At the very top, the fridge is room temperature. Hence why it is suspended from the ceiling.
The metal spaghetti for lack of a better term, are coax cables. The actual quantum processor is on the very bottom. You can't really see it in the picture, but that's where the cables go.
Normally there are still some layers of metal that go around this whole contraption and then we pump it to a vacuum. So you can never really see the chandelier unless we are changing some part of the setup.
It indeed looks like this because of the cooling. The coldest part needs to be down due to gravity, and the temperature drops in stages. Combine that with a load of wires and one ends up with this thing.
Though the one in the picture is a particularly big one, most labs have smaller chandeliers. Iirc that is one of the Google setups.
That’s awesome, thank you. I thought the whole thing was the computer and they lowered into a freezer. The thing it goes in must be either all insulation or something like a reverse convection oven.
You're thinking in the right direction. The best insulating material in existence is no material. So in a way it's all insulation indeed.
There's typically a few things that can be around this chandelier, like a superconducting magnet like in MRIs, but for this the main thing is just a vacuum can.
So the chandelier is not lowered into anything, but we place a metal shell around it. This shell is only attached at the top and we make sure the can does not touch any other part, as that would ruin the insulation. Then we use a vacuum pump to get rid of all the air inside. This makes it very hard for heat to get to the lowest part of the fridge.
The actual cooling is already in there. This is one with way fewer cables, so you can actually see some of the essential parts. The thin things that loop are the coax cables. On the left you can see the parts used to cool the thing. This fridge has no quantum chip in it, but normally that would be below the lowest plate.
So if you would see one in use in a lab, all you'd see is some metal can suspended from the ceiling that makes noises that sound like it's from a sci-fi movie due to the gasses being pumped around.
There are some great answers in here, and remember them, because when the hype comes they aren't going to mention any of it and they'll try to turn it into something else entirely like AI.
Good point... That might be the next tech Big Thing / bubble. The word "quantum" is hella marketable.
While you've gotten other answers on this, it's worth understanding that the nature of quantum computing means that it's not really ideal for traditional computer use.
Even if they were suddenly easier to make, it's very unlikely that the average user would have a quantum computer "chip" in their device, as for the vast majority of operations our current methods are not just good enough, but better.
Of course as others have said quantum computing gives you an insane amount of power for very specific problem types, and thus they're mostly only configured for that. This is reinforced by the expense and difficulty of creating one right now, but I'm not aware of any average user use case that would take advantage of quantum computing.
Traditional microprocessors operate on millions of microscopic logic gates that can either be open or shut (representing a boolean value.) This is why machine code consists of 1s and 0s, and why computing has evolved around being able to process 1s and 0s much faster, or in tandem (i.e. through multiple processor cores or dedicated hardware like GPUs.) A single boolean value, also known as a bit, can only hold two values, a 1 or a 0, but these can be combined to make larger values, (i.e. a byte which consists of eight bits and can represent 256 values between 0 and 255.)
Quantum processors are designed to exploit the theoretical behaviours of how matter behaves at atomic and subatomic scales (i.e. quantum tunnelling and superposition.) They use qubits, as opposed to traditional bits, which can represent a 1, a 0 or a superposition of both at the same time. In theory, this means quantum computers can store and process data exponentially faster than a traditional microprocessor. In practice, qubits are notoriously unstable and a lot of work is being done by scientists and engineers across the world to make them more stable and actually practical for use. An example of this is the Majorana 1 chip that Microsoft announced a few months ago where they claim to create new exotic particles and entirely new states of matter.
Explaining the specifics of how quantum computing and quantum processors work beyond this is far beyond my understanding...
Quantum mechanics has a thing where quantum particles have a wavelike nature and have discrete wace modes that can interfere with eachother.
Scientists have theorized that we could use these quantum effects to carry out computations by mapping the problems we want to solve onto quantum effects.
There are a lot of potential ways we could do this, so its not like there is a single type of quantum computing. But most ideas involve taking advantage of interference to effectively apply algortihms to a wide range of potential inputs at the same time by using superposition to map different inputs to different wave modes.
This is because the idea of superposition implies that each wave mode kind of operates independently to eachother until you add them all up at the end. So if map a different input to each mode you can run them all through the same circuit at the same time without them changing the outcome, as long as you know how to extract that specific mode out at the end.
TLDR: instead of using normal bits, which are 1 or 0, they're using quantum bits, which can have a set probability of being a 1 or a 0 when you read them. This allows for some interesting algorithms. They're not useful for all problems, so even if one day we manage to make them compact and usable for home users, you'll never have a quantum PC - it's more likely that there will be a separate quantum module (QPU) in computers for specific operations just like there's a GPU for graphical operations.
If you want to go down the rabbit hole a little bit and learn more about the underlying maths, https://quantum.country is a great longrid on the topic that explains it well without requiring any prior quantum physics/computing knowledge (although it does require a bit of simple knowledge of vectors and matrices)
If you want a more simplified explanation, this blog post is awesome, but it's unfortunately in Russian. You can use Google/Kagi/whatever translate on the page though, it seems to work pretty well on it (barring some of the jokes in there)
You're close, but not completely right. The probabilistic nature of measuring quantum states is a curse more than a blessing.
The speedup comes from the fact that it is both |0> and |1> at the same time, before you read it out. This in and of itself is not so useful, but it becomes useful ones you have multiple of these qubits. Once you do operations where you do something to one qubit, conditional on the state of another qubit the mathematics become much richer than just probabilistic states. We call the result entanglement.
This is hard to simulate on a classical (meaning normal everyday digital) computer. And if you can make your calculation exploit this, it is sometimes possible to get a ridiculous speedup. To the point where a classical computer might take a billion years, but a sufficiently powerful quantum computer a few hours.
The probabilistic nature of the measurement however means you need to do the same calculation many times to gather statistics. But the improvement is still ridiculous despite this in some cases.
For home computing, in principle typical quantum computers can do all the same maths a classical computer can as well but without any advantage. In practice however, making a quantum computer that can rival even some of the weakest CPUs in regular computations seems ridiculous. So most will agree that they will never replace classical computers entirely.
Shrinking it to the point where it can be a co-processor to me seems unlikely as well. At least viewed through the lens of what it requires these days. The cryogenic setup alone is unfeasible to have at home, even if you shrink it somehow. That limits you to things that work at room temperature and then the options get very slim.
If people will use quantum computers in their daily life, if expect it to be via the cloud. And the actual quantum computer would be sitting in a datacenter somewhere. We are already experimenting with this a bit these days.