Google Says It Understands the Quantum Computer It Owns…a Little Better

| News

Google announced on Tuesday that it understands the quantum computer it co-owns with NASA. Well, a little better, at least. No one actually understands quantum computers yet, not even the folks who make them. In fact, there's some debate on whether existing quantum computers are even quantum computers, but that's a rabbit hole for another time.

D-Wave 2X Quantum Computer

D-Wave 2X Quantum Computer

What Google announced today is that quantum annealing works, that it's fast, and that it has proven this using the D-Wave 2X quantum annealer it's been working on with NASA. What is quantum annealing? On a "knowledge" level, I have no fracking idea, but on a practical level I can say that it's a different way of doing some specific kinds of math, and it has the potential to unlock new levels of computational power.

"We found that for problem instances involving nearly 1000 binary variables, quantum annealing significantly outperforms its classical counterpart, simulated annealing," Google Research wrote in a blog post. "It is more than 108 times faster than simulated annealing running on a single core."

Ten to the 8th power is a lot to we humans. Think a hundred million times more powerful. To be sure, current computing technologies allow multiple cores to be run in parallel, so it's not as if these tests showed that the D-Wave 2X quantum annealer is hundred million times more powerful than today's computers.

But it's still fast, and we are in early, early days of quantum computing. Early. To emphasize that even more, these tests Google has been working on are really tests to figure out what this computer can do and how it is doing it. And they still don't really know.

"We also compared the quantum hardware to another algorithm called Quantum Monte Carlo," Google Research said. "This is a method designed to emulate the behavior of quantum systems, but it runs on conventional processors. While the scaling with size between these two methods is comparable, they are again separated by a large factor sometimes as high as 108."

Google published a nifty graph showing some of its results. I have no idea what it means other than the quantum annealing was hella fast. (The caption is Google's).

Time to find the optimal solution with 99% probability for different problem sizes. We compare Simulated Annealing (SA), Quantum Monte Carlo (QMC) and D-Wave 2X. Shown are the 50, 75 and 85 percentiles over a set of 100 instances. We observed a speedup of many orders of magnitude for the D-Wave 2X quantum annealer for this optimization problem characterized by rugged energy landscapes. For such problems quantum tunneling is a useful computational resource to traverse tall and narrow energy barriers.



We’re at a point in time where more and more, the numbers we hear about have no relevance to us. Even 1 billion is such a large number that it is just a word to us. We don’t conceptualize it as a number. We just know the word.

If you think you know 1 billion,  do you know roughly how long it takes for 1 billion seconds to pass? When you look into the answer you might see what I’m talking about. We can’t keep track of that many individual things. This is why we don’t panic about things like a country’s debt ceiling and the challenges of deep space observation (let alone exploration).

“This computer is hella fast” does pretty much sum up this article, but it is interesting to see some of the numbers (or words) on the screen to describe it.


I appreciate Bryan’s willingness to discuss this without having to position himself as an expert. Me neither. But I have inferred from reading about quantum computing and seeing some video of D Wave’s machine that we may be talking about a kind of computing that can resolve highly complex problems in a “timely” manner in an extremely fine-grained “statistically significant” way rather than arriving at an exact number solution as a binary machine might. Let’s say, a solution significant to something like 99exp-8th power would take 30 seconds, while finding the exact solution might take a hundred centuries. I have inferred this and stand ready to be corrected. The notion makes sense for analyzing geology, human contact networks, weather, etc. for which a statistically accurate result is sufficient.

Log in to comment (TMO, Twitter or Facebook) or Register for a TMO account