> Now, if you can map some useful computational question onto the original configuration of qubits that is answered by the ending position, you've got yourself a useful quantum computer. This is the hard part!
Yes, I agree. But this has not been demonstrated. What's being demonstrated (apparently) is that measuring a quote-unquote "quantum computer" doing whatever it does naturally is easier than simulating said quantum computer classically. Well, yeah. Duh.
That's the same thing as "rendering" a scene with an unbiased renderer vs. setting up that same scene in reality and using a camera. No one in their right mind would point to the camera and say they'd create a next-gen, heretofor impossibly fast unbiased rendering algorithm.
Technically, the digital camera is "computing" the same result—but no one would call it that, and IMO, the same is true of what is being discussed in the FAQ. It's literally NOT COMPUTATION, which brings us back to your line:
> Now, if you can map some useful computational question onto the original configuration of qubits that is answered by the ending position, you've got yourself a useful quantum computer. This is the hard part!
It's not only the hard part, it's the only part that matters. Until then, you have AT BEST a "quantum camera". Potentially useful, perhaps—but it's not a computer, or computation.
Anyway, thanks for responding. Much better than drive-by downvoters probably hoping to get PhDs in this stuff.
Also, my intent is not to belittle Google's engineering effort. In the same way that I wouldn't belittle Sony for making 24mmx36mm backlit CMOS sensors. It's impressive! Good for them. But it's not computation, and it definitely doesn't establish some kind of "quantum computing supremacy" (since no meaningful computation is being done). When they stop handwaving about mapping actual computation problems to the scene they've set up and are measuring, then I'll get excited. Maybe it's doable, maybe not. But a quantum camera, AT BEST, is one step along the path...
I think your camera example is a false equivalence that makes this seem as if it's not a computation. The camera is not running the same algorithm as the renderer and so you're comparing different things.
The experiment used a classical computer to randomly generate a circuit C, told the quantum computer to execute it, and recorded the result. Then they repeated this but executed each circuit in the most optimal way a classical computer can. Finally they compared the distributions to verify that the quantum computer's results matched the correct classically computed results.
This proves that the quantum computer is able to generate that distribution faster than a classical computer and isn't just doing some other spurious process that happens to be faster.
Yes, this random distribution isn't very useful and so this result is only really interesting as an experimental verification of theory. However, it is a computation that benefits from quantum speedup in real life!
Hopefully soon, algorithms will be found to generate more useful distributions (as it seems for the time being sampling is the only application of this type of QC that is practically doable). For example, Aaronson mentions that generating verifiably random bits is not much more difficult than the noise generated in this experiment and could have an impact in a variety of cryptographic applications.
> The camera is not running the same algorithm as the renderer and so you're comparing different things.
[rewriting your words] The experiment used a classical computer to randomly generate a scene C, set that scene up in real life, and recorded the result. Then they repeated this but rendered the scene in the most optimal way a classical computer can. Finally they compared the results to verify that the scene in real life matched the correct classically computed results.
The scene + digital camera has the exact same role (and proof value) in my hypothetical experiment as the quantum computer + measurement device does in the Google experiment.
It's not the camera that "contains the circuit", it's the scene and the camera together that computes the same values (exactly, as it turns out) as the classically computed rendering algorithm of the same phenomena.
Call it "camera computing", write a paper in Nature, win Turing award. The hard part with camera computing is the same: How to map a computation onto the generated scene so that the measurement device (camera) gets a meaningful result faster than it can be simulated in the computer.
----
Look, I've worked on things for a long time only to discover in the end that there's nothing there. It sucks, I get it. Move on, try something else. There's nothing here.
Yes, I agree. But this has not been demonstrated. What's being demonstrated (apparently) is that measuring a quote-unquote "quantum computer" doing whatever it does naturally is easier than simulating said quantum computer classically. Well, yeah. Duh.
That's the same thing as "rendering" a scene with an unbiased renderer vs. setting up that same scene in reality and using a camera. No one in their right mind would point to the camera and say they'd create a next-gen, heretofor impossibly fast unbiased rendering algorithm.
Technically, the digital camera is "computing" the same result—but no one would call it that, and IMO, the same is true of what is being discussed in the FAQ. It's literally NOT COMPUTATION, which brings us back to your line:
> Now, if you can map some useful computational question onto the original configuration of qubits that is answered by the ending position, you've got yourself a useful quantum computer. This is the hard part!
It's not only the hard part, it's the only part that matters. Until then, you have AT BEST a "quantum camera". Potentially useful, perhaps—but it's not a computer, or computation.
Anyway, thanks for responding. Much better than drive-by downvoters probably hoping to get PhDs in this stuff.
Also, my intent is not to belittle Google's engineering effort. In the same way that I wouldn't belittle Sony for making 24mmx36mm backlit CMOS sensors. It's impressive! Good for them. But it's not computation, and it definitely doesn't establish some kind of "quantum computing supremacy" (since no meaningful computation is being done). When they stop handwaving about mapping actual computation problems to the scene they've set up and are measuring, then I'll get excited. Maybe it's doable, maybe not. But a quantum camera, AT BEST, is one step along the path...