You might have misunderstood my comment (or Scott's writings?)
Classical computers are limited by what they can compute because the universe has limits. A 10,000 qubit quantum computer can factor fairly large numbers. 10,000 qubits is pretty small, one day hopefully it will fit inside a small room.
To factor those same numbers using a classical computer you'd have to make a computer the size of the entire known universe and run that computer until the heat death of the universe. Obviously this is not possible, even in principal.
Theoretically any QC computation can be simulated by a classical computer but in our limited universe you quickly run into the wall where the classical computer just becomes too big (bigger than the entire universe) and too slow (slower than the life of the universe).
I definitely reread your comment several times and kept stewing on the word "intractable" since I'd seen it used somewhere else to talk about problems in NP-Complete. I assumed if I was misinterpreting what you were saying, it would hinge on that word.
Aside from factoring, what kinds of things do we think will meaningfully change if we get general QC? Cryptographers are already preparing for the post-quantum world. Who else needs to be preparing?
Everything I think I understand about QC suggests that a practical breakthrough will not cause any changes in society, other than the abandonment of RSA. Am I missing something?
Since you seem to already familiar with integer factoring, isn't factoring large integers something that "solvable by QC, and unsolvable by classical computers"?
Before this thread, I knew that Shor's algorithm and Grover's algorithm were two very important QC results. I knew that Shor's algorithm means that a QC would be able to decrypt anything that was ever encrypted with RSA. (ECC schemes are likely just as vulnerable, so cryptographers are looking at purely hash-based schemes: http://pqcrypto.org/hash.html)
What I learned this morning based on hints in this thread:
1) Grover's algorithm is a far more modest speedup compared to Shor's algorithm
2) Shor's algorithm only factors integers, but Grover's algorithm can more generally invert a function (which still threatens many currently used cryptographic functions)
So I'd guess that Grover's algorithm is the sort of thing people are talking about as useful in machine learning. There are probably other QC results that are worth getting excited about for people working with neural networks. The Google/Microsoft workshop this weekend has a number of sessions on quantum annealing, as well.
A big reason "unsolvable by classical computers" is such a silly way to phrase things is (paraphrasing Dr. Aaronson here) that simulated annealing techniques on classical computers are already producing such good results without QC. In the previous Shtetl-Optimized post to this one, he posts a Powerpoint deck for a talk he gave at the same conference, and it is quite instructive (but still enough over my head that I'm sitting on HN asking dumb questions).
I mean, since the best-known quantum algorithm for factoring integers is asymptotically faster than the best-known classical algorithm for factoring integers, isn't there some definition of "large" for which this is no longer true?
(I'm assuming you mean "classical computers can factor large integers effectively", since the class of all problems solvable by a classical computer is exactly the same as the class of all problems solvable by a quantum computer)
> the class of all problems solvable by a classical computer is exactly the same as the class of all problems solvable by a quantum computer
I started this sub-thread because I'm pretty sure this statement is true. But the details are tricky.
> the best-known quantum algorithm for factoring integers is asymptotically faster than the best-known classical algorithm for factoring integers
The only reason that current commercial cryptography works is that classical computers can't factor large integers effectively. But quantum speedups to factoring isn't particularly profound, since factoring is just a small part of computing.
> > the class of all problems solvable by a classical computer is exactly the same as the class of all problems solvable by a quantum computer
> I started this sub-thread because I'm pretty sure this statement is true. But the details are tricky.
Actually that is one of the first problems that was solved when we started looking into quantum computers. Our models of classical and quantum computation are both Turing complete and only Turing complete. Any classical computer can simulate a quantum computer, and any quantum computer can simulate a classical one. A quantum computer's only advantage is in speed, and we don't yet have proof that there is an exponential speed benefit that cannot be circumvented by finding a better classical algorithm.
Since asymmetric cryptography generally relies on an exponential barrier between the process of using the encryption and the process of breaking the encryption, that is what we really need to completely break modern asymmetric cryptography.
It might provide a rapid, general approach to non-convex optimization for neural nets.
And that changes everything, probably more than anything since (the iPhone|the internet|computers|penicillin|the industrial revolution|fire) depending on how optimistic you are. It'll change things a lot, anyway.
Imagine simulating a human brain. I'm not sure how massive a computer would need to be today to simulate the neurons, but an efficient implementation can be made to be the size of.. Well.. A brain.
I could see this causing massive changes in society. An artificial intelligent simulation of a super-smart human, that can be tuned toward a specific problem area and made much more focused and efficient than a purely biological brain could work...
Well, to simulate the chemistry in the brain, would, I think, involve simulating some quantum mechanical things, which a quantum computer might be better equipped to simulate?
Is the argument that I've heard.
I don't know how that works when the qubits are being used to deal with other sorts of variables than bits though?
Not proven. The best currently known algorithms for factoring on a classical computer take asymptotically longer than the best known quantum ones. That doesn't necessarily mean it's a fundamental limit of the universe. More generally we have no proof that BQP =/= P.
edit: I was trying to be polite, but Scott Aaronson has spilled quite a lot of blog ink denouncing remarks like the parent post as utter nonsense.