Eliezer S. Yudkowsky, <sentience@pobox.com>, writes:
> Hal Finney wrote:
> > Quantum computing does extend the class of problems known to be solvable
> > in polynomial time, but it does not establish that P=NP using such
> > machines. Factoring has not been shown to be NP-complete.
>
> NP stands for Nondeterministic Polynomial. Given the right physical
> architecture (each eigenstate is a separate thread, just like reality itself),
> then each qubit could directly represent the magic-coin-flip bit of an NP algorithm.
Typically the QC is put into an initial state as a superposition of all possible inputs and then runs through the problem so that the resulting state is a similar superposition of all possible answers. In this sense it would seem that the QC, taken as a whole over all branches/universes, is able to solve all NP problems. The problem is that the answer is in only one branch. If you just measure the output register, you get state function collapse and the probability that you will see the desired answer is too small. So actually the QC is not effectivelly able to work as an NP problem solver.
What has to be done is to maintain coherence and do some trick to amplify the probability of the right answer. Such tricks have been discovered for factoring and discrete logs (Shor's algorithm) and for search (Grover's algorithm). But they are complex and technical and do not apply to general problems.
> > Robin Hanson had an interesting article on the polymath list speculating
> > that quantum computing could theoretically speed up AI very dramatically.
> > This only works if AI can be expressed as a massive search problem,
> > consolidating many small problems into very large superproblems. Search
> > of n items for an optimum can be done in sqrt(n) time using QCs so it
> > is necessary to work on very large problems to get effective speedups.
>
> As far as I can tell, qubit-chunk QC doesn't help AI at all. If Hanson can
> figure out how to get sqrt(n) out of it, he's way ahead of me. I haven't
> finished the QCL language spec, but I can't visualize doing anything but very
> dense mathematical problems. Maybe some network-optimization problems could
> be encoded, but I can't see, say, quantum Deep Blue.
The sqrt(n) speedup is due to Grover's algorithm, and I have to admit that I was not able to understand Grover's paper. He repeatedly does some kind of transform, each time slightly amplifying the probability of finding the right answer when you read the register. You do this sqrt(n) times and then you can read the output register with a good probability of success.
A popularized discussion of Grover's algorithm and its application to cryptography is at http://jya.com/qshake.htm. There was also an article in the August 7 issue of Science. BTW the new government standard algorithm to replace the DES, called AES, requires support for 256 bit keys, apparently at least in part because of fear of QCs. It would still take 2^128 work for a QC to find such a key, which is considered to be safe enough.
Hal