Eliezer S. Yudkowsky, <email@example.com>, writes:
> A "qubit" is a quantum bit, the basic unit which can "fork" reality by being 0
> in one and 1 in the other. Thus a 32-qubit machine could compute in 4 billion
> separate branches simultaneously, at least in theory. Current proposed
> quantum-computing architectures have other constraints, which would limit them
> to certain simple manipulations of the qubits. It's still possible to factor
> large numbers, though. With quantum computing, P=NP.
Quantum computing does extend the class of problems known to be solvable in polynomial time, but it does not establish that P=NP using such machines. Factoring has not been shown to be NP-complete.
Robin Hanson had an interesting article on the polymath list speculating that quantum computing could theoretically speed up AI very dramatically. This only works if AI can be expressed as a massive search problem, consolidating many small problems into very large superproblems. Search of n items for an optimum can be done in sqrt(n) time using QCs so it is necessary to work on very large problems to get effective speedups.
Note though that a reduction from n to sqrt(n) does not cross the boundary between polynomial and super-polynomial complexity. If search problems have exponential growth trees, a quantum computer can look twice as far down the tree, roughly speaking. But the QC still faces an exponentially growing problem.