Given the known algorithm for factoring a large number into its factors using a quantum computer, I am curious how big the numbers are that a 50 qbit qc would be able to handle. Anyone have any ideas?
@howhot, it's not really about analog computing. Remember, a qubit when finally definitively read collapses either to a 1 or a 0. It can only stay in superposition while you don't look at it. Just like a quantum particle. This has profound effects during computation that are not duplicated in either digital or analog computers. The available algorithms, to make best use of this property, cannot therefore be either digital or analog.
50 qubits is 50 quantum values.
That's always been the point of quantum computing, @Whyde.
Found something interesting, a categorical listing of the currently known quantum algorithms: http://math.nist....tum/zoo/
Oh and one other point: until there is hardware most algorithms are speculative at best. We are in our infancy in terms of quantum computing algorithms. We need computer scientists who can do quantum physics and there aren't any. Most CSs I know can barely do algebra; calculus? Forget it. They wouldn't know a calculus if it jumped up and bit them on the ass.
flashgordon
Nov 10, 2017