I've seen something like this before, like maybe a year ago.

This never seems to get explained properly in articles, and is actually a fallacy.

there is a detailed presentation from a university lecture available, I think on youtube, which explains this relationship, and it is a precise mathematical relationship. The computer cannot "try all possibilities simultaneously".

http://www.youtub...e=relmfu

and

http://www.youtub...e=relmfu

See the abstract.

it depends on the algorithm and application, but there is a finite, real limit to how many calculations the computer can do, and how many calculations are required to solve a problem.

Techno1,

Yep, it's already been proven that the set of complexity classes for which quantum computers will outperform digital computers is not a very large. We'll see great improvements in solving problems in critical cases, but the benefits won't truly be as widespread as most journalists casually like to claim.

It's so incredibly easy to misunderstand details of quantum mechanics, and I'm nothing close to a physicists, but I do study mathematics regularly and am fully aware of what's involved with interpreting the meaning of equations so that you can make intuitive breakthroughs or describe an abstract system.

Interpretation requires such extreme care, correctness, clarity ... I have no doubt there are still many crucial details about quantum mechanics that we are not correctly interpreting still to this day.

My usual criticisms are of physicists taking statistics and probability too literally. Look at quantum mechanics documentaries from just five years ago, they're absurd.

In cases where the "N" is a very large number, the quantum computer will give huge returns, for those classes of problems it actually can solve, but it certainly isn't like the stereotype, mis-reported notion.

N vs sqrt(N).

is a big difference, but it's not infinite, and it's not solving all possibilities simultaneously.

That's:

N vs sqrt(N)
100 vs 10
1E6 vs 1E3
1E12 vs 1E6
1E18 vs 1E9
etc.

While these numbers are mind-numbingly incredible returns, they are far from what is commonly reported.

Some other classes of problems will not that big of a return, like the winner of a game:

N^0.753 vs N^0.5

Which is again, the square root of N on the right side.

This will probably make computers unbeatable in chess, but then again, on max difficulty they are nearly unbeatable already, and they already do the calculations in less than a second.

For searches and sorts, which is what most computing does, it will only be useful for certain classes of search and sort

Say if a classical algorithm is M*N operations, reflecting an algorithm in two variables, such as "searching" a two-dimensional array, then the quantum algorithm would require a MINIMUM of sqrt(M*N) cycles...I think...

I think it's:

M*N vs sqrt(M*N)

Which it turns out doesn't even matter, since it obeys the simple substitution rule, not the power rule.

Also, real world algorithms probably won't be as optimized as theoretical "pure math" algorithms,w hich means you may not actually get as low as "sqrt(N)" result.

First of all, as would be obvious, "Sqrt(N)" needs to be rounded up to the nearest whole number of clock cycles.

Second, the real algorithm may not be ideal. The best possible real world algorithm may require "C*sqrt(N)" or "sqrt(C*N)" or "Sqrt(N C)" or "Sqrt(N) C", where "C" is a constant, probably 1 or some other number.

In all of those cases, the number of cycles will still be smaller than a classical computer, just probably not as much smaller...

And above, I should clarify that "C" would always be greater or equal to "1", and never less...

There is a fundamental misunderstanding here. Quantum computing is no interesting because of how fast quantum computers can solve toy problems which can also be solved by classical computers. All the research on quantum computing is because it can be used to solve problems which are beyond the reach of classical computers.

Let me show this using Shor's Algorithm. This is a quantum algorithm for finding the factors of an integer N. The time required is proportional to the cube of log N. If if takes you 1 day to factor a thousand digit number, it takes 8 days to factor a two thousand digit number. (More likely it takes 4 times as long, on a QC with twice as many q-bits. But I digress.)

How long does it take to factor a (hard) thousand digit number using a classical computer? It hasn't been done, and it probably never will be. The best algorithm (number field sieve) known takes exponential time in relation to log N, the size of the problem statement.

EAchus:

I am aware of that as well, but what you are talking about is only going to be used in Cryptography and a few other fields.

Moreover, having the ability to crack such problems is more likely to favor HACKERS than legitimate purposes.

Can you imagine what the world would be like when hackers, terrorists, and rogue states could literally crack ANY computer code or communication protocol or encryption scheme? Absolutely no communication would be secure, adn they would steal your bank account at will too...