"It's like being able to place one phone call that simultaneously tests all four numbers, but only goes through to the right one."

A non-deterministic state machine? I thought that wasn't possible.

I'm going to BestBuy today to shop for a Quantum Computer. I don't need an advanced computer but just one that runs simple algorithms. The article says this one runs simple algorithms so it would be perfect for my needs. Does anyone know where I can buy these Quantum chips in bulk volume? I'd like to get 30 of them for my pupils in 1st graders class next year.

A non-deterministic state machine? I thought that wasn't possible.

According to most of the dominant interpretations of QM, the entire universe is a non-deterministic state machine. But Einstein disputed this ("God does not play dice with the universe!"), and the jury is still out. No one really knows yet what superposition really means, or what the quantum wavefunction collapse really is (is it a real physical event, or merely a change in how humans conceptualize the system?) The dominant interpretation is that the wavefunction collapses when it is "observed". But does the universe actually recognize a physical separation between the observer and the observed? And what is an observer, anyway?

There are many conflicting interpretations of QM, some of which are deterministic, and some of which are not. And you'd need a PhD to really understand how these interpretations differ. QM itself provides relatively little guidance -- QM is, at its core, merely a set of equations that tell you what possible outcomes an experiment might have, and the probability of each. What's actually happening inside the experiment is anybody's guess, and there's no shortage of guesses. ;-)

I think of QM as being sort of like a set of statistical equations that tell you how often a coin toss will give you heads or tails under different circumstances. These equations regard the coin toss as a "random" event, but of course it's not really random -- in theory you could predict the outcome of each toss if you knew the coin's starting position, velocity, spin, etc. So it's only random until you discover and understand the underlying dynamics. And this leads me to question whether true "randomness" really exists. Will we perhaps one day discover a deeper set of equations which can perfectly predict supposedly "random" quantum events? No one knows. But the dominant interpretation models the universe as being inherently non-deterministic (i.e. random). You can compute the probability of any given outcome (and the probability curve itself evolves deterministically over time), but that's as much determinism as you get. As I understand it. :-)

The Q-chip here is superconducting; i don't think you'll be able to pick one up at bestbuy, or handle it very effectivelly;

This Q-chip also is not exactly a nano/Q-chip; it's quantum dots are millions of atoms per; someday, we'll be able to put together integrated circuits of billions of these Q-chips are really knock your socks off! And, that billion element chip can be one of millions! And that personal computer sized supercomputer can be one of billions!

Talk about tip of the iceberg! Talk about the calm before the storm! This is like astronomically beyond the comprehension of the standard human being; or even some mathematical genius!

Talk about the calm before the storm! This is like astronomically beyond the comprehension of the standard human being

Yep, it would be really cool to have atom-sized computing elements! Eventually we might even use subatomic particles. And some say that even the fundamental particles are just the tips of an underlying spacetime "froth", sort of like whitecaps appearing on a lake. They say that the 99.9999999% of "empty space" between particles is composed of this froth. Can you imagine if we were able to craft that froth directly into some sort of computing array, without even involving anything that physicists of today would call a "particle"? It would be like hijacking whatever "machine" runs the universe itself, and making it run our own computations! Harnessing the raw computing power of the universe itself. The possibilities boggle the mind.

The multiverse lives!

Wonderful! I always thought of QM as creating x (depending on the complexity and count of the qubits) parallel universes, and then forcing it (or you, but it doesn't really matter) to be the one with the correct answer. This is simplification, but it really helps to grasp a fact, that - for example -correct phone number is selected from the myriad of possibilities.

mmm...froth

Autobots, assemble!!!!!

Wonderful! I always thought of QM as creating x parallel universes

Yeah, I've always had a soft spot for the multiverse interpretation of QM, too. But it seems to have fallen into disfavor lately, what with Bohm and decoherence and such. Oh, well. String theory will probably blow them all away anyway, eh? ;-)

Will this quantum computer also top my computer bank accounts? If Yes it will right now solve all of my problems and I will be able to tackle new ones.

Will there be a quantum computer in my lifetime capable of the output of today's most powerful supercomputer for less than $3000? The answer might seem like an obvious yes, but I think there's the possibility of unforseen difficulties in bringing this to market which creates an order of magnitude error band on when we'll get this.

Actually, this is very useful technology,
I am always forgetting peoples phone numbers.

The modern computer started with the invention of a single lowly transistor (1947)....this is the quantum equivalent. But, I'll bet it doesn't take 60 years to reach the level we are at today with modern computers.

Quantum computing is one of the biggest hoaxes/crackpotteries in the history of science, on a par with the flat earth hypothesis.

Quantum Computing Crackpottery:
http://rebelscien...ery.html

What's a non deterministic state machine?

Oh, well. String theory will probably blow them all away anyway, eh? ;-)

String theory is a dead end.

I've never been convinced on the alleged ability of a spintronic/quantum computer to perform "simultaneous" calculations.

the analogy of making a single phone call to guarantee the correct number among 4 random recipients is simply hogwash.

Basicly, in logical "computer terms" this is like searching an un-sorted, un-referenced array( the 4 un-labelled phone numbers) for an un-identified value (the correct friend) and getting it right on the first try as anything other than a fluke.

This is simply logically impossible, as there is no mechanism presented to explain this behaviour.

The following section assumes a one-molecule transistor(qubit) for simplicity.

1) One can conceptualize how an electron might concievably have two states at once, but there is no novel mechanism for detecting the presence of both states in a single comparison. If the spin of an electron is both up and down, it would seem to require at least two seperate comparisons to prove that.

2) Even though a 3 state transistor(my take on "up", "down" and "both") could theoretically perform multiple tests in a single step for certain very cleverly designed data objects, in general this would not apply to everything, and would only work with some very inventive data object design and database design by the programmer. The quantum processor could never "magically" perform scores of calculations in a single step and output the results immediately the way Quantum Computing promoters often claim. For example, it simply cannot add 2 3=5 and 1 6=7 simultaneously on a one processor, 1 word machine, even if it is "quantum", as there is no way to know which "up," "down," or "both" applies to which data object from which algorithm.

3) Now in terms of data storage, power consumption, and processor time, theoretically, a quantum computer would be much better than a binary computer. This assumes a non-volatile RAM (spintronic/qubit transistors) and a non-volatile spintronic secondary storage(data stick w/ non-volatile qubit transistors.)

in binary
8 bits = 1 byte = 256 values 0-255
32 bits = 4 byte = 4294967296 values 0 upward

in qubits
6 qubits = 729 values from 0-728
24 qubits = 282429536481 values 0 upward


Applications:

Power:
because far few transistors are needed to store the same size number, or to handle the same number of symbols, data sets, and command sets, you use far less power. A 24 qubit quantum processor use only 2/3 as many transistors as a 32 bit binary processor, so right away the energy from "flipping switches" in the prcessor is cut by 1/3. Not counting that quantum qubits are non-volatile if based on "spintronics," and thus, use less energy anyway...potentially hundredths, thousandths, or even less of the energy of an electronic computer. This means less power used, and less wear and tear on the components, i.e. motherboards, processors, and ram never burn out or over heat.

Math:
Allows for handling absurdly large(or small) numbers in far fewer steps of the processor because very large numbers take up far fewer transistors in base 3 allowed by quantum transistors than in base 2 of electronic transistors (see figures above). While this may not appear hugely significant for small numbers, when dealing with astronimically large or small numbers, which are represented by "multi-word" data objects and require very many cycles of the processor to add, subtract, multiply, or divide a single number, etc. When done with data that fits into fewer transistors, and therefore fewer "words", this saves tremendous number of operations.

Example:
3^48 > 2^64
3^128 >>> 2^192*

* this saves 64 transistors and stores a far larger number, and can then be managed in far fewer operations of the processor since it involves fewer "words" and so on.

Basicly, the higher the "base," the fewer the operations you need; equal or fewer, never more.


text documents (inlcuding html and scripting languages.)

Its a bit lengthy to explain, but I have concocted potential data storage and compression algorith which would work WONDERS in qubits which is physically impossible to do in binary.

Imagine that "qubyte" above which was only 6 qubits. Now in a text document, it can story almost 3 times as many possible symbols as a byte, and is yet 2 transistors smaller!

A typical algorithm in data compression is to make a single bit 0 or 1 which indicates whether the following byte is an "real text" character or whether it is a symbol for a "compressed text" found in the symbol table. Well, since in base three, the preceding qubit can have 3 possibilities interpreted as 0, 1, or 2, we can literally have twice as large of a symbol table (column 1 or column 2). Perhaps "colum 1" are "standard" compression items, such as common phonetics, words, and syllables ("Th", "The", "to", "an", "es", "ing", etc). Basicly, this becomes a "codebook" that is common to all computers. Meanwhile, some or all of column 2 might be "custom" compression symbols, such as words that are especially common or unique to the text in mind (proper names, rare scientific words or acronyms which are used repeatedly in the text, etc.) In reality, you would only need a few dozen of such entries, as it would be better to use all of the possibilites for the most common words, prefixes, and suffixes in the relevant language.

Thus, uncompressed text documents would literally be at least 25% smaller in "qubytes" to begin with as compared to text in bits. Then the EXACT SAME basic data compression concept could allow most text documents to be compressed to perhaps as little as 10-20% or less in some cases.

In terms of data compression, the reason this is superior to the same basic concept in binary are this: (hopefully, this is understandable to the reader)

1) 6 qubits fits on 25% fewer transistors than 1 byte, and yet has almost 3 times the possible symbols.

2) 1 qubit "flag" allows us to identify not only whether a symbol is compressed or whether it is plain text, but also, in the same step, tell the processor which table to use to decompress if it indeed is compressed. (In binary, since you need extra bits as part of the "flag", this would have more and more overhead to perform the same algorithm. Base 3 has literally half the overhead in this case.)

Crap, that was long.

Anyway, there are certain physical benefits to a quantum computer, but I have never been convinced on the "do multiple calculations simultaneously" part. It just doesn't hold water.

Well, can this device actually perform the operation stated, or can't it?

===



The answer is "no, it can't. No computer ever could."



Just because a transistor can be in two states simultaneously does not mean it can actually store two seperate data simultaneously, nor does it mean it can be used to perform two seperate tests or operations simultaneously (as stated, with a very few very specialized exceptions.)



The classic notion of a quantum computer performing millions of calculations simultaneously, or solving all possible solutions to an equation simultaneously, or even "dialing all 4 numbers simultaneously and only the correct one answers," are all science fiction.*



There is no basis whatsoever for the claims made in that article, as there is no mechanism, real or imaginary, which could actually facilitate this with any degree of reliability.

* even the programming language to attempt to handle any such claims is ridiculous.

variable initializations:

$N = 1;

$A = 0;

$M = 1 and 0;

So what is N M? What is A M?

What happens on a classical "if" statement which is checking "M"? Do you literally split "threads" and handle both possibilities as legitimate outcomes, even though this may become nonsensical and conflicting?

Anyway, even if superposition is "real", a "real" quantum computer using spintronic "qubit" processors and spintronic primary and secondary memory data storage cannot do the crap that you see in this article nor in science fiction.



It CAN DO the following things (mostly due to being able to store larger numbers or more symbols on the same number of transistors.)



1) CAN: Perform the same mathematical operation POTENTIALLY in fewer steps; Depending on how large or small the numbers are which you are working with.



For small, ordinary numbers that most computer software uses (small loops, few entries in a database, small number math, etc,) this would be insignifigant or even entirely non-beneficial.



2) CAN: Store the same text document in at least 25% less space while uncompressed, and possibly 90% less space while compressed. Almost anyone who uses the internet would benefit from this.



3) CAN: Reduce the size of all "one word" instruction sets by 25%, and also potentially reduce SOME multi-word instruction sets to fewer words than they currently are (highly data and/or application specific, but works much like the "very large or small numbers math" idea above).



Things it CANNOT do (because there is no logical mechanism to facilitate them.)





A) CANT: perform multiple operations on the same processor simultaneously. This is pure science fiction. To demonstrate why, just imagine the base 3 number system.



0=0

1=1

2="both"



Now imagine our quantum processor trying to do two math problems simultaneously as science fiction (and crackpottery,) claims. To keep this very simple we consider the math operates:



1 plus 2 = 3



3 plus 4 = 7



For simplicity, I'll just look at the first 4 bits.



usual binary: (p is short for the "plus" sign.)





equation A: 0001 p 0010 = 0011

equation B: 0011 p 0100 = 0111



The problem with the "superposition" interpretation appears here.



"quantum" (assuming binary, to try to demonstrate the paradox of alleged simultaneous operations.)



The problem comes in because is zero really zero?



0 plus 0 = 0



but in binary, 1 plus 1 = 10







Is the "2" ("both" state) that way because equation A has a 1 and B has a 0, or is it because A has a 0 and B has a 1? It makes a WORLD of difference, and it is not possible to detect which is the case without additional operations.(in fact, more operations than two seperate simple binary additions.)



To further illustrate this paradox and its absurdities: (Q tells us which qubit we are looking at, A and B are data results of the respective equations, "2" represents "both", C is the value in the processor outputs)





Q 123456

A 000011

B 000111

C 000211



Now the problem is, C cannot actually tell us whether A's 4th qubit is 0 or 1, only that between A and B both 0 and 1 are represented. thus the processor has a wrong result for BOTH numbers 50% of the time(actually, ithas a wrong result for both numbers 63 out of 64 times, which makes it worse than guessing).





B) CANT: search an un-sorted, non-referenced array in one step with guaranteed correct outcome; in spite of what this article implies with the "phone number" example. That is just non-sensical and hogwash. This would not be possible EVEN IF "A" above was possible. This would not be possible even if the uncertainty principle is not true. With the uncertainty principle, even if you COULD perform this operation successfully, doing so just one time, you will have destroyed all four data objects (phone numbers) in the process, which is bad for future use of any of the data in question.

Didn't D-Wave already design, construct and make a quantum processor and then a quantum computer and show a demonstration of it a couple of years ago?


To the best of my knowledge, what D-Wave did was build a regular analog computer using the coherent state of electrons in a superconductor, which behave classically even though they can only exist due to quantum mechanical effects. So they couldn't implement any true quantum algorithms, but they marketed it as "quantum computing" to get press. This sounds like the real deal however.

Didn't D-Wave already design, construct and make a quantum processor and then a quantum computer and show a demonstration of it a couple of years ago?





To the best of my knowledge, what D-Wave did was build a regular analog computer using the coherent state of electrons in a superconductor, which behave classically even though they can only exist due to quantum mechanical effects. So they couldn't implement any true quantum algorithms, but they marketed it as "quantum computing" to get press. This sounds like the real deal however.




This article doesn't even explain what its alleged "algorithms" were. What sort of algorithm could actually be run on a 2 qubit machine which actually does something complicated enough to test any of the science(fiction) quantum computing concepts I've addressed? In the base 3 interpretation, a 2 qubit processor can only handle numbers 0 through 8 as a single "word", so if by "algorithm" they mean adding 3 plus 5 equals 8, that is very unimpressive.

If they mean sorting a randomized array of 100 terms in verifiably fewer operations than is physically possible with a 2 bit binary computer, now that would be an "algorithm," but I doubt anything that useful was actually done here, else they would have presented it.


Are they actually claiming that this 2 qubit processor has already run an "algorithm" which someone invented to search an un-sorted array of 4 terms, making only one comparison, and getting it right every time on the first try?

Lets see the algorithm, and lets actually see some sort of physical proof of this absurd claim.

Quantum Conundrum, I agree with your basic argument. Quantum computing is crackpottery anyway you look at it. In my opinion, this is just one more announcement designed to keep the money coming in. They got nothing that is worth anything simply because it's all based on BS.

"Quantum computing is one of the biggest hoaxes/crackpotteries in the history of science, on a par with the flat earth hypothesis.

Quantum Computing Crackpottery:
http://rebelscien...ery.html"

Wow, one could give up after seeing somebody who's so far down their path of thought like this . . . trying to explain anything to anybody about science. I mean this guy sounds like the people who couldn't look through the Galileo telescope because "why doesn't the air get left behind", "why doesn't the earth fall apart", "why do I stay firmly planted to the solid earth if it is moving like this"?

I've tried to explain things like this to "CRN/FORESIGHT" people . . . that the only way for humanity to grow up and get past . . . the past . . . is to be free; to be able to get together with those who see and understand things to be able to get away from those who don't and will not because they are wrapped up in their world; i mean I'm not even talking about fear mongers here; i'm just talking about people who think but cannot and will always refuse to not thing about new ideas. But no, we've got to bind everybody up here on earth because dam if some humans see that 1) humans are the technologically dependent species, and 2) we are about becoming transhuman. But Eric Drexler, Chris Phoenix, Bill Joy, Mike Treder, Ralph Merkle, and all the rest I can only suppose since nobody criticizes them!

Like I've said before; you don't want to think things scientifically, I don't want to hear your problems; just shut up; and when push comes to shove in the future(it will with all those irrationalists out there; to the tune of practically six billion people all socially bound up with anti-science as a social grace . . . oh yea, push will come to shove) . . . I don't want to hear about it.

People argue(I suspect; nobody actually talks to me; they just kind of make quick remarks so that nothing comes back their way) that I don't learn the real mathematics; and/or, I spend to much time with E.T. Bell's "The Development of Mathematics", Morris Kline's "Mathematics; the Loss of Certainty/Mathematics in Western Culture" and Jacob Bronowski's works(I'd recommend "The Origin of Knowledge and Imagination", "Science and Human Values", "Magic, Science, and Civilization", "The Western Intellectual Tradition", and some of his collected works like "The Visionary Eye" and "A Sence of the Future" and not "The Ascent of Man") to get to the real stuff; well, I agree I need to read the more technical stuff; but, without knowing the philosophy and history behind all that stuff, I don't know the humanity . . . both the nature and origins of the technical stuff, and the philosophy of those behind it and the philosophy behind those that disagree or just flat out don't know or come from the mathematical tradition. But, I'm still young and healthy, and I'm not that bad mathematically or that far from learning and doing; and, since I've read all that stuff above, I can see the big picture; i can see where to go.

I know my family had the book of the author who came up with the famous quote "those who don't learn their history are doomed to repeat it" but it seems quite clearly to not be around anymore; it got lost, grew wings and flew away apparently. Bottom line here is what happened to Greek mathematical science due to Plato's restrictions of no experiment and compass and straightedge will happen again. When the future comes around, we'll be living in a kind of logan's run; and, because the technology is so powerfull and incomprehensible, we'll never leave the earth; and, because no knowledge is the final knowledge(CRN/Foresight guys/gals hate Godel's theorems), problems will ensue both social/psychologically and 'technologically.'

String theory is a dead end.

Any theory dealing with such high energies can't in the foreseeable future be tested, and so will always be accused by some of just being 'philosophy.'

This is just a bad excuse for a failure.

Low energy physics has to be derivable from high energy physics just as newton laws are derivable from relativistic laws. There are plenty of low energy phenomena which need to be explained by a proper theory of everything (which string theory aspires to be) and which should therefore allow for it's testing. Things like origin of mass, why masses and mixing angles are what they are, where does fine structure constant come from, why Koide formula holds, why are there 3 generations of particles, where gauge symmetries come from, how entanglement works, what causes decoherence, how many forces are there, what constitutes dark matter and dark energy, what is the nature of neutrinos, do gravitons really exist and in what form, and so on.

And as for theories which are genuinely beyond experimental testing they should never be founded from public funds. Every successful human model has been based on experimental evidence, without such evidence to guide the way it's pretty much guaranteed that the model will be wrong and the funding which went into it wasted. Therefore development of theories has to be postponed until the time when proper experiments are within reach.

String theory is a cancer on the body of theoretical physics. It's a manifestation of a more general problem concerning science - increasing the number of scientists is not always a good thing. Theres only so many bright and honest people with a genuine passion for a particular field. Lowering the barrier to entry to rise numbers of scientists means allowing mediocre ones into the field which results in lower overall quality and dilution of both talent and knowledge. Mediocre scientists have to produce publications but being unable to tackle the real problems of the fields they invent replacement problems - string theory, multiverse, extra dimensions, the first 10-40 seconds after big bang and other such speculative nonsense. Normally such "research" should not pass the peer review but with mediocre scientists far outnumbering the real ones it not only passes it even becomes a hot and trendy area of "research"! What's more this crap is then packed up in hype and sold to ignorant public. A whole industry build on pseudoscience.

Flashgordon:

I would like few things more than for quantum computing as pourtrayed in this article and in science fiction to be true. However, it simply is neither plausible nor possible in this universe, nor any universe, which has any degree of causality whatsoever.

I have attempted (admittedly poorly,) to show above the physical and logical reasons why no processor could actually perform multiple calculations simultaneously, even if the principle of superposition is true. The most basic reason is that the input and output data from one calculation would necessarily corrupt the input and output data of the other simultaneous calculation, and as I demonstrated, there is no way to recover even one of the results, much less both of them.

Just think about this for a moment, can anyone reading this actually concoct a procedure or a logical algorithm which a quantum computer could execute to supposedly solve an equation for all roots in a single step, or make a single comparison and guarantee the correct result while searching an array?

I don't claim to know the mathematics; but, I know there's mathematicians who have worked it out; i've seen papers at that formerly Los Alamos e-mail article website(now at Cornel).

The link at the bottom of the article leads to a better description (the abstract of the paper, I guess), which says that "Here we demonstrate a two-qubit superconducting processor and the implementation of the Grover search and Deutsch%u2013Jozsa quantum algorithms". Wikipedia says Grover's algorithm allows "searching an unsorted database with N entries in O(N1/2) time and using O(logN) storage space" as compared to "linear" ( O(N) ) time for a classical algorithm. (Note: The wikipedia page on Grover's Algorithm has a link to the other algorithm mentioned.)

Ok, even the wike article says that the deutsch-jozsa algorithm is of "little practical use".

Also, the grover algorithm is still probabilistic and carriers a percentage chance of error! This is therefore absurd, and amounts to rolling a set of fixed dice. You may have an increased chance of getting the right number, but you still have a chance of getting the wrong number too, which is unacceptable for real world applications.

To me, the efficiency of an algorithm to solve the "worst case scenario" search or sort must be defined by the first iteration that is guaranteed to always have the right result, NOT the first iteration that "probably" has the right result.

Grover's algorithm is probabilistic in the sense that it gives the correct answer with high probability. The probability of failure can be decreased by repeating the algorithm



That is simply an absurdly unacceptable standard, because in reality, a "worst case scenario" in Grover's algoithm actually ends up requiring more iterations than even a linear search, because you have to check and re-check the results to be sure you haven't made a mistake, and even then, since future checks also have a probability of being wrong, your results can never be "guaranteed correct". Which is functionally useless.


IN a binary search in a regular computer, the WORST case scenario when searching an array of 2^N objects requires only N iterations.

Obviously, the best case scenario possible for any algorithm would be finding the right result always on the first iteration, but I still do not believe that has been proven possible for anything other than an array of one element, unless you already know the correct answer ahead of time(but then why search, simply de-reference and move on...).


But the worst case in the Grover algorithm (which was glossed over and ignored,) is actually an infinite loop of always getting a different answer on each subsequent check, which CAN and WILL happen from time to time, which makes it useless.

Any probability less than unity is functionally useless in an algorithm.

This technology is going to take over the world. I'm always forgetting telephone numbers. Just the other day I lost someones' phone number. Just imagine what the world would be like with one of these gadgets!

This technology is going to take over the world. I'm always forgetting telephone numbers. Just the other day I lost someones' phone number. Just imagine what the world would be like with one of these gadgets!


Its science fiction.


Ok, even the wike article says that the deutsch-jozsa algorithm is of "little practical use".
However, it always produces the correct answer. The wikipedia article says, "For a conventional deterministic algorithm where n is number of bits/qubits, 2^(n-1) 1 evaluations of f will be required in the worst case." while, "The Deutsch-Jozsa quantum algorithm produces an answer that is always correct with a single evaluation of f."

"little practical use" - this is rubbish. I am always forgetting telephone numbers, and a device that can correctly get 1 out of 4 telephone numbers would be a dream come true for me. I am sure that I am not the only person who feels this way?

doesnt quantum uncertainty require/produce a non-deterministic universe?

doesnt quantum uncertainty require/produce a non-deterministic universe?
Quantum uncertainty requires unpredictability; so, depending on one's definition of determinism, it may require that the universe be non-deterministic.

However, the probability amplitudes of any given processes in the universe (as far as I know) change deterministically/predictably. So, given a large enough set of occurrences of a process, the proportion of each possible outcome can be predicted very precisely (as with most statistical observations, the more instances the smaller the uncertainty in the proportions.) So, uncertainty produces unpredictability on small scales, but does not preclude a large measure of determinism on large scales.

This device is the greatest thing since Alexander Bell invented the telephone. Surely?

KBK is right - the ability to remember four telephone numbers is the way ahead; this is the future.

I agree - this new ability to remember one-out-of-four telephone numbers is indeed quite astounding.