Some time ago I asked Love how computers really work. What make them go? What make them do what I want them to (and quite often, what I don't want them to). He gave me a decent explanation (well, his explanation was probably awesome, but my comprehension of it was decent), of which some was about the binary system. As I understood it, it basically means the "language" of the computer uses on and off switches, which in long series of codes mean different things. It is handy because the simplest way to use and understand the information from a switch is by either have it turned on or turned off. Either it gives a current or it doesn't. Quite the black and white world a computer lives in.
I didn't think this sounded practical though, or rather, wouldn't it be even more useful if a switch didn't just give either on or off response, but maybe also alot of inbetweens? Imagine a person who only answers everything with yes and no and then suddenly adds the option "perhaps" or "a little" or "alot". It would add alot of possibilities to their communication. I thought this would work for a computer too. Wouldn't it be able to store more information with the same set of switches if the switches could use inbetween variables?
After finding an article about Quantum Computers over at good old Sciencedaily.com, it seems scientists have had the same thought and have been looking into how to use atoms, or rather electrons, as bits or "qubits" as they call it. A bit is a computer information unit, and what they've done is trying to use electrons to work as these information units.
The problem with electrons is they don't behave as we want them to. Alot of other things in the world will try to make the electron behave in another way than we can control. The whole thing about using them as information units is to make their behavior predictable and therefore controllable. But since electrons are very small, controlling one usually means controlling a whole big bunch of electrons, which apparently isn't what the scientists want.
But now they've come a step closer to making a single electron behave over at the Princeton University. They do this by trapping them into "microscopic corrals" which puts the electron in a quantum state. In this quantum state, not only does the electron act in a way "ordered" to it, without disturbing other electrons, most importantly it can behave in a "not entirely off" and "not entirely on" kind of way. It can actually be inbetween.
Having a single information bit giving the same information as 100 do today would make the computer as we know it even smaller, maybe faster (don't know how this quantum stuff works really, so I'm not promising anything). I hope we get to see this in computers soon!