Quantum data defies an easy explanation, but approaching it with an open mind and robust imagination helps pave the way toward understanding.
“I think I can safely say that nobody understands quantum mechanics.”
– Richard Feynman, The Character of Physical Law (MIT Press: Cambridge, Massachusetts, 1995), p. 129.
Here’s the basic problem with understanding anything in the realm of quantum mechanics: None of it, none at all, is anything like we experience on a day-to-day basis. The normal world we interact with (the “classical” world) is a world our brains evolved to deal with and understand, a world where things either happen or don’t, a world where opposites can repel or attract instead of being the same thing at the same time.
The problem of writing about quantum computing is how to bridge this gap in understanding with words. Unfortunately, it’s a gap that can only really be described with advanced mathematics, so some descriptions end up being pretty bad. This comic takes a humorous – and quite apt – look at that difficulty.
I’m going to hand-wave some of the more difficult parts, but trust that you, the reader, will take my word for it when I describe some of the weirder bits of the quantum world without dumbing it down for you. You’re smart enough to get it. I promise.
To understand quantum data, let’s start with classical data. We’re all pretty familiar with the bit – it’s the basic building block of digital information, representing a single 1 or 0. Eight of them make a byte, 1024 bytes make a kilobyte, etc. Bits are powerful not just because they once worked nicely with vacuum tube computers, with 1 representing “electric current on” and 0 representing “electric current off”, but because of how much data they can represent in a relatively small amount of space.
Many schoolchildren learn about exponential numbers starting with an example like the following: Imagine that someone offers to do a job for a month, with their pay being one penny on the first day, two on the second, four on the third, eight on the fourth, etc. How much will that person be paid on the 30th day of the month? The answer is meant to be surprising: 229 pennies, or approximately US$5.4 million. Bits work the same way; 29 bits can express numbers up to approximately 540 million. You’d only need 83 bits to count every atom in the universe, and you’d still have space left over.
Qubits (pronounced like the letter “Q” followed by the word “bit”) are the basic building block of quantum information. A qubit doesn’t represent a 1 or a 0; it also doesn’t represent “both 1 and 0”, and it doesn’t represent “1 or 0 at the same time”. Loosely speaking, it measures a 1 “amplitude” and a 0 “amplitude”, and when measured, these amplitudes can be turned into probabilities; when entangled with other qubits, these probabilities can translate into a lot of information. Without diving into how or why, the upshot here is that, just as bits can represent exponentially many numbers, qubits can represent exponentially many bits. Twenty-nine bits can represent about 540 million numbers, but 29 qubits can represent about 540 million bits. That translates into so many different numbers, my computer literally can’t estimate it (I just tried).
That’s just one way that quantum computing can influence how we do machine learning – we need to process a lot of data to do machine learning work, and using quantum encoded data, that data could take up a lot less space and take a lot less time to process.
Our report Are you ready for blockchain? is now ready for complimentary download.