“Of all the theories proposed in this century,” physicist Michio Kaku wrote in Hyperspace, “the silliest is quantum theory. In fact, some say that the only thing that quantum theory has going for it is that it is unquestionably correct.”
Last month, our client Accurate Append, an email and phone contact data quality vendor, blogged about a big space data conference, and described promising developments in the use of data analytics to measure greenhouse gases based on satellite imagery, to identify organic molecules and methane on other planets and moons (critical to the search for the origins of life) and more.
But how about a deeper dive into something even more complex? The particles (like photons and electrons) that make up the substance of the universe behave in really strange ways if we look at them closely. They have a “reality” very different from classical descriptions of matter as stable and consistent. Understanding that strange behavior—and then even harnessing it, or flowing along with it—is the challenge of applying quantum theory, and this has world-shattering implications for big data and artificial intelligence, to say the least.
It really depends on who you ask, of course, whether this is a good thing. Shouldn’t we be able to break codes used by criminals or terrorists? We may be heading into a brave new world where security and insecurity co-exist along with the on, off, and on-off of quantum states. An expert in “Ethical Hacking” said back in 2014 that told Vice in 2014 that “the speed of quantum computers will allow them to quickly get through current encryption techniques such as public-key cryptography (PK) and even break the Data Encryption Standard (DES) put forward by the US Government.”
In the most oversimplified of nutshells, quantum computing goes beyond the binary on/off states computer bits normally operate under, adding the additional state of on and off. The main consequence of this third state is that quantum computers can work “thousands of times faster” than non-quantum computers—beyond anything that could be otherwise imagined. That speed also adds to the security of quantum data. Experts call it un-hackable, which is pretty audacious. Some of the basic everyday miracles of quantum physics also make their way into quantum computing, like “entangled” particles being changeable together even if they are far apart. This provides a way of essentially “teleporting” data—transferring it without fear of being intercepted. Chinese scientists seem to have taken the lead on the un-hackable quantum data plan. Since there is no signal, there is nothing to intercept or steal. To put it in even simpler terms, the data being shared is, in a sense, the same data. It’s like existing at two distinct points in the universe simultaneously but only as one unit. More precisely, you’ve created a pair “of photons that mirror one another.” This indeterminacy leads to the possibility that many of the “laws of science” we take for granted are just arrangements of things in relation to other things. Gravity itself, and many of the behaviors of space and time might actually be “correcting codes” at the quantum level.
Qbits, which are these nonbinary computer bits we’re talking about, can be made by superconductors that maintain a quantum state. This requires extremely cold temperatures—close to absolute zero; colder than the vacuum of space. Underlying these miraculous evolutionary steps is the quantum theory’s embrace of “imprecision” in a computing world that has mostly relied on precision and predictability. This makes quantum theory natural kin to artificial intelligence since AI aspires to teach computers how to “handle” and process imprecision.
In some ways, embracing imprecision in computing technology is similar to the implications of philosophers rejecting binarism in the 19th and 20th centuries. Georg Wilhelm Friedrich Hegel, for example, in the early 19th century, developed the dialectic to do justice, as many of his interpreters have put it, to the reality of the half-truth, to the idea that things may be in a state of development where they are neither and both one thing and/or another. In a very different way, the Danish theologian Soren Kierkegaard sought the rejection of absolutes and the embrace of absurdity, a kind of simultaneous being-and-non-being. Werner Heisenberg, one of the founders of quantum theory, seemed more like a philosopher than a scientist when he wrote “[T]he atoms or elementary particles themselves are not real; they form a world of potentialities or possibilities rather than one of things or facts.”
The implications for big data are immeasurable because quantum computing is to nonquantum computing what the speed of light is to the speed of sound. “Quantum computers,” says Science Alert, “perform calculations based on the probability of an object’s state before it is measured – instead of just 1s or 0s – which means they have the potential to process exponentially more data compared to classical computers.” All of this culminates in Lesa Moné’s post on quantum computing and big data analytics. With quantum computers, Moné writes, the complex calculations called for by such analytics will be performed in seconds—and we are talking calculations that currently take years to solve (and are sometimes measured in the thousands of years). Quantum calculations will change the very nature of how we view the interaction of time and information processing. It’s something on par with the discovery of radio waves, but given that we’ll be crunching years into seconds, the social impact may be much, much larger.