Quantum computing: 4 things CIOs should know

Should quantum computing be on an IT leader's radar, or is it simply the stuff of research labs right now? If you're charting IT strategy, here's what you need to know
308 readers like this.

You’ve probably heard of quantum computing. But perhaps less obvious is whether it’s something you should care about in a practical sense or it’s best left to researchers in labs for now. If you’re charting IT strategy for your organization, the answer is a little bit of both.

What is quantum computing?

Let’s start with a very brief introduction to what quantum computing is.

Classical computing is built around the idea of binary bits and boolean logic. A bit can be physically represented as a switch with a value of 0 (off) or 1 (on). A large number of these switches, connected together using boolean logic gates (and, or, xor, and so forth) to perform all the complex operations of a modern microprocessor. Classical computing lends itself to a wide range of applications for which there are efficient algorithms even for very large problem sizes.

Quantum computing helps solve some problems that classical algorithms can't tackle in a reasonable time.

However, there are other types of problems. One is simulating complex physical systems such as molecules. Another is performing mathematical operations such as factoring the primes of large numbers – important for cryptography – for which efficient classical algorithms don’t exist. The running time to solve such problems can increase very rapidly — by an exponential or factorial factor — as problem size, such as the size of prime being factored, increases. As a result, there are some problems we know the steps to solve; we just can’t do so in a reasonable time.

[ Read also: Quantum computing and security: 5 looming questions. ]

Meet the Qubit

By contrast, quantum computing is built around the concept of qubits – quantum bits. Three quantum mechanical properties — superposition, entanglement, and interference — are used in quantum computing to manipulate the state of a qubit. Quantum gates act on one or more physical qubits and can be combined into larger circuits in a manner analogous to classical computing. Without going into all the hairy details, suffice it to say that quantum algorithms can be much faster than classical ones for certain classes of problems, such as those mentioned above.

With that background, what should you know about where quantum computing is and where it’s going?

1. Quantum will not replace classical computing

In a manner analogous to other types of specialty computing parts, such as GPUs and FPGAs, quantum computers should be thought of as devices that augment conventional general-purpose computers rather than replacing them. With this model, a core application executes on a traditional system — which also handles data storage and other infrastructure-related tasks — while the quantum computer handles only the subset of the overall task that’s best suited to its particular strengths.

Conceptually, this is similar to how GPUs quickly perform the large matrix calculations that are at the heart of many types of machine learning.

Conceptually, this is similar to how GPUs are very fast at performing the large matrix calculations that are at the heart of many types of machine learning, while conventional CPUs are still needed to perform all the associated computations and data movement that are needed to set up the core calculations and report the results.

There is one important distinction between quantum computing and most of the coprocessors that we’re familiar with today. Whereas specialty processors are often located in the same rack or even on the same board as conventional CPUs, quantum computers will be accessed over a network as a cloud service for the foreseeable future.

2. Quantum computing will be in a cloud

The cloud model is needed partly because current quantum computers require very specialized hardware and supporting infrastructure. For example, one approach to quantum computing uses superconductivity to create and maintain a quantum state; doing so requires keeping qubits at a temperature near absolute zero using a dilution refrigerator.

Quantum computing also isn’t yet at the point where truly mass-produced systems even exist; different approaches are still being tested out. In addition to the aforementioned superconducting method, quantum annealing and trapped ions are two other approaches being investigated by various companies.

End users should therefore expect to access quantum services rather than installing their own quantum computers on-premises for these and other reasons (as was often the case with early classical computers accessed as timesharing services).

3. It's early, but quantum is real

Given that it’s such early days, does it even make sense for most organizations to investigate quantum computing?

While some research is fairly secretive, there's ready access to quantum computing resources from some vendors.

For most, it probably shouldn’t be a priority at this point. However, organizations that view technology as an important differentiator and enabler, especially those in businesses like drug discovery, should at least keep abreast of developments. While some of the research into quantum computing is fairly secretive, there’s ready access to quantum resources from some vendors.

For example, Qiskit is an open source framework for quantum computing that provides tools for creating and manipulating quantum programs and running them on prototype quantum devices on IBM Q Experience or on simulators on a local computer.

Quantum computers aren’t yet at the point where they can solve meaningful problems faster than classical computers. But that will likely start to change later this decade. Forward-looking organizations can benefit from monitoring developments and even taking the opportunity to start experimenting to better understand what quantum computing can bring to their business.

4. Keep your eye on quantum-resistant cryptography

The potential of quantum computing to break existing cryptography schemes has been the subject of many breathless headlines. This is because there’s a quantum algorithm (Schor’s algorithm, published in 1995) for factoring integers that is almost exponentially faster than known classical methods. And it’s the difficulty, as in time required, to factor large integers that are the product of two prime numbers that’s at the heart of modern cryptography. It’s easy to multiply two even large primes together. It’s much harder to reverse the process.

Before you start panicking, it’s worth pointing out that the general consensus is that quantum computers won’t be big enough — have enough qubits — for this to become a problem for maybe another 15 years, depending upon your assumptions. And “post-quantum” or “quantum-proof” encryption protocols are already being standardized at the National Institute of Standards and Technology. So nothing actually to worry about, right?

Not quite, as Bob Sutor, VP of IBM Quantum Ecosystem Development at IBM Research writes in “Dancing with Qubits:” “For financial institutions, it can take ten years or more to implement new security technology.” He adds, “Of greater immediate importance is your data. Will it be a problem if someone can crack your database security in 15, 30, or 50 years? For most organizations, the answer is a loud YES. Start looking at hardware and software encryption support for your data using the new post-quantum security standards now.”

Be prepared

That sums up the situation for most organizations with respect to quantum computing nicely. Quantum doesn’t demand an all-hands-on-deck initiative for most organizations at this point. But it’s not science fiction, and it’s not even something that’s just of academic research interest. It’s early-but-real technology that is expected to have real impact in a tomorrow that may be coming faster than you realize.

[ How will emerging technologies affect the CIO role going forward? Read CIO role: 3 areas to focus on in 2020. ]

Gordon Haff is Technology Evangelist at Red Hat where he works on product strategy, writes about trends and technologies, and is a frequent speaker at customer and industry events on topics including DevOps, IoT, cloud computing, containers, and next-generation application architectures.