The challenges of quantum computing

Introduction

The quantum computer holds the promise of instantly finding the solution that best combines with a problem, whereas a classical computer is required to evaluate all the possibilities one by one before finding the right one. It is with this idea that IT players, then research labs and now states have decided to start the race to design this incredible machine. Among the competitors, let’s mention IBM, the most advanced, and France, with the most proactive policy.

Just think: if it works as predicted, the quantum computer will say in real time the best military strategy in a given context, the formula for the best vaccine against a virus, the most optimal possible distribution of a production capacity, the key to bypassing the smallest problem.

Embarking on such a race is not just about seeking to take advantage of quantum computing first. It is also a means of protecting oneself against the offensives that an opponent may perform when equipped with such a weapon. Because yes, one of the first uses of the quantum computer, one of the simplest to implement, will be to break the encryption of private messages. With such a starting point, you don’t even have to worry about figuring out how to catch up – industrially, economically, militarily – you just have to read the secrets of the actor who has the best position and get ahead of his strategy.

Problem, the quantum computer is far from being built. In principle, it involves replacing the transistors of a conventional computer – which transmit information or not depending on a state – with particles that have quantum properties. These properties are properties of matter on a subatomic scale: a particle carries several pieces of information at once (superposition), each with a probability of existence that fluctuates according to external circumstances. Superposition teleports between particles (entanglement) to connect the steps of an algorithm and stops (decoherence) when the information is read.

Build a quantum computer

To entangle particles, put them in a superimposed state and freeze their most salient information at the desired time is the work of IBM, Honeywell and to a lesser extent Google, Microsoft, Intel, or even a myriad of European, American works. and Russian laboratories. IBM cryogenizes atoms in a tank of liquid helium. The Honeywell laser bombards yttrium ions in a gas. The others are experimenting with superconducting circuits, trapped ions, nuclear magnetic resonance, optical networks.

Nothing works. Particles that must be in a superimposed state – called qubits, i.e. quantum bit – break their entanglement or fall into decoherence before the end of the experiment. Not systematically. But the more you put together, the higher the error rate. It is boring: the amount of information and the amount of steps in a process go hand in hand with the complexity of the algorithms that can be executed.

To date, embryos of quantum computers, which one or the other has managed to operate to the end, have served only to execute simple algorithms. So simple that a basic PC would have mastered it just as quickly. Reducing the error rate, multiplying qubits in the system, making the experiment last as many cycles as required by an industrial, military or scientific algorithm; this is the problem. Until that is achieved, conventional computers will remain the safe bet. We grope, we estimate dates for the next chip jump.

IBM and Honeywell have agreed on one point. When they build a new quantum computer prototype, they express its power in terms of the number of qubits they have managed to pack into it. But after several months of experience, potentially more than a year, they no longer state the strength of their prototype except in “quantum volume”. This corresponds to the average number of qubits that actually last until the end of an experiment.

Program the quantum computer

The second challenge is to program the famous circumstances that cause the probability of the existence of information to wave one way or another. At the current stage of progress in quantum computing, developers are at a loss.

It’s a bit like they have to throw away all the languages ​​and all the functional environments that they’ve been using for decades to relearn how to program in a rudimentary way, in assembly, like in the 1980s. It is no longer. even a matter of pushing information into a box, of increasing it, of comparing it. You have to do it with mathematical probabilities.

In this context, no company is ready to know how to use a quantum computer.

Alongside the design of the machine itself, an R&D dedicated to the creation of quantum development tools has therefore been set up. This activity is in itself a new technological race. The first phase of this competition is more or less reached. : provide a simulated quantum computer that, if it doesn’t have the power, is at least used to execute the codes, to validate that we understood how to write them. So to build on that step by step libraries of functions.

The Americans IBM, AWS, Google, Microsoft have decided to offer such a quantum simulator that can be used from their respective clouds. That from IBM would have the advantage of having the codes executed from time to time by one of the true quantum computer prototypes.

In Europe, it is above all the French Atos – a major supplier of supercomputers and development consultancy providers to CAC40 companies – that is doing well. Its QLM (Quantum Learning Machine) simulator is a cluster of turnkey servers that can be installed in the secret data centers of private large companies or research institutes and, above all, already programmable with high-level languages.

The strategy around Atos is simple. First, sell QLM to supercomputer customers as if it were an acceleration module. So take advantage of the fact that Atos is French to make it the locomotive of a quantum sovereignty campaign launched by President Emmanuel Macron.

Networks of quantum and classical computers

Latest challenge: the hybridization between quantum and classical computers. At the beginning of the year 2022, the various suppliers and customers who embarked on the adventure realized that it would become more realistic to prepare the data on supercomputers that they already mastered, and not to reserve the quantum computation only for the few most critical operations .

An intermediate solution that brings its own set of problems because quantum players now realize that operating systems and physical networks are missing to bridge the gap between a conventional processor and a quantum processor. Existing communication protocols produce too much background noise. Current network cables add too much heat. Using one and the other only serves to further increase the number of errors in experiments.

Other research laboratories are working to solve this problem. CERN develops sensors, time synchronization protocols, a global theory of quantum information. Chalmers University in Sweden is working on a thermometer that will act as a metronome to only roll out communication at times that meet the right conditions. Finally, Intel engineers are developing a network chip that will operate from the cryogenic tank in the quantum processor.

Implicitly, observers wonder how so many different options, to test so many opposing directions, will ultimately lead to a standard for quantum computers, development tools, and hybrid networks. Getting everyone to agree might be the next big project.

Leave a Comment