The development of quantum computing, called to revolutionize computing as we know it by exponentially increasing the computing power of machines, has been undergoing notable advances in recent years. The last one is starring IBM: the US company will present this Tuesday, at its own event, Eagle, its 127-bit quantum processor, or qubits. Its power is double that of Zuchongzhi, developed by engineers from the University of Science and Technology of China and the Tsinghua University of Beijing and which until now was the most advanced and which, according to its creators published in the magazine *Science*, had managed to solve in about three minutes a random number generation problem in which the most powerful classical supercomputers on the planet would have invested 600 million years.

IBM’s new processor has the ability to pulverize that brand. “Eagle is a milestone because it surpasses the 100 qubit barrier. It has already reached the limit where its computing power can no longer be simulated with classic processors ”, says Zaira Nazario, technical manager of Quantum Computing Theory and Applications of the company, by video call. According to the company itself, the number of classical bits needed to equal the computing power of the 127-qubit processor exceeds the total number of atoms in the more than 7.5 billion people alive today.

The advance is important, but we are still far from quantum computers taking computing to an unknown level. For that, its power will need to be around one million qubits. “The arrival of the Eagle processor is an important step towards the day when quantum computers can surpass classical computers in significant levels,” contextualizes the Spanish Darío Gil, vice president of IBM and Director of Research, in a statement. The technology company intends to have a new 433 qubit processor ready next year and, by 2023, another 1,121.

IBM and Google are leading the race to produce the first quantum computer for commercial use, a competition in which other companies such as Microsoft or Intel also participate. That at the business level, because geopolitically the game is played by the US and China with Europe as an observer. Continuing with this reading, the US can score a goal, although it has all the ballots to lose the match. The figures are stubborn when it comes to investment in R&D. And China’s outlay is unrivaled: between 2017 and 2020 it contributed some 10 billion dollars to the quantum computing programs of its research centers. The US wants to dedicate 1,200 million until 2023, while the EU will put 1,000 million until 2026.

### Theoretical physics turned into technology

As the name suggests, quantum computing harnesses the fundamental quantum nature of matter at subatomic levels to offer the possibility of vastly greater computing power. Conventional computers work with a binary system: that of the digits 0 and 1 (hence the term digital). Those 0’s and 1’s, the bits, are translated in the physical world into tiny electrical currents that are produced in transistors. In a modern, state-of-the-art chip there are billions of transistors, capable of performing complex operations in seconds. But, no matter how miniaturization advances, there will come a time when no more transistors can be put on a single chip.

Quantum computing breaks down these physical barriers with a proposal that defies understanding: instead of using transistors that can generate 0 or 1 states, it uses so-called quantum bits, or qubits, which can be at 0 or 1 and also in a superposition of both states. This superposition of states, as well as other properties such as quantum entanglement, is what enables an exponentially greater computational capacity (the number of operations grows exponentially, 2 raised to n). With 2 qubits you can do four operations; with 10, 1,024, and so on.

Developing the necessary infrastructure to host and exploit qubits is extremely complex. They use microwaves, ion traps, or superconducting rings. Engineers have had to deal with problems such as processor cooling (qubits need to operate in temperatures close to absolute zero, -273 degrees) or total isolation from their environment, while any interaction (such as noise) can destabilize them.

It is difficult to know how far these new computers will go if they continue to be refined. For now, they are expected to significantly promote the research of new materials, the development of medicines, the exploration of the universe or solve problems related to machine learning (*machine learning*), the most promising artificial intelligence technique of the moment.

The cryptography used today would be exposed when quantum computing reaches a certain stage of maturity. “If you create a revolutionary technology, you also have a responsibility to mitigate the risks it brings,” says Nazario. “In this case, other cryptographic mechanisms have been developed that quantum computing cannot break. Institutions that want to keep their data safe for decades should bet on these methods now ”.

*You can follow EL PAÍS TECNOLOGÍA at** Facebook** and **Twitter** or sign up here to receive our **weekly newsletter**.*