Friday, May 18, 2018

The present and future of Quantum Computing

Google, Intel, IBM, Alibaba, and many other established companies and startups alike are racing to see who can overcome the limitations that the nature itself imposes on the current semiconductor fabrication process technology. Their hope is to enter the quantum realm, where the laws of physics don’t work as we’re used to and develop the next generation of supercomputers: quantum computers. 



The End of Moore's Law

Ever since Gordon Moore, the co-founder of Fairchild Semiconductor and Intel, observed in 1965 that the number of components per integrated circuit doubles roughly every two years, the advancements in digital electronics have been following a predictable pattern.

In 2010, the International Technology Roadmap for Semiconductors predicted a slowdown of the trend, and Moore himself foresaw the end of an era when he stated that Moore's law would be dead in the next decade or so.

"Now we're getting to the point where it's more and more difficult, and some of the laws are quite fundamental", said Moore in an interview in 2015. "We're very close to the atomic limitation now. We take advantage of all the speed we can get, but the velocity of light limits performance. These are fundamentals I don't see how we [will] ever get around. And in the next couple of generations, we're right up against them."

Currently, most computer chips are being manufactured using the 14-nanometer lithography process. For comparison, the human immunodeficiency virus (HIV) is roughly spherical with a diameter of about 120 nm. The next step is the 10-nanometer lithography process. At this size, the wires inside integrated circuits become so small that the barrier layer takes up most of the interconnect, leaving less space for the copper itself. 

While approaching the size of an atom is extremely difficult, getting past it is a whole another story, which is where quantum computing comes in.

The Beginning of Quantum Era

"The exciting thing about quantum computers is that they work fundamentally differently from today's computers", explains Terry Hickey, IBM Global Business Services Business leader for Cognitive & Analytics.

In traditional computers, bits are the smallest units of information. Each bit can have one of two possible values: a one or a zero. As such, a bit roughly corresponds to a switch with one on position and one off position.

"In contrast, a quantum computer makes use of quantum bits, or qubits, where each can represent a one, a zero, or both at once, which is known as superposition. This property along with other quantum effects enable quantum computers to perform certain calculations vastly faster than classic computers", says Hickey. 

A qubit can be any two-level quantum system, such as the polarization encoding of a photon, the electronic spin of an electron, or the nuclear spin of a nucleus. "We can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'", says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

Entangled states are closed connections that make each of the qubits react to a change in the other's state instantaneously, no matter how far apart they are. This quantum phenomenon makes it possible to directly deduce properties of all entangled qubits by measuring just one entangled qubit, allowing quantum computers to store massive amounts of information using less energy than traditional computers.

Quantum Supremacy

Quantum computers are exciting and promising, but they're still outperformed by machines that rely on the same manufacturing process that has fueled the information age - so far. We're rapidly approaching a point called quantum supremacy, which is when quantum machines will be able to do calculations that existing computers aren't capable of processing.

"Hopes of reaching quantum supremacy have been dashed before. For some time, researchers thought that a 49-qubit machine would be enough, but last year researchers at IBM were able to simulate a 49-qubit quantum system on a conventional computer", write Martin Giles and Will Knight.

Intel unveiled its 49-qubit quantum computer early in 2018, at the International Consumer Electronics Show (CES), falling just one qubit behind IBM's 50-qubit prototype quantum computer. In March 2018, however, Google took the quantum crown when the company introduced its new quantum processor, codenamed Bristlecone, with 72 qubits.

"We are cautiously optimistic that quantum supremacy can be achieved with Bristlecone, and feel that learning to build and operate devices at this level of performance is an exciting challenge! We look forward to sharing the results and allowing collaborators to run experiments in the future", posted Julian Kelly, Research Scientist at Google's Quantum AI Lab.

"Operating a device such as Bristlecone at low system error requires harmony between a full stack of technology ranging from software and control electronics to the processor itself. Getting this right requires careful systems engineering over several iterations."

However, even Google's quantum workhorse has nowhere near as many qubits as best D-wave systems. This Vancouver-based company sells a quantum computer chip, called D-Wave 2000Q, with 2048 physical qubits.

"With 2000 qubits and new control features, the new system can solve larger problems than was previously possible, with faster performance, providing a big step toward production applications in optimization, cybersecurity, machine learning, and sampling", stated D-Wave in an official announcement.

The first customer for the D-Wave 2000Q quantum computer chip was Temporal Defense Systems Inc. (TDS), a cutting-edge cybersecurity firm. Many researchers, however, believe that the D-wave systems are not true quantum computers because the underlying ideas for the D-Wave approach arose from experimental results in condensed matter physics, not from the conventional quantum information field.

Use Cases for Quantum Computing

A research report from Morgan Stanley predicts quantum computing to double the high-end computing market from 5 billion USD to 10 billion USD. "Quantum computing is at an inflection point- moving from fundamental theoretical research to an engineering development phase, including commercial experiments", the report states.

According to Morgan Stanley, traditional transistor won't go to Silicon Heaven any time soon because quantum computing is not suited to all compute tasks. The vast majority of devices, including smartphones, personal computers, and web servers will continue to run on current technology, but a small subset of high-end compute platform may start its transition to quantum technology not long from now, around 2025.

"The classical computer is a large calculator that is very good at calculus and step-by-step analytics, whereas the quantum computer looks at solving problems from a higher point of view. Quantum computing does not make the classic computer irrelevant-smartphones and laptops will still use transistors for the foreseeable future, and the transition might take several years."

Even though quantum computers are not yet ready for primetime, we can already identify several use cases for quantum computing, including those described below. 

Cryptography

Today's cryptography is extremely secure against brute force attacks executed using traditional computers. For example, it would take even the most powerful supercomputer in the world such an absurdly huge amount of time to break a symmetric 256-bit key that even our sun wouldn't be here to finally see someone succeed.

But quantum computers operate on different principles to traditional computers, and they are especially great for solving particular mathematical problems, such as those underlying modern cryptography.

"For public key cryptography, the damage from quantum computers will be catastrophic", said Lily Chen, mathematician and leader of the National Institute of Standards and Technology's Cryptographic Technology Group, in a session at the American Association for the Advancement of Science's 2018 annual meeting in Austin, Texas.

One possible approach how to secure cryptography against quantum computers might involve the use of quantum-based cryptographic systems. As such, the very technology that could render today's cryptography obsolete could also give us the cryptography of tomorrow. 

Simulations

In September 2017, the peer-reviewed academic journal of the American Association for the Advancement of Science, reported that a quantum computer simulated the behavior of beryllium hydride. While beryllium hydride is a relatively simple molecule, its simulation is seen by physicists and chemists as a step toward a new way to discover drugs and materials.

"Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers aren't very good at all", write Abigail Beall and Matt Reynolds in their Wired article. "Eventually, researchers hope they'll be able to use quantum simulations to design entirely new molecules for use in medicine."

Simulations run on quantum computers could also make existing treatments, such as radiation therapy, more effective. Current software and hardware used by medical dosimetrist to calculate the best possible radiation dose cannot guarantee optimal results as there are far too many variables to take into account. Quantum computers of the future could, however, do just that and make this and many other existing treatments far more effective. 

Artificial Intelligence

"The promise is that quantum computers will allow for quick analysis and integration of our enormous data sets which will improve and transform our machine learning and artificial intelligence capabilities", writes internationally best-selling author and keynote speaker Bernard Marr.

According to Seth Lloyd, a physicist at the Massachusetts Institute of Technology and a quantum-computing pioneer, 60 qubits should be enough to encode an amount of data equivalent to that produced by humanity in a year.

The ability to store and process massive amounts of data makes quantum computers perfect for machine-learning techniques, which are responsible for getting computers to act without being explicitly programmed.

Conclusion

There are still many challenges for quantum computing, but researchers across multiple fields have been able to achieve great success and overcome obstacles that would seem insurmountable not too long ago. We don't know yet exactly what advancements quantum computing will enable. It seems, however, that it could be a major driving element of a fourth industrial revolution and usher in an era of rapid productivity growth. 

Sources