If the quantum computer is imperfect enough, classical computers can efficiently simulate its behavior. Quantum simulators that run on traditional computers can simulate the execution of quantum algorithms on a quantum system.
Physicists have developed a method for simulating the quantum approximate optimization algorithm on a standard computer. Instead of using advanced quantum processors, the new approach employs a classical machine-learning algorithm that closely resembles the behavior of near-term quantum computers.
Giuseppe Carleo, an EPFL professor, and Matija Medvidovi, a graduate student at Columbia University and the Flatiron Institute in New York, have discovered a way to run a complex quantum computing algorithm on traditional computers rather than quantum ones in a paper published in Nature Quantum Information.
The “quantum software” in question is known as the Quantum Approximate Optimization Algorithm (QAOA), and it is used to solve classical optimization problems in mathematics; it is essentially a method of selecting the best solution to a problem from a set of possible solutions. “There is a lot of interest in understanding what problems a quantum computer can solve efficiently, and QAOA is one of the more prominent candidates,” Carleo says.
Physicists have found a way to execute a complex quantum computing algorithm on traditional computers instead of quantum ones. The new approach uses a classical machine-learning algorithm that closely mimics the behavior of near-term quantum computers.
Finally, QAOA is intended to aid us in our quest for the infamous “quantum speedup,” the predicted increase in processing speed that we can achieve with quantum computers rather than conventional ones. Understandably, QAOA has a number of supporters, including Google, which has its sights set on quantum technologies and computing in the near future: in 2019, they created Sycamore, a 53-qubit quantum processor, and used it to run a task that a state-of-the-art classical supercomputer would take around 10,000 years to complete. The same task was completed in 200 seconds by Sycamore.
“But the barrier of “quantum speedup” is all but rigid and it is being continuously reshaped by new research, also thanks to the progress in the development of more efficient classical algorithms,” says Carleo.
Carleo and Medvidovi address a key open question in the field in their study: can algorithms running on current and near-term quantum computers provide a significant advantage over classical algorithms for tasks of practical interest? “In order to answer that question, we must first understand the limitations of classical computing in simulating quantum systems,” Carleo says. This is especially important because the current generation of quantum processors operate in an error-prone mode when running quantum “software,” limiting their ability to run algorithms of limited complexity.
Classical computers have distinct characteristics that will be difficult for quantum computers to match. Because quantum computers’ memory lasts only a few hundred microseconds at most, the ability to store data, for example, is unique to classical computers.
Using conventional computers, the two researchers devised a method that can roughly simulate the behavior of a subclass of algorithms known as variational quantum algorithms, which are methods of determining the lowest energy state, or “ground state,” of a quantum system. QAOA is an example of a family of quantum algorithms that researchers believe are among the most promising candidates for “quantum advantage” in near-term quantum computers.
The method is based on the idea that modern machine-learning tools, such as those used to teach complex games like Go, can also be used to learn and emulate the inner workings of a quantum computer. The key tool for these simulations is Neural Network Quantum States, an artificial neural network that Carleo developed with Matthias Troyer in 2016 and is now being used to simulate QAOA for the first time. The findings are considered to be in the domain of quantum computing, and they set a new standard for the future development of quantum hardware.
“Our work demonstrates that the QAOA you can run on current and near-term quantum computers can also be simulated with high accuracy on a classical computer,” Carleo says. “This does not imply that all useful quantum algorithms that can be run on near-term quantum processors can be classically emulated. Indeed, we hope that our approach will serve as a model for developing new quantum algorithms that are both useful and difficult to simulate on classical computers.”