Quantum, Classical Computing Combine to Tackle Tough Optimization Problems

Staff Report

Tuesday, April 26th, 2022

A research team led by the Georgia Tech Research Institute (GTRI) was recently selected for second-phase funding of a $9.2 million project aimed at demonstrating a hybrid computing system that will combine the advantages of classical computing with those of quantum computing to tackle some of the world’s most difficult optimization problems.
 
Over the next two years, the team plans to use several hundred quantum bits (qubits) made of trapped ions to put the unique capabilities of quantum computing systems to work on these challenges. The team, which also includes researchers from Georgia Tech’s School of Industrial and Systems Engineering, the National Institute of Standards and Technology (NIST), and Oak Ridge National Laboratory, has already demonstrated key elements of the system using a 10-qubit ion chain.
 
“The implications of a quantum solution to this optimization challenge could be dramatic,” said Creston Herold, a GTRI senior research scientist who is principal investigator for the program, which is known as Optimization with Trapped Ion Qubits (OPTIQ). “Previously intractable problems could be solvable, and computation time could be reduced from days to hours or minutes. That could allow optimization to be applied to many more tasks, improving operational efficiency, and saving time, money, and energy.”
 
The research is supported by the Defense Advanced Research Projects Agency (DARPA) as part of its Optimization with Noisy Intermediate-Scale Quantum Devices (ONISQ) program. Specifically, the GTRI-led team will use the Quantum Approximate Optimization Algorithm (QAOA) to tackle a difficult optimization challenge known as Max-Cut and related optimization problems.