The world’s biggest machine, the Large Hadron Collider, was built to help answer some of the most important questions in physics. To do that, the scientists behind the particle collider have to be able to process and understand the massive amounts of data from the machine. They want to be able to tell whether certain particles are produced in high--energy collisions taking place at nearly the speed of light.
The LHC can produce over a petabyte of data per second from one billion particle collisions, requiring about one million processor cores spread out around the world to analyze and understand what would otherwise be chaos. What does all that data mean?
This is one of the most staggering problems facing Jennifer Glick, an IBM researcher whose work is to find big problems that can benefit from quantum computing and then either try to solve them with existing quantum algorithms or create new ones for the purpose.
Quantum computing promises enormous advances in processing power over classical computing for certain problems that are intractably large or time-consuming for classical computers—the kind of problems Glick looks for. A quantum computer’s strength can be credited to the superposition and entanglement of quantum bits, or qubits, which offer an exponentially large computational space. For example, 50 perfect qubits can represent over a quadrillion states to explore.
Still, it’s a technology in its very early days. In two years at IBM, Glick has helped lead an effort to create partnerships that bring quantum technology into the real world. She spends a lot of her time hunting for problems and then developing and demonstrating ways in which a quantum computer could solve them faster than a classical one.
“What we’re looking at for the Large Hadron Collider is to use a quantum algorithm to predict whether or not a certain particle was produced,” she says. “Was that the particle I think was produced or not?”
In 2019, Glick and her colleagues tackled another big but more workaday problem with the banking giant Barclays. The challenge was managing the quadrillions of dollars processed each year in securities transaction settlements. These occur, for instance, when a financial institution buys shares, bonds, or derivatives. Clearinghouses must run complex optimization algorithms on the transactions to settle as many of them as possible within technical and legal constraints.
The results of the team’s research indicate that quantum technology could make this process more efficient, speeding up the time between trade and settlement. “When someone gives you an industry or business problem, there’s a lot of complications to start out with. It’s a very complex, gnarly problem,” Glick says. “Part of it is breaking it down into simpler pieces to be able to identify where the bottlenecks are with respect to classical computing methods that are being used today. And can any of those bottlenecks be removed by an quantum approach?"