Machine learning and computation frameworks

Quantum computers use the fundamental quantum nature of reality at the subatomic scale to achieve massive computational speedups over classical, transistor-based computing machines. And machine learning is a rapidly growing field in computer science that allows computers to learn using statistical techniques and data. Such algorithms have had a huge impact on the areas of pattern matching and classification.

Applications at Fermilab including classifying different types of neutrino interactions in detectors, determining a particle’s path from a string of hits in a tracking chamber, and separating stars from galaxies in astrophysics sky surveys. Research efforts to build quantum computers are also poised to pay off in time to help solve a number of massive data processing and analysis problems due to become an issue in the middle of the next decade as the Deep Underground Neutrino Experiment and the High-Luminosity LHC begin to operate.

Fermilab scientists are studying algorithms using quantum associative memory and graph networks to solve particle tracking problems for the High-Luminosity Large Hadron Collider. Additionally, they’re studying ways to use quantum machine learning algorithms for anomaly detection in the trigger mechanisms in HEP experiments. The data volume at a particle physics experiment is often overwhelming, and even if scientists employ the best technology, they can record only a small fraction of it. Therefore, the algorithms that select which events we save for later analysis are crucial.

Of course, with increased data volumes we will require better data access methods, so researchers are also exploring the use of quantum computers for very fast data indexing and recall.

Caltech is the lead institution on this initiative. The Massachusetts Institute of Technology and the University of Southern California are participating institutions.