![]() “These signals make up a complex data space,” says Michael Kagan from SLAC, who works on ATLAS and was also an author on the Nature review. The gigantic ATLAS and CMS detectors at the LHC, which enabled the discovery of the Higgs boson, each have millions of sensing elements whose signals need to be put together to obtain meaningful results. Machine learning has proven extremely successful in the area of analysis. “Machine learning plays a role in almost all data aspects of the experiment, from triggers to the analysis of the remaining data,” he says. In LHCb, an experiment that could shed light on why there is so much more matter than antimatter in the universe, machine learning algorithms make at least 70 percent of these decisions, says LHCb scientist Mike Williams from the Massachusetts Institute of Technology, one of the authors of the Nature summary. To handle the gigantic data volumes produced in modern experiments like the ones at the LHC, researchers apply what they call “triggers” – dedicated hardware and software that decide in real time which data to keep for analysis and which data to toss out. “Compared to a traditional computer algorithm that we design to do a specific analysis, we design a machine learning algorithm to figure out for itself how to do various analyses, potentially saving us countless hours of design and analysis work,” says co-author Alexander Radovic from the College of William & Mary, who works on the NOvA neutrino experiment. ![]() They partner with a form of artificial intelligence called machine learning that learns how to do complex analyses on its own.Ī group of researchers, including scientists at the Department of Energy’s SLAC National Accelerator Laboratory and Fermi National Accelerator Laboratory, summarize current applications and future prospects of machine learning in particle physics in a paper published today in Nature. Luckily, particle physicists don’t have to deal with all of that data all by themselves. Even after reduction and compression, the data amassed in just one hour is similar to the data volume Facebook collects in an entire year – too much to store and analyze. Experiments at the Large Hadron Collider (LHC), the world’s largest particle accelerator at the European particle physics lab CERN, produce about a million gigabytes of data every second. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |