What are you looking for ?
Advertise with us
Advertise with us

Micron: Using HPC to Solve World’s Largest Challenges

To assist with high-speed data collection, CERN CMS will use Micron CZ120 memory expansion modules, based on CXL standard, to improve ingestion and data processing chain for L1 Scouting.

Micron Jason Adlard
By J
ason Adlard, senior manager business development, compute and networking business unit, Micron Technology, Inc.

 

 

CERN, the European Council for Nuclear Research, is the world’s most renowned particle physics research facility. It operates the Large Hadron Collider (LHC), which is not only the largest particle accelerator in the world but also the largest machine.

Micron Cern Logos

The LHC recreates conditions similar to those just after the Big Bang by firing trillions of particles at each other at nearly the speed of light and generating up to 1 billion collisions each second. Sensors record what happens when these particles collide. These collisions can create an abundance of new particles – the building blocks of all matter – and generate massive volumes of raw data that can be challenging to capture, store and analyze.

CERN and its research teams are committed to using advanced technologies to assist in these challenges, including AI systems. This is where Micron comes in as the first and, to date, only memory company collaborating with CERN as members of its exclusive industry partner platform, openlab. Through this collaboration, Micron engineers are leveraging our high-performance memory innovations to assist CERN scientists in their most challenging experiments.

One such experiment is the Compact Muon Solenoid (CMS), which is a general-purpose detector at the LHC. The CMS’s function ranges from studying the Standard Model of physics (including the discovery of the Higgs boson) to searching for extra dimensions and dark matter. The CMS detector is built around a huge solenoid magnet that generates a field of 4 tesla, about 100,000x the magnetic field of the Earth. As of May 2022, it was one of the largest international scientific collaborations in history, involving about 5,500 particle physicists, engineers, technicians, students and support staff from 241 institutes in 54 countries.

The CMS is planning an extensive upgrade of the detector for operation at the High-Luminosity LHC starting in 2029. It will use more advanced event-selection algorithms, including ML inference, and introduce a novel data-collection system known as Level-1 (L1) Scouting as part of the L1 trigger. This system collects and stores the reconstructed particle information, providing vast amounts of data for detector diagnostics, luminosity measurements and the study of otherwise inaccessible physics signatures.

The High-Luminosity LHC (H-L LHC) will be an extensive upgrade of the Large Hadron Collider (LHC). The HL-LHC is expected to increase the number of collisions by a factor of … where luminosity refers to … and is an indicator of the collider’s performance.

2 Micron CZ120 memory expansion modules

Micron Cz120 Cxl Twin Images Blog

To assist with high-speed data collection, CERN CMS will use Micron CZ120 memory expansion modules, based on the CXL standard, to improve the ingestion and data processing chain for L1 Scouting. The performance advantages of CZ120 memory expansion modules – in terms capacity, bandwidth, and flexibility – will provide CMS coherent and seamless access to buffered data from multiple processors and compute accelerators, as well as a low-latency access/short-term storage space for both raw and processed data at scale.

This project continues a previous collaboration between CERN and Micron. The first project, successfully operated between 2019 and 2022, focused on accelerated ML inference for triggering and data acquisition. With our new project, Micron will support CERN with memory solutions that will allow faster processing of data and shorten the time insight.

Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E
RAIDON