Worldwide LHC Computing Grid

Worldwide LHC Computing Grid

The Worldwide LHC Computing Grid (WLCG), formerly (until 2006)[1] the LHC Computing Grid (LCG), is an international collaborative project that consists of a grid-based computer network infrastructure incorporating over 170 computing centers in 36 countries, as of 2012. It was designed by CERN to handle the prodigious volume of data produced by Large Hadron Collider (LHC) experiments.[2][3]

By 2012 data from over 300 trillion (3 x 1014) LHC proton-proton collisions had been analyzed,[4] LHC collision data was being produced at approximately 25 petabytes per year, and the LHC Computing Grid had become the world's largest computing grid (as of 2012), comprising over 170 computing facilities in a worldwide network across 36 countries.[4][5][6]

Background

The Large Hadron Collider at CERN was designed to prove or disprove the existence of the Higgs boson, an important but elusive piece of knowledge that had been sought by particle physicists for over 40 years. A very powerful particle accelerator was needed, because Higgs bosons might not be seen in lower energy experiments, and because vast numbers of collisions would need to be studied. Such a collider would also produce unprecedented quantities of collision data requiring analysis. Therefore advanced computing facilities were needed to process the data.

Description

A design report was published in 2005.[7] It was announced to be ready for data on 3 October 2008.[8] A popular 2008 press article predicted "the internet could soon be made obsolete" by its technology.[9] CERN had to publish its own articles trying to clear up the confusion.[10] It incorporates both private fiber optic cable links and existing high-speed portions of the public Internet. At the end of 2010, the Grid consisted of some 200,000 processing cores and 150 petabytes of disk space, distributed across 34 countries.[11]

The data stream from the detectors provides approximately 300 GByte/s of data, which after filtering for "interesting events", results in a "raw data" stream of about 300 MByte/s. The CERN computer center, considered "Tier 0" of the LHC Computing Grid, has a dedicated 10 Gbit/s connection to the counting room.

The project was expected to generate 27 TB of raw data per day, plus 10 TB of “event summary data”, which represents the output of calculations done by the CPU farm at the CERN data center. This data is sent out from CERN to eleven Tier 1 academic institutions in Europe, Asia, and North America, via dedicated 10 Gbit/s links. This is called the LHC Optical Private Network.[12] More than 150 Tier 2 institutions are connected to the Tier 1 institutions by general-purpose national research and education networks.[13] The data produced by the LHC on all of its distributed computing grid is expected to add up to 10–15 PB of data each year.[14] In total, the four main detectors at the LHC produced 13 petabytes of data in 2010.[11]

The Tier 1 institutions receive specific subsets of the raw data, for which they serve as a backup repository for CERN. They also perform reprocessing when recalibration is necessary.[13] The primary configuration for the computers used in the grid is based on Scientific Linux.

Distributed computing resources for analysis by end-user physicists are provided by the Open Science Grid, Enabling Grids for E-sciencE,[13] and LHC@home projects.

See also

  • Openlab (CERN)


References

External links

  • de:LHC Computing Grid

el:Υπολογιστικό Πλέγμα Μεγάλου Επιταχυντή Αδρονίων pt:WLCG ru:LHC Computing Grid