News

Powerful Grid Set To Handle Collider Data

A high-performance computing network called the Worldwide LHC Computing Grid (WLCG) will be formally introduced on Friday as part of an international scientific collaboration investigating particle physics, including "Big Bang"-type calculations.

The computer grid will process an expected 15 million gigabytes of data generated annually. Researchers at CERN, a European nuclear physics body, will measure high-energy proton collisions generated in a ring-like device called the Large Hadron Collider (LHC), an underground particle accelerator located near Geneva.

The WLCG, comprising more than 140 computer centers in 33 countries, will use purpose-built technologies for the project.

"[It will] combine processing power and storage capacity to create a massively powerful and geographically distributed supercomputing resource for physics and a growing number of other disciplines," according to Ian Bird, project manager of the WLCG, in a letter introducing the organization.

The vast amount of data generated by the LHC -- equivalent to filling about 3 million DVDs every year or a stack of CDs more than 20 kilometers high -- can only be handled by a grid of connected computers sharing processing power and storage and then further distributing the information to individual users.

The WLCG distributes its resources through three tiers located in 33 countries. Tier 0 is the CERN Computing Center, a central hub that provides less than 20 percent of the total computer capacity. Tier 1 includes sites in Canada, France, Germany, Italy, the Netherlands, Nordic countries, Spain, Taipei, the United Kingdom and two sites in the United States. Tier 2 constitutes 140 sites in 60 "federations" in 33 countries that will provide around 50 percent of the capacity needed to process the LHC data. The tier 2 sites will feed their data to physics institutes throughout the world, supplying both scientists and individuals.

Conceived nine years ago, the Worldwide LHC Computing Grid will be "a massively powerful and geographically distributed supercomputing resource for physics and a growing number of other disciplines," Bird concluded.

About the Author

Jim Barthold is a freelance writer based in Delanco, N.J. covering a variety of technology subjects.

Featured

  • MIT Finds Only 1 in 20 AI Investments Translate into ROI

    Despite pouring billions into generative AI technologies, 95 percent of businesses have yet to see any measurable return on investment.

  • Report: Cost, Sustainability Drive DaaS Adoption Beyond Remote Work

    Gartner's 2025 Magic Quadrant for Desktop as a Service reveals that while secure remote access remains a key driver of DaaS adoption, a growing number of deployments now focus on broader efficiency goals.

  • Windows 365 Reserve, Microsoft's Cloud PC Rental Service, Hits Preview

    Microsoft has launched a limited public preview of its new "Windows 365 Reserve" service, which lets organizations rent cloud PC instances in the event their Windows devices are stolen, lost or damaged.

  • Hands-On AI Skills Now Outshine Certs in Salary Stakes

    For AI-related roles, employers are prioritizing verifiable, hands-on abilities over framed certificates -- and they're paying a premium for it.