News

Powerful Grid Set To Handle Collider Data

A high-performance computing network called the Worldwide LHC Computing Grid (WLCG) will be formally introduced on Friday as part of an international scientific collaboration investigating particle physics, including "Big Bang"-type calculations.

The computer grid will process an expected 15 million gigabytes of data generated annually. Researchers at CERN, a European nuclear physics body, will measure high-energy proton collisions generated in a ring-like device called the Large Hadron Collider (LHC), an underground particle accelerator located near Geneva.

The WLCG, comprising more than 140 computer centers in 33 countries, will use purpose-built technologies for the project.

"[It will] combine processing power and storage capacity to create a massively powerful and geographically distributed supercomputing resource for physics and a growing number of other disciplines," according to Ian Bird, project manager of the WLCG, in a letter introducing the organization.

The vast amount of data generated by the LHC -- equivalent to filling about 3 million DVDs every year or a stack of CDs more than 20 kilometers high -- can only be handled by a grid of connected computers sharing processing power and storage and then further distributing the information to individual users.

The WLCG distributes its resources through three tiers located in 33 countries. Tier 0 is the CERN Computing Center, a central hub that provides less than 20 percent of the total computer capacity. Tier 1 includes sites in Canada, France, Germany, Italy, the Netherlands, Nordic countries, Spain, Taipei, the United Kingdom and two sites in the United States. Tier 2 constitutes 140 sites in 60 "federations" in 33 countries that will provide around 50 percent of the capacity needed to process the LHC data. The tier 2 sites will feed their data to physics institutes throughout the world, supplying both scientists and individuals.

Conceived nine years ago, the Worldwide LHC Computing Grid will be "a massively powerful and geographically distributed supercomputing resource for physics and a growing number of other disciplines," Bird concluded.

About the Author

Jim Barthold is a freelance writer based in Delanco, N.J. covering a variety of technology subjects.

Featured

  • The 2021 Microsoft Product Roadmap

    From Windows 10X to the next generation of Microsoft's application server products, here are the product milestones coming down the pipeline in 2021.

  • After High-Profile Attacks, Biden Calls for Better Software Security

    Recent high-profile security attacks have prompted the Biden administration to issue an executive order aiming to tighten software security practices across the board.

  • With Hybrid Networks on Rise, Microsoft Touts Zero Trust Security

    Hybrid networks, which combine use of cloud services with on-premises software, require a "zero trust" security approach, Microsoft said this week.

  • Feds Advise Orgs on How To Block Ransomware Amid Colonial Pipeline Attack

    A recent ransomware attack on a U.S. fuel pipeline company has put a spotlight on how "critical infrastructure" organizations can prevent similar attacks.