The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here:

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

profil image

Balazs Konya


profil image

A Step Towards a Computing Grid for the LHC Experiments: ATLAS Data Challenge 1


  • R. Sturrock
  • Paula Eerola
  • Balazs Konya
  • Oxana Smirnova

Summary, in English

The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity.

It should be noted that it was not an option to “run the complete production at CERN” even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and then carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the required computing resources, this exercise created worldwide momentum for ATLAS computing as a whole.

This report describes in detail the main steps carried out in DC1 and what has been learned from them as a step towards a computing Grid for the LHC experiments.


  • Particle and nuclear physics

Publishing year






Document type

Journal article




  • Subatomic Physics


  • ATLAS Computing Grid Data Challenge LHC