Title of article :
CMS distributed data analysis challenges
Author/Authors :
Grandi، نويسنده , , C.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2004
Pages :
7
From page :
87
To page :
93
Abstract :
In Spring 2004, CMS will undertake a 100 TeraByte-scale Data Challenge (DC04) as part of a series of challenges in preparation for running at CERNʹs Large Hadron Collider. During 1 month, DC04 must demonstrate the ability of the computing and software to cope with a sustained event data-taking rate of 25 Hz, for a total of 50 million events. The emphasis of DC04 is on the validation of the first pass reconstruction and storage systems at CERN and the streaming of events to a distributed system of Tier-1, and Tier-2 sites worldwide where typical analysis tasks will be performed. It is expected that the LHC Computing Grid project will provide a set of grid services suitable for use in a real production environment, as part of this data challenge. The results of this challenge will be used to define the CMS software and computing systems in their Technical Design Report.
Journal title :
Nuclear Instruments and Methods in Physics Research Section A
Serial Year :
2004
Journal title :
Nuclear Instruments and Methods in Physics Research Section A
Record number :
2202618
Link To Document :
بازگشت