DocumentCode :
2622929
Title :
Distributed computing and data analysis in the CMS Experiment
Author :
Kreuzer, P. ; Spiga, D.
Author_Institution :
RWTH Aachen IIIA (Germany)
fYear :
2008
fDate :
19-25 Oct. 2008
Firstpage :
1979
Lastpage :
1983
Abstract :
The CMS Experiments at the Large Hadron Collider (LHC) at CERN/Geneva is expected to start taking data during summer 2008. The CMS Computing, Software and Analysis projects will need to meet the expected performances in terms of data archiving, calibration and reconstruction at the host laboratory, as well as data transferring to many computing centers located around the word, where further archiving and re-processing will take place. Hundreds of physicists will then expect to find the necessary infrastructure to easily access and start analysing the long awaited LHC data. In recent years, CMS has conducted a series of Computing, Software, and Analysis challenges to demonstrate the functionality, scalability and usability of the relevant components and infrastructure. These challenges have been designed to validate the CMS distributed computing model [1] and to run operations in quasi-real data taking conditions. We will present the CMS readiness in terms of data archiving, offline processing, data transferring and data analysis. We will particularly focus on the achieved metrics during 2008 and potentially on first data taking experiences.
Keywords :
Calibration; Collision mitigation; Data analysis; Distributed computing; Laboratories; Large Hadron Collider; Performance analysis; Scalability; Software performance; Usability;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Nuclear Science Symposium Conference Record, 2008. NSS '08. IEEE
Conference_Location :
Dresden, Germany
ISSN :
1095-7863
Print_ISBN :
978-1-4244-2714-7
Electronic_ISBN :
1095-7863
Type :
conf
DOI :
10.1109/NSSMIC.2008.4774773
Filename :
4774773
Link To Document :
بازگشت