Provided by: University of Economics, Prague
Date Added: Mar 2008
The start of data taking at the Large Hadron Collider (LHC) will herald a new era in data volumes and distributed processing in particle physics. Data volumes of hundreds of terabytes will be shipped to tier-2 centers for analysis by the LHC experiments using the Worldwide LHC Computing Grid (WLCG). In many countries Tier-2 centers are distributed between a number of institutes, e.g., the geographically spread tier-2s of GridPP in the UK. This presents a number of challenges for experiments to utilize these centers efficaciously, as CPU and storage resources may be sub-divided and exposed in smaller units than the experiment would ideally want to work with.