Here is how the problem goes: The Atlas experiment in CERN generates a LOT of data per second. How much? The CMS is a 75 Megapixels camera which takes snapshots of the particle collisions inside the collision detector. Though 75 Megapixels may not sound too much, you will be amazed if you hear that this camera can take up to 40 million snapshots per second! This camera will try to catch the best moments of the one-billion collisions that happens every second. The data that are generated is about 1 petabyte per second! That is 1015 bytes, or 1.000 TERA-bytes per second.
In order to store all those data it would literally require to fill 1.000 HDDs of 1Tb capacity every second, which is practically impossible. So what do they do?
The trick is to filter these data. This filtering is done in two stages. The first stage is a hardware filter called "the trigger", which limits down the 40 million snapshots/sec down to some 100.000 snapshots/sec. The trigger gets an overall "idea" of what happens inside the collider and decides which of these collisions are interesting.
The second filtering is software-based. It is done inside a computer-farm called "the farm". This process will further limit the snapshots from 100.000/sec down to about 100 to 300/sec.
These snapshots are then distribute for further analysis to a global computer grid called "the grid"