Using the Large Hadron Collider, CERN recently discovered a particle consistent with the Higgs boson, but preliminary results are far from conclusive. ITNews sat down with David Foster, CERN's deputy head of IT, to discuss what it takes to calculate the massive amounts of data associated with such experiments and how the organization is planning for the future. With each collision, raw data is filtered through thousands of machines at CERN's data center at a mind-boggling speed of a petabyte (one million gigabytes) per second, then is instantly distributed to additional facilities for further processing by using grid technology. Of course, the intricacies of the organization's data infrastructure are far more complicated. Fortunately, CERN has someone leading its IT department who fully understands that information technology means a whole different thing when talking about the "God particle."
The Large Hadron Collider and processing data at a million gigabytes per second
The Large Hadron Collider and processing data at a million gigabytes per second/
ITNews recently sat down with CERN's deputy head of IT to discuss what it takes to process data at a petabyte per second.