Tag Archives: LHC

Alarming news from the LHC

Despite the successful beam circulation in the LHC with no apparent ill effects, it has been revealed the LHC only avoided destroying the Earth because the planet had already been destroyed:

It is our duty to inform you that as of 7:35:05am UTC on September 10, 2008, the Earth has been destroyed.

The destruction of Earth was first reported by Mr Jonathan Barber of Wisconsin, United States, who spotted that his home-made seismic Earth Detector had ceased to give readings at around 8:00am (2am local time). Several other amateur geocide spotters noticed this at the same time but Mr. Barber was the first to place a telephone call to the IEDAB's Geocide Hotline (+44 115 09Ω 4127, ask for Other Dave) at which point IEDAB officials performed an emergency check of their own instrumentation and verified Mr. Barber's report, as well as fixing the exact time of geocide.

Evidence is still being collated, but preliminary results suggest that the Earth was destroyed pre-emptively by scientists at the Large Hadron Collider at CERN, Geneva, Switzerland, before the commencement of their experiments to locate the Higgs Boson, as a precautionary measure to ensure that the experiment itself could not result in the destruction of the Earth.

Flood of data from the LHC

CERN Computing Center
CERN Computing Center

If all goes well with the Large Hadron Collider this week, it will finally have gotten a beam to go around a full circle almost a month after the first beam injection. While from a physics standpoint this will be quite exciting, although it will be much more exciting when they manage head-on collisions between two beams a couple of months later, the LHC is also very impressive in terms of the supporting computing infrastructure.

The LHC is going to generate an incredible number of collision events, too much to handle in a single computing center. And I mean a center with more than 100,000 computers. This means that they need a computing infrastructure distributed all over the world which is able to handle the flood of data that comes out of the collider. With about one DVD's worth of data being generated every five seconds, the data is first received by CERN's computing center, which then distributes the data to 11 computing sites in Europe, North America, and Asia. These then provide access to the collision data to scientists on their own computers, which will do the actual CPU-intensive work of analyzing the data for new discoveries.

And despite the incredible amount of data that comes out of the collider, it's mind-boggling how that is just a tiny fraction of what is originally produced in it. Out of the 40 million collision events that would occur, a lot of work is done to filter out "boring" and "well-known" events which our current theories of physics can already explain quite well, so that data for "only" about 100 potentially interesting events per second will come out of the collider. Without such filters, even all the computing power and network bandwidth in the world would not be able to handle all the data.

Overexcited about beam circulation

A lot of science web sites seem to be excited about plans for the initial beam circulation in the Large Hadron Collider on September 10. While it's a great step into getting the particle accelerator online, I'm a bit less excited about it than others. Personally, I'm waiting for the start of actual collisions between opposing beams a month or two later, which is when we'll start to get some real scientific data.

I wonder if this makes me a dour wet blanket?