Inside a 17-mile tunnel deep beneath the French-Swiss border they hope to detect evidence of extra dimensions, invisible "dark matter" and an elusive particle called the Higgs boson.
Success in this $10 billion endeavor would revolutionize our understanding of the universe.
But even the massive computing power at the European Organization for Nuclear Research can't sift through all the data that will pour in when its particle-smashing experiment begins on Wednesday.
So the Geneva-based lab, known by its old French acronym CERN, devised a way of sharing the burden among dozens of leading computing centers around the world.
The result is the "LHC Grid," a global network of 60,000 computers that will analyze what happens when protons are hurled at each other inside CERN's Large Hadron Collider.
"This is the next step after the Web," says David Colling, a scientist at Britain's Imperial College, which is contributing to the Grid. "Except that unlike the Web, you're sharing computing power and not files."
That computing power is needed if scientists are to find what they are looking for among the mountains of data produced when four giant detectors - 10 times more accurate than any previous instruments - begin measuring activity at the subatomic level.
"You can think of each experiment as a giant digital camera with around 150 million pixels taking snapshots 600 million times a second," explains CERN's Ian Bird, who leads the Grid project.
Sophisticated filters discard all but the most interesting data, still leaving some 15 petabytes to be analyzed each year. That's enough to fill two million DVDs.
The data are sent via high-speed lines to 11 top research institutions in Europe, North America and Asia, and from there to a wider network of some 150 research facilities around the world where they can be scrutinized by thousands of researchers.
"The LHC experiment would not be possible without this infrastructure, that's why particle physicists have really driven the Grid," says Colling.
Building a new computer center at CERN would have been impractical and costly, so scientists proposed a distributed network that makes use of each country's own research facilities, ensures they all have equal access and gives them a chance to share in the glory of any discovery.
Already the experience of collaborating on such a large computing project has proved invaluable, says Ruth Pordes, executive director of the Open Science Grid at Fermilab in Chicago. The U.S.-government funded project is among the major contributors to the LHC Grid.
"We are doing things that are at the boundaries of science," says Pordes. "But the technologies, the methods and the results will be picked up by industry."
Scientists expect grid computing to become more widely used in future for research ranging from new drugs to more effective nuclear power. Eventually, consumers will start seeing it used in daily life to regulated traffic, predict the weather or even boost a flagging economy.
"In credit risk, the amount of money you can lend out is directly proportional to how many calculations you can do to quantify your risk," notes Imperial's David Colling.
So even if the LHC experiment doesn't yield answers to the cosmic questions posed by physicists at CERN, historians may one day see it as a key step in the development of networked computing.
It wouldn't be the first time that has happened at CERN. In 1990 a young British researcher there created a computer-based system for sharing information with colleagues around the world.
He called it the World Wide Web.