Faster Computers Accelerate Pace of Discovery | |||
Dec 3, 2007 | Washington Post | ||
__ By Christopher Lee Washington Post Staff Writer Monday, December 3, 2007; A07 Sometime next year, developers will boot up the next generation of supercomputers, machines whose vast increases in processing power will accelerate the transformation of the scientific method, experts say. The first "petascale" supercomputer will be capable of 1,000 trillion calculations per second. That's about twice as powerful as today's dominant model, a basketball-court-size beast known as BlueGene/L at the Energy Department's Lawrence Livermore National Laboratory in California that performs a peak of 596 trillion calculations per second. The computing muscle of the new petascale machines will be akin to that of more than 100,000 desktop computers combined, experts say. A computation that would take a lifetime for a home PC and that can be completed in about five hours on today's supercomputers will be doable in as little as two hours. "The difficulty in building the machines is tremendous, and the amount of power these machines require is pretty mind-boggling," said Mark Seager, assistant department head for advanced computing technology at Lawrence Livermore. "But the scientific results that we can get out of them are also mind-boggling and worth every penny and every megawatt it takes to build them." A leading candidate to become the first petascale machine, the "Roadrunner" supercomputer being developed by IBM in partnership with the Energy Department's Los Alamos National Laboratory, will require about 4 megawatts of power -- enough to illuminate 10,000 light bulbs, said John Hopson, program director for advanced simulation and computing at Los Alamos in New Mexico. But scientists say Roadrunner and its cousins will make possible dramatically improved computer simulations. That will help shed new light on subjects such as climate change, geology, new drug development, dark matter and other secrets of the universe, as well as other fields in which direct experimental observation is time-consuming, costly, dangerous or impossible. In fact, supercomputers and their simulations are becoming so powerful that they essentially have introduced a new step in the time-honored scientific method that moves from theory to hypothesis to experimental confirmation, some experts contend. "They are a tool that really helps stimulate the imagination of scientists and engineers in ways that previously weren't possible," said David Turek, vice president of supercomputing at IBM. "You had theory and hypothesis and experimentation. Well, now scientists are admitting that computation is an important part of this, as well." "Nature is the final arbiter of truth," said Seager, the Lawrence Livermore computer scientist, but "rather than doing experiments, a lot of times now we're actually simulating those experiments and getting the data that way. "We can now do as much scientific discovery with computational science as we could do before with observational science or theoretical science." A particularly fruitful area of computer modeling has been the study of global climate change. Ten years ago, experts agreed that humans probably were contributing to global warming. Now, in part because of a 10,000-fold increase computing power and better accuracy in climate simulations, scientists are sure of it. One result is that computer climate models can now simulate atmospheric and oceanic conditions and, crucially, how changes in each affect the other, experts said. Now the worry is not that computing power is inadequate but that the aging of NASA's weather satellites will lead to a shortage of input data before long, Seager and others said. Petascale computers also will make it possible to predict, say, the effect of an earthquake on every building in downtown Los Angeles, experts said. Current models cannot yield predictions for areas smaller than a square mile or two. The increased detail could help shape building codes and be a valuable tool in evacuation planning and disaster preparedness. Computer simulations also help assess the reliability, safety, security and performance of weapons in the U.S. nuclear stockpile, years removed from any real-life nuclear tests. "Nuclear weapons are the quintessential example of something you can't really test anymore, so a lot of it has to be done computationally," said Hopson, the Los Alamos scientist. Other potential uses of petascale computers include better simulations of what happens when stars explode into supernovas and die, and new and more refined analyses of experimental drugs and their effects on disease and interactions with other medications, experts said. Still another is the modeling of the bird flu virus and how it might evolve to become more communicable and lethal -- knowledge that could help scientists develop a vaccine in time to use it and to inform public health planning. Petascale computers are also expected to lead to more potent models for Wall Street to calculate risk and predict the fate of financial instruments, as well as more advanced digital prototypes of automobiles and jet aircraft, further reducing the need for physical mock-ups. The remarkable advances in computing power of recent decades are frequently attributed to the tenet known as Moore's Law, named for Intel co-founder Gordon E. Moore, which says that progress in building chips doubles the power of microprocessors about every 18 months. But that alone does not explain the leaps in supercomputing, scientists said. Today's supercomputers rely not only on better "compute nodes" (made up of faster chips and more memory), but also on scientists' ability to "gang" hundreds of thousands of those nodes together in a single machine and to devise better ways of having them communicate with one another and divide up the work of complex problem solving. "If you ran today's code on yesterday's computers, they would be much faster," said Raymond Bair, director of the Argonne Leadership Computing Facility at the Energy Department's Argonne National Laboratory near Chicago. "People have figured out how to solve the problems faster." Even before a petascale computer is a reality, scientists are anticipating the next big milestone, the exascale machine -- a thousand times more powerful still, and capable of 1 million trillion calculations per second. But they'll have to wait. That one isn't expected until about 2018. |
Email Subscriptions powered by FeedBlitz
Handpicked Breaking News, Research, and Editorial to Help Educate, Promote, and Advance the World
Dec 3, 2007
Subscribe to:
Post Comments (Atom)
Email Subscriptions powered by FeedBlitz
Blog Archive
-
▼
2007
(27)
-
▼
December
(26)
- Bleaders: Yes, there are contrarians out there tha...
- This is more from Google on C...
- Cool experiment on recycling below, however what w...
- More on CFLS and Dimmer Switches.3. Can I use a co...
- What to Do if a Fluorescent Light Bulb Breaks (Acc...
- First CO2-free coal power plant announcedThis lo...
- Update on international ...
- As I sit at my desk wondering what I'm going to bl...
- Gore makes Nashville home more 'green' Dec 14...
- More farmers seeing wind as cash crop Dec...
- EU Threatens to Boycott US Climate Talks By ...
- How Much Greenhouse Gas Does California Swine Prod...
- Carbon gases: Green Exchange takes root soon ...
- 2. Summary: The Clean Renewable Energy an...
- Top News December 11, 2007, 5:39PM EST t...
- Global Climate Talks Divided on Emissions Target...
- EDITORIAL: Pedal to the metal Dec 11, 2007Fort Wor...
- Green Job Growth Expected According to a new repo...
- This is pulled from ACEEE's Grapevine Online - Iss...
- The greening of America Dec 5, 2007 The Hill,...
- Story of Stuff: 20 minute movie follows stuff from...
- The just-announced 3110 Evolve candybar from Noki...
- Hoax Misrepresents Corporate Consortium's Climate ...
- DOE to Invest More than $...
- Faster Computers Accelerate Pace of Discovery ...
- So I decided what to do. I'm going to move my old...
-
▼
December
(26)
No comments:
Post a Comment