ORIGINAL: NatGeo
Marianne Lavelle For National Geographic News
October 29, 2012
The power of Titan, a supercomputer at the Oak Ridge National Laboratory in Tennessee, is akin to each of the world’s 7 billion people being able to carry out 3 million calculations per second. Photograph courtesy Charles Brooks, Oak Ridge National Laboratory |
Update November 12: Titan named world's fastest supercomputer.
In a breakthrough that harnesses video-game technology for solving science's most complex mysteries, the U.S. government's new Titan machine was named the world's fastest supercomputer. Deployed just two weeks ago, Titan is the fastest, most powerful, and most energy-efficient of a new generation of supercomputers that breach the bounds of "central processing unit" computing.
The announcement in Salt Lake City marks a return to the top of the the closely watched, semiannual TOP500 list for the U.S. Department of Energy's (DOE) Oak Ridge National Laboratory in Tennessee. Its previous supercomputer, Jaguar, led the world for a year before being overtaken in 2010 by a Chinese system, which later was supplanted by a machine in Japan. But another U.S. government machine, Sequoia, has topped the world since June. Titan's performance on the TOP500's benchmark test was 17.59 petaflops, about 17,590 trillion calculations each second, edging out Sequoia, with a speed of 16.33 petaflops. Watch a video about Titan here:
It would take 60,000 years for 1,000 people working at a rate of one calculation per second to complete the number of calculations that Titan can process in a single second. Think of Titan's power as akin to each of the world's 7 billion people solving 3 million math problems per second.
But Titan's signature achievement is how little energy it burns while blazing through those computations.
Titan's predecessor supercomputer at Oak Ridge, the 2.3-petaflop Jaguar machine, drew 7 megawatts (MW) of electricity, enough to power a small town. Titan needs just about 30 percent more electricity, 9 MW, while delivering ninefold greater computing power.
"We're able to achieve an order of magnitude increase in our scientific computing capabilities, which is what we need for our challenges, but to do so at essentially the same energy budget," says Jack Wells, director of science at the Oak Ridge Leadership Computing Facility. "Titan puts us on a different curve with respect to the energy consumption for increased computing power."
Video-Gaming Efficiency
Titan's energy-saving secret is a "hybrid" architecture that boosts the power of central processing units (CPUs) by marrying them to high-performance, energy-efficient graphical processing units (GPUs)—the technology that propels and animates today's most popular video games. A few dozen supercomputers around the world have used GPU and CPU processing in tandem since the first hybrid machine, the one-petaflop Roadrunner, at Los Alamos National Laboratory in New Mexico in 2008. Titan is the largest, by far.
To update pixels rapidly enough to bring angry birds, soldiers, and athletes to life on game consoles and handheld devices, GPUs have to handle large amounts of data at the same time, in parallel fashion. "This is exactly what we need for the future in order to enable progress and manage the energy [in supercomputing]," Wells says. If Titan had relied only on CPUs, which are optimized to do just one task at a time rapidly and flexibly (serial processing), Oak Ridge estimates the electricity requirements would have been about 30 MW, or more than three times greater than the system now demands.
Titan's approach is not the only path to energy-efficient supercomputing. IBM's "Sequoia" BlueGene/Q supercomputer at the U.S. Department of Energy's Lawrence Livermore Laboratory in California, now No. 2 on the official Top500 list, is part of a family of supercomputers that have been leaders in low-power design. The Sequoia can boast energy efficiency similar to Titan's (it uses 8 MW, and its peak performance is 20 petaflops computing power) through a design using many small, low-power embedded chips, connected through specialized networks inside the system. Four of the current top ten fastest supercomputers are BlueGene/Q machines, but the design does not use widely available commodity processors.
But Oak Ridge and its machine designer, Seattle-based Cray, have built Titan with processors made by the same companies that make the processors in consumer personal computing and gaming products. The upgrade from the Jaguar system to the Titan Cray XK7 system, which cost about $100 million, relies on AMD Opteron CPUs (299,008 CPU cores in all) and NVIDIA Tesla GPUs. It's an approach that has allowed Oak Ridge to take advantage of advances in the broader information technology market—including the highly efficient processing needed for video games—to drive energy efficiency.
"There's an economic model here that really enables this to work," says Steve Scott, chief technology officer for NVIDIA, based in Santa Clara, California. "The high-performance computing industry has great demand but it's not a very large market. But we're able to leverage this very broad consumer technology and use that to enable power-efficiency breakthroughs and make this high-performance computational tool possible. (See "Supercomputing Power Could Pave the Way to Energy-Efficient Engines")
"So when you go out and download and play the latest video game," Scott says, "you actually are helping to advance science."
From Motors to Skin
Because Titan marks an achievement in energy efficiency, it is perhaps appropriate that one of its primary uses will be to advance science on the future of energy. Titan will be put to work on research into systems for more fuel-efficient automobiles, for safer nuclear power reactors with improved power output, and on advanced magnets that could drive future electric motors and generators. It also will be used in research to model more accurately the impact of climate change.
These projects were among 61 science and engineering projects awarded time on Titan and another U.S. supercomputer at Argonne National Laboratory outside of Chicago, the DOE announced today. Scientists in fields from molecular biology to materials science vie for time on the machine at Oak Ridge and other U.S. government facilities, in a competitive process in which projects are picked for "high potential for accelerating discovery and innovation." The deployment of Titan makes it the largest open science supercomputer in operation in the world today. (In contrast, Sequoia is dedicated to classified work on maintenance of the U.S. government's nuclear weapons stockpile.)
Although researchers use supercomputers to model staggeringly complex interaction of natural and man-made systems, some of the applications of these systems are commonplace, and even mundane. The giant consumer products company Procter & Gamble has its own supercomputer (often ranked in the Top 500, though not in the Top 10) to tackle such problems as how to make strong paper towels that tear easily at the perforation, how to make billions of diapers at blinding speed, and how to engineer containers that open easily but don't leak.
"Last year, we did over 50,000 calculations on plastic bottles," says Tom Lange, director of modeling and simulation corporate research and development for Procter & Gamble.
Now, P&G researchers, working in partnership with scientists from Temple University in Philadelphia, have been awarded time on Titan because they are tackling a project deemed of broad interest and stunning complexity. Their work will aim to develop the first molecular-based model for understanding how lotions or drugs are delivered through the skin.
Michael Klein, director of Temple's Institute for Computational Molecular Science, explains that only in recent years has it been understood that beneath the first layer, the human skin is made of a complex matrix of lipids, cholesterol, so-called free fatty acids and another type of long-chain fatty acid known as ceramides. "The structure of this matrix gives skin all these beautiful properties," of flexibility, resilience, water-resistance, and the like, he says. "Understanding how these processes work is of great interest to consumer products companies," he says.
Because it is conducted on the big government machine, the research will be published and shared with the scientific community at large, where the potential to advance medical science has a broad public benefit.
It's just one example of the surprisingly wide reach of supercomputing work. "The scope is as broad as science and engineering is broad," says Wells. "It is not so much about having the leading supercomputer in Top 500 list. That's significant, but it's not really what we're focused on. We're focused on the science and engineering applications. It's about clean energy, it's about clean air. It's about a sustainable future. We offer our resources to companies big and small to come work with us to take a look into the future."
This story is part of a special series that explores energy issues. For more, visit The Great Energy Challenge.
No hay comentarios:
Publicar un comentario
Nota: solo los miembros de este blog pueden publicar comentarios.