LATEST TECHNOLOGY NEWS | GADGET REVIEWS | SOCIAL MEDIA UPDATES | INFORMATION TECHNOLOGY NEWS | MOBILE WORLD | CAR WORLD | NATIONAL-INTERNATIONAL NEWS AND MANY MORE..........

Pages

Translate My Page

Sunday, July 22, 2012

IBM'S Sequoia supercomputer headed the latest Top500 list of the world's fastest supercomputers

            


          IBM SEQUOIA SUPERCOMPUTER


After winning top honors as the fastest supercomputer in the world -- for now, anyway -- IBM's Sequoia is getting ready to go to work for the Lawrence Livermore National Laboratory. Its duty: help manage the United States supply of nuclear weapons. "Much of the modeling for defense, aerospace, medical research and anticipating natural disasters is done on supercomputers," said analyst Rob Enderle.



IBM's Sequoia supercomputer headed the latest Top500 list of the world's fastest supercomputers. The twice-yearly roster was released at the 2012 International Supercomputing Conference in Hamburg, Germany, in June.



                 An exclusive photo of  IBM Sequoia development team 


Delivered to the Lawrence Livermore National Laboratory (LLNL) in 2011, Sequoia is scheduled for operational deployment this year. It's going to be used to manage the United States' nuclear weapons stockpile by enabling uncertainty quantification calculations to help extend the life of aging weapons systems. This is done by predicting certain outcomes if some variables remain uncertain.

Sequoia will be restricted to weapons management and will be moved off onto its own classified network. National Nuclear Security Administration spokesperson Courtney Greenwald was not immediately available to provide further details.


The Genes Have It

LLNL will get another IBM supercomputer, the Vulcan, that will be made available to researchers. IBM's collaborating with LLNL to build this machine, which will be a 24-rack machine in the same Blue Gene/Q family as the Sequoia.

Blue Gene is an IBM project aimed at creating supercomputers with operating speeds in the petaflops range with low power consumption. It has created three generations of supercomputers, the Blue Gene/L, Blue Gene/P and Blue Gene/Q.

All Blue Gene/Q systems are based on IBM's PowerPC 1.6 Ghz processor, Michael Rosenfield, director of Deep Computing Systems at IBM Research, told TechNewsWorld. This is a massively multicore capable and multithreaded 62-bit Power Architecture processor core that IBM built using the Power ISA v.206 specs.

Why the PowerPC?

The PowerPC A2 was used in the PowerMac, iBook, eMac and PowerBook computers from Apple as well as iMacs, Mac Minis and Xserve computers. It was also used in the TiVo series 1 DVR system and the Sony PlayStation 3.

Although 84 percent of the supercomputers on the Top500 list use the less power-hungry and lower-cost Intel and AMD Opteron processors, IBM's using the PowerPC because "this is the technology IBM owns and knows the most about so this is what they will be most competitive with," Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld.

Supercomputers "are pretty much custom efforts that are designed to do massive levels of processing, so it's often more about how the system is optimized than the processors it uses," Enderle continued. "Some of the more recent ones have started to use GPUs, for instance."

Supercomputers in Research

"Much of the modeling for defense, aerospace, medical research and anticipating natural disasters is done on supercomputers," Enderle said. "Without them, we would be effectively out of a number of markets and largely be incapable of truly anticipating and properly planning for those moments when Mother Nature has a hissy fit."

The Argonne National Laboratory already hosts Intrepid, a Blue Gene/P supercomputer. It's getting a 10-pflop Blue Gene/Q system named "Mira" that will provide researchers worldwide access through blocks of compute time awarded through the U.S. Department of Energy's INCITE program, IBM's Rosenfield said.

IBM and LLNL will use the Vulcan supercomputer in a high-performance computing (HPC) collaboration called "Deep Computing Solutions." Computer and domain science experts from both organizations help devise HPC solutions to accelerate the development of new technologies, products and services in several areas. These areas will include green energy, applied energy, biology, materials science, manufacturing and data management.

Linux Rules in Blue Genes

Blue Gene/Q systems have two types of nodes: compute nodes and input/output (I/O) nodes. Between 32 and 128 compute nodes are linked to an I/O node, IBM's Rosenfield said.

The I/O nodes run Red Hat Enterprise Linux. "Linux has the full set of input/output capabilities and supports all system calls, so we use it on the I/O nodes," Rosenfield explained.

The compute nodes run an open source, highly scalable and reliable operating system called the "Compute Node Kernel."

"In Sequoia, we have 1.57 million compute cores running concurrently," Rosenfield stated. "We need a highly efficient operating system that scales with the size of the system. CNK has proven to scale in the sense that, if a new application is slowing down while scaling up, we know it is not the CNK operating system slowing it down."

Further, "Linux is huge and highly complex," Rosenfield pointed out. "To replicate Linux 1.57 million times will not meet the stringent reliability requirements."

Other Supercomputer Solutions

IBM supercomputers make up nearly 43 percent of the latest Top500 list, followed by HP with more than 27 percent. However, other supercomputers are also making their mark in research.

"We had Blue Genes many years ago, but they have been decommissioned," Jan Sverina, director of media relations at the supercomputer center of the University of California at San Diego (UCSD), told TechNewsWorld. "It's from Appro, and we call it 'Gordon,' for Flash Gordon."

Appro supercomputers make up more than 3 percent of the Top500 list.

UCSD's Gordon uses Flash-based memory, which is "10 to 100 times faster than most systems," Zverina said. "At the end of the day, supercomputers are a tool and, if you can do data-intensive analysis over a much shorter period and with much larger datasets, you're immediately increasing the productivity of the researchers."

That's the same goal the Blue Gene project's aiming at.
Share My Posts! :)

No comments:

Post a Comment