Applying Some Grunt: Super computers
N
N
0Overall Score

In the search for extraterrestrials, SETI recently commissioned its first own radio telescope, the Allen Telescope Array, in the mountains of Northern California. Now SETI can scan and analyse in real-time thanks to backers such as Paul Allen and Gordon Moore. Rather than one big antenna, the Allen Array consists of 350 smaller antennas, uses off-the-shelf parts and costs US$35million instead of US$200 million. Some of the radio dishes are already working, but the full array of six meter dishes will be completed sometime in 2008 and will be the equivalent to a 100 meter dish.

SETI started in the 1960’s, but the organisation has been going on borrowed time, as it were. You see, the search has never had its own facilities to scan the skies for signs of life. Instead searchers had to borrow telescope-time, collect as much data as they could and take it away to be analysed. But since Government funding was discontinued in 1993, the universal SETI search has splintered in two. One group is undertaking a whole-of-sky mapping exercise, while the other is carrying out a targeted search of likely candidate stars.
Currently much of the huge data processing load is shared amongst some half a million volunteer PCs at a rate of more than a trillion flops. Any spare computing power at SETI is used to manage the massive SETI@Home processing task.

SETI@Home was not the first distributed processing project, but it may be the most well-known. There are any number of tasks for spare cycles – folding proteins, searching for planets, fighting AIDS. Increasingly, SETI@Home uses the specially designed BOINC (Berkeley Open Infrastructure for Network Computing), a software platform designed for distributed computing using volunteer computer resources. Another way to get access to a lot of processing power is to own your own supercomputer. Just in time for the official announcement of the world’s supercomputer list at the 20th International Supercomputer Conference in late June, IBM announced it had commissioned a new BlueGene. The world’s most powerful privately owned supercomputer, the Watson Blue Gene (BGW) is capable of a processing speed of 91.29 Teraflops – just a shade less than being applied by SETI@Home.
The system joins its sister machine, the BlueGene/L at Lawrence Livermore National Lab. BlueGene/L is about to be upgraded to a healthy 360 Teraflops, once the full 64 rack system (130,000 PowerPC CPUs) is completed this year.

With the loss of high-profile customer Apple Computer, IBM was probably glad for an opportunity to tell the world how great the PowerPC architecture is. The installation of a four-rack system at Switzerland’s EPFL coincided with the Apple announcement. Hardly worth a mention with a speed of only 22.8 Teraflops, but one of the projects planned for the new machine boggles the mind – or de-boggles it. The Blue Brain Project plans to model the circuitry in the human neocortex. Ultimately the scientists hope to build an accurate, computer-based model of the entire brain. It might take some time, but it follows Sony’s patent for an ultrasound device that transmits data directly into the brain, allowing you to see, smell, taste and see movies, video games or whatever. Sony claimed the patent is “based on an inspiration that this may someday be the direction that technology will take us”. Perhaps it might, you never know.