Supercomputing

Supercomputing, There's an App For That 66

aarondubrow writes "Researchers at MIT have created an experimental system for smart phones that allows engineers to leverage the power of supercomputers for instant computation and analysis. The team performed a series of expensive high-fidelity simulations on the Ranger supercomputer to generate a small "reduced model" which was transferred to a Google Android smart phone. They were then able to solve engineering and fluid flow problems on the phone and visualize the results interactively. The project proved the potential for reduced order methods to perform real-time and reliable simulations for complicated problems on handheld devices."
Supercomputing

IBM Supercomputer Cooled With Hot Water 89

1sockchuck writes "IBM has deployed an innovative supercomputer cooled by hot water in a Zurich computer lab. The Aquasar supercomputer employs a chip-level liquid cooling system that can use water at temperatures as high as 60 degrees C (140 degrees F), and as a result consumes up to 40 percent less energy than a comparable system using room-level air-cooling. The system also uses waste heat to provide warmth to buildings, reducing Aquasar's carbon footprint even further."
Supercomputing

'Telecommuting' In Formula 1 90

flewp writes "This New York Times article on Formula 1 racing gives some insight into the workings of one of the most high-tech sports on the planet — consider that a few years ago, Sauber's supercomputer ranked toward the top of all the supercomputers in Europe. The teams bring to each race dozens of mechanics, support personnel, etc.; but back at their home bases, perhaps thousands of miles away, countless more engineers work (with the help of gobs of computing power) to give each team that extra edge."
Power

Renewable Energy To Power Aussie SKA 48

schliz writes "New solar and geothermal energy facilities are being built in Australia to provide sustainable energy for the region's Square Kilometer Array (SKA) bid. The Australian Government yesterday announced A$47.3m in funding for a full-scale, hybrid solar and diesel plant for the Australian SKA Pathfinder (ASKAP) at the Murchison Radio-astronomy Observatory, and geothermal energy facilities for the Pawsey High-Performance Computing Centre, where data from SKA radio telescopes would be processed. ASKAP is part of the Australasian bid to host the $2.5 billion Square Kilometre Array (SKA), which involves 20 countries and will investigate galaxy evolution, dark matter, and the existence of life. IBM expects the whole of the SKA to produce an exabyte of data per day."
Supercomputing

Mobile Phones vs. Supercomputers of the Past 247

An anonymous reader writes "The recently published Top 500 list of the world's fastest supercomputers is based on the Linpack benchmark developed decades ago by Jack Dongarra. This same test has been ported to Android mobile phones, which means that we can compare the performance of our phones against that of the supercomputers of the past. For example, a tweaked Motorola Droid can hit 52 Mflop/s, which is more than 15 times faster than the CPUs used in the 1979 Cray-1." But even today's most powerful cellphones don't come with an integrated bench.
Supercomputing

NSF Gives Supercomputer Time For 3-D Model of Spill 102

CWmike writes "Scientists have embarked on a crash effort to use one the world's largest supercomputers to create 3-D models to simulate how BP's massive Gulf of Mexico oil spill will affect coastal areas. Acting within 24 hours of receiving a request from researchers, the National Science Foundation late last week made an emergency allocation of 1 million compute hours on a supercomputer at the Texas Advanced Computing Center to study how BP's gusher will affect coastlines. The computer model they are working on 'has the potential to advise and undergird many emergency management decisions that may be made along the way, particularly if a hurricane comes through the area,' said Rick Luettich, a professor of marine sciences and head of the Institute of Marine Sciences at the University of North Carolina in Chapel Hill, who is one of the researchers on this project. Meanwhile, geographic information systems vendor ESRI has added a social spin to GIS mapping of the BP oil spill."
Supercomputing

Slimming Down a Supercomputer 64

1sockchuck writes "Happy Feet animator Dr. D Studios has packed a large amount of supercomputing power into a smaller package in its new render farm in Sydney, Australia. The digital production shop has consolidated the 150 blade chassis used in the 2007 dancing penguin feature into just 24 chassis, entirely housed in a hot-aisle containment pod. The Dr. D render farm has moved from its previous home at Equinix to the E3 Pegasus data center in Sydney. ITNews has a video and photos of the E3 facility."
Hardware

Startup's Submerged Servers Could Cut Cooling Costs 147

1sockchuck writes "Are data center operators ready to abandon hot and cold aisles and submerge their servers? An Austin startup says its liquid cooling enclosure can cool high-density server installations for a fraction of the cost of air cooling in traditional data centers. Submersion cooling using mineral oil isn't new, dating back to the use of Fluorinert in the Cray 2. The new startup, Green Revolution Cooling, says its first installation will be at the Texas Advanced Computing Center (also home to the Ranger supercomputer). The company launched at SC09 along with a competing liquid cooling play, the Iceotope cooling bags."
Earth

IBM Claims Breakthrough Energy-Efficient Algorithm 231

jitendraharlalka sends news of a claimed algorithmic breakthrough by IBM, though from the scant technical detail provided it's hard to tell exactly how important the development might be. IBM apparently presented its results yesterday at the Society for Industrial and Applied Mathematics conference in Seattle. The breathless press release begins: "IBM Research today unveiled a breakthrough method based on a mathematical algorithm that reduces the computational complexity, costs, and energy usage for analyzing the quality of massive amounts of data by two orders of magnitude. This new method will greatly help enterprises extract and use the data more quickly and efficiently to develop more accurate and predictive models. In a record-breaking experiment, IBM researchers used the fourth most powerful supercomputer in the world... to validate nine terabytes of data... in less than 20 minutes, without compromising accuracy. Ordinarily, using the same system, this would take more than a day. Additionally, the process used just one percent of the energy that would typically be required."
Operating Systems

Virtualizing a Supercomputer 57

bridges writes "The V3VEE project has announced the release of version 1.2 of the Palacios virtual machine monitor following the successful testing of Palacios on 4096 nodes of the Sandia Red Storm supercomputer, the 17th-fastest in the world. The added overhead of virtualization is often a show-stopper, but the researchers observed less than 5% overhead for two real, communication-intensive applications running in a virtual machine on Red Storm. Palacios 1.2 supports virtualization of both desktop x86 hardware and Cray XT supercomputers using either AMD SVM or Intel VT hardware virtualization extensions, and is an active open source OS research platform supporting projects at multiple institutions. Palacios is being jointly developed by researchers at Northwestern University, the University of New Mexico, and Sandia National Labs." The ACM's writeup has more details of the work at Sandia.
Math

New Pi Computation Record Using a Desktop PC 204

hint3 writes "Fabrice Bellard has calculated Pi to about 2.7 trillion decimal digits, besting the previous record by over 120 billion digits. While the improvement may seem small, it is an outstanding achievement because only a single desktop PC, costing less than $3,000, was used — instead of a multi-million dollar supercomputer as in the previous records."
Supercomputing

FASTRA II Puts 13 GPUs In a Desktop Supercomputer 127

An anonymous reader writes "Last year tomography researchers of the ASTRA group at the University of Antwerp developed a desktop supercomputer with four NVIDIA GeForce 9800 GX2 graphics cards. The performance of the FASTRA GPGPU system was amazing; it was slightly faster than the university's 512-core supercomputer and cost less than 4000EUR. Today the researchers announce FASTRA II, a new 6000EUR GPGPU computing beast with six dual-GPU NVIDIA GeForce GTX 295 graphics cards and one GeForce GTX 275. The development of the new system was more complicated and there are still some stability issues, but tests reveal the 13 GPUs deliver 3.75x more performance than the old system. For the tomography reconstruction calculations these researchers need to do, the compact FASTRA II is four times faster than the university's supercomputer cluster, while being roughly 300 times more energy efficient."
Supercomputing

A Skeptical Reaction To IBM's Cat Brain Simulation Claims 198

kreyszig writes "The recent story of a cat brain simulation from IBM had me wondering if this was really possible as described. Now a senior researcher in the same field has publicly denounced IBM's claims." More optimisticaly, dontmakemethink points out an "astounding article about new 'Neurogrid' computer chips which offer brain-like computing with extremely low power consumption. In a simulation of 55 million neurons on a traditional supercomputer, 320,000 watts of power was required, while a 1-million neuron Neurogrid chip array is expected to consume less than one watt."
Supercomputing

Australia's CSIRO To Launch CPU-GPU Supercomputer 82

bennyboy64 contributes this excerpt from CRN Australia: "The CSIRO will this week launch a new supercomputer which uses a cluster of GPUs [pictures] to gain a processing capacity that competes with supercomputers over twice its size. The supercomputer is one of the world's first to combine traditional CPUs with the more powerful GPUs. It features 100 Intel Xeon CPU chips and 50 Tesla GPU chips, connected to an 80 Terabyte Hitachi Data Systems network attached storage unit. CSIRO science applications have already seen 10-100x speedups on NVIDIA GPUs."
IBM

IBM Takes a (Feline) Step Toward Thinking Machines 428

bth writes "A computer with the power of a human brain is not yet near. But this week researchers from IBM Corp. are reporting that they've simulated a cat's cerebral cortex, the thinking part of the brain, using a massive supercomputer. The computer has 147,456 processors (most modern PCs have just one or two processors) and 144 terabytes of main memory — 100,000 times as much as your computer has."
Supercomputing

100 Million-Core Supercomputers Coming By 2018 286

CWmike writes "As amazing as today's supercomputing systems are, they remain primitive and current designs soak up too much power, space and money. And as big as they are today, supercomputers aren't big enough — a key topic for some of the estimated 11,000 people now gathering in Portland, Ore. for the 22nd annual supercomputing conference, SC09, will be the next performance goal: an exascale system. Today, supercomputers are well short of an exascale. The world's fastest system at Oak Ridge National Laboratory, according to the just released Top500 list, is a Cray XT5 system, which has 224,256 processing cores from six-core Opteron chips made by Advanced Micro Devices Inc. (AMD). The Jaguar is capable of a peak performance of 2.3 petaflops. But Jaguar's record is just a blip, a fleeting benchmark. The US Department of Energy has already begun holding workshops on building a system that's 1,000 times more powerful — an exascale system, said Buddy Bland, project director at the Oak Ridge Leadership Computing Facility that includes Jaguar. The exascale systems will be needed for high-resolution climate models, bio energy products and smart grid development as well as fusion energy design. The latter project is now under way in France: the International Thermonuclear Experimental Reactor, which the US is co-developing. They're expected to arrive in 2018 — in line with Moore's Law — which helps to explain the roughly 10-year development period. But the problems involved in reaching exaflop scale go well beyond Moore's Law."
Supercomputing

Asus Releases Desktop-Sized Supercomputer 260

angry tapir writes "Asustek has unveiled its first supercomputer, the desktop computer-sized ESC 1000, which uses Nvidia graphics processors to attain speeds up to 1.1 teraflops. Asus's ESC 1000 comes with a 3.33GHz Intel LGA1366 Xeon W3580 microprocessor designed for servers, along with 960 graphics processing cores from Nvidia inside three Tesla c1060 Computing Processors and one Quadro FX5800."
Silicon Graphics

SGI Rolls Out "Personal Supercomputers" 303

CWmike writes "They aren't selling personal supercomputers at Best Buy just yet. But that day probably isn't too far off, as the costs continue to fall and supercomputers become easier to use. Silicon Graphics International on Monday released its first so-called personal supercomputer. The new Octane III system is priced from $7,995 with one Xeon 5500 processor. The system can be expanded to an 80-core system with a capacity of up to 960GB of memory. This new supercomputer's peak performance of about 726 GFLOPS won't put it on the Top 500 supercomputer list, but that's not the point of the machine, SGI says. A key feature instead is the system's ease of use."
Data Storage

US Supercomputer Uses Flash Storage Drives 72

angry tapir writes "The San Diego Supercomputer Center has built a high-performance computer with solid-state drives, which the center says could help solve science problems faster than systems with traditional hard drives. The flash drives will provide faster data throughput, which should help the supercomputer analyze data an 'order of magnitude faster' than hard drive-based supercomputers, according to Allan Snavely, associate director at SDSC. SDSC intends to use the HPC system — called Dash — to develop new cures for diseases and to understand the development of Earth."

Slashdot Top Deals