Slashdot Deals: Prep for the CompTIA A+ certification exam. Save 95% on the CompTIA IT Certification Bundle ×

NASA's Hurricane Model Resolution Increases Nearly 10-Fold Since Katrina 89

zdburke writes: Thanks to improvements in satellites and on-the-ground computing power, NASA's ability to model hurricane data has come a long way in the ten years since Katrina devastated New Orleans. Their blog notes, "Today's models have up to ten times the resolution than those during Hurricane Katrina and allow for a more accurate look inside the hurricane. Imagine going from video game figures made of large chunky blocks to detailed human characters that visibly show beads of sweat on their forehead." Gizmodo covered the post too and added some technical details, noting that, "the supercomputer has more than 45,000 processor cores and runs at 1.995 petfalops."

China To Impose Export Control On High Tech Drones and Supercomputers 67

hackingbear writes: Following similar hi-tech export restriction policies in the U.S. (or perhaps in response to the U.S. ban on China,) China will impose export control on some drones and high performance computers starting on August 15th, according to an announcement published on Friday by China's Ministry of Commerce and the General Administration of Customs. The ban includes (official documents in Chinese) drone that can take off in wind speed exceeding 46.4km/hour or can continuously fly for over 1 hour as well as electronic components specifically designed or modified for supercomputers with speed over 8 petaflops. Companies must acquire specific permits before exporting such items. Drones and supercomputers are the two areas where China is the leader or among the top players. China is using its rapidly expanding defense budget to make impressive advances in (military) drone technology, prompting some to worry that the United States' global dominance in the market could soon be challenged. The tightening of regulations comes two weeks after an incident in disputed Kashmir in which the Pakistani army claimed to have shot down an Indian "spy drone", reportedly Chinese-made. China's 33-petaflops Tianhe-2, currently the fastest supercomputer in the world, while still using Intel Xeon processors, makes use of the home-grown interconnect, arguably the most important component of modern supercomputers.

Obama's New Executive Order Says the US Must Build an Exascale Supercomputer 223

Jason Koebler writes: President Obama has signed an executive order authorizing a new supercomputing research initiative with the goal of creating the fastest supercomputers ever devised. The National Strategic Computing Initiative, or NSCI, will attempt to build the first ever exascale computer, 30 times faster than today's fastest supercomputer. Motherboard reports: "The initiative will primarily be a partnership between the Department of Energy, Department of Defense, and National Science Foundation, which will be designing supercomputers primarily for use by NASA, the FBI, the National Institutes of Health, the Department of Homeland Security, and NOAA. Each of those agencies will be allowed to provide input during the early stages of the development of these new computers."
The Almighty Buck

19-Year-Old's Supercomputer Chip Startup Gets DARPA Contract, Funding 150

An anonymous reader writes: 19-year-old Thomas Sohmers, who launched his own supercomputer chip startup back in March, has won a DARPA contract and funding for his company. Rex Computing, is currently finishing up the architecture of its final verified RTL, which is expected to be completed by the end of the year. The new Neo chips will be sampled next year, before moving into full production in mid-2017.The Platform reports: "In addition to the young company’s first round of financing, Rex Computing has also secured close to $100,000 in DARPA funds. The full description can be found midway down this DARPA document under 'Programming New Computers,' and has, according to Sohmers, been instrumental as they start down the verification and early tape out process for the Neo chips. The funding is designed to target the automatic scratch pad memory tools, which, according to Sohmers is the 'difficult part and where this approach might succeed where others have failed is the static compilation analysis technology at runtime.'"

Cray To Build Australia's Fastest Supercomputer 54

Bismillah writes: US supercomputer vendor Cray has scored the contract to build the Australian Bureau of Meteorology's new system, said to be capable of 1.6 petaFLOPS and with an upgrade option in three years' time to hit 5 petaFLOPS. From the iTnews story: "The increase in capacity will allow the BoM to deal with growth in the 1TB of data it collects every day, which it expects to increase by 30 percent every 18 months to two years. It will also allow the agency to collect new areas of information it previously lacked the capacity for. 'The new observation platforms that are coming online are bringing quite a lot more data,' supercomputer program director Tim Pugh told iTnews.

Preserving Radio Silence At the Square Kilometer Array 27

johnslater writes: The Guardian has a story on the radio silence requirements at the Square Kilometer Array in Australia. The RF requirements for the SKA are far more stringent than at the US National Radio Quiet Zone at Greenbank, to such an extent that the specialized supercomputers to control the array have specially shielded data centers, and the as-yet-unbuilt supercomputer to process the data will be located hundreds of miles away in Perth. To quote Dr John Morgan in the article: "You can guarantee that the thing that SKA will be remembered for ... is going to be the thing you have not thought of. It's the unknown unknown."

Supercomputing Cluster Immersed In Oil Yields Extreme Efficiency 67

1sockchuck writes: A new supercomputing cluster immersed in tanks of dielectric fluid has posted extreme efficiency ratings. The Vienna Scientific Cluster 3 combines several efficiency techniques to create a system that is stingy in its use of power, cooling and water. VSC3 recorded a PUE (Power Usage Efficiency) of 1.02, putting it in the realm of data centers run by Google and Facebook. The system avoids the use of chillers and air handlers, and doesn't require any water to cool the fluid in the cooling tanks. Limiting use of water is a growing priority for data center operators, as cooling towers can use large volumes of water resources. The VSC3 system packs 600 teraflops of computing power into 1,000 square feet of floor space.

Baidu Forced To Withdraw Last Month's ImageNet Test Results 94

elwinc writes: Back in mid-May, Baidu, a computer research and services organization in Mainland China, announced impressive results on the ImageNet "Large Scale Visual Recognition Challenge," besting results posted by Google and Microsoft. Turns out, Baidu gamed the system, creating 30 accounts and running far more than the 2 tests per week allowed in the contest. Having been caught cheating, Baidu has been banned for a year from the challenge. I believe all competitors are using variations on the convolutional neural network, AKA deep network. Running the test dozens of times per week might allow a competitor to pre-tune parameters for the particular problem, thus producing results that might not generalize to other problems. All of which makes it quite ironic that a Baidu scientist crowed "Our company is now leading the race in computer intelligence!"

Baidu's Supercomputer Beats Google At Image Recognition 115

catchblue22 writes: Using the ImageNet object classification benchmark, Baidu’s Minwa supercomputer scanned more than 1 million images and taught itself to sort them into about 1,000 categories and achieved an image identification error rate of just 4.58 percent, beating humans, Microsoft and Google. Google's system scored a 95.2% and Microsoft's, a 95.06%, Baidu said. “Our company is now leading the race in computer intelligence,” said Ren Wu, a Baidu scientist working on the project. “I think this is the fastest supercomputer dedicated to deep learning,” he said. “We have great power in our hands—much greater than our competitors.”

Nuclear Fusion Simulator Among Software Picked For US's Summit Supercomputer 57

An anonymous reader writes Today, The Register has learned of 13 science projects approved by boffins at the US Department of Energy to run on the 300-petaFLOPS Summit. These software packages, selected for the Center for Accelerated Application Readiness (CAAR) program, will be ported to the massive parallel machine, and are hoped to make full use of the supercomputer's architecture.They range from astrophysics, biophysics, chemistry, and climate modeling to combustion engineering, materials science, nuclear physics, plasma physics and seismology.

US Blocks Intel From Selling Xeon Chips To Chinese Supercomputer Projects 229

itwbennett writes: U.S. government agencies have stopped Intel from selling microprocessors for China's supercomputers, apparently reflecting concern about their use in nuclear tests. In February, four supercomputing institutions in China were placed on a U.S. government list that effectively bans them from receiving certain U.S. exports. The institutions were involved in building Tianhe-2 and Tianhe-1A, both of which have allegedly been used for 'nuclear explosive activities,' according to a notice (PDF) posted by the U.S. Department of Commerce. Intel has been selling its Xeon chips to Chinese supercomputers for years, so the ban represents a blow to its business.

US Pens $200 Million Deal For Massive Nuclear Security-Focused Supercomputer 74

An anonymous reader writes For the first time in over twenty years of supercomputing history, a chipmaker [Intel] has been awarded the contract to build a leading-edge national computing resource. This machine, expected to reach a peak performance of 180 petaflops, will provide massive compute power to Argonne National Laboratory, which will receive the HPC gear in 2018. Supercomputer maker Cray, which itself has had a remarkable couple of years contract-wise in government and commercial spheres, will be the integrator and manufacturer of the "Aurora" super. This machine will be a next-generation variant of its "Shasta" supercomputer line. The new $200 million supercomputer is set to be installed at Argonne's Leadership Computing Facility in 2018, rounding out a trio of systems aimed at bolstering nuclear security initiatives as well as pushing the performance of key technical computing applications valued by the Department of Energy and other agencies.

NVIDIA Announces SHIELD Game Console 116

MojoKid writes: NVIDIA held an event in San Francisco last night at GDC, where the company unveiled a new Android TV streamer, game console, and supercomputer, as NVIDIA's Jen Hsun Huang calls it, all wrapped up in a single, ultra-slim device called NVIDIA SHIELD. The SHIELD console is powered by the NVIDIA Tegra X1 SoC with 3GB of RAM, 16GB of storage, Gig-E and 802.11ac 2x2 MIMO WiFi. It's also 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264) with encode/decode with full hardware processing. The company claims the console provides twice the performance of an Xbox 360. NVIDIA demo'ed the device with Android TV, streaming music and HD movies and browsing social media. The device can stream games from a GeForce powered PC to your television or from NVIDIA's GRID cloud gaming service, just like previous NVIDIA SHIELD devices. Native Android games will also run on the SHIELD console. NVIDIA's plan is to offer a wide array of native Android titles in the SHIELD store, as well as leverage the company's relationships with game developers to bring top titles to GRID. The device was shown playing Gearbox's Borderlands The Pre-Sequel, Doom 3 BFG Edition, Metal Gear Solid V, the Unreal Engine 4 Infiltrator demo and yes, even Crysis 3.

NSF Commits $16M To Build Cloud-Based and Data-Intensive Supercomputers 29

aarondubrow writes: As supercomputing becomes central to the work and progress of researchers in all fields, new kinds of computing resources and more inclusive modes of interaction are required. The National Science Foundation announced $16M in awards to support two new supercomputing acquisitions for the open science community. The systems — "Bridges" at the Pittsburgh Supercomputing Center and "Jetstream," co-located at the Indiana University Pervasive Technology Institute and The University of Texas at Austin's Texas Advanced Computing Center — respond to the needs of the scientific computing community for more high-end, large-scale computing resources while helping to create a more inclusive computing environment for science and engineering. Reader 1sockchuck adds this article about why funding for the development of supercomputers is more important than ever: America's high-performance computing (HPC) community faces funding challenges and growing competition from China and other countries. At last week's SC14 conference, leading researchers focused on outlining the societal benefits of their work, and how it touches the daily lives of Americans. "When we talk at these conferences, we tend to talk to ourselves," said Wilf Pinfold, director of research and advanced technology development at Intel Federal. "We don't do a good job communicating the importance of what we do to a broader community." Why the focus on messaging? Funding for American supercomputing has been driven by the U.S. government, which is in a transition with implications for HPC funding. As ComputerWorld notes, climate change skeptic Ted Cruz is rumored to be in line to chair a Senate committee that oversees NASA and the NSF.

Alva Noe: Don't Worry About the Singularity, We Can't Even Copy an Amoeba 455

An anonymous reader writes "Writer and professor of philosophy at the University of California, Berkeley Alva Noe isn't worried that we will soon be under the rule of shiny metal overlords. He says that currently we can't produce "machines that exhibit the agency and awareness of an amoeba." He writes at NPR: "One reason I'm not worried about the possibility that we will soon make machines that are smarter than us, is that we haven't managed to make machines until now that are smart at all. Artificial intelligence isn't synthetic intelligence: It's pseudo-intelligence. This really ought to be obvious. Clocks may keep time, but they don't know what time it is. And strictly speaking, it is we who use them to tell time. But the same is true of Watson, the IBM supercomputer that supposedly played Jeopardy! and dominated the human competition. Watson answered no questions. It participated in no competition. It didn't do anything. All the doing was on our side. We played Jeopordy! with Watson. We used 'it' the way we use clocks.""

Does Being First Still Matter In America? 247

dcblogs writes At the supercomputing conference, SC14, this week, a U.S. Dept. of Energy offical said the government has set a goal of 2023 as its delivery date for an exascale system. It may be taking a risky path with that amount of lead time because of increasing international competition. There was a time when the U.S. didn't settle for second place. President John F. Kennedy delivered his famous "we choose to go to the moon" speech in 1962, and seven years later a man walked on the moon. The U.S. exascale goal is nine years away. China, Europe and Japan all have major exascale efforts, and the government has already dropped on supercomputing. The European forecast of Hurricane Sandy in 2012 was so far ahead of U.S. models in predicting the storm's path that the National Oceanic and Atmospheric Administration was called before Congress to explain how it happened. It was told by a U.S. official that NOAA wasn't keeping up in computational capability. It's still not keeping up. Cliff Mass, a professor of meteorology at the University of Washington, wrote on his blog last month that the U.S. is "rapidly falling behind leading weather prediction centers around the world" because it has yet to catch up in computational capability to Europe. That criticism followed the $128 million recent purchase a Cray supercomputer by the U.K.'s Met Office, its meteorological agency.

US DOE Sets Sights On 300 Petaflop Supercomputer 127

dcblogs writes U.S. officials Friday announced plans to spend $325 million on two new supercomputers, one of which may eventually be built to support speeds of up to 300 petaflops. The U.S. Department of Energy, the major funder of supercomputers used for scientific research, wants to have the two systems – each with a base speed of 150 petaflops – possibly running by 2017. Going beyond the base speed to reach 300 petaflops will take additional government approvals. If the world stands still, the U.S. may conceivably regain the lead in supercomputing speed from China with these new systems. How adequate this planned investment will look three years from now is a question. Lawmakers weren't reading from the same script as U.S. Energy Secretary Ernest Moniz when it came to assessing the U.S.'s place in the supercomputing world. Moniz said the awards "will ensure the United States retains global leadership in supercomputing." But Rep. Chuck Fleischmann (R-Tenn.) put U.S. leadership in the past tense. "Supercomputing is one of those things that we can step up and lead the world again," he said.

Interviews: Ask CMI Director Alex King About Rare Earth Mineral Supplies 62

The modern electronics industry relies on inputs and supply chains, both material and technological, and none of them are easy to bypass. These include, besides expertise and manufacturing facilities, the actual materials that go into electronic components. Some of them are as common as silicon; rare earth minerals, not so much. One story linked from Slashdot a few years back predicted that then-known supplies would be exhausted by 2017, though such predictions of scarcity are notoriously hard to get right, as people (and prices) adjust to changes in supply. There's no denying that there's been a crunch on rare earths, though, over the last several years. The minerals themselves aren't necessarily rare in an absolute sense, but they're expensive to extract. The most economically viable deposits are found in China, and rising prices for them as exports to the U.S., the EU, and Japan have raised political hackles. At the same time, those rising prices have spurred exploration and reexamination of known deposits off the coast of Japan, in the midwestern U.S., and elsewhere.

Alex King is director of the Critical Materials Institute, a part of the U.S. Department of Energy's Ames Laboratory. CMI is heavily involved in making rare earth minerals slightly less rare by means of supercomputer analysis; researchers there are approaching the ongoing crunch by looking both for substitute materials for things like gallium, indium, and tantalum, and easier ways of separating out the individual rare earths (a difficult process). One team there is working with "ligands – molecules that attach with a specific rare-earth – that allow metallurgists to extract elements with minimal contamination from surrounding minerals" to simplify the extraction process. We'll be talking with King soon; what questions would you like to see posed? (This 18-minute TED talk from King is worth watching first, as is this Q&A.)

16-Teraflops, £97m Cray To Replace IBM At UK Meteorological Office 125

Memetic writes: The UK weather forecasting service is replacing its IBM supercomputer with a Cray XC40 containing 17 petabytes of storage and capable of 16 TeraFLOPS. This is Cray's biggest contract outside the U.S. With 480,000 CPUs, it should be 13 times faster than the current system. It will weigh 140 tons. The aim is to enable more accurate modeling of the unstable UK climate, with UK-wide forecasts at a resolution of 1.5km run hourly, rather than every three hours, as currently happens. (Here's a similar system from the U.S.)

Supercomputing Upgrade Produces High-Resolution Storm Forecasts 77

dcblogs writes A supercomputer upgrade is paying off for the U.S. National Weather Service, with new high-resolution models that will offer better insight into severe weather. This improvement in modeling detail is a result of a supercomputer upgrade. The National Oceanic and Atmospheric Administration, which runs the weather service, put into production two new IBM supercomputers, each 213 teraflops, running Linux on Intel processors. These systems replaced 74-teraflop, four-year old systems. More computing power means systems can run more mathematics, and increase the resolution or detail on the maps from 8 miles to 2 miles.