Earth

How The Advance Weather Forecast Got Good (npr.org) 80

NPR notes today's "supercomputer-driven" weather modelling can crunch huge amounts of data to accurately forecast the weather a week in advance -- pointing out that "a six-day weather forecast today is as good as a two-day forecast was in the 1970s."

Here's some highlights from their interview with Andrew Blum, author of The Weather Machine: A Journey Inside the Forecast : One of the things that's happened as the scale in the system has shifted to the computers is that it's no longer bound by past experience. It's no longer, the meteorologists say, "Well, this happened in the past, we can expect it to happen again." We're more ready for these new extremes because we're not held down by past expectations...

The models are really a kind of ongoing concern. ... They run ahead in time, and then every six hours or every 12 hours, they compare their own forecast with the latest observations. And so the models in reality are ... sort of dancing together, where the model makes a forecast and it's corrected slightly by the observations that are coming in...

It's definitely run by individual nations -- but individual nations with their systems tied together... It's a 150-year-old system of governments collaborating with each other as a global public good... The positive example from last month was with Cyclone Fani in India. And this was a very similar storm to one 20 years ago, that tens of thousands of people had died. This time around, the forecast came far enough in advance and with enough confidence that the Indian government was able to move a million people out of the way.

Science

How To Evaluate Computers That Don't Quite Exist (sciencemag.org) 27

sciencehabit writes: To gauge the performance of a supercomputer, computer scientists turn to a standard tool: a set of algorithms called LINPACK that tests how fast the machine solves problems with huge numbers of variables. For quantum computers, which might one day solve certain problems that overwhelm conventional computers, no such benchmarking standard exists. One reason is that the computers, which aim to harness the laws of quantum mechanics to accelerate certain computations, are still rudimentary, with radically different designs contending. In some, the quantum bits, or qubits, needed for computation are embodied in the spin of strings of trapped ions, whereas others rely on patches of superconducting metal resonating with microwaves. Comparing the embryonic architectures "is sort of like visiting a nursery school to decide which of the toddlers will become basketball stars," says Scott Aaronson, a computer scientist at the University of Texas in Austin.

Yet researchers are making some of their first attempts to take the measure of quantum computers. Last week, Margaret Martonosi, a computer scientist at Princeton University, and colleagues presented a head-to-head comparison of quantum computers from IBM, Rigetti Computing in Berkeley, California, and the University of Maryland (UMD) in College Park. The UMD machine, which uses trapped ions, ran a majority of 12 test algorithms more accurately than the other superconducting machines, the team reported at the International Symposium on Computer Architecture in Phoenix. Christopher Monroe, a UMD physicist and founder of the company IonQ, predicts such comparisons will become the standard. "These toy algorithms give you a simple answer -- did it work or not?" But even Martonosi warns against making too much of the tests. In fact, the analysis underscores how hard it is to compare quantum computers -- which leaves room for designers to choose metrics that put their machines in a favorable light.

United States

US Blacklists More Chinese Tech Companies Over National Security Concerns (nytimes.com) 82

The Trump administration added five Chinese entities to a United States blacklist on Friday, further restricting China's access to American technology and stoking already high tensions as President Trump and President Xi Jinping of China prepare to meet in Japan next week. From a report: The Commerce Department announced that it would add four Chinese companies and one Chinese institute to an "entity list," saying they posed risks to American national security or foreign policy interests [Editor's note: the link may be paywalled; alternative source]. The move essentially bars the entities, which include one of China's leading supercomputer makers, Sugon, and a number of its subsidiaries set up to design microchips, from buying American technology and components without a waiver from the United States government.

The move could all but cripple these Chinese businesses, which rely on American chips and other technology to manufacture advanced electronics. Those added to the entity list also include Higon, Chengdu Haiguang Integrated Circuit, Chengdu Haiguang Microelectronics Technology, and Wuxi Jiangnan Institute of Computing Technology, which lead China's development of high performance computing, some of which is used in military applications like simulating nuclear explosions, the Commerce Department said.
Each of the aforementioned companies does businesses under a variety of other names.
Math

How a Professor Beat Roulette, Crediting a Non-Existent Supercomputer (thehustle.co) 156

I loved this story. The Hustle remembers how in 1964 a world-renowned medical professor found a way to beat roulette wheels, kicking off a five-year winning streak in which he amassed $1,250,000 ($8,000,000 today). He noticed that at the end of each night, casinos would replace cards and dice with fresh sets -- but the expensive roulette wheels went untouched and often stayed in service for decades before being replaced. Like any other machine, these wheels acquired wear and tear. Jarecki began to suspect that tiny defects -- chips, dents, scratches, unlevel surfaces -- might cause certain wheels to land on certain numbers more frequently than randomocity prescribed. The doctor spent weekends commuting between the operating table and the roulette table, manually recording thousands upon thousands of spins, and analyzing the data for statistical abnormalities. "I [experimented] until I had a rough outline of a system based on the previous winning numbers," he told the Sydney Morning Herald in 1969. "If numbers 1, 2, and 3 won the last 3 rounds, [I could determine] what was most likely to win the next 3...."

With his wife, Carol, he scouted dozens of wheels at casinos around Europe, from Monte Carlo (Monaco), to Divonne-les-Bains (France), to Baden-Baden (Germany). The pair recruited a team of 8 "clockers" who posted up at these venues, sometimes recording as many as 20,000 spins over a month-long period. Then, in 1964, he made his first strike. After establishing which wheels were biased, he secured a £25,000 loan from a Swiss financier and spent 6 months candidly exacting his strategy. By the end of the run, he'd netted £625,000 (roughly $6,700,000 today).

Jarecki's victories made headlines in newspapers all over the world, from Kansas to Australia. Everyone wanted his "secret" -- but he knew that if he wanted to replicate the feat, he'd have to conceal his true methodology. So, he concocted a "fanciful tale" for the press: He tallied roulette outcomes daily, then fed the information into an Atlas supercomputer, which told him which numbers to pick. At the time, wrote gambling historian, Russell Barnhart, in Beating the Wheel, "Computers were looked upon as creatures from outer space... Few persons, including casino managers, were vocationally qualified to distinguish myth from reality." Hiding behind this technological ruse, Jarecki continued to keep tabs on biased tables -- and prepare for his next big move...

In the decades following Jarecki's dominance, casinos invested heavily in monitoring their roulette tables for defects and building wheels less prone to bias. Today, most wheels have gone digital, run by algorithms programmed to favor the house.

Businesses

Hewlett Packard Enterprise To Acquire Supercomputer Maker Cray for $1.3 Billion (anandtech.com) 101

Hewlett Packard Enterprise will be buying the supercomputer maker Cray for roughly $1.3 billion, the companies said this morning. Intending to use Cray's knowledge and technology to bolster their own supercomputing and high-performance computing technologies, when the deal closes, HPE will become the world leader for supercomputing technology. From a report: Cray of course needs no introduction. The current leader in the supercomputing field and founder of supercomputing as we know it, Cray has been a part of the supercomputing landscape since the 1970s. Starting at the time with fully custom systems, in more recent years Cray has morphed into an integrator and scale-out specialist, combining processors from the likes of Intel, AMD, and NVIDIA into supercomputers, and applying their own software, I/O, and interconnect technologies. The timing of the acquisition announcement closely follows other major news from Cray: the company just landed a $600 million US Department of Energy contract to supply the Frontier supercomputer to Oak Ridge National Laboratory in 2021. Frontier is one of two exascale supercomputers Cray is involved in -- the other being a subcontractor for the 2021 Aurora system -- and in fact Cray is involved in the only two exascale systems ordered by the US Government thus far. So in both a historical and modern context, Cray was and is one of the biggest players in the supercomputing market.
The Courts

Who To Sue When a Robot Loses Your Fortune (bloomberg.com) 201

An anonymous reader shares a report: It all started over lunch at a Dubai restaurant on March 19, 2017. It was the first time 45-year-old Li, met Costa, the 49-year-old Italian who's often known by peers in the industry as "Captain Magic." During their meal, Costa described a robot hedge fund his company London-based Tyndaris Investments would soon offer to manage money entirely using AI, or artificial intelligence. Developed by Austria-based AI company 42.cx, the supercomputer named K1 would comb through online sources like real-time news and social media to gauge investor sentiment and make predictions on US stock futures. It would then send instructions to a broker to execute trades, adjusting its strategy over time based on what it had learned.

The idea of a fully automated money manager inspired Li instantly. He met Costa for dinner three days later, saying in an email beforehand that the AI fund "is exactly my kind of thing." Over the following months, Costa shared simulations with Li showing K1 making double-digit returns, although the two now dispute the thoroughness of the back-testing. Li eventually let K1 manage $2.5bn -- $250m of his own cash and the rest leverage from Citigroup. The plan was to double that over time. But Li's affection for K1 waned almost as soon as the computer started trading in late 2017. By February 2018, it was regularly losing money, including over $20m in a single day -- Feb. 14 -- due to a stop-loss order Li's lawyers argue wouldn't have been triggered if K1 was as sophisticated as Costa led him to believe.

AMD

World's Fastest Supercomputer Coming To US in 2021 From Cray, AMD (cnet.com) 89

The "exascale" computing race is getting a new entrant called Frontier, a $600 million machine with Cray and AMD technology that could become the world's fastest when it arrives at Oak Ridge National Laboratory in 2021. From a report: Frontier should be able to perform 1.5 quintillion calculations per second, a level called 1.5 exaflops and enough to claim the performance crown, the Energy Department announced Tuesday. Its speed will be about 10 times faster than that of the current record holder on the Top500 supercomputer ranking, the IBM-built Summit machine, also at Oak Ridge, and should surpass a $500 million, 1-exaflops Cray-Intel supercomputer called Aurora to be built in 2021 at Argonne National Laboratory. There's no guarantee the US will win the race to exascale machines -- those that cross the 1-exaflop threshold -- because China, Japan and France each could have exascale machines in 2020. At stake is more than national bragging rights: It's also about the ability to perform cutting-edge research in areas like genomics, nuclear physics, cosmology, drug discovery, artificial intelligence and climate simulation.
IBM

IBM Halting Sales of Watson AI Tool For Drug Discovery Amid Sluggish Growth (statnews.com) 29

Citing lackluster financial performance, IBM is halting development and sales of a product that uses its Watson AI software to help pharmaceutical companies discover new drugs, news outlet Stat reported on Thursday, citing a person familiar with the company's internal decision-making. From the report: The decision to shut down sales of Watson for Drug Discovery marks the highest-profile retreat in the company's effort to apply artificial intelligence to various areas of health care. Last year, the company scaled back on the hospital side of its business, and it's struggled to develop a reliable tool to assist doctors in treating cancer patients. In a statement, an IBM spokesperson said, "We are focusing our resources within Watson Health to double down on the adjacent field of clinical development where we see an even greater market need for our data and AI capabilities."

Further reading: IBM Pitched Its Watson Supercomputer as a Revolution in Cancer Care. It's Nowhere Close (September 2017); IBM Watson Reportedly Recommended Cancer Treatments That Were 'Unsafe and Incorrect' (July 2018).
China

US Reveals Details of $500 Million Supercomputer (nytimes.com) 60

An anonymous reader quotes a report from The New York Times: The Department of Energy disclosed details on Monday of one of the most expensive computers being built: a $500 million machine based on Intel and Cray technology that may become crucial in a high-stakes technology race between the United States and China (Warning: source may be paywalled; alternative source). The supercomputer, called Aurora, is a retooling of a development effort first announced in 2015 and is scheduled to be delivered to the Argonne National Laboratory near Chicago in 2021. Lab officials predict it will be the first American machine to reach a milestone called "exascale" performance, surpassing a quintillion calculations per second. That's roughly seven times the speed rating of the most powerful system built to date, or 1,000 times faster than the first "petascale" systems that began arriving in 2008. Backers hope the new machines will let researchers create significantly more accurate simulations of phenomena such as drug responses, climate changes, the inner workings of combustion engines and solar panels.

Aurora, which far exceeds the $200 million price for Summit, represents a record government contract for Intel and a test of its continued leadership in supercomputers. The Silicon Valley giant's popular processors -- the calculating engine for nearly all personal computers and server systems -- power most such machines. But additional accelerator chips are considered essential to reach the very highest speeds, and its rival Nvidia has built a sizable business adapting chips first used with video games for use in supercomputers. The version of Aurora announced in 2015 was based on an Intel accelerator chip that the company later discontinued. A revised plan to seek more ambitious performance targets was announced two years later. Features discussed on Monday include unreleased Intel accelerator chips, a version of its standard Xeon processor, new memory and communications technology and a design that packages chips on top of each other to save space and power.

Math

Google Smashes the World Record For Calculating Digits of Pi (wired.co.uk) 132

Pi just got bigger. Google's Compute Engine has calculated the most digits of pi ever, setting a new world record. From a report: Emma Haruka Iwao, who works in high performance computing and programming language communities at Google, used infrastructure powered by Google Cloud to calculate 31.4 trillion digits of pi. The previous world record was set by Peter Trueb in 2016, who calculated the digits of pi to 22.4 trillion digits. This is the first time that a publicly available cloud software has been used for a pi calculation of this magnitude.

Iwao became fascinated by pi when she learned about it in math class at school. At university, one of her professors, Daisuke Takahashi, was the record holder for the most-calculated digits of pi using a supercomputer. Now, y-cruncher is the software of choice for pi enthusiasts. Created in 2009, y-cruncher is designed to compute mathematical constants like pi to trillions of digits. "You need a pretty big computer to break the world record," says Iwao. "But you can't just do this with a computer from a hardware store, so people have previously built custom machines." In September of 2018, Iwao started to consider how the process of calculating even more digits of pi would work technically. Something which came up quickly was the amount of data that would be necessary to carry out the calculations, and store them -- 170 terabytes of data, which wouldn't be easily hosted by a piece of hardware. Rather than building a whole new machine Iwao used Google Cloud.

Iwao used 25 virtual machines to carry out those calculations. "But instead of clicking that virtual machine button 25 times, I automated it," she explains. "You can do it in a couple of minutes, but if you needed that many computers, it could take days just to get the next ones set up." Iwao ran y-cruncher on those 25 virtual machines, continuously, for 121 days.

ISS

Computer Servers 'Stranded' in Space (bbc.com) 89

A pair of Hewlett Packard Enterprise servers sent up to the International Space Station in August 2017 as an experiment have still not come back to Earth, three months after their intended return. From a report: Together they make up the Spaceborne Computer, a Linux system that has supercomputer processing power. They were sent up to see how durable they would be in space with minimal specialist treatment. After 530 days, they are still working. Their return flight was postponed after a Russian rocket failed in October 2018. HPE senior content architect Adrian Kasbergen said they may return in June 2019 if there is space on a flight but "right now they haven't got a ticket." The company is working with Nasa to be "computer-ready" for the first manned Mars flight, estimated to take place in about 2030. The company is also working with Elon Musk's SpaceX.
Earth

Extreme CO2 Levels Could Trigger Clouds 'Tipping Point' and 8C of Global Warming (carbonbrief.org) 254

If atmospheric CO2 levels exceed 1,200 parts per million (ppm), it could push the Earth's climate over a "tipping point", finds a new study. This would see clouds that shade large part of the oceans start to break up. From a report: According to the new paper published in the journal Nature Geoscience, this could trigger a massive 8C rise in global average temperatures -- in addition to the warming from increased CO2. The only similar example of rapid warming at this magnitude in the Earth's recent history is the Paleo-Eocene Thermal Maximum 55m years ago, when global temperatures increased by 5-8C and drove widespread extinction of species on both the oceans and land.

However, scientists not involved in the research caution that the results are still speculative and that other complicating factors could influence if or when a tipping point is reached. The threshold identified by the researchers -- a 1,200ppm concentration of atmospheric CO2 -- is three times current CO2 concentrations. If fossil fuel use continues to rapidly expand over the remainder of the century, it is possible levels could get that high. The Representative Concentration Pathways 8.5 scenario (RCP8.5), a very high emissions scenario examined by climate scientists, has the Earth's atmosphere reaching around 1,100ppm by the year 2100. But this would require the world to massively expand coal use and eschew any climate mitigation over the rest of this century.
Further reading: A state-of-the-art supercomputer simulation indicates that a feedback loop between global warming and cloud loss can push Earth's climate past a disastrous tipping point in as little as a century.
AI

The World's Fastest Supercomputer Breaks an AI Record (wired.com) 66

Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. From a report: The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before. Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI's frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

"Deep learning has never been scaled to such levels of performance before," says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. His group collaborated with researchers at Summit's home base, Oak Ridge National Lab. Fittingly, the world's most powerful computer's AI workout was focused on one of the world's largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century's worth of three-hour forecasts for Earth's atmosphere.

Education

A Supercomputer In a 19th Century Church Is 'World's Most Beautiful Data Center' (vice.com) 62

"Motherboard spoke to the Barcelona Supercomputing Center about how it outfitted a deconsecrated 19th century chapel to host the MareNostrum 4 -- the 25th most powerful supercomputer in the world," writes Slashdot reader dmoberhaus. From the report: Heralded as the "most beautiful data center in the world," the MareNostrum supercomputer came online in 2005, but was originally hosted in a different building at the university. Meaning "our sea" in Latin, the original MareNostrum was capable of performing 42.35 teraflops -- 42.35 trillion operations per second -- making it one of the most powerful supercomputers in Europe at the time. Yet the MareNostrum rightly became known for its aesthetics as much as its computing power. According to Gemma Maspoch, head of communications for Barcelona Supercomputing Center, which oversees the MareNostrum facility, the decision to place the computer in a giant glass box inside a chapel was ultimately for practical reasons.

"We were in need of hundreds of square meters without columns and the capacity to support 44.5 tons of weight," Maspoch told me in an email. "At the time there was not much available space at the university and the only room that satisfied our requirements was the Torre Girona chapel. We did not doubt it for a moment and we installed a supercomputer in it." According to Maspoch, the chapel required relatively few modifications to host the supercomputer, such as reinforcing the soil around the church so that it would hold the computer's weight and designing a glass box that would house the computer and help cool it.
The supercomputer has been beefed up over the years. Most recently, the fourth iteration came online in 2017 "with a peak computing capacity of 11 thousand trillion operations per second (11.15 petaflops)," reports Motherboard. "MareNostrum 4 is spread over 48 server racks comprising a total of 3,456 nodes. A node consists of two Intel chips, each of which has 24 processors."
Cloud

Is Linux Taking Over The World? (networkworld.com) 243

"2019 just might be the Year of Linux -- the year in which Linux is fully recognized as the powerhouse it has become," writes Network World's "Unix dweeb." The fact is that most people today are using Linux without ever knowing it -- whether on their phones, online when using Google, Facebook, Twitter, GPS devices, and maybe even in their cars, or when using cloud storage for personal or business use. While the presence of Linux on all of these systems may go largely unnoticed by consumers, the role that Linux plays in this market is a sign of how critical it has become. Most IoT and embedded devices -- those small, limited functionality devices that require good security and a small footprint and fill so many niches in our technology-driven lives -- run some variety of Linux, and this isn't likely to change. Instead, we'll just be seeing more devices and a continued reliance on open source to drive them.

According to the Cloud Industry Forum, for the first time, businesses are spending more on cloud than on internal infrastructure. The cloud is taking over the role that data centers used to play, and it's largely Linux that's making the transition so advantageous. Even on Microsoft's Azure, the most popular operating system is Linux. In its first Voice of the Enterprise survey, 451 Research predicted that 60 percent of nearly 1,000 IT leaders surveyed plan to run the majority of their IT off premises by 2019. That equates to a lot of IT efforts relying on Linux. Gartner states that 80 percent of internally developed software is now either cloud-enabled or cloud-native.

The article also cites Linux's use in AI, data lakes, and in the Sierra supercomputer that monitors America's nuclear stockpile, concluding that "In its domination of IoT, cloud technology, supercomputing and AI, Linux is heading into 2019 with a lot of momentum."

And there's even a long list of upcoming Linux conferences...
China

US Overtakes China in Top Supercomputer List (bbc.com) 74

China has been pushed into third place on a list of the world's most powerful supercomputers. From a report: The latest list by Top 500, published twice a year, puts two US machines -- Summit and Sierra -- in the top two places. The US has five entries in the top 10, with other entries from Switzerland, Germany and Japan. However, overall China has 227 machines in the top 500, while the US has 109. Summit can process 200,000 trillion calculations per second. Both Summit and Sierra were built by the tech giant IBM. China's Sunway TaihuLight supercomputer, which this time last year was the world's most powerful machine, is now ranked at number three, while the country also has the fourth spot in the list.
Hardware

SpiNNaker Powers Up World's Largest Supercomputer That Emulates a Human Brain 164

The world's largest neuromorphic supercomputer, the Spiking Neural Network Architecture (SpiNNaker), was just switched on for the first time yesterday, boasting one million processor cores and the ability to perform 200 trillion actions per second. HotHardware reports: SpiNNaker has been twenty years and nearly $19.5 million in the making. The project was originally supported by the Engineering and Physical Sciences Research Council (EPSRC), but has been most recently funded by the European Human Brain Project. The supercomputer was designed and built by the University of Manchester's School of Computer Science. Construction began in 2006 and the supercomputer was finally turned on yesterday.

SpiNNaker is not the first supercomputer to incorporate one million processor cores, but it is still incredibly unique since it is designed to mimic the human brain. Most computers send information from one point to another through a standard network. SpiNNaker sends small bits of information to thousands of points, similar to how the neurons pass chemicals and electrical signals through the brain. SpiNNaker uses electronic circuits to imitate neurons. SpiNNaker has so far been used to mimic the processing of more isolated brain networks like the cortex. It has also been used to control SpOmnibot, a robot that processes visual information and navigates towards its targets.
The Internet

'You Can See Almost Everything.' Antarctica Just Became the Best-Mapped Continent on Earth (fortune.com) 110

Antarctica has become the best-mapped continent on Earth with a new high-resolution terrain map showing the ice-covered landmass in unprecedented detail. From a report: According to the scientists at Ohio State University and the University of Minnesota who created the imagery, Antarctica is now the best-mapped continent on Earth. The Reference Elevation Model of Antarctica (REMA) was constructed using hundreds of thousands of satellite images taken between 2009 and 2017, Earther reports. A supercomputer assembled the massive amounts of data, including the elevation of the land over time, and created REMA, an immensely detailed topographical map, with a file size over 150 terabytes. The new map has a resolution of 2 to 8 meters, compared to the usual 1,000 meters, says an Ohio State press release. According to The New York Times, the detail of this new map is the equivalent of being able to see down to a car, or smaller, when before you could only see the whole of Central Park. Scientists now know the elevation of every point of Antarctica, with an error margin of just a few feet.
Programming

Is Julia the Next Big Programming Language? MIT Thinks So, as Version 1.0 Lands (techrepublic.com) 386

Julia, the MIT-created programming language for developers "who want it all", hit its milestone 1.0 release this month -- with MIT highlighting its rapid adoption in the six short years since its launch. From a report: Released in 2012, Julia is designed to combine the speed of C with the usability of Python, the dynamism of Ruby, the mathematical prowess of MatLab, and the statistical chops of R. "The release of Julia 1.0 signals that Julia is now ready to change the technical world by combining the high-level productivity and ease of use of Python and R with the lightning-fast speed of C++," says MIT professor Alan Edelman. The breadth of Julia's capabilities and ability to spread workloads across hundreds of thousands of processing cores have led to its use for everything from machine learning to large-scale supercomputer simulation. MIT says Julia is the only high-level dynamic programming language in the "petaflop club," having been used to simulate 188 million stars, galaxies, and other astronomical objects on Cori, the world's 10th-most powerful supercomputer. The simulation ran in just 14.6 minutes, using 650,000 Intel Knights Landing Xeon Phi cores to handle 1.5 petaflops (quadrillion floating-point operations per second).
Education

University of Texas is Getting a $60 Million Supercomputer (cnet.com) 88

The University of Texas at Austin, will soon be home to one of the most powerful supercomputers in the world. From a report: The National Science Foundation awarded a $60 million grant to the school's Texas Advanced Computing Center, UT Austin and NSF said Wednesday. The supercomputer, named Frontera, is set to become operational roughly a year from now in 2019, and will be "among the most powerful in the world," according to a statement. To be exact, it will be the fifth most powerful in the world, third most powerful in the US, and the most powerful at a university.

Slashdot Top Deals