×
Businesses

Hewlett Packard Enterprise To Acquire Supercomputer Maker Cray for $1.3 Billion (anandtech.com) 101

Hewlett Packard Enterprise will be buying the supercomputer maker Cray for roughly $1.3 billion, the companies said this morning. Intending to use Cray's knowledge and technology to bolster their own supercomputing and high-performance computing technologies, when the deal closes, HPE will become the world leader for supercomputing technology. From a report: Cray of course needs no introduction. The current leader in the supercomputing field and founder of supercomputing as we know it, Cray has been a part of the supercomputing landscape since the 1970s. Starting at the time with fully custom systems, in more recent years Cray has morphed into an integrator and scale-out specialist, combining processors from the likes of Intel, AMD, and NVIDIA into supercomputers, and applying their own software, I/O, and interconnect technologies. The timing of the acquisition announcement closely follows other major news from Cray: the company just landed a $600 million US Department of Energy contract to supply the Frontier supercomputer to Oak Ridge National Laboratory in 2021. Frontier is one of two exascale supercomputers Cray is involved in -- the other being a subcontractor for the 2021 Aurora system -- and in fact Cray is involved in the only two exascale systems ordered by the US Government thus far. So in both a historical and modern context, Cray was and is one of the biggest players in the supercomputing market.
The Courts

Who To Sue When a Robot Loses Your Fortune (bloomberg.com) 201

An anonymous reader shares a report: It all started over lunch at a Dubai restaurant on March 19, 2017. It was the first time 45-year-old Li, met Costa, the 49-year-old Italian who's often known by peers in the industry as "Captain Magic." During their meal, Costa described a robot hedge fund his company London-based Tyndaris Investments would soon offer to manage money entirely using AI, or artificial intelligence. Developed by Austria-based AI company 42.cx, the supercomputer named K1 would comb through online sources like real-time news and social media to gauge investor sentiment and make predictions on US stock futures. It would then send instructions to a broker to execute trades, adjusting its strategy over time based on what it had learned.

The idea of a fully automated money manager inspired Li instantly. He met Costa for dinner three days later, saying in an email beforehand that the AI fund "is exactly my kind of thing." Over the following months, Costa shared simulations with Li showing K1 making double-digit returns, although the two now dispute the thoroughness of the back-testing. Li eventually let K1 manage $2.5bn -- $250m of his own cash and the rest leverage from Citigroup. The plan was to double that over time. But Li's affection for K1 waned almost as soon as the computer started trading in late 2017. By February 2018, it was regularly losing money, including over $20m in a single day -- Feb. 14 -- due to a stop-loss order Li's lawyers argue wouldn't have been triggered if K1 was as sophisticated as Costa led him to believe.

AMD

World's Fastest Supercomputer Coming To US in 2021 From Cray, AMD (cnet.com) 89

The "exascale" computing race is getting a new entrant called Frontier, a $600 million machine with Cray and AMD technology that could become the world's fastest when it arrives at Oak Ridge National Laboratory in 2021. From a report: Frontier should be able to perform 1.5 quintillion calculations per second, a level called 1.5 exaflops and enough to claim the performance crown, the Energy Department announced Tuesday. Its speed will be about 10 times faster than that of the current record holder on the Top500 supercomputer ranking, the IBM-built Summit machine, also at Oak Ridge, and should surpass a $500 million, 1-exaflops Cray-Intel supercomputer called Aurora to be built in 2021 at Argonne National Laboratory. There's no guarantee the US will win the race to exascale machines -- those that cross the 1-exaflop threshold -- because China, Japan and France each could have exascale machines in 2020. At stake is more than national bragging rights: It's also about the ability to perform cutting-edge research in areas like genomics, nuclear physics, cosmology, drug discovery, artificial intelligence and climate simulation.
IBM

IBM Halting Sales of Watson AI Tool For Drug Discovery Amid Sluggish Growth (statnews.com) 29

Citing lackluster financial performance, IBM is halting development and sales of a product that uses its Watson AI software to help pharmaceutical companies discover new drugs, news outlet Stat reported on Thursday, citing a person familiar with the company's internal decision-making. From the report: The decision to shut down sales of Watson for Drug Discovery marks the highest-profile retreat in the company's effort to apply artificial intelligence to various areas of health care. Last year, the company scaled back on the hospital side of its business, and it's struggled to develop a reliable tool to assist doctors in treating cancer patients. In a statement, an IBM spokesperson said, "We are focusing our resources within Watson Health to double down on the adjacent field of clinical development where we see an even greater market need for our data and AI capabilities."

Further reading: IBM Pitched Its Watson Supercomputer as a Revolution in Cancer Care. It's Nowhere Close (September 2017); IBM Watson Reportedly Recommended Cancer Treatments That Were 'Unsafe and Incorrect' (July 2018).
China

US Reveals Details of $500 Million Supercomputer (nytimes.com) 60

An anonymous reader quotes a report from The New York Times: The Department of Energy disclosed details on Monday of one of the most expensive computers being built: a $500 million machine based on Intel and Cray technology that may become crucial in a high-stakes technology race between the United States and China (Warning: source may be paywalled; alternative source). The supercomputer, called Aurora, is a retooling of a development effort first announced in 2015 and is scheduled to be delivered to the Argonne National Laboratory near Chicago in 2021. Lab officials predict it will be the first American machine to reach a milestone called "exascale" performance, surpassing a quintillion calculations per second. That's roughly seven times the speed rating of the most powerful system built to date, or 1,000 times faster than the first "petascale" systems that began arriving in 2008. Backers hope the new machines will let researchers create significantly more accurate simulations of phenomena such as drug responses, climate changes, the inner workings of combustion engines and solar panels.

Aurora, which far exceeds the $200 million price for Summit, represents a record government contract for Intel and a test of its continued leadership in supercomputers. The Silicon Valley giant's popular processors -- the calculating engine for nearly all personal computers and server systems -- power most such machines. But additional accelerator chips are considered essential to reach the very highest speeds, and its rival Nvidia has built a sizable business adapting chips first used with video games for use in supercomputers. The version of Aurora announced in 2015 was based on an Intel accelerator chip that the company later discontinued. A revised plan to seek more ambitious performance targets was announced two years later. Features discussed on Monday include unreleased Intel accelerator chips, a version of its standard Xeon processor, new memory and communications technology and a design that packages chips on top of each other to save space and power.

Math

Google Smashes the World Record For Calculating Digits of Pi (wired.co.uk) 132

Pi just got bigger. Google's Compute Engine has calculated the most digits of pi ever, setting a new world record. From a report: Emma Haruka Iwao, who works in high performance computing and programming language communities at Google, used infrastructure powered by Google Cloud to calculate 31.4 trillion digits of pi. The previous world record was set by Peter Trueb in 2016, who calculated the digits of pi to 22.4 trillion digits. This is the first time that a publicly available cloud software has been used for a pi calculation of this magnitude.

Iwao became fascinated by pi when she learned about it in math class at school. At university, one of her professors, Daisuke Takahashi, was the record holder for the most-calculated digits of pi using a supercomputer. Now, y-cruncher is the software of choice for pi enthusiasts. Created in 2009, y-cruncher is designed to compute mathematical constants like pi to trillions of digits. "You need a pretty big computer to break the world record," says Iwao. "But you can't just do this with a computer from a hardware store, so people have previously built custom machines." In September of 2018, Iwao started to consider how the process of calculating even more digits of pi would work technically. Something which came up quickly was the amount of data that would be necessary to carry out the calculations, and store them -- 170 terabytes of data, which wouldn't be easily hosted by a piece of hardware. Rather than building a whole new machine Iwao used Google Cloud.

Iwao used 25 virtual machines to carry out those calculations. "But instead of clicking that virtual machine button 25 times, I automated it," she explains. "You can do it in a couple of minutes, but if you needed that many computers, it could take days just to get the next ones set up." Iwao ran y-cruncher on those 25 virtual machines, continuously, for 121 days.

ISS

Computer Servers 'Stranded' in Space (bbc.com) 89

A pair of Hewlett Packard Enterprise servers sent up to the International Space Station in August 2017 as an experiment have still not come back to Earth, three months after their intended return. From a report: Together they make up the Spaceborne Computer, a Linux system that has supercomputer processing power. They were sent up to see how durable they would be in space with minimal specialist treatment. After 530 days, they are still working. Their return flight was postponed after a Russian rocket failed in October 2018. HPE senior content architect Adrian Kasbergen said they may return in June 2019 if there is space on a flight but "right now they haven't got a ticket." The company is working with Nasa to be "computer-ready" for the first manned Mars flight, estimated to take place in about 2030. The company is also working with Elon Musk's SpaceX.
Earth

Extreme CO2 Levels Could Trigger Clouds 'Tipping Point' and 8C of Global Warming (carbonbrief.org) 254

If atmospheric CO2 levels exceed 1,200 parts per million (ppm), it could push the Earth's climate over a "tipping point", finds a new study. This would see clouds that shade large part of the oceans start to break up. From a report: According to the new paper published in the journal Nature Geoscience, this could trigger a massive 8C rise in global average temperatures -- in addition to the warming from increased CO2. The only similar example of rapid warming at this magnitude in the Earth's recent history is the Paleo-Eocene Thermal Maximum 55m years ago, when global temperatures increased by 5-8C and drove widespread extinction of species on both the oceans and land.

However, scientists not involved in the research caution that the results are still speculative and that other complicating factors could influence if or when a tipping point is reached. The threshold identified by the researchers -- a 1,200ppm concentration of atmospheric CO2 -- is three times current CO2 concentrations. If fossil fuel use continues to rapidly expand over the remainder of the century, it is possible levels could get that high. The Representative Concentration Pathways 8.5 scenario (RCP8.5), a very high emissions scenario examined by climate scientists, has the Earth's atmosphere reaching around 1,100ppm by the year 2100. But this would require the world to massively expand coal use and eschew any climate mitigation over the rest of this century.
Further reading: A state-of-the-art supercomputer simulation indicates that a feedback loop between global warming and cloud loss can push Earth's climate past a disastrous tipping point in as little as a century.
AI

The World's Fastest Supercomputer Breaks an AI Record (wired.com) 66

Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. From a report: The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before. Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI's frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

"Deep learning has never been scaled to such levels of performance before," says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. His group collaborated with researchers at Summit's home base, Oak Ridge National Lab. Fittingly, the world's most powerful computer's AI workout was focused on one of the world's largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century's worth of three-hour forecasts for Earth's atmosphere.

Education

A Supercomputer In a 19th Century Church Is 'World's Most Beautiful Data Center' (vice.com) 62

"Motherboard spoke to the Barcelona Supercomputing Center about how it outfitted a deconsecrated 19th century chapel to host the MareNostrum 4 -- the 25th most powerful supercomputer in the world," writes Slashdot reader dmoberhaus. From the report: Heralded as the "most beautiful data center in the world," the MareNostrum supercomputer came online in 2005, but was originally hosted in a different building at the university. Meaning "our sea" in Latin, the original MareNostrum was capable of performing 42.35 teraflops -- 42.35 trillion operations per second -- making it one of the most powerful supercomputers in Europe at the time. Yet the MareNostrum rightly became known for its aesthetics as much as its computing power. According to Gemma Maspoch, head of communications for Barcelona Supercomputing Center, which oversees the MareNostrum facility, the decision to place the computer in a giant glass box inside a chapel was ultimately for practical reasons.

"We were in need of hundreds of square meters without columns and the capacity to support 44.5 tons of weight," Maspoch told me in an email. "At the time there was not much available space at the university and the only room that satisfied our requirements was the Torre Girona chapel. We did not doubt it for a moment and we installed a supercomputer in it." According to Maspoch, the chapel required relatively few modifications to host the supercomputer, such as reinforcing the soil around the church so that it would hold the computer's weight and designing a glass box that would house the computer and help cool it.
The supercomputer has been beefed up over the years. Most recently, the fourth iteration came online in 2017 "with a peak computing capacity of 11 thousand trillion operations per second (11.15 petaflops)," reports Motherboard. "MareNostrum 4 is spread over 48 server racks comprising a total of 3,456 nodes. A node consists of two Intel chips, each of which has 24 processors."
Cloud

Is Linux Taking Over The World? (networkworld.com) 243

"2019 just might be the Year of Linux -- the year in which Linux is fully recognized as the powerhouse it has become," writes Network World's "Unix dweeb." The fact is that most people today are using Linux without ever knowing it -- whether on their phones, online when using Google, Facebook, Twitter, GPS devices, and maybe even in their cars, or when using cloud storage for personal or business use. While the presence of Linux on all of these systems may go largely unnoticed by consumers, the role that Linux plays in this market is a sign of how critical it has become. Most IoT and embedded devices -- those small, limited functionality devices that require good security and a small footprint and fill so many niches in our technology-driven lives -- run some variety of Linux, and this isn't likely to change. Instead, we'll just be seeing more devices and a continued reliance on open source to drive them.

According to the Cloud Industry Forum, for the first time, businesses are spending more on cloud than on internal infrastructure. The cloud is taking over the role that data centers used to play, and it's largely Linux that's making the transition so advantageous. Even on Microsoft's Azure, the most popular operating system is Linux. In its first Voice of the Enterprise survey, 451 Research predicted that 60 percent of nearly 1,000 IT leaders surveyed plan to run the majority of their IT off premises by 2019. That equates to a lot of IT efforts relying on Linux. Gartner states that 80 percent of internally developed software is now either cloud-enabled or cloud-native.

The article also cites Linux's use in AI, data lakes, and in the Sierra supercomputer that monitors America's nuclear stockpile, concluding that "In its domination of IoT, cloud technology, supercomputing and AI, Linux is heading into 2019 with a lot of momentum."

And there's even a long list of upcoming Linux conferences...
China

US Overtakes China in Top Supercomputer List (bbc.com) 74

China has been pushed into third place on a list of the world's most powerful supercomputers. From a report: The latest list by Top 500, published twice a year, puts two US machines -- Summit and Sierra -- in the top two places. The US has five entries in the top 10, with other entries from Switzerland, Germany and Japan. However, overall China has 227 machines in the top 500, while the US has 109. Summit can process 200,000 trillion calculations per second. Both Summit and Sierra were built by the tech giant IBM. China's Sunway TaihuLight supercomputer, which this time last year was the world's most powerful machine, is now ranked at number three, while the country also has the fourth spot in the list.
Hardware

SpiNNaker Powers Up World's Largest Supercomputer That Emulates a Human Brain 164

The world's largest neuromorphic supercomputer, the Spiking Neural Network Architecture (SpiNNaker), was just switched on for the first time yesterday, boasting one million processor cores and the ability to perform 200 trillion actions per second. HotHardware reports: SpiNNaker has been twenty years and nearly $19.5 million in the making. The project was originally supported by the Engineering and Physical Sciences Research Council (EPSRC), but has been most recently funded by the European Human Brain Project. The supercomputer was designed and built by the University of Manchester's School of Computer Science. Construction began in 2006 and the supercomputer was finally turned on yesterday.

SpiNNaker is not the first supercomputer to incorporate one million processor cores, but it is still incredibly unique since it is designed to mimic the human brain. Most computers send information from one point to another through a standard network. SpiNNaker sends small bits of information to thousands of points, similar to how the neurons pass chemicals and electrical signals through the brain. SpiNNaker uses electronic circuits to imitate neurons. SpiNNaker has so far been used to mimic the processing of more isolated brain networks like the cortex. It has also been used to control SpOmnibot, a robot that processes visual information and navigates towards its targets.
The Internet

'You Can See Almost Everything.' Antarctica Just Became the Best-Mapped Continent on Earth (fortune.com) 110

Antarctica has become the best-mapped continent on Earth with a new high-resolution terrain map showing the ice-covered landmass in unprecedented detail. From a report: According to the scientists at Ohio State University and the University of Minnesota who created the imagery, Antarctica is now the best-mapped continent on Earth. The Reference Elevation Model of Antarctica (REMA) was constructed using hundreds of thousands of satellite images taken between 2009 and 2017, Earther reports. A supercomputer assembled the massive amounts of data, including the elevation of the land over time, and created REMA, an immensely detailed topographical map, with a file size over 150 terabytes. The new map has a resolution of 2 to 8 meters, compared to the usual 1,000 meters, says an Ohio State press release. According to The New York Times, the detail of this new map is the equivalent of being able to see down to a car, or smaller, when before you could only see the whole of Central Park. Scientists now know the elevation of every point of Antarctica, with an error margin of just a few feet.
Programming

Is Julia the Next Big Programming Language? MIT Thinks So, as Version 1.0 Lands (techrepublic.com) 386

Julia, the MIT-created programming language for developers "who want it all", hit its milestone 1.0 release this month -- with MIT highlighting its rapid adoption in the six short years since its launch. From a report: Released in 2012, Julia is designed to combine the speed of C with the usability of Python, the dynamism of Ruby, the mathematical prowess of MatLab, and the statistical chops of R. "The release of Julia 1.0 signals that Julia is now ready to change the technical world by combining the high-level productivity and ease of use of Python and R with the lightning-fast speed of C++," says MIT professor Alan Edelman. The breadth of Julia's capabilities and ability to spread workloads across hundreds of thousands of processing cores have led to its use for everything from machine learning to large-scale supercomputer simulation. MIT says Julia is the only high-level dynamic programming language in the "petaflop club," having been used to simulate 188 million stars, galaxies, and other astronomical objects on Cori, the world's 10th-most powerful supercomputer. The simulation ran in just 14.6 minutes, using 650,000 Intel Knights Landing Xeon Phi cores to handle 1.5 petaflops (quadrillion floating-point operations per second).
Education

University of Texas is Getting a $60 Million Supercomputer (cnet.com) 88

The University of Texas at Austin, will soon be home to one of the most powerful supercomputers in the world. From a report: The National Science Foundation awarded a $60 million grant to the school's Texas Advanced Computing Center, UT Austin and NSF said Wednesday. The supercomputer, named Frontera, is set to become operational roughly a year from now in 2019, and will be "among the most powerful in the world," according to a statement. To be exact, it will be the fifth most powerful in the world, third most powerful in the US, and the most powerful at a university.
AI

IBM Watson Reportedly Recommended Cancer Treatments That Were 'Unsafe and Incorrect' 103

An anonymous reader quotes a report from Gizmodo: Internal company documents from IBM show that medical experts working with the company's Watson supercomputer found "multiple examples of unsafe and incorrect treatment recommendations" when using the software, according to a report from Stat News. According to Stat, those documents provided strong criticism of the Watson for Oncology system, and stated that the "often inaccurate" suggestions made by the product bring up "serious questions about the process for building content and the underlying technology." One example in the documents is the case of a 65-year-old man diagnosed with lung cancer, who also seemed to have severe bleeding. Watson reportedly suggested the man be administered both chemotherapy and the drug "Bevacizumab." But the drug can lead to "severe or fatal hemorrhage," according to a warning on the medication, and therefore shouldn't be given to people with severe bleeding, as Stat points out. A Memorial Sloan Kettering (MSK) Cancer Center spokesperson told Stat that they believed this recommendation was not given to a real patient, and was just a part of system testing.

According to the report, the documents blame the training provided by IBM engineers and on doctors at MSK, which partnered with IBM in 2012 to train Watson to "think" more like a doctor. The documents state that -- instead of feeding real patient data into the software -- the doctors were reportedly feeding Watson hypothetical patients data, or "synthetic" case data. This would mean it's possible that when other hospitals used the MSK-trained Watson for Oncology, doctors were receiving treatment recommendations guided by MSK doctors' treatment preferences, instead of an AI interpretation of actual patient data. And the results seem to be less than desirable for some doctors.
Communications

China's Quantum Radar Could Detect Stealth Planes, Missiles (popsci.com) 194

hackingbear shares a report from Popular Science: China Electronics Technology Group Corporation (CETC), China's foremost military electronics company, announced that its groundbreaking quantum radar has achieved capability of tracking high altitude objects, likely by increasing the coherence time entangled photons. CETC envisions that its quantum radar will be used in the stratosphere to track objects in "the upper atmosphere and beyond" (including space). Quantum can identify the position, radar cross section, speed, direction and even "observe" on the composition of the target such as differentiating between an actual nuclear warhead against inflatable decoys. [...] Importantly, attempts to spoof the quantum radar would be easily noticed since any attempt to alter or duplicate the entangled photons would be detected by the radar. The news is an important illustration of a larger trend of Chinese advancement in the new, crucial area of quantum research. Other notable projects in China's quantum technology include the Micius satellite, and advances by Alibaba and the Chinese University of Science and Technology in a world record of entangling 18 photons (a quantum supercomputer would require about 50 entangled photons), such that China arguably leads the world in quantum technologies.
Japan

Japan's Fujitsu and RIKEN Have Dropped the SPARC Processor in Favor of an ARM Design Chip Scaled Up For Supercomputer Performance (ieee.org) 40

Japan's computer giant Fujitsu and RIKEN, the country's largest research institute, have begun field-testing a prototype CPU for a next-generation supercomputer they believe will take the country back to the leading position in global rankings of supercomputer might. From a report: The next-generation machine, dubbed the Post-K supercomputer, follows the two collaborators' development of the 8 petaflops K supercomputer that commenced operations for RIKEN in 2012, and which has since been upgraded to 11 petaflops in application processing speed. Now the aim is to "create the world's highest performing supercomputer," with "up to one hundred times the application execution performance of the K computer," Fujitsu declared in a press release on 21 June. The plan is to install the souped-up machine at the government-affiliated RIKEN around 2021. If the partners achieve those execution speeds, that would place the Post-K machine in exascale territory (one exaflops being a billion billion floating point operations a second). To do this, they have replaced the SPARC64 VIIIfx CPU powering the K computer with the Arm8A-SVE (Scalable Vector Extension) 512-bit architecture that's been enhanced for supercomputer use, and which both Fujitsu and RIKEN had a hand in developing. The new design runs on CPUs with 48 cores plus 2 assistant cores for the computational nodes, and with 48 cores plus 4 assistant cores for the I/O and computational nodes. The system structure uses 1 CPU per node, and 384 nodes make up one rack.
Operating Systems

Finally, It's the Year of the Linux... Supercomputer (zdnet.com) 171

Beeftopia writes: From ZDNet: "The latest TOP500 Supercomputer list is out. What's not surprising is that Linux runs on every last one of the world's fastest supercomputers. Linux has dominated supercomputing for years. But, Linux only took over supercomputing lock, stock, and barrel in November 2017. That was the first time all of the TOP500 machines were running Linux. Before that IBM AIX, a Unix variant, was hanging on for dear life low on the list."

An interesting architectural note: "GPUs, not CPUs, now power most of supercomputers' speed."

Slashdot Top Deals