Google

Google Reportedly Attains 'Quantum Supremacy' (cnet.com) 93

New submitter Bioblaze shares a report from CNET: Google has reportedly built a quantum computer more powerful than the world's top supercomputers. A Google research paper was temporarily posted online this week, the Financial Times reported Friday, and said the quantum computer's processor allowed a calculation to be performed in just over 3 minutes. That calculation would take 10,000 years on IBM's Summit, the world's most powerful commercial computer, Google reportedly said. Google researchers are throwing around the term "quantum supremacy" as a result, the FT said, because their computer can solve tasks that can't otherwise be solved. "To our knowledge, this experiment marks the first computation that can only be performed on a quantum processor," the research paper reportedly said.
Math

Two Mathematicians Solve Old Math Riddle, Possibly the Meaning of Life (livescience.com) 93

pgmrdlm shares a report from Live Science: In Douglas Adams' sci-fi series "The Hitchhiker's Guide to the Galaxy," a pair of programmers task the galaxy's largest supercomputer with answering the ultimate question of the meaning of life, the universe and everything. After 7.5 million years of processing, the computer reaches an answer: 42. Only then do the programmers realize that nobody knew the question the program was meant to answer. Now, in this week's most satisfying example of life reflecting art, a pair of mathematicians have used a global network of 500,000 computers to solve a centuries-old math puzzle that just happens to involve that most crucial number: 42.

The question, which goes back to at least 1955 and may have been pondered by Greek thinkers as early as the third century AD, asks, "How can you express every number between 1 and 100 as the sum of three cubes?" Or, put algebraically, how do you solve x^3 + y^3 + z^3 = k, where k equals any whole number from 1 to 100? This deceptively simple stumper is known as a Diophantine equation, named for the ancient mathematician Diophantus of Alexandria, who proposed a similar set of problems about 1,800 years ago. Modern mathematicians who revisited the puzzle in the 1950s quickly found solutions when k equals many of the smaller numbers, but a few particularly stubborn integers soon emerged. The two trickiest numbers, which still had outstanding solutions by the beginning of 2019, were 33 and -- you guessed it -- 42.
Using a computer algorithm to look for solutions to the Diophantine equation with x, y and z values that included every number between positive and negative 99 quadrillion, mathematician Andrew Booker, of the University of Bristol in England, found the solution to 33 after several weeks of computing time.

Since his search turned up no solutions for 42, Booker enlisted the help of Massachusetts Institute of Technology mathematician Andrew Sutherland, who helped him book some time with a worldwide computer network called Charity Engine. "Using this crowdsourced supercomputer and 1 million hours of processing time, Booker and Sutherland finally found an answer to the Diophantine equation where k equals 42," reports Live Science. The answer: (-80538738812075974)^3 + (80435758145817515)^3 + (12602123297335631)^3 = 42.
Supercomputing

University of Texas Announces Fastest Academic Supercomputer In the World (utexas.edu) 31

On Tuesday the University of Texas at Texas launched the fastest supercomputer at any academic facility in the world.

The computer -- named "Frontera" -- is also the fifth most-powerful supercomputer on earth. Slashdot reader aarondubrow quotes their announcement: The Texas Advanced Computing Center (TACC) at The University of Texas is also home to Stampede2, the second fastest supercomputer at any American university. The launch of Frontera solidifies UT Austin among the world's academic leaders in this realm...

Joined by representatives from the National Science Foundation (NSF) -- which funded the system with a $60 million award -- UT Austin, and technology partners Dell Technologies, Intel, Mellanox Technologies, DataDirect Networks, NVIDIA, IBM, CoolIT and Green Revolution Cooling, TACC inaugurated a new era of academic supercomputing with a resource that will help the nation's top researchers explore science at the largest scale and make the next generation of discoveries.

"Scientific challenges demand computing and data at the largest and most complex scales possible. That's what Frontera is all about," said Jim Kurose, assistant director for Computer and Information Science and Engineering at NSF. "Frontera's leadership-class computing capability will support the most computationally challenging science applications that U.S. scientists are working on today."

Frontera has been supporting science applications since June and has already enabled more than three dozen teams to conduct research on a range of topics from black hole physics to climate modeling to drug design, employing simulation, data analysis, and artificial intelligence at a scale not previously possible.

Here's more technical details from the announcement about just how fast this supercomputer really is.
Open Source

Celebrating the 28th Anniversary of the Linux Kernel (androidauthority.com) 60

Exactly 28 years ago today, a 21-year-old student named Linus Torvalds made a fateful announcement on the Usenet newsgroup comp.os.minix.

i-Programmer commemorates today's anniversary with some interesting trivia: Back in 1991 the fledgling operating system didn't have a name, according to Joey Sneddon's 27 Interesting Facts about Linux:

Linux very nearly wasn't called Linux! Linus wanted to call his "hobby" project "FreaX" (a combination of "free", "freak" and "Unix"). Thankfully, he was persuaded otherwise by the owner of the server hosting his early code, who happened to prefer the name "Linux" (a combination of "Linus" and "Unix").

One fact I had been unaware of is that the original version of Linux wasn't open source software. It was free but was distributed with a license forbidding commercial use or redistribution. However, for version 0.12, released in 1992, the GPL was adopted making the code freely available.

Android Authority describes the rest of the revolution: Torvalds announced to the internet that he was working on a project he said was "just a hobby, won't be big and professional." Less than one month later, Torvalds released the Linux kernel to the public. The world hasn't been the same since...

To commemorate the nearly 30 years that Linux has been available, we compiled a shortlist of ways Linux has fundamentally changed our lives.

- Linux-based operating systems are the number-one choice for servers around the world... As of 2015, web analytics and market share company W3Cook estimated that as many as 96.4% of all servers ran Linux or one of its derivatives. No matter the exact number, it's safe to say that the kernel nearly powers the entire web...

- In Oct. 2003, a team of developers forked Android from Linux to run on digital cameras. Nearly 16 years later, it's the single most popular operating system in the world, running on more than 2 billion devices. Even Chrome OS, Android TV, and Wear OS are all forked from Linux. Google isn't the only one to do this either. Samsung's own in-house operating system, Tizen, is forked from Linux as well, and it's is even backed by The Linux Foundation.

- Linux has even changed how we study the universe at large. For similar reasons cars and supercomputers use Linux, NASA uses it for most of the computers aboard the International Space Station. Astronauts use these computers to carry out research and perform tasks related to their assignments. But NASA isn't the only galaxy studying organization using Linux. The privately-owned SpaceX also uses Linux for many of its projects. In 2017, SpaceX sent a Linux-powered supercomputer developed by HP to space and, according to an AMA on Reddit, even the Dragon and Falcon 9 run Linux.

"Without it," the article concludes, "there would be no science or social human development, and we would all still be cave-people."
Security

Cray Is Building a Supercomputer To Manage the US' Nuclear Stockpile (engadget.com) 65

An anonymous reader quotes a report from Engadget: The U.S. Department of Energy (DOE) and National Nuclear Security Administration (NNSA) have announced they've signed a contract with Cray Computing for the NNSA's first exascale supercomputer, "El Capitan." El Capitan's job will be to will perform essential functions for the Stockpile Stewardship Program, which supports U.S. national security missions in ensuring the safety, security and effectiveness of the nation's nuclear stockpile in the absence of underground testing. Developed as part of the second phase of the Collaboration of Oak Ridge, Argonne and Livermore (CORAL-2) procurement, the computer will be used to make critical assessments necessary for addressing evolving threats to national security and other issues such as non-proliferation and nuclear counterterrorism.

El Capitan will have a peak performance of more than 1.5 exaflops -- which is 1.5 quintillion calculations per second. It'll run applications 50 times faster than Lawrence Livermore National Laboratory's (LLNL) Sequoia system and 10 times faster than its Sierra system, which is currently the world's second most powerful super computer. It'll be four times more energy efficient than Sierra, too. The $600 million El Capitan is expected to go into production by late 2023.
"NNSA is modernizing the Nuclear Security Enterprise to face 21st century threats," said Lisa E Gordon-Hagerty, DOE undersecretary for nuclear security and NNSA administrator. "El Capitan will allow us to be more responsive, innovative and forward-thinking when it comes to maintaining a nuclear deterrent that is second-to-none in a rapidly-evolving threat environment."
Earth

How The Advance Weather Forecast Got Good (npr.org) 80

NPR notes today's "supercomputer-driven" weather modelling can crunch huge amounts of data to accurately forecast the weather a week in advance -- pointing out that "a six-day weather forecast today is as good as a two-day forecast was in the 1970s."

Here's some highlights from their interview with Andrew Blum, author of The Weather Machine: A Journey Inside the Forecast : One of the things that's happened as the scale in the system has shifted to the computers is that it's no longer bound by past experience. It's no longer, the meteorologists say, "Well, this happened in the past, we can expect it to happen again." We're more ready for these new extremes because we're not held down by past expectations...

The models are really a kind of ongoing concern. ... They run ahead in time, and then every six hours or every 12 hours, they compare their own forecast with the latest observations. And so the models in reality are ... sort of dancing together, where the model makes a forecast and it's corrected slightly by the observations that are coming in...

It's definitely run by individual nations -- but individual nations with their systems tied together... It's a 150-year-old system of governments collaborating with each other as a global public good... The positive example from last month was with Cyclone Fani in India. And this was a very similar storm to one 20 years ago, that tens of thousands of people had died. This time around, the forecast came far enough in advance and with enough confidence that the Indian government was able to move a million people out of the way.

Science

How To Evaluate Computers That Don't Quite Exist (sciencemag.org) 27

sciencehabit writes: To gauge the performance of a supercomputer, computer scientists turn to a standard tool: a set of algorithms called LINPACK that tests how fast the machine solves problems with huge numbers of variables. For quantum computers, which might one day solve certain problems that overwhelm conventional computers, no such benchmarking standard exists. One reason is that the computers, which aim to harness the laws of quantum mechanics to accelerate certain computations, are still rudimentary, with radically different designs contending. In some, the quantum bits, or qubits, needed for computation are embodied in the spin of strings of trapped ions, whereas others rely on patches of superconducting metal resonating with microwaves. Comparing the embryonic architectures "is sort of like visiting a nursery school to decide which of the toddlers will become basketball stars," says Scott Aaronson, a computer scientist at the University of Texas in Austin.

Yet researchers are making some of their first attempts to take the measure of quantum computers. Last week, Margaret Martonosi, a computer scientist at Princeton University, and colleagues presented a head-to-head comparison of quantum computers from IBM, Rigetti Computing in Berkeley, California, and the University of Maryland (UMD) in College Park. The UMD machine, which uses trapped ions, ran a majority of 12 test algorithms more accurately than the other superconducting machines, the team reported at the International Symposium on Computer Architecture in Phoenix. Christopher Monroe, a UMD physicist and founder of the company IonQ, predicts such comparisons will become the standard. "These toy algorithms give you a simple answer -- did it work or not?" But even Martonosi warns against making too much of the tests. In fact, the analysis underscores how hard it is to compare quantum computers -- which leaves room for designers to choose metrics that put their machines in a favorable light.

United States

US Blacklists More Chinese Tech Companies Over National Security Concerns (nytimes.com) 82

The Trump administration added five Chinese entities to a United States blacklist on Friday, further restricting China's access to American technology and stoking already high tensions as President Trump and President Xi Jinping of China prepare to meet in Japan next week. From a report: The Commerce Department announced that it would add four Chinese companies and one Chinese institute to an "entity list," saying they posed risks to American national security or foreign policy interests [Editor's note: the link may be paywalled; alternative source]. The move essentially bars the entities, which include one of China's leading supercomputer makers, Sugon, and a number of its subsidiaries set up to design microchips, from buying American technology and components without a waiver from the United States government.

The move could all but cripple these Chinese businesses, which rely on American chips and other technology to manufacture advanced electronics. Those added to the entity list also include Higon, Chengdu Haiguang Integrated Circuit, Chengdu Haiguang Microelectronics Technology, and Wuxi Jiangnan Institute of Computing Technology, which lead China's development of high performance computing, some of which is used in military applications like simulating nuclear explosions, the Commerce Department said.
Each of the aforementioned companies does businesses under a variety of other names.
Math

How a Professor Beat Roulette, Crediting a Non-Existent Supercomputer (thehustle.co) 156

I loved this story. The Hustle remembers how in 1964 a world-renowned medical professor found a way to beat roulette wheels, kicking off a five-year winning streak in which he amassed $1,250,000 ($8,000,000 today). He noticed that at the end of each night, casinos would replace cards and dice with fresh sets -- but the expensive roulette wheels went untouched and often stayed in service for decades before being replaced. Like any other machine, these wheels acquired wear and tear. Jarecki began to suspect that tiny defects -- chips, dents, scratches, unlevel surfaces -- might cause certain wheels to land on certain numbers more frequently than randomocity prescribed. The doctor spent weekends commuting between the operating table and the roulette table, manually recording thousands upon thousands of spins, and analyzing the data for statistical abnormalities. "I [experimented] until I had a rough outline of a system based on the previous winning numbers," he told the Sydney Morning Herald in 1969. "If numbers 1, 2, and 3 won the last 3 rounds, [I could determine] what was most likely to win the next 3...."

With his wife, Carol, he scouted dozens of wheels at casinos around Europe, from Monte Carlo (Monaco), to Divonne-les-Bains (France), to Baden-Baden (Germany). The pair recruited a team of 8 "clockers" who posted up at these venues, sometimes recording as many as 20,000 spins over a month-long period. Then, in 1964, he made his first strike. After establishing which wheels were biased, he secured a £25,000 loan from a Swiss financier and spent 6 months candidly exacting his strategy. By the end of the run, he'd netted £625,000 (roughly $6,700,000 today).

Jarecki's victories made headlines in newspapers all over the world, from Kansas to Australia. Everyone wanted his "secret" -- but he knew that if he wanted to replicate the feat, he'd have to conceal his true methodology. So, he concocted a "fanciful tale" for the press: He tallied roulette outcomes daily, then fed the information into an Atlas supercomputer, which told him which numbers to pick. At the time, wrote gambling historian, Russell Barnhart, in Beating the Wheel, "Computers were looked upon as creatures from outer space... Few persons, including casino managers, were vocationally qualified to distinguish myth from reality." Hiding behind this technological ruse, Jarecki continued to keep tabs on biased tables -- and prepare for his next big move...

In the decades following Jarecki's dominance, casinos invested heavily in monitoring their roulette tables for defects and building wheels less prone to bias. Today, most wheels have gone digital, run by algorithms programmed to favor the house.

Businesses

Hewlett Packard Enterprise To Acquire Supercomputer Maker Cray for $1.3 Billion (anandtech.com) 101

Hewlett Packard Enterprise will be buying the supercomputer maker Cray for roughly $1.3 billion, the companies said this morning. Intending to use Cray's knowledge and technology to bolster their own supercomputing and high-performance computing technologies, when the deal closes, HPE will become the world leader for supercomputing technology. From a report: Cray of course needs no introduction. The current leader in the supercomputing field and founder of supercomputing as we know it, Cray has been a part of the supercomputing landscape since the 1970s. Starting at the time with fully custom systems, in more recent years Cray has morphed into an integrator and scale-out specialist, combining processors from the likes of Intel, AMD, and NVIDIA into supercomputers, and applying their own software, I/O, and interconnect technologies. The timing of the acquisition announcement closely follows other major news from Cray: the company just landed a $600 million US Department of Energy contract to supply the Frontier supercomputer to Oak Ridge National Laboratory in 2021. Frontier is one of two exascale supercomputers Cray is involved in -- the other being a subcontractor for the 2021 Aurora system -- and in fact Cray is involved in the only two exascale systems ordered by the US Government thus far. So in both a historical and modern context, Cray was and is one of the biggest players in the supercomputing market.
The Courts

Who To Sue When a Robot Loses Your Fortune (bloomberg.com) 201

An anonymous reader shares a report: It all started over lunch at a Dubai restaurant on March 19, 2017. It was the first time 45-year-old Li, met Costa, the 49-year-old Italian who's often known by peers in the industry as "Captain Magic." During their meal, Costa described a robot hedge fund his company London-based Tyndaris Investments would soon offer to manage money entirely using AI, or artificial intelligence. Developed by Austria-based AI company 42.cx, the supercomputer named K1 would comb through online sources like real-time news and social media to gauge investor sentiment and make predictions on US stock futures. It would then send instructions to a broker to execute trades, adjusting its strategy over time based on what it had learned.

The idea of a fully automated money manager inspired Li instantly. He met Costa for dinner three days later, saying in an email beforehand that the AI fund "is exactly my kind of thing." Over the following months, Costa shared simulations with Li showing K1 making double-digit returns, although the two now dispute the thoroughness of the back-testing. Li eventually let K1 manage $2.5bn -- $250m of his own cash and the rest leverage from Citigroup. The plan was to double that over time. But Li's affection for K1 waned almost as soon as the computer started trading in late 2017. By February 2018, it was regularly losing money, including over $20m in a single day -- Feb. 14 -- due to a stop-loss order Li's lawyers argue wouldn't have been triggered if K1 was as sophisticated as Costa led him to believe.

AMD

World's Fastest Supercomputer Coming To US in 2021 From Cray, AMD (cnet.com) 89

The "exascale" computing race is getting a new entrant called Frontier, a $600 million machine with Cray and AMD technology that could become the world's fastest when it arrives at Oak Ridge National Laboratory in 2021. From a report: Frontier should be able to perform 1.5 quintillion calculations per second, a level called 1.5 exaflops and enough to claim the performance crown, the Energy Department announced Tuesday. Its speed will be about 10 times faster than that of the current record holder on the Top500 supercomputer ranking, the IBM-built Summit machine, also at Oak Ridge, and should surpass a $500 million, 1-exaflops Cray-Intel supercomputer called Aurora to be built in 2021 at Argonne National Laboratory. There's no guarantee the US will win the race to exascale machines -- those that cross the 1-exaflop threshold -- because China, Japan and France each could have exascale machines in 2020. At stake is more than national bragging rights: It's also about the ability to perform cutting-edge research in areas like genomics, nuclear physics, cosmology, drug discovery, artificial intelligence and climate simulation.
IBM

IBM Halting Sales of Watson AI Tool For Drug Discovery Amid Sluggish Growth (statnews.com) 29

Citing lackluster financial performance, IBM is halting development and sales of a product that uses its Watson AI software to help pharmaceutical companies discover new drugs, news outlet Stat reported on Thursday, citing a person familiar with the company's internal decision-making. From the report: The decision to shut down sales of Watson for Drug Discovery marks the highest-profile retreat in the company's effort to apply artificial intelligence to various areas of health care. Last year, the company scaled back on the hospital side of its business, and it's struggled to develop a reliable tool to assist doctors in treating cancer patients. In a statement, an IBM spokesperson said, "We are focusing our resources within Watson Health to double down on the adjacent field of clinical development where we see an even greater market need for our data and AI capabilities."

Further reading: IBM Pitched Its Watson Supercomputer as a Revolution in Cancer Care. It's Nowhere Close (September 2017); IBM Watson Reportedly Recommended Cancer Treatments That Were 'Unsafe and Incorrect' (July 2018).
China

US Reveals Details of $500 Million Supercomputer (nytimes.com) 60

An anonymous reader quotes a report from The New York Times: The Department of Energy disclosed details on Monday of one of the most expensive computers being built: a $500 million machine based on Intel and Cray technology that may become crucial in a high-stakes technology race between the United States and China (Warning: source may be paywalled; alternative source). The supercomputer, called Aurora, is a retooling of a development effort first announced in 2015 and is scheduled to be delivered to the Argonne National Laboratory near Chicago in 2021. Lab officials predict it will be the first American machine to reach a milestone called "exascale" performance, surpassing a quintillion calculations per second. That's roughly seven times the speed rating of the most powerful system built to date, or 1,000 times faster than the first "petascale" systems that began arriving in 2008. Backers hope the new machines will let researchers create significantly more accurate simulations of phenomena such as drug responses, climate changes, the inner workings of combustion engines and solar panels.

Aurora, which far exceeds the $200 million price for Summit, represents a record government contract for Intel and a test of its continued leadership in supercomputers. The Silicon Valley giant's popular processors -- the calculating engine for nearly all personal computers and server systems -- power most such machines. But additional accelerator chips are considered essential to reach the very highest speeds, and its rival Nvidia has built a sizable business adapting chips first used with video games for use in supercomputers. The version of Aurora announced in 2015 was based on an Intel accelerator chip that the company later discontinued. A revised plan to seek more ambitious performance targets was announced two years later. Features discussed on Monday include unreleased Intel accelerator chips, a version of its standard Xeon processor, new memory and communications technology and a design that packages chips on top of each other to save space and power.

Math

Google Smashes the World Record For Calculating Digits of Pi (wired.co.uk) 132

Pi just got bigger. Google's Compute Engine has calculated the most digits of pi ever, setting a new world record. From a report: Emma Haruka Iwao, who works in high performance computing and programming language communities at Google, used infrastructure powered by Google Cloud to calculate 31.4 trillion digits of pi. The previous world record was set by Peter Trueb in 2016, who calculated the digits of pi to 22.4 trillion digits. This is the first time that a publicly available cloud software has been used for a pi calculation of this magnitude.

Iwao became fascinated by pi when she learned about it in math class at school. At university, one of her professors, Daisuke Takahashi, was the record holder for the most-calculated digits of pi using a supercomputer. Now, y-cruncher is the software of choice for pi enthusiasts. Created in 2009, y-cruncher is designed to compute mathematical constants like pi to trillions of digits. "You need a pretty big computer to break the world record," says Iwao. "But you can't just do this with a computer from a hardware store, so people have previously built custom machines." In September of 2018, Iwao started to consider how the process of calculating even more digits of pi would work technically. Something which came up quickly was the amount of data that would be necessary to carry out the calculations, and store them -- 170 terabytes of data, which wouldn't be easily hosted by a piece of hardware. Rather than building a whole new machine Iwao used Google Cloud.

Iwao used 25 virtual machines to carry out those calculations. "But instead of clicking that virtual machine button 25 times, I automated it," she explains. "You can do it in a couple of minutes, but if you needed that many computers, it could take days just to get the next ones set up." Iwao ran y-cruncher on those 25 virtual machines, continuously, for 121 days.

ISS

Computer Servers 'Stranded' in Space (bbc.com) 89

A pair of Hewlett Packard Enterprise servers sent up to the International Space Station in August 2017 as an experiment have still not come back to Earth, three months after their intended return. From a report: Together they make up the Spaceborne Computer, a Linux system that has supercomputer processing power. They were sent up to see how durable they would be in space with minimal specialist treatment. After 530 days, they are still working. Their return flight was postponed after a Russian rocket failed in October 2018. HPE senior content architect Adrian Kasbergen said they may return in June 2019 if there is space on a flight but "right now they haven't got a ticket." The company is working with Nasa to be "computer-ready" for the first manned Mars flight, estimated to take place in about 2030. The company is also working with Elon Musk's SpaceX.
Earth

Extreme CO2 Levels Could Trigger Clouds 'Tipping Point' and 8C of Global Warming (carbonbrief.org) 254

If atmospheric CO2 levels exceed 1,200 parts per million (ppm), it could push the Earth's climate over a "tipping point", finds a new study. This would see clouds that shade large part of the oceans start to break up. From a report: According to the new paper published in the journal Nature Geoscience, this could trigger a massive 8C rise in global average temperatures -- in addition to the warming from increased CO2. The only similar example of rapid warming at this magnitude in the Earth's recent history is the Paleo-Eocene Thermal Maximum 55m years ago, when global temperatures increased by 5-8C and drove widespread extinction of species on both the oceans and land.

However, scientists not involved in the research caution that the results are still speculative and that other complicating factors could influence if or when a tipping point is reached. The threshold identified by the researchers -- a 1,200ppm concentration of atmospheric CO2 -- is three times current CO2 concentrations. If fossil fuel use continues to rapidly expand over the remainder of the century, it is possible levels could get that high. The Representative Concentration Pathways 8.5 scenario (RCP8.5), a very high emissions scenario examined by climate scientists, has the Earth's atmosphere reaching around 1,100ppm by the year 2100. But this would require the world to massively expand coal use and eschew any climate mitigation over the rest of this century.
Further reading: A state-of-the-art supercomputer simulation indicates that a feedback loop between global warming and cloud loss can push Earth's climate past a disastrous tipping point in as little as a century.
AI

The World's Fastest Supercomputer Breaks an AI Record (wired.com) 66

Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. From a report: The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before. Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI's frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

"Deep learning has never been scaled to such levels of performance before," says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. His group collaborated with researchers at Summit's home base, Oak Ridge National Lab. Fittingly, the world's most powerful computer's AI workout was focused on one of the world's largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century's worth of three-hour forecasts for Earth's atmosphere.

Education

A Supercomputer In a 19th Century Church Is 'World's Most Beautiful Data Center' (vice.com) 62

"Motherboard spoke to the Barcelona Supercomputing Center about how it outfitted a deconsecrated 19th century chapel to host the MareNostrum 4 -- the 25th most powerful supercomputer in the world," writes Slashdot reader dmoberhaus. From the report: Heralded as the "most beautiful data center in the world," the MareNostrum supercomputer came online in 2005, but was originally hosted in a different building at the university. Meaning "our sea" in Latin, the original MareNostrum was capable of performing 42.35 teraflops -- 42.35 trillion operations per second -- making it one of the most powerful supercomputers in Europe at the time. Yet the MareNostrum rightly became known for its aesthetics as much as its computing power. According to Gemma Maspoch, head of communications for Barcelona Supercomputing Center, which oversees the MareNostrum facility, the decision to place the computer in a giant glass box inside a chapel was ultimately for practical reasons.

"We were in need of hundreds of square meters without columns and the capacity to support 44.5 tons of weight," Maspoch told me in an email. "At the time there was not much available space at the university and the only room that satisfied our requirements was the Torre Girona chapel. We did not doubt it for a moment and we installed a supercomputer in it." According to Maspoch, the chapel required relatively few modifications to host the supercomputer, such as reinforcing the soil around the church so that it would hold the computer's weight and designing a glass box that would house the computer and help cool it.
The supercomputer has been beefed up over the years. Most recently, the fourth iteration came online in 2017 "with a peak computing capacity of 11 thousand trillion operations per second (11.15 petaflops)," reports Motherboard. "MareNostrum 4 is spread over 48 server racks comprising a total of 3,456 nodes. A node consists of two Intel chips, each of which has 24 processors."
Cloud

Is Linux Taking Over The World? (networkworld.com) 243

"2019 just might be the Year of Linux -- the year in which Linux is fully recognized as the powerhouse it has become," writes Network World's "Unix dweeb." The fact is that most people today are using Linux without ever knowing it -- whether on their phones, online when using Google, Facebook, Twitter, GPS devices, and maybe even in their cars, or when using cloud storage for personal or business use. While the presence of Linux on all of these systems may go largely unnoticed by consumers, the role that Linux plays in this market is a sign of how critical it has become. Most IoT and embedded devices -- those small, limited functionality devices that require good security and a small footprint and fill so many niches in our technology-driven lives -- run some variety of Linux, and this isn't likely to change. Instead, we'll just be seeing more devices and a continued reliance on open source to drive them.

According to the Cloud Industry Forum, for the first time, businesses are spending more on cloud than on internal infrastructure. The cloud is taking over the role that data centers used to play, and it's largely Linux that's making the transition so advantageous. Even on Microsoft's Azure, the most popular operating system is Linux. In its first Voice of the Enterprise survey, 451 Research predicted that 60 percent of nearly 1,000 IT leaders surveyed plan to run the majority of their IT off premises by 2019. That equates to a lot of IT efforts relying on Linux. Gartner states that 80 percent of internally developed software is now either cloud-enabled or cloud-native.

The article also cites Linux's use in AI, data lakes, and in the Sierra supercomputer that monitors America's nuclear stockpile, concluding that "In its domination of IoT, cloud technology, supercomputing and AI, Linux is heading into 2019 with a lot of momentum."

And there's even a long list of upcoming Linux conferences...

Slashdot Top Deals