×
United Kingdom

UK To Spend $1.6 Billion on World's Best Climate Supercomputer (bloomberg.com) 126

The U.K. said it will spend 1.2 billion pounds ($1.6 billion) on developing the most powerful weather and climate supercomputer in the world. From a report: The program aims to improve weather and climate modeling by the government forecaster, the Met Office, Business Secretary Alok Sharma said in a statement Monday. The machine will replace the U.K.'s existing supercomputer, which is already one of the 50 most powerful in the world. "Come rain or shine, our significant investment for a new supercomputer will further speed up weather predictions, helping people be more prepared for weather disruption from planning travel journeys to deploying flood defenses," said Sharma, who will preside over the annual round of United Nations climate talks in Glasgow, Scotland, in November. With Britain hosting the year-end climate summit, Prime Minister Boris Johnson is seeking to showcase the U.K.'s leadership in both studying the climate and reducing global greenhouse gas emissions. His government plans to use data generated by the new computer to inform policy as it seeks to spearhead the fight against climate change.
Technology

Toshiba Touts Algorithm That's Faster Than a Supercomputer (bloomberg.com) 35

It's a tantalizing prospect for traders whose success often hinges on microseconds: a desktop PC algorithm that crunches market data faster than today's most advanced supercomputers. Japan's Toshiba says it has the technology to make such rapid-fire calculations a reality -- not quite quantum computing, but perhaps the next best thing. From a report: The claim is being met with a mix of intrigue and skepticism at financial firms in Tokyo and around the world. Toshiba's "Simulated Bifurcation Algorithm" is designed to harness the principles behind quantum computers without requiring the use of such machines, which currently have limited applications and can cost millions of dollars to build and keep near absolute zero temperature. Toshiba says its technology, which may also have uses outside finance, runs on PCs made from off-the-shelf components.

"You can just plug it into a server and run it at room temperature," Kosuke Tatsumura, a senior research scientist at Toshiba's Computer & Network Systems Laboratory, said in an interview. The Tokyo-based conglomerate, while best known for its consumer electronics and nuclear reactors, has long conducted research into advanced technologies. Toshiba has said it needs a partner to adopt the algorithm for real-world use, and financial firms have taken notice as they grapple for an edge in markets increasingly dominated by machines. Banks, brokerages and asset managers have all been experimenting with quantum computing, although viable applications are generally considered to be some time away.

Robotics

Scientists Use Stems Cells From Frogs To Build First Living Robots (theguardian.com) 37

Cy Guy writes: Having not learned the lessons of Jurassic Park and the Terminator, scientists from the University of Vermont and Tufts have created "reconfigurable organisms" using stem cells from frogs. But don't worry, the research was funded by the Department of Defense, so I'm sure nothing could possibly go wrong this time. "The robots, which are less than 1mm long, are designed by an 'evolutionary algorithm' that runs on a supercomputer," reports The Guardian. "The program starts by generating random 3D configurations of 500 to 1,000 skin and heart cells. Each design is then tested in a virtual environment, to see, for example, how far it moves when the heart cells are set beating. The best performers are used to spawn more designs, which themselves are then put through their paces."

"Because heart cells spontaneously contract and relax, they behave like miniature engines that drive the robots along until their energy reserves run out," the report adds. "The cells have enough fuel inside them for the robots to survive for a week to 10 days before keeling over."

The findings have been published in the Proceedings of the National Academy of Sciences.
Television

The BBC's 1992 TV Show About VR, 3D TVs With Glasses, and Holographic 3D Screens (youtu.be) 54

dryriver writes: 27 years ago, the BBC's "Tomorrow's World" show broadcasted this little gem of a program [currently available on YouTube]. After showing old Red-Cyan Anaglyph movies, Victorian Stereoscopes, lenticular-printed holograms and a monochrome laser hologram projected into a sheet of glass, the presenter shows off a stereoscopic 3D CRT computer display with active shutter glasses. The program then takes us to a laboratory at Massachusetts Institute Of Technology, where a supercomputer is feeding 3D wireframe graphics into the world's first glasses-free holographic 3D display prototype using a Tellurium Dioxide crystal. One of the researchers at the lab predicts that "years from now, advances in LCD technology may make this kind of display cheap enough to use in the home."

A presenter then shows a bulky plastic VR headset larger than an Oculus Rift and explains how VR will let you experience completely computer-generated worlds as if you are there. The presenter notes that 1992 VR headsets may be "too bulky" for the average user, and shows a mockup of much smaller VR glasses about the size of Magic Leap's AR glasses, noting that "these are already in development." What is astonishing about watching this 27-year-old TV broadcast is a) the realization that much of today's stereo stereo 3D tech was already around in some form or another in the early 1990s; b) VR headsets took an incredibly long time to reach the consumer and are still too bulky; and that c) almost three decades later, MIT's prototype holographic glasses-free 3D display technology never made its way into consumer hands or households.

Hardware

Russia Joins Race To Make Quantum Dreams a Reality (nature.com) 18

Russia has launched an effort to build a working quantum computer, in a bid to catch up to other countries in the race for practical quantum technologies. From a report: The government will inject around 50 billion roubles (US$790 million) over the next 5 years into basic and applied quantum research carried out at leading Russian laboratories, the country's deputy prime minister, Maxim Akimov, announced on 6 December at a technology forum in Sochi. The windfall is part of a 258-billion-rouble programme for research and development in digital technologies, which the Kremlin has deemed vital for modernizing and diversifying the Russian economy. "This is a real boost," says Aleksey Fedorov, a quantum physicist at the Russian Quantum Center (RQC), a private research facility in Skolkovo near Moscow. "If things work out as planned, this initiative will be a major step towards bringing Russian quantum science to a world-class standard."

[...] The race is on to create quantum computers that outperform classical machines in specific tasks. Prototypes developed by Google and IBM, headquartered in Mountain View, California, and Armonk, New York, respectively, are approaching the limit of classical computer simulation. In October, scientists at Google announced that a quantum processor working on a specific calculation had achieved such a quantum advantage. Russia is far from this milestone. "We're 5 to 10 years behind," says Fedorov. "But there's a lot of potential here, and we follow very closely what's happening abroad." Poor funding has excluded Russian quantum scientists from competing with Google, says Ilya Besedin, an engineer at the National University of Science and Technology in Moscow.

PlayStation (Games)

The Rise and Fall of the PlayStation Supercomputers (theverge.com) 50

"On the 25th anniversary of the original Sony PlayStation, The Verge shares the story of the PlayStation supercomputers," writes Slashdot reader jimminy_cricket. From the report: Dozens of PlayStation 3s sit in a refrigerated shipping container on the University of Massachusetts Dartmouth's campus, sucking up energy and investigating astrophysics. It's a popular stop for tours trying to sell the school to prospective first-year students and their parents, and it's one of the few living legacies of a weird science chapter in PlayStation's history. Those squat boxes, hulking on entertainment systems or dust-covered in the back of a closet, were once coveted by researchers who used the consoles to build supercomputers. With the racks of machines, the scientists were suddenly capable of contemplating the physics of black holes, processing drone footage, or winning cryptography contests. It only lasted a few years before tech moved on, becoming smaller and more efficient. But for that short moment, some of the most powerful computers in the world could be hacked together with code, wire, and gaming consoles. "The game consoles entered the supercomputing scene in 2002 when Sony released a kit called Linux for the PlayStation 2," reports The Verge. Craig Steffen, senior research scientist at the National Center for Supercomputing Applications, and his group hooked up between 60 and 70 PlayStation 2s, wrote some code, and built out a library.

"The PS3 entered the scene in late 2006 with powerful hardware and an easier way to load Linux onto the devices," the report adds. "Researchers would still need to link the systems together, but suddenly, it was possible for them to imagine linking together all of those devices into something that was a game-changer instead of just a proof-of-concept prototype."
Intel

Intel Unveils 7nm Ponte Vecchio GPU Architecture For Supercomputers and AI (hothardware.com) 28

MojoKid writes: Intel has unveiled its first discrete GPU solution that will hit the market in 2020, code name Ponte Vecchio. Based on 7nm silicon manufacturing and stack chiplet design with Intel's Foveros tech, Ponte Vecchio will target HPC markets for supercomputers and AI training in the datacenter. According to HotHardware, Ponte Vecchio will employ a combination of both its Foveros 3D packaging and EMIB (Embedded Multi-die Interconnect Bridge) technologies, along with High Bandwidth Memory (HBM) and Compute Express Link (CXL), which will operate over the newly ratified PCIe 5.0 interface and serve as Ponte Vecchio's high-speed switch fabric connecting all GPU resources. Intel is billing Ponte Vecchio as its first exascale GPU, proving its meddle in the U.S. Department of Energy's (DOE) Aurora supercomputer. Aurora will employ a topology of six Ponte Vecchio GPUs and two Intel Xeon Scalable processors based on Intel's next generation Sapphire Rapids architecture, along with Optane DC Persistent Memory on a single blade. The new supercomputer is schedule to arrive sometime in 2021.
United States

The World's Fastest Supercomputers Hit Higher Speeds Than Ever With Linux (zdnet.com) 124

An anonymous reader quotes a report from ZDNet: In the latest Top 500 supercomputer ratings, the average speed of these Linux-powered racers is now an astonishing 1.14 petaflops. The fastest of the fast machines haven't changed since the June 2019 Top 500 supercomputer list. Leading the way is Oak Ridge National Laboratory's Summit system, which holds top honors with an HPL result of 148.6 petaflops. This is an IBM-built supercomputer using Power9 CPUs and NVIDIA Tesla V100 GPUs. In a rather distant second place is another IBM machine: Lawrence Livermore National Laboratory's Sierra system. It uses the same chips, but it "only" hit a speed of 94.6 petaflops.

Close behind at No. 3 is the Sunway TaihuLight supercomputer, with an HPL mark of 93.0 petaflops. TaihuLight was developed by China's National Research Center of Parallel Computer Engineering and Technology (NRCPC) and is installed at the National Supercomputing Center in Wuxi. It is powered exclusively by Sunway's SW26010 processors. Sunway's followed by the Tianhe-2A (Milky Way-2A). This is a system developed by China's National University of Defense Technology (NUDT). It's deployed at the National Supercomputer Center in China. Powered by Intel Xeon CPUs and Matrix-2000 accelerators, it has a top speed of 61.4 petaflops. Coming at No. 5, the Dell-built, Frontera, a Dell C6420 system is powered by Intel Xeon Platinum processors. It speeds along at 23.5 petaflops. It lives at the Texas Advanced Computing Center of the University of Texas. The most powerful new supercomputer on the list is Rensselaer Polytechnic Institute Center for Computational Innovations (CCI)'s AiMOS. It made the list in the 25th position with 8.0 petaflops. The IBM-built system, like Summit and Sierra, is powered by Power9 CPUs and NVIDIA V100 GPUs.
In closing, ZDNet's Steven J. Vaughan-Nichols writes: "Regardless of the hardware, all 500 of the world's fastest supercomputers have one thing in common: They all run Linux."
Advertising

Does Linux Have a Marketing Problem? (hackaday.com) 263

On Hackaday's hosting site Hackaday.io, an electrical engineer with a background in semiconductor physics argues that Linux's small market share is due to a lack of marketing: Not only does [Linux] have dominance when raw computing ability is needed, either in a supercomputer or a webserver, but it must have some ability to effectively work as a personal computer as well, otherwise Android wouldn't be so popular on smartphones and tablets. From there it follows that the only reason that Microsoft and Apple dominate the desktop world is because they have a marketing group behind their products, which provides customers with a comfortable customer service layer between themselves and the engineers and programmers at those companies, and also drowns out the message that Linux even exists in the personal computing realm...

Part of the problem too is that Linux and most of its associated software is free and open source. What is often a strength when it comes to the quality of software and its flexibility and customizablity becomes a weakness when there's no revenue coming in to actually fund a marketing group that would be able to address this core communications issue between potential future users and the creators of the software. Canonical, Red Hat, SUSE and others all had varying successes, but this illistrates another problem: the splintered nature of open-source software causes a fragmenting not just in the software itself but the resources. Imagine if there were hundreds of different versions of macOS that all Apple users had to learn about and then decide which one was the best for their needs...

I have been using Linux exclusively since I ditched XP for 5.10 Breezy Badger and would love to live in a world where I'm not forced into the corporate hellscape of a Windows environment every day for no other reason than most people already know how to use Windows. With a cohesive marketing strategy, I think this could become a reality, but it won't happen through passionate essays on "free as in freedom" or the proper way to pronounce "GNU" or the benefits of using Gentoo instead of Arch. It'll only come if someone can unify all the splintered groups around a cohesive, simple message and market it to the public.

Google

Quantum Supremacy From Google? Not So Fast, Says IBM. (technologyreview.com) 80

IBM is disputing the much-vaunted claim that Google has hit a new milestone. From a report: A month ago, news broke that Google had reportedly achieved "quantum supremacy": it had gotten a quantum computer to run a calculation that would take a classical computer an unfeasibly long time. While the calculation itself -- essentially, a very specific technique for outputting random numbers -- is about as useful as the Wright brothers' 12-second first flight, it would be a milestone of similar significance, marking the dawn of an entirely new era of computing. But in a blog post published this week, IBM disputes Google's claim. The task that Google says might take the world's fastest classical supercomputer 10,000 years can actually, says IBM, be done in just days.

As John Preskill, the CalTech physicist who coined the term "quantum supremacy," wrote in an article for Quanta magazine, Google specifically chose a very narrow task that a quantum computer would be good at and a classical computer is bad at. "This quantum computation has very little structure, which makes it harder for the classical computer to keep up, but also means that the answer is not very informative," he wrote. Google's research paper hasn't been published, but a draft was leaked online last month. In it, researchers say they got a machine with 53 quantum bits, or qubits, to do the calculation in 200 seconds. They also estimated that it would take the world's most powerful supercomputer, the Summit machine at Oak Ridge National Laboratory, 10,000 years to repeat it with equal "fidelity," or the same level of uncertainty as the inherently uncertain quantum system.

Oracle

Oracle's New Supercomputer Has 1,060 Raspberry Pis (tomshardware.com) 71

An anonymous reader quotes Tom's Hardware: One Raspberry Pi can make a nice web server, but what happens if you put more than 1,000 of them together? At Oracle's OpenWorld convention on Monday, the company showed off a Raspberry Pi Supercomputer that combines 1,060 Raspberry Pis into one powerful cluster.

According to ServeTheHome, which first reported the story, the supercomputer features scores of racks with 21 Raspberry Pi 3 B+ boards each. To make everything run well together, the system runs on Oracle Autonomous Linux... Every unit connects to a single rebranded Supermicro 1U Xeon server, which functions as a central storage server for the whole supercomputer. The Oracle team also created custom, 3D printed brackets to help support all the Pis and connecting components...

ServeTheHome asked Oracle why it chose to create a cluster of Raspberry Pis instead of using a virtualized Arm server and one company rep said simply that "...a big cluster is cool."

Google

Google Reportedly Attains 'Quantum Supremacy' (cnet.com) 93

New submitter Bioblaze shares a report from CNET: Google has reportedly built a quantum computer more powerful than the world's top supercomputers. A Google research paper was temporarily posted online this week, the Financial Times reported Friday, and said the quantum computer's processor allowed a calculation to be performed in just over 3 minutes. That calculation would take 10,000 years on IBM's Summit, the world's most powerful commercial computer, Google reportedly said. Google researchers are throwing around the term "quantum supremacy" as a result, the FT said, because their computer can solve tasks that can't otherwise be solved. "To our knowledge, this experiment marks the first computation that can only be performed on a quantum processor," the research paper reportedly said.
Math

Two Mathematicians Solve Old Math Riddle, Possibly the Meaning of Life (livescience.com) 93

pgmrdlm shares a report from Live Science: In Douglas Adams' sci-fi series "The Hitchhiker's Guide to the Galaxy," a pair of programmers task the galaxy's largest supercomputer with answering the ultimate question of the meaning of life, the universe and everything. After 7.5 million years of processing, the computer reaches an answer: 42. Only then do the programmers realize that nobody knew the question the program was meant to answer. Now, in this week's most satisfying example of life reflecting art, a pair of mathematicians have used a global network of 500,000 computers to solve a centuries-old math puzzle that just happens to involve that most crucial number: 42.

The question, which goes back to at least 1955 and may have been pondered by Greek thinkers as early as the third century AD, asks, "How can you express every number between 1 and 100 as the sum of three cubes?" Or, put algebraically, how do you solve x^3 + y^3 + z^3 = k, where k equals any whole number from 1 to 100? This deceptively simple stumper is known as a Diophantine equation, named for the ancient mathematician Diophantus of Alexandria, who proposed a similar set of problems about 1,800 years ago. Modern mathematicians who revisited the puzzle in the 1950s quickly found solutions when k equals many of the smaller numbers, but a few particularly stubborn integers soon emerged. The two trickiest numbers, which still had outstanding solutions by the beginning of 2019, were 33 and -- you guessed it -- 42.
Using a computer algorithm to look for solutions to the Diophantine equation with x, y and z values that included every number between positive and negative 99 quadrillion, mathematician Andrew Booker, of the University of Bristol in England, found the solution to 33 after several weeks of computing time.

Since his search turned up no solutions for 42, Booker enlisted the help of Massachusetts Institute of Technology mathematician Andrew Sutherland, who helped him book some time with a worldwide computer network called Charity Engine. "Using this crowdsourced supercomputer and 1 million hours of processing time, Booker and Sutherland finally found an answer to the Diophantine equation where k equals 42," reports Live Science. The answer: (-80538738812075974)^3 + (80435758145817515)^3 + (12602123297335631)^3 = 42.
Supercomputing

University of Texas Announces Fastest Academic Supercomputer In the World (utexas.edu) 31

On Tuesday the University of Texas at Texas launched the fastest supercomputer at any academic facility in the world.

The computer -- named "Frontera" -- is also the fifth most-powerful supercomputer on earth. Slashdot reader aarondubrow quotes their announcement: The Texas Advanced Computing Center (TACC) at The University of Texas is also home to Stampede2, the second fastest supercomputer at any American university. The launch of Frontera solidifies UT Austin among the world's academic leaders in this realm...

Joined by representatives from the National Science Foundation (NSF) -- which funded the system with a $60 million award -- UT Austin, and technology partners Dell Technologies, Intel, Mellanox Technologies, DataDirect Networks, NVIDIA, IBM, CoolIT and Green Revolution Cooling, TACC inaugurated a new era of academic supercomputing with a resource that will help the nation's top researchers explore science at the largest scale and make the next generation of discoveries.

"Scientific challenges demand computing and data at the largest and most complex scales possible. That's what Frontera is all about," said Jim Kurose, assistant director for Computer and Information Science and Engineering at NSF. "Frontera's leadership-class computing capability will support the most computationally challenging science applications that U.S. scientists are working on today."

Frontera has been supporting science applications since June and has already enabled more than three dozen teams to conduct research on a range of topics from black hole physics to climate modeling to drug design, employing simulation, data analysis, and artificial intelligence at a scale not previously possible.

Here's more technical details from the announcement about just how fast this supercomputer really is.
Open Source

Celebrating the 28th Anniversary of the Linux Kernel (androidauthority.com) 60

Exactly 28 years ago today, a 21-year-old student named Linus Torvalds made a fateful announcement on the Usenet newsgroup comp.os.minix.

i-Programmer commemorates today's anniversary with some interesting trivia: Back in 1991 the fledgling operating system didn't have a name, according to Joey Sneddon's 27 Interesting Facts about Linux:

Linux very nearly wasn't called Linux! Linus wanted to call his "hobby" project "FreaX" (a combination of "free", "freak" and "Unix"). Thankfully, he was persuaded otherwise by the owner of the server hosting his early code, who happened to prefer the name "Linux" (a combination of "Linus" and "Unix").

One fact I had been unaware of is that the original version of Linux wasn't open source software. It was free but was distributed with a license forbidding commercial use or redistribution. However, for version 0.12, released in 1992, the GPL was adopted making the code freely available.

Android Authority describes the rest of the revolution: Torvalds announced to the internet that he was working on a project he said was "just a hobby, won't be big and professional." Less than one month later, Torvalds released the Linux kernel to the public. The world hasn't been the same since...

To commemorate the nearly 30 years that Linux has been available, we compiled a shortlist of ways Linux has fundamentally changed our lives.

- Linux-based operating systems are the number-one choice for servers around the world... As of 2015, web analytics and market share company W3Cook estimated that as many as 96.4% of all servers ran Linux or one of its derivatives. No matter the exact number, it's safe to say that the kernel nearly powers the entire web...

- In Oct. 2003, a team of developers forked Android from Linux to run on digital cameras. Nearly 16 years later, it's the single most popular operating system in the world, running on more than 2 billion devices. Even Chrome OS, Android TV, and Wear OS are all forked from Linux. Google isn't the only one to do this either. Samsung's own in-house operating system, Tizen, is forked from Linux as well, and it's is even backed by The Linux Foundation.

- Linux has even changed how we study the universe at large. For similar reasons cars and supercomputers use Linux, NASA uses it for most of the computers aboard the International Space Station. Astronauts use these computers to carry out research and perform tasks related to their assignments. But NASA isn't the only galaxy studying organization using Linux. The privately-owned SpaceX also uses Linux for many of its projects. In 2017, SpaceX sent a Linux-powered supercomputer developed by HP to space and, according to an AMA on Reddit, even the Dragon and Falcon 9 run Linux.

"Without it," the article concludes, "there would be no science or social human development, and we would all still be cave-people."
Security

Cray Is Building a Supercomputer To Manage the US' Nuclear Stockpile (engadget.com) 65

An anonymous reader quotes a report from Engadget: The U.S. Department of Energy (DOE) and National Nuclear Security Administration (NNSA) have announced they've signed a contract with Cray Computing for the NNSA's first exascale supercomputer, "El Capitan." El Capitan's job will be to will perform essential functions for the Stockpile Stewardship Program, which supports U.S. national security missions in ensuring the safety, security and effectiveness of the nation's nuclear stockpile in the absence of underground testing. Developed as part of the second phase of the Collaboration of Oak Ridge, Argonne and Livermore (CORAL-2) procurement, the computer will be used to make critical assessments necessary for addressing evolving threats to national security and other issues such as non-proliferation and nuclear counterterrorism.

El Capitan will have a peak performance of more than 1.5 exaflops -- which is 1.5 quintillion calculations per second. It'll run applications 50 times faster than Lawrence Livermore National Laboratory's (LLNL) Sequoia system and 10 times faster than its Sierra system, which is currently the world's second most powerful super computer. It'll be four times more energy efficient than Sierra, too. The $600 million El Capitan is expected to go into production by late 2023.
"NNSA is modernizing the Nuclear Security Enterprise to face 21st century threats," said Lisa E Gordon-Hagerty, DOE undersecretary for nuclear security and NNSA administrator. "El Capitan will allow us to be more responsive, innovative and forward-thinking when it comes to maintaining a nuclear deterrent that is second-to-none in a rapidly-evolving threat environment."
Earth

How The Advance Weather Forecast Got Good (npr.org) 80

NPR notes today's "supercomputer-driven" weather modelling can crunch huge amounts of data to accurately forecast the weather a week in advance -- pointing out that "a six-day weather forecast today is as good as a two-day forecast was in the 1970s."

Here's some highlights from their interview with Andrew Blum, author of The Weather Machine: A Journey Inside the Forecast : One of the things that's happened as the scale in the system has shifted to the computers is that it's no longer bound by past experience. It's no longer, the meteorologists say, "Well, this happened in the past, we can expect it to happen again." We're more ready for these new extremes because we're not held down by past expectations...

The models are really a kind of ongoing concern. ... They run ahead in time, and then every six hours or every 12 hours, they compare their own forecast with the latest observations. And so the models in reality are ... sort of dancing together, where the model makes a forecast and it's corrected slightly by the observations that are coming in...

It's definitely run by individual nations -- but individual nations with their systems tied together... It's a 150-year-old system of governments collaborating with each other as a global public good... The positive example from last month was with Cyclone Fani in India. And this was a very similar storm to one 20 years ago, that tens of thousands of people had died. This time around, the forecast came far enough in advance and with enough confidence that the Indian government was able to move a million people out of the way.

Science

How To Evaluate Computers That Don't Quite Exist (sciencemag.org) 27

sciencehabit writes: To gauge the performance of a supercomputer, computer scientists turn to a standard tool: a set of algorithms called LINPACK that tests how fast the machine solves problems with huge numbers of variables. For quantum computers, which might one day solve certain problems that overwhelm conventional computers, no such benchmarking standard exists. One reason is that the computers, which aim to harness the laws of quantum mechanics to accelerate certain computations, are still rudimentary, with radically different designs contending. In some, the quantum bits, or qubits, needed for computation are embodied in the spin of strings of trapped ions, whereas others rely on patches of superconducting metal resonating with microwaves. Comparing the embryonic architectures "is sort of like visiting a nursery school to decide which of the toddlers will become basketball stars," says Scott Aaronson, a computer scientist at the University of Texas in Austin.

Yet researchers are making some of their first attempts to take the measure of quantum computers. Last week, Margaret Martonosi, a computer scientist at Princeton University, and colleagues presented a head-to-head comparison of quantum computers from IBM, Rigetti Computing in Berkeley, California, and the University of Maryland (UMD) in College Park. The UMD machine, which uses trapped ions, ran a majority of 12 test algorithms more accurately than the other superconducting machines, the team reported at the International Symposium on Computer Architecture in Phoenix. Christopher Monroe, a UMD physicist and founder of the company IonQ, predicts such comparisons will become the standard. "These toy algorithms give you a simple answer -- did it work or not?" But even Martonosi warns against making too much of the tests. In fact, the analysis underscores how hard it is to compare quantum computers -- which leaves room for designers to choose metrics that put their machines in a favorable light.

United States

US Blacklists More Chinese Tech Companies Over National Security Concerns (nytimes.com) 82

The Trump administration added five Chinese entities to a United States blacklist on Friday, further restricting China's access to American technology and stoking already high tensions as President Trump and President Xi Jinping of China prepare to meet in Japan next week. From a report: The Commerce Department announced that it would add four Chinese companies and one Chinese institute to an "entity list," saying they posed risks to American national security or foreign policy interests [Editor's note: the link may be paywalled; alternative source]. The move essentially bars the entities, which include one of China's leading supercomputer makers, Sugon, and a number of its subsidiaries set up to design microchips, from buying American technology and components without a waiver from the United States government.

The move could all but cripple these Chinese businesses, which rely on American chips and other technology to manufacture advanced electronics. Those added to the entity list also include Higon, Chengdu Haiguang Integrated Circuit, Chengdu Haiguang Microelectronics Technology, and Wuxi Jiangnan Institute of Computing Technology, which lead China's development of high performance computing, some of which is used in military applications like simulating nuclear explosions, the Commerce Department said.
Each of the aforementioned companies does businesses under a variety of other names.
Math

How a Professor Beat Roulette, Crediting a Non-Existent Supercomputer (thehustle.co) 156

I loved this story. The Hustle remembers how in 1964 a world-renowned medical professor found a way to beat roulette wheels, kicking off a five-year winning streak in which he amassed $1,250,000 ($8,000,000 today). He noticed that at the end of each night, casinos would replace cards and dice with fresh sets -- but the expensive roulette wheels went untouched and often stayed in service for decades before being replaced. Like any other machine, these wheels acquired wear and tear. Jarecki began to suspect that tiny defects -- chips, dents, scratches, unlevel surfaces -- might cause certain wheels to land on certain numbers more frequently than randomocity prescribed. The doctor spent weekends commuting between the operating table and the roulette table, manually recording thousands upon thousands of spins, and analyzing the data for statistical abnormalities. "I [experimented] until I had a rough outline of a system based on the previous winning numbers," he told the Sydney Morning Herald in 1969. "If numbers 1, 2, and 3 won the last 3 rounds, [I could determine] what was most likely to win the next 3...."

With his wife, Carol, he scouted dozens of wheels at casinos around Europe, from Monte Carlo (Monaco), to Divonne-les-Bains (France), to Baden-Baden (Germany). The pair recruited a team of 8 "clockers" who posted up at these venues, sometimes recording as many as 20,000 spins over a month-long period. Then, in 1964, he made his first strike. After establishing which wheels were biased, he secured a £25,000 loan from a Swiss financier and spent 6 months candidly exacting his strategy. By the end of the run, he'd netted £625,000 (roughly $6,700,000 today).

Jarecki's victories made headlines in newspapers all over the world, from Kansas to Australia. Everyone wanted his "secret" -- but he knew that if he wanted to replicate the feat, he'd have to conceal his true methodology. So, he concocted a "fanciful tale" for the press: He tallied roulette outcomes daily, then fed the information into an Atlas supercomputer, which told him which numbers to pick. At the time, wrote gambling historian, Russell Barnhart, in Beating the Wheel, "Computers were looked upon as creatures from outer space... Few persons, including casino managers, were vocationally qualified to distinguish myth from reality." Hiding behind this technological ruse, Jarecki continued to keep tabs on biased tables -- and prepare for his next big move...

In the decades following Jarecki's dominance, casinos invested heavily in monitoring their roulette tables for defects and building wheels less prone to bias. Today, most wheels have gone digital, run by algorithms programmed to favor the house.

Slashdot Top Deals