Supercomputing

Nvidia CEO: Quantum Computers Won't Be Very Useful for Another 20 Years (pcmag.com) 48

Nvidia CEO Jensen Huang said quantum computers won't be very useful for another 20 years, causing stocks in this emerging sector to plunge more than 40% for a total market value loss of over $8 billion. "If you kind of said 15 years for very useful quantum computers, that'd probably be on the early side. If you said 30, is probably on the late side. But if you picked 20, I think a whole bunch of us would believe it," Huang said during a Q&A with analysts. PCMag reports: The field of quantum computing hasn't gotten nearly as much hype as generative AI and the tech giants promoting it in the past few years. Right now, part of the reason quantum computers aren't currently that helpful is because of their error rates. Nord Quantique CEO Julien Lemyre previously told PCMag that quantum error correction is the future of the field, and his firm is working on a solution. The errors that qubits, the basic unit of information in a quantum machine, currently make result in quantum computers being largely unhelpful. It's an essential hurdle to overcomeâ"but we don't currently know if or when quantum errors will be eliminated.

Chris Erven, CEO and co-founder of Kets Quantum, believes quantum computers will eventually pose a significant threat to cybersecurity. "China is making some of the largest investments in quantum computing, pumping in billions of dollars into research and development in the hope of being the first to create a large-scale, cryptographically relevant machine," Erven tells PCMag in a statement. "Although they may be a few years away from being fully operational, we know a quantum computer will be capable of breaking all traditional cyber defenses we currently use. So they, and others, are actively harvesting now, to decrypt later."
"The 15 to 20-year timeline seems very realistic," said Ivana Delevska, investment chief of Spear Invest, which holds Rigetti and IonQ shares in an actively managed ETF. "That is roughly what it took Nvidia to develop accelerated computing."
Supercomputing

Microsoft, Atom Computing Leap Ahead On the Quantum Frontier With Logical Qubits (geekwire.com) 18

An anonymous reader quotes a report from GeekWire: Microsoft and Atom Computing say they've reached a new milestone in their effort to build fault-tolerant quantum computers that can show an advantage over classical computers. Microsoft says it will start delivering the computers' quantum capabilities to customers by the end of 2025, with availability via the Azure cloud service as well as through on-premises hardware. "Together, we are co-designing and building what we believe will be the world's most powerful quantum machine," Jason Zander, executive vice president at Microsoft, said in a LinkedIn posting.

Like other players in the field, Microsoft's Azure Quantum team and Atom Computing aim to capitalize on the properties of quantum systems -- where quantum bits, also known as qubits, can process multiple values simultaneously. That's in contrast to classical systems, which typically process ones and zeros to solve algorithms. Microsoft has been working with Colorado-based Atom Computing on hardware that uses the nuclear spin properties of neutral ytterbium atoms to run quantum calculations. One of the big challenges is to create a system that can correct the errors that turn up during the calculations due to quantum noise. The solution typically involves knitting together "physical qubits" to produce an array of "logical qubits" that can correct themselves.

In a paper posted to the ArXiv preprint server, members of the research team say they were able to connect 256 noisy neutral-atom qubits using Microsoft's qubit-virtualization system in such a way as to produce a system with 24 logical qubits. "This represents the highest number of entangled logical qubits on record," study co-author Krysta Svore, vice president of advanced quantum development for Microsoft Azure Quantum, said today in a blog posting. "Entanglement of the qubits is evidenced by their error rates being significantly below the 50% threshold for entanglement." Twenty of the system's logical qubits were used to perform successful computations based on the Bernstein-Vazirani algorithm, which is used as a benchmark for quantum calculations. "The logical qubits were able to produce a more accurate solution than the corresponding computation based on physical qubits," Svore said. "The ability to compute while detecting and correcting errors is a critical component to scaling to achieve scientific quantum advantage."

Supercomputing

'El Capitan' Ranked Most Powerful Supercomputer In the World (engadget.com) 44

Lawrence Livermore National Laboratory's "El Capitan" supercomputer is now ranked as the world's most powerful, exceeding a High-Performance Linpack (HPL) score of 1.742 exaflops on the latest Top500 list. Engadget reports: El Capitan is only the third "exascale" computer, meaning it can perform more than a quintillion calculations in a second. The other two, called Frontier and Aurora, claim the second and third place slots on the TOP500 now. Unsurprisingly, all of these massive machines live within government research facilities: El Capitan is housed at Lawrence Livermore National Laboratory; Frontier is at Oak Ridge National Laboratory; Argonne National Laboratory claims Aurora. [Cray Computing] had a hand in all three systems.

El Capitan has more than 11 million combined CPU and GPU cores based on AMD 4th-gen EPYC processors. These 24-core processors are rated at 1.8GHz each and have AMD Instinct M1300A APUs. It's also relatively efficient, as such systems go, squeezing out an estimated 58.89 Gigaflops per watt. If you're wondering what El Capitan is built for, the answer is addressing nuclear stockpile safety, but it can also be used for nuclear counterterrorism.

Supercomputing

With First Mechanical Qubit, Quantum Computing Goes Steampunk (science.org) 14

An anonymous reader quotes a report from Science Magazine: Qubits, the strange devices at the heart of a quantum computer that can be set to 0, 1, or both at once, could hardly be more different from the mechanical clockwork used in the earliest computers. Today, most quantum computers rely on qubits made out of tiny circuits of superconducting metal, individual ions, photons, or other things. But now, physicists have made a working qubit from a tiny, moving machine, an advance that echoes back to the early 20th century when the first computers employed mechanical switches. "For many years, people were thinking it would be impossible to make a qubit from a mechanical system," says Adrian Bachtold, a condensed matter physicist at the Institute of Photonic Sciences who was not involved in the work, published today in Science. Stephan Durr, a quantum physicist at the Max Planck Institute for Quantum Optics, says the result "puts a new system on the map," which could be used in other experiments—and perhaps to probe the interface of quantum mechanics and gravity. [...]

The new mechanical qubit is unlikely to run more mature competition off the field any time soon. Its fidelity -- a measure of how well experimenters can set the state they desire -- is just 60%, compared with greater than 99% for the best qubits. For that reason, "it's an advance in principle," Bachtold says. But Durr notes that a mechanical qubit might serve as a supersensitive probe of forces, such as gravity, that don't affect other qubits. And ETHZ researchers hope to take their demonstration a step further by using two mechanical qubits to perform simple logical operations. "That's what Igor is working on now," [says Yiwen Chu, a physicist at ETH Zurich]. If they succeed, the physical switches of the very first computers will have made a tiny comeback.

Supercomputing

IBM Boosts the Amount of Computation You Can Get Done On Quantum Hardware (arstechnica.com) 30

An anonymous reader quotes a report from Ars Technica: There's a general consensus that we won't be able to consistently perform sophisticated quantum calculations without the development of error-corrected quantum computing, which is unlikely to arrive until the end of the decade. It's still an open question, however, whether we could perform limited but useful calculations at an earlier point. IBM is one of the companies that's betting the answer is yes, and on Wednesday, it announced a series of developments aimed at making that possible. On their own, none of the changes being announced are revolutionary. But collectively, changes across the hardware and software stacks have produced much more efficient and less error-prone operations. The net result is a system that supports the most complicated calculations yet on IBM's hardware, leaving the company optimistic that its users will find some calculations where quantum hardware provides an advantage. [...]

Wednesday's announcement was based on the introduction of the second version of its Heron processor, which has 133 qubits. That's still beyond the capability of simulations on classical computers, should it be able to operate with sufficiently low errors. IBM VP Jay Gambetta told Ars that Revision 2 of Heron focused on getting rid of what are called TLS (two-level system) errors. "If you see this sort of defect, which can be a dipole or just some electronic structure that is caught on the surface, that is what we believe is limiting the coherence of our devices," Gambetta said. This happens because the defects can resonate at a frequency that interacts with a nearby qubit, causing the qubit to drop out of the quantum state needed to participate in calculations (called a loss of coherence). By making small adjustments to the frequency that the qubits are operating at, it's possible to avoid these problems. This can be done when the Heron chip is being calibrated before it's opened for general use.

Separately, the company has done a rewrite of the software that controls the system during operations. "After learning from the community, seeing how to run larger circuits, [we were able to] almost better define what it should be and rewrite the whole stack towards that," Gambetta said. The result is a dramatic speed-up. "Something that took 122 hours now is down to a couple of hours," he told Ars. Since people are paying for time on this hardware, that's good for customers now. However, it could also pay off in the longer run, as some errors can occur randomly, so less time spent on a calculation can mean fewer errors. Despite all those improvements, errors are still likely during any significant calculations. While it continues to work toward developing error-corrected qubits, IBM is focusing on what it calls error mitigation, which it first detailed last year. [...] The problem here is that using the function is computationally difficult, and the difficulty increases with the qubit count. So, while it's still easier to do error mitigation calculations than simulate the quantum computer's behavior on the same hardware, there's still the risk of it becoming computationally intractable. But IBM has also taken the time to optimize that, too. "They've got algorithmic improvements, and the method that uses tensor methods [now] uses the GPU," Gambetta told Ars. "So I think it's a combination of both."

Supercomputing

Google Identifies Low Noise 'Phase Transition' In Its Quantum Processor (arstechnica.com) 31

An anonymous reader quotes a report from Ars Technica: Back in 2019, Google made waves by claiming it had achieved what has been called "quantum supremacy" -- the ability of a quantum computer to perform operations that would take a wildly impractical amount of time to simulate on standard computing hardware. That claim proved to be controversial, in that the operations were little more than a benchmark that involved getting the quantum computer to behave like a quantum computer; separately, improved ideas about how to perform the simulation on a supercomputer cut the time required down significantly.

But Google is back with a new exploration of the benchmark, described in a paper published in Nature on Wednesday. It uses the benchmark to identify what it calls a phase transition in the performance of its quantum processor and uses it to identify conditions where the processor can operate with low noise. Taking advantage of that, they again show that, even giving classical hardware every potential advantage, it would take a supercomputer a dozen years to simulate things.

Supercomputing

IBM Opens Its Quantum-Computing Stack To Third Parties (arstechnica.com) 7

An anonymous reader quotes a report from Ars Technica, written by John Timmer: [P]art of the software stack that companies are developing to control their quantum hardware includes software that converts abstract representations of quantum algorithms into the series of commands needed to execute them. IBM's version of this software is called Qiskit (although it was made open source and has since been adopted by other companies). Recently, IBM made a couple of announcements regarding Qiskit, both benchmarking it in comparison to other software stacks and opening it up to third-party modules. [...] Right now, the company is supporting six third-party Qiskit functions that break down into two categories.

The first can be used as stand-alone applications and are focused on providing solutions to problems for users who have no expertise programming quantum computers. One calculates the ground-state energy of molecules, and the second performs optimizations. But the remainder are focused on letting users get more out of existing quantum hardware, which tends to be error prone. But some errors occur more often than others. These errors can be due to specific quirks of individual hardware qubits or simply because some specific operations are more error prone than others. These can be handled in two different ways. One is to design the circuit being executed to avoid the situations that are most likely to produce an error. The second is to examine the final state of the algorithm to assess whether errors likely occurred and adjust to compensate for any. And third parties are providing software that can handle both of these.

One of those third parties is Q-CTRL, and we talked to its CEO, Michael Biercuk. "We build software that is really focused on everything from the lowest level of hardware manipulation, something that we call quantum firmware, up through compilation and strategies that help users map their problem onto what has to be executed on hardware," he told Ars. (Q-CTRL is also providing the optimization tool that's part of this Qiskit update.) "We're focused on suppressing errors everywhere that they can occur inside the processor," he continued. "That means the individual gate or logic operations, but it also means the execution of the circuit. There are some errors that only occur in the whole execution of a circuit as opposed to manipulating an individual quantum device." Biercuk said Q-CTRL's techniques are hardware agnostic and have been demonstrated on machines that use very different types of qubits, like trapped ions. While the sources of error on the different hardware may be distinct, the manifestations of those problems are often quite similar, making it easier for Q-CTRL's approach to work around the problems.

Those work-arounds include things like altering the properties of the microwave pulses that perform operations on IBM's hardware, and replacing the portion of Qiskit that converts an algorithm to a series of gate operations. The software will also perform operations that suppress errors that can occur when qubits are left idle during the circuit execution. As a result of all these differences, he claimed that using Q-CTRL's software allows the execution of more complex algorithms than are possible via Qiskit's default compilation and execution. "We've shown, for instance, optimization with all 156 qubits on [an IBM] system, and importantly -- I want to emphasize this word -- successful optimization," Biercuk told Ars. "What it means is you run it and you get the right answer, as opposed to I ran it and I kind of got close."

Supercomputing

As Quantum Computing Threats Loom, Microsoft Updates Its Core Crypto Library (arstechnica.com) 33

An anonymous reader quotes a report from Ars Technica: Microsoft has updated a key cryptographic library with two new encryption algorithms designed to withstand attacks from quantum computers. The updates were made last week to SymCrypt, a core cryptographic code library for handing cryptographic functions in Windows and Linux. The library, started in 2006, provides operations and algorithms developers can use to safely implement secure encryption, decryption, signing, verification, hashing, and key exchange in the apps they create. The library supports federal certification requirements for cryptographic modules used in some governmental environments. Despite the name, SymCrypt supports both symmetric and asymmetric algorithms. It's the main cryptographic library Microsoft uses in products and services including Azure, Microsoft 365, all supported versions of Windows, Azure Stack HCI, and Azure Linux. The library provides cryptographic security used in email security, cloud storage, web browsing, remote access, and device management. Microsoft documented the update in a post on Monday. The updates are the first steps in implementing a massive overhaul of encryption protocols that incorporate a new set of algorithms that aren't vulnerable to attacks from quantum computers. [...]

The first new algorithm Microsoft added to SymCrypt is called ML-KEM. Previously known as CRYSTALS-Kyber, ML-KEM is one of three post-quantum standards formalized last month by the National Institute of Standards and Technology (NIST). The KEM in the new name is short for key encapsulation. KEMs can be used by two parties to negotiate a shared secret over a public channel. Shared secrets generated by a KEM can then be used with symmetric-key cryptographic operations, which aren't vulnerable to Shor's algorithm when the keys are of a sufficient size. [...] The other algorithm added to SymCrypt is the NIST-recommended XMSS. Short for eXtended Merkle Signature Scheme, it's based on "stateful hash-based signature schemes." These algorithms are useful in very specific contexts such as firmware signing, but are not suitable for more general uses. Monday's post said Microsoft will add additional post-quantum algorithms to SymCrypt in the coming months. They are ML-DSA, a lattice-based digital signature scheme, previously called Dilithium, and SLH-DSA, a stateless hash-based signature scheme previously called SPHINCS+. Both became NIST standards last month and are formally referred to as FIPS 204 and FIPS 205.
In Monday's post, Microsoft Principal Product Manager Lead Aabha Thipsay wrote: "PQC algorithms offer a promising solution for the future of cryptography, but they also come with some trade-offs. For example, these typically require larger key sizes, longer computation times, and more bandwidth than classical algorithms. Therefore, implementing PQC in real-world applications requires careful optimization and integration with existing systems and standards."
Supercomputing

After AI, Quantum Computing Eyes Its 'Sputnik' Moment (phys.org) 52

The founder of Cambridge-based Riverlane, Steve Brierley, predicts quantum computing will have its "Sputnik" breakthrough within years. "Quantum computing is not going to be just slightly better than the previous computer, it's going to be a huge step forward," he said. Phys.org reports: His company produces the world's first dedicated quantum decoder chip, which detects and corrects the errors currently holding the technology back. In a sign of confidence in Riverlane's work and the sector in general, the company announced on Tuesday that it had raised $75 million in Series C funding, typically the last round of venture capital financing prior to an initial public offering. "Over the next two to three years, we'll be able to get to systems that can support a million error-free operations," said Earl Campbell, vice president of quantum science at Riverlane. This is the threshold where a quantum computer should be able to perform certain tasks better than conventional computers, he added.

Quantum computers are "really good at simulating other quantum systems", explained Brierley, meaning they can simulate interactions between particles, atoms and molecules. This could open the door to revolutionary medicines and also promises huge efficiency improvements in how fertilizers are made, transforming an industry that today produces around two percent of global CO2 emissions. It also paves the way for much more efficient batteries, another crucial weapon in the fight against climate change. "I think most people are more familiar with exponential after COVID, so we know how quickly something that's exponential can spread," said Campbell, inside Riverlane's testing lab, a den of oscilloscopes and chipboards. [...]

While today's quantum computers can only perform around 1,000 operations before being overwhelmed by errors, the quality of the actual components has "got to the point where the physical qubits are good enough," said Brierley. "So this is a super exciting time. The challenge now is to scale up... and to add error correction into the systems," he added. Such progress, along with quantum computing's potential to crack all existing cryptography and create potent new materials, is spurring regulators into action. "There's definitely a scrambling to understand what's coming next in technology. It's really important that we learn the lessons from AI, to not be surprised by the technology and think early about what those implications are going to be," said Brierley. "I think there will ultimately be regulation around quantum computing, because it's such an important technology. And I think this is a technology where no government wants to come second."

China

China Is Getting Secretive About Its Supercomputers 28

For decades, American and Chinese scientists collaborated on supercomputers. But Chinese scientists have become more secretive as the U.S. has tried to hinder China's technological progress, and they have stopped participating altogether in a prominent international supercomputing forum. From a report: The withdrawal marked the end of an era and created a divide that Western scientists say will slow the development of AI and other technologies as countries pursue separate projects. The new secrecy also makes it harder for the U.S. government to answer a question it deems essential to national security: Does the U.S. or China have faster supercomputers? Some academics have taken it upon themselves to hunt for clues about China's supercomputing progress, scrutinizing research papers and cornering Chinese peers at conferences.

Supercomputers have become central to the U.S.-China technological Cold War because the country with the faster supercomputers can also hold an advantage in developing nuclear weapons and other military technology. "If the other guy can use a supercomputer to simulate and develop a fighter jet or weapon 20% or even 1% better than yours in terms of range, speed and accuracy, it's going to target you first, and then it's checkmate," said Jimmy Goodrich, a senior adviser for technology analysis at Rand, a think tank. The forum that China recently stopped participating in is called the Top500, which ranks the world's 500 fastest supercomputers. While the latest ranking, released in June, says the world's three fastest computers are in the U.S., the reality is probably different.
Supercomputing

$2.4 Million Texas Home Listing Boasts Built-In 5,786 sq ft Data Center (tomshardware.com) 34

A Zillow listing for a $2.4 million house in a Dallas suburb is grabbing attention for its 5,786-square-foot data center with immersion cooling tanks, massive server racks, and two separate power grids. Tom's Hardware reports: With a brick exterior, cute paving, and mini-McMansion arch stylings, the building certainly looks to be a residential home for the archetypal Texas family. Prospective home-buyers will thus be disappointed by the 0 bedroom, 1 bathroom setup, which becomes a warehouse-feeling office from the first step inside where you are met with a glass-shielded reception desk in a white-brick corridor. The "Crypto Collective" branding betrays the former life of the unit, which served admirably as a crypto mining base.

The purchase of the "upgraded turnkey Tier 2 Data Center" will include all of its cooling and power infrastructure. Three Engineered Fluids "SLICTanks," single-phase liquid immersion cooling tanks for use with dielectric coolant, will come with pumps and a 500kW dry cooler. The tanks are currently filled with at least 80 mining computers visible from the photos, though the SLICTanks can be configured to fit more machines. Also visible in proximity to the cooling array is a deep row of classic server racks and a staggering amount of networking.

The listing advertises a host of potential uses for future customers, from "AI services, cloud hosting, traditional data center, servers or even Bitcoin Mining". Also packed into the 5,786 square feet of real estate is two separate power grids, 5 HVAC units, a hefty amount of four levels of warehouse-style storage aisles, a lounge/office space, and a fully-paved backyard. In other good news, its future corporate residents will not have an HOA to deal with, and will only be 20 minutes outside of the heart of Dallas, sitting just out of earshot of two major highways.

Hardware

Will Tesla Do a Phone? Yes, Says Morgan Stanley 170

Morgan Stanley, in a note -- seen by Slashdot -- sent to its clients on Wednesday: From our continuing discussions with automotive management teams and industry experts, the car is an extension of the phone. The phone is an extension of the car. The lines between car and phone are truly blurring.

For years, we have been writing about the potential for Tesla to expand into edge compute domains beyond the car, including last October where we described a mobile AI assistant as a 'heavy key.' Following Apple's WWDC, Tesla CEO Elon Musk re-ignited the topic by saying that making such a device is 'not out of the question.' As Mr. Musk continues to invest further into his own LLM/genAI efforts, such as 'Grok,' the potential strategic and userexperience overlap becomes more obvious.

From an automotive perspective, the topic of supercomputing at both the datacenter level and at the edge are highly relevant given the incremental global unit sold is a car that can perform OTA updates of firmware, has a battery with a stored energy equivalent of approx. 2,000 iPhones, and a liquid cooled inference supercomputer as standard kit. What if your phone could tap into your vehicle's compute power and battery supply to run AI applications?

Edge compute and AI have brought to light some of the challenges (battery life, thermal, latency, etc.) of marrying today's smartphones with ever more powerful AI-driven applications. Numerous media reports have discussed OpenAI potentially developing a consumer device specifically designed for AI.

The phone as a (heavy) car key? Any Tesla owner will tell you how they use their smartphone as their primary key to unlock their car as well as running other remote applications while they interact with their vehicles. The 'action button' on the iPhone 15 potentially takes this to a different level of convenience.
Supercomputing

UK Imposes Mysterious Ban On Quantum Computer Exports (newscientist.com) 19

Longtime Slashdot reader MattSparkes shares a report from NewScientist: Quantum computing experts are baffled by the UK government's new export restrictions on the exotic devices (source paywalled), saying they make little sense. [The UK government has set limits on the capabilities of quantum computers that can be exported -- starting with those above 34 qubits, and rising as long as error rates are also higher -- and has declined to explain these limits on the grounds of national security.] The legislation applies to both existing, small quantum computers that are of no practical use and larger computers that don't actually exist, so cannot be exported. Instead, there are fears the limits will restrict sales and add bureaucracy to a new and growing sector. For more context, here's an excerpt from an article published by The Telegraph in March: The technology has been added to a list of "dual use" items that could have military uses maintained by the Export Control Joint Unit, which scrutinizes sales of sensitive goods. A national quantum computer strategy published last year described the technology as being "critically important" for defense and national security and said the UK was in a "global race" to develop it. [...] The changes have been introduced as part of a broader update to export rules agreed by Western allies including the US and major European countries. Several nations with particular expertise on quantum computer technologies have added specific curbs, including France which introduced rules at the start of this month.

Last year, industry body Quantum UK said British companies were concerned about the prospect of further export controls, and that they could even put off US companies seeking to relocate to the UK. Quantum computer exports only previously required licenses in specific cases, such as when they were likely to lead to military use. Oxford Instruments, which makes cooling systems for quantum computers, said last year that sales in China had been hit by increasing curbs. James Lindop of law firm Eversheds Sutherland said: "Semiconductor and quantum technologies -- two areas in which the UK already holds a world-leading position -- are increasingly perceived to be highly strategic and critical to UK national security. This will undoubtedly create an additional compliance burden for businesses active in the development and production of the targeted technologies."

Supercomputing

Linux Foundation Announces Launch of 'High Performance Software Foundation' (linuxfoundation.org) 4

This week the nonprofit Linux Foundation announced the launch of the High Performance Software Foundation, which "aims to build, promote, and advance a portable core software stack for high performance computing" (or HPC) by "increasing adoption, lowering barriers to contribution, and supporting development efforts."

It promises initiatives focused on "continuously built, turnkey software stacks," as well as other initiatives including architecture support and performance regression testing. Its first open source technical projects are:

- Spack: the HPC package manager.

- Kokkos: a performance-portable programming model for writing modern C++ applications in a hardware-agnostic way.

- Viskores (formerly VTK-m): a toolkit of scientific visualization algorithms for accelerator architectures.

- HPCToolkit: performance measurement and analysis tools for computers ranging from desktop systems to GPU-accelerated supercomputers.

- Apptainer: Formerly known as Singularity, Apptainer is a Linux Foundation project providing a high performance, full featured HPC and computing optimized container subsystem.

- E4S: a curated, hardened distribution of scientific software packages.

As use of HPC becomes ubiquitous in scientific computing and digital engineering, and AI use cases multiply, more and more data centers deploy GPUs and other compute accelerators. The High Performance Software Foundation will provide a neutral space for pivotal projects in the high performance computing ecosystem, enabling industry, academia, and government entities to collaborate on the scientific software.

The High Performance Software Foundation benefits from strong support across the HPC landscape, including Premier Members Amazon Web Services (AWS), Hewlett Packard Enterprise, Lawrence Livermore National Laboratory, and Sandia National Laboratories; General Members AMD, Argonne National Laboratory, Intel, Kitware, Los Alamos National Laboratory, NVIDIA, and Oak Ridge National Laboratory; and Associate Members University of Maryland, University of Oregon, and Centre for Development of Advanced Computing.

In a statement, an AMD vice president said that by joining "we are using our collective hardware and software expertise to help develop a portable, open-source software stack for high-performance computing across industry, academia, and government." And an AWS executive said the high-performance computing community "has a long history of innovation being driven by open source projects. AWS is thrilled to join the High Performance Software Foundation to build on this work. In particular, AWS has been deeply involved in contributing upstream to Spack, and we're looking forward to working with the HPSF to sustain and accelerate the growth of key HPC projects so everyone can benefit."

The new foundation will "set up a technical advisory committee to manage working groups tackling a variety of HPC topics," according to the announcement, following a governance model based on the Cloud Native Computing Foundation.
Supercomputing

Intel Aurora Supercomputer Breaks Exascale Barrier 28

Josh Norem reports via ExtremeTech: At the recent International supercomputing conference called ISC 2024, Intel's newest Aurora supercomputer installed at Argonne National Laboratory raised a few eyebrows by finally surpassing the exascale barrier. Before this, only AMD's Frontier system had been able to achieve this level of performance. Intel also achieved what it says is the world's best performance for AI at 10.61 "AI exaflops." Intel reported the news on its blog, stating Aurora was now officially the fastest supercomputer for AI in the world. It shares the distinction in collaboration with Argonne National Laboratory and Hewlett Packard Enterprise (HPE), which both built and houses the system in its current state, which Intel says was at 87% functionality for the recent tests. In the all-important Linpack (HPL) test, the Aurora computer hit 1.012 exaflops, meaning it has almost doubled the performance on tap since its initial "partial run" in late 2023, where it hit just 585.34 petaflops. The company then said it expected to cross the exascale barrier with Aurora eventually, and now it has.

Intel says for the ISC 2024 tests, Aurora was operating with 9,234 nodes. The company notes it ranked second overall in LINPACK, meaning it's still unable to dethrone AMD's Frontier system, which is also an HPE supercomputer. AMD's Frontier was the first supercomputer to break the exascale barrier in June 2022. Frontier sits at around 1.2 exaflops in Linpack, so Intel is knocking on its door but still has a way to go before it can topple it. However, Intel says Aurora came in first in the Linpack-mixed benchmark, reportedly highlighting its unparalleled AI performance. Intel's Aurora supercomputer uses the company's latest CPU and GPU hardware, with 21,248 Sapphire Rapids Xeon CPUs and 63,744 Ponte Vecchio GPUs. When it's fully operational later this year, Intel believes the system will eventually be capable of crossing the 2-exaflop barrier.
Supercomputing

Defense Think Tank MITRE To Build AI Supercomputer With Nvidia (washingtonpost.com) 44

An anonymous reader quotes a report from the Washington Post: A key supplier to the Pentagon and U.S. intelligence agencies is building a $20 million supercomputer with buzzy chipmaker Nvidia to speed deployment of artificial intelligence capabilities across the U.S. federal government, the MITRE think tank said Tuesday. MITRE, a federally funded, not-for-profit research organization that has supplied U.S. soldiers and spies with exotic technical products since the 1950s, says the project could improve everything from Medicare to taxes. "There's huge opportunities for AI to make government more efficient," said Charles Clancy, senior vice president of MITRE. "Government is inefficient, it's bureaucratic, it takes forever to get stuff done. ... That's the grand vision, is how do we do everything from making Medicare sustainable to filing your taxes easier?" [...] The MITRE supercomputer will be based in Ashburn, Va., and should be up and running late this year. [...]

Clancy said the planned supercomputer will run 256 Nvidia graphics processing units, or GPUs, at a cost of $20 million. This counts as a small supercomputer: The world's fastest supercomputer, Frontier in Tennessee, boasts 37,888 GPUs, and Meta is seeking to build one with 350,000 GPUs. But MITRE's computer will still eclipse Stanford's Natural Language Processing Group's 68 GPUs, and will be large enough to train large language models to perform AI tasks tailored for government agencies. Clancy said all federal agencies funding MITRE will be able to use this AI "sandbox." "AI is the tool that is solving a wide range of problems," Clancy said. "The U.S. military needs to figure out how to do command and control. We need to understand how cryptocurrency markets impact the traditional banking sector. ... Those are the sorts of problems we want to solve."

Supercomputing

Europe Plans To Build 100-Qubit Quantum Computer By 2026 (physicsworld.com) 27

An anonymous reader quotes a report published last week by Physics World: Researchers at the Dutch quantum institute QuTech in Delft have announced plans to build Europe's first 100-quantum bit (qubit) quantum computer. When complete in 2026, the device will be made publicly available, providing scientists with a tool for quantum calculations and simulations. The project is funded by the Dutch umbrella organization Quantum Delta NL via the European OpenSuperQPlus initiative, which has 28 partners from 10 countries. Part of the 10-year, 1 billion-euro European Quantum Flagship program, OpenSuperQPlus aims to build a 100-qubit superconducting quantum processor as a stepping stone to an eventual 1000-qubit European quantum computer.

Quantum Delta NL says the 100-qubit quantum computer will be made publicly available via a cloud platform as an extension of the existing platform Quantum Inspire that first came online in 2020. It currently includes a two-qubit processor of spin qubits in silicon, as well as a five-qubit processor based on superconducting qubits. Quantum Inspire is currently focused on training and education but the upgrade to 100 qubits is expected to allow research into quantum computing. Lead researcher from QuTech Leonardo DiCarlo believes the R&D cycle has "come full circle," where academic research first enabled spin-off companies to grow and now their products are being used to accelerate academic research.

Supercomputing

New Advances Promise Secure Quantum Computing At Home (phys.org) 27

Scientists from Oxford University Physics have developed a breakthrough in cloud-based quantum computing that could allow it to be harnessed by millions of individuals and companies. The findings have been published in the journal Physical Review Letters. Phys.Org reports: In the new study, the researchers use an approach dubbed "blind quantum computing," which connects two totally separate quantum computing entities -- potentially an individual at home or in an office accessing a cloud server -- in a completely secure way. Importantly, their new methods could be scaled up to large quantum computations. "Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realizing this concept is a big step forward in both quantum computing and keeping our information safe online," said study lead Dr. Peter Drmota, of Oxford University Physics.

The researchers created a system comprising a fiber network link between a quantum computing server and a simple device detecting photons, or particles of light, at an independent computer remotely accessing its cloud services. This allows so-called blind quantum computing over a network. Every computation incurs a correction that must be applied to all that follow and needs real-time information to comply with the algorithm. The researchers used a unique combination of quantum memory and photons to achieve this. The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.
"We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity," said Professor David Lucas, who co-heads the Oxford University Physics research team and is lead scientist at the UK Quantum Computing and Simulation Hub, led from Oxford University Physics.
Crime

Former Google Engineer Indicted For Stealing AI Secrets To Aid Chinese Companies 28

Linwei Ding, a former Google software engineer, has been indicted for stealing trade secrets related to AI to benefit two Chinese companies. He faces up to 10 years in prison and a $250,000 fine on each criminal count. Reuters reports: Ding's indictment was unveiled a little over a year after the Biden administration created an interagency Disruptive Technology Strike Force to help stop advanced technology being acquired by countries such as China and Russia, or potentially threaten national security. "The Justice Department just will not tolerate the theft of our trade secrets and intelligence," U.S. Attorney General Merrick Garland said at a conference in San Francisco.

According to the indictment, Ding stole detailed information about the hardware infrastructure and software platform that lets Google's supercomputing data centers train large AI models through machine learning. The stolen information included details about chips and systems, and software that helps power a supercomputer "capable of executing at the cutting edge of machine learning and AI technology," the indictment said. Google designed some of the allegedly stolen chip blueprints to gain an edge over cloud computing rivals Amazon.com and Microsoft, which design their own, and reduce its reliance on chips from Nvidia.

Hired by Google in 2019, Ding allegedly began his thefts three years later, while he was being courted to become chief technology officer for an early-stage Chinese tech company, and by May 2023 had uploaded more than 500 confidential files. The indictment said Ding founded his own technology company that month, and circulated a document to a chat group that said "We have experience with Google's ten-thousand-card computational power platform; we just need to replicate and upgrade it." Google became suspicious of Ding in December 2023 and took away his laptop on Jan. 4, 2024, the day before Ding planned to resign.
A Google spokesperson said: "We have strict safeguards to prevent the theft of our confidential commercial information and trade secrets. After an investigation, we found that this employee stole numerous documents, and we quickly referred the case to law enforcement."
Supercomputing

Investors Threw 50% Less Money At Quantum Last Year (theregister.com) 32

Dan Robinson reports via The Register: Quantum companies received 50 percent less venture cap funding last year as investors switched to generative AI or shied away from risky bets on Silicon Valley startups. Progress in quantum computing is being made, but practical applications of the technology are still likely years away. Investment in quantum technology reached a high of $2.2 billion in 2022, as confidence (or hype) grew in this emerging market, but that funding fell to about $1.2 billion last year, according to the latest State of Quantum report, produced by The Quantum Insider, with quantum computing company IQM, plus VCs OpenOcean and Lakestar. The picture is even starker in the US, where there was an 80 percent decline in venture capital for quantum, while the APAC region dropped by 17 percent, and EMEA grew slightly by three percent.

But the report denies that we have reached a "quantum winter," comparable with the "AI winter" periods of scarce funding and little progress. Instead, the quantum industry continues to progress towards useful quantum systems, just at a slower pace, and the decline in funding must be seen as part of broader venture capital trends, it insists. "Calendar year 2023 was an interesting year with regards to quantum," Heather West, research manager for Quantum Computing, Infrastructure Systems, Platforms, and Technology at IDC told The Register. "With the increased interest in generative AI, we started to observe that some of the funding that was being invested into quantum was transferred to AI initiatives and companies. Generative AI was seen as the new disruptive technology which end users could use immediately to gain an advantage or value, whereas quantum, while expected to be a disruptive technology, is still very early in development," West told The Register.

Gartner Research vice president Matthew Brisse agreed. "It's due to the slight shift of CIO priorities toward GenAI. If organizations were spending 10 innovation dollars on quantum, now they are spending five. Not abandoning it, but looking at GenAI to provide value sooner to the organization than quantum," he told us. Meanwhile, venture capitalists in America are fighting shy of risky bets on Silicon Valley startups and instead keeping their powder dry as they look to more established technology companies or else shore up their existing portfolio of investments, according to the Financial Times.

Slashdot Top Deals