Privacy

Hacker Steals 10 Petabytes of Data From China's Tianjin Supercomputer Center (cnn.com) 70

An anonymous reader quotes a report from CNN: A hacker has allegedly stolen a massive trove of sensitive data -- including highly classified defense documents and missile schematics -- from a state-run Chinese supercomputer in what could potentially constitute the largest known heist of data from China. The dataset, which allegedly contains more than 10 petabytes of sensitive information, is believed by experts to have been obtained from the National Supercomputing Center (NSCC) in Tianjin -- a centralized hub that provides infrastructure services for more than 6,000 clients across China, including advanced science and defense agencies.

Cyber experts who have spoken to the alleged hacker and reviewed samples of the stolen data they posted online say they appeared to gain entry to the massive computer with comparative ease and were able to siphon out huge amounts of data over the course of multiple months without being detected. An account calling itself FlamingChina posted a sample of the alleged dataset on an anonymous Telegram channel on February 6, claiming it contained "research across various fields including aerospace engineering, military research, bioinformatics, fusion simulation and more." The group alleges the information is linked to "top organizations" including the Aviation Industry Corporation of China, the Commercial Aircraft Corporation of China, and the National University of Defense Technology.

Cyber security experts who have reviewed the data say the group is offering a limited preview of the alleged dataset, for thousands of dollars, with full access priced at hundreds of thousands of dollars. Payment was requested in cryptocurrency. CNN cannot verify the origins of the alleged dataset and the claims made by FlamingChina, but spoke with multiple experts whose initial assessment of the leak indicated it was genuine. The alleged sample data appeared to include documents marked "secret" in Chinese, along with technical files, animated simulations and renderings of defense equipment including bombs and missiles.

Power

The UK Will Invest Billions to Build a Nuclear Fusion Industry (thetimes.com) 74

The UK's science minister is announcing details of a five-year, £2.5 billion investment in nuclear fusion, reports the Times of London, "including building one of the world's first prototype fusion power plants in Nottinghamshire and developing a UK sector projected to employ 10,000 people by 2030." Despite the potentially transformative impact of fusion, which in theory could provide limitless clean energy and create a £12 trillion global market, no country has managed to use this fledgling technology to generate useable electricity... [T]he UK is backing a spherical tokamak design... investing an initial £1.3 billion into a prototype fusion power plant called Step (Spherical Tokamak for Energy Production) on the site of a decommissioned coal-fired power station at West Burton in Nottinghamshire. Paul Methven, chief executive of the government-owned UK Industrial Fusion Solutions, which is delivering the Step project, said the aim is to get the reactor operating early in the 2040s. "It's quite an aggressive programme," he said. "We need to show that we can achieve genuine 'wall socket' energy — which has not been done before."

On Monday, [science minister] Vallance will also announce £180 million for a facility in Culham, Oxfordshire, to manufacture tritium fuel and £50 million for training 2,000 scientists and engineers in fusion-related disciplines. The government is also buying a £45 million fusion-dedicated AI supercomputer called Sunrise to model plasma physics. Scientists at the UK Atomic Energy Authority last year developed an AI model that can rapidly simulate how the ultra-hot fuel in a fusion power plant will behave, cutting calculations that previously took days down to seconds...

Vallance will also announce new support and collaboration for the many fusion, robotics, engineering and AI start-ups working in Britain, to develop a strong supply chain for a new fusion sector. One of those companies, Tokamak Energy, which spun out from the UK Atomic Energy Authority in 2009, has already built a smaller reactor that has informed the Step design. In March 2022, it became the first private organisation in the world to surpass 100 million degrees Celsius in its reactor.

The Courts

Former Google Engineer Found Guilty of Stealing AI Secrets For Chinese Firms (cbsnews.com) 34

Longtime Slashdot reader schwit1 shares a report from CBS News: A former Google engineer has been found guilty on multiple federal charges for stealing the tech giant's trade secrets on artificial intelligence to benefit Chinese companies he secretly worked for, federal prosecutors said. According to the U.S. Attorney's Office for the Northern District of California, a jury on Thursday convicted Linwei Ding on seven counts of economic espionage and seven counts of theft of trade secrets, following an 11-day trial. The 38-year-old, also known as Leon Ding, was hired by Google in 2019 and was a resident of Newark.

According to evidence presented at trial, Ding stole more than 2,000 pages of confidential information containing Google AI trade secrets between May 2022 and April 2023. He uploaded the information to his personal Google Cloud account. Around the same time, Ding secretly affiliated himself with two Chinese-based technology companies. Around June 2022, prosecutors said Ding was in discussions to be the chief technology officer for an early-stage tech company. Several months later, he was in the process of founding his own AI and machine learning company in China, acting as the company's CEO. Prosecutors said Ding told investors that he could build an AI supercomputer by copying and modifying Google's technology.

In late 2023, prosecutors said Ding downloaded the trade secrets to his own personal computer before resigning from Google. According to the superseding indictment, Google uncovered the uploads after finding out that Ding presented himself as CEO of one of the companies during an Beijing investor conference. Around the same time, Ding told his manager he was leaving the company and booked a one-way flight to Beijing.
"Silicon Valley is at the forefront of artificial intelligence innovation, pioneering transformative work that drives economic growth and strengthens our national security. The jury delivered a clear message today that the theft of this valuable technology will not go unpunished," U.S. Attorney Craig Missakian said in a statement.
Science

Nature-Inspired Computers Are Shockingly Good At Math (phys.org) 32

An R&D lab under America's Energy Department annnounced this week that "Neuromorphic computers, inspired by the architecture of the human brain, are proving surprisingly adept at solving complex mathematical problems that underpin scientific and engineering challenges."

Phys.org publishes the announcement from Sandia National Lab: In a paper published in Nature Machine Intelligence, Sandia National Laboratories computational neuroscientists Brad Theilman and Brad Aimone describe a novel algorithm that enables neuromorphic hardware to tackle partial differential equations, or PDEs — the mathematical foundation for modeling phenomena such as fluid dynamics, electromagnetic fields and structural mechanics. The findings show that neuromorphic computing can not only handle these equations, but do so with remarkable efficiency. The work could pave the way for the world's first neuromorphic supercomputer, potentially revolutionizing energy-efficient computing for national security applications and beyond...

"We're just starting to have computational systems that can exhibit intelligent-like behavior. But they look nothing like the brain, and the amount of resources that they require is ridiculous, frankly," Theilman said.For decades, experts have believed that neuromorphic computers were best suited for tasks like recognizing patterns or accelerating artificial neural networks. These systems weren't expected to excel at solving rigorous mathematical problems like PDEs, which are typically tackled by traditional supercomputers. But for Aimone and Theilman, the results weren't surprising. The researchers believe the brain itself performs complex computations constantly, even if we don't consciously realize it. "Pick any sort of motor control task — like hitting a tennis ball or swinging a bat at a baseball," Aimone said. "These are very sophisticated computations. They are exascale-level problems that our brains are capable of doing very cheaply..."

Their research also raises intriguing questions about the nature of intelligence and computation. The algorithm developed by Theilman and Aimone retains strong similarities to the structure and dynamics of cortical networks in the brain. "We based our circuit on a relatively well-known model in the computational neuroscience world," Theilman said. "We've shown the model has a natural but non-obvious link to PDEs, and that link hasn't been made until now — 12 years after the model was introduced." The researchers believe that neuromorphic computing could help bridge the gap between neuroscience and applied mathematics, offering new insights into how the brain processes information. "Diseases of the brain could be diseases of computation," Aimone said. "But we don't have a solid grasp on how the brain performs computations yet." If their hunch is correct, neuromorphic computing could offer clues to better understand and treat neurological conditions like Alzheimer's and Parkinson's.

Supercomputing

Mexico Unveils Plans To Build Most Powerful Supercomputer In Latin America (apnews.com) 22

An anonymous reader quotes a report from the Associated Press: Mexico unveiled plans Wednesday to build what it claims will be Latin America's most powerful supercomputer -- a project the government says will help the country capitalize on the rapidly evolving uses of artificial intelligence and exponentially expand the country's computing capacity. Dubbed "Coatlicue" for the Mexica goddess considered the earth mother, the supercomputer would be seven times more powerful than the region's current leader in Brazil, Jose Merino, head of the Telecommunications and Digital Transformation Agency.

President Claudia Sheinbaum said during her morning news briefing that the location for the project had not been decided yet, but construction will begin next year. "We're very excited," said Sheinbaum, an academic and climate scientist. "It is going to allow Mexico to fully get in on the use of artificial intelligence and the processing of data that today we don't have the capacity to do." Merino said that Mexico's most powerful supercomputer operates at 2.3 petaflops -- a unit to measure computing speed, meaning it can perform one quadrillion operations per second. Coatlicue would have a capacity of 314 petaflops.

United States

US Department of Energy Forms $1 Billion Supercomputer and AI Partnership With AMD (reuters.com) 23

The U.S. has formed a $1 billion partnership with AMD to construct two supercomputers that will tackle large scientific problems ranging from nuclear power to cancer treatments to national security, said Energy Secretary Chris Wright and AMD CEO Lisa Su. From a report: The U.S. is building the two machines to ensure the country has enough supercomputers to run increasingly complex experiments that require harnessing enormous amounts of data-crunching capability. The machines can accelerate the process of making scientific discoveries in areas the U.S. is focused on.

Energy Secretary Wright said the systems would "supercharge" advances in nuclear power and fusion energy, technologies for defense and national security, and the development of drugs. Scientists and companies are trying to replicate fusion, the reaction that fuels the sun, by jamming light atoms in a plasma gas under intense heat and pressure to release massive amounts of energy. "We've made great progress, but plasmas are unstable, and we need to recreate the center of the sun on Earth," Wright told Reuters.

Google

Google's Quantum Computer Makes a Big Technical Leap (nytimes.com) 30

Google announced Wednesday that its quantum computer achieved the first verifiable quantum advantage, running a new algorithm 13,000 times faster than a top supercomputer. The algorithm, called Quantum Echoes, was published in the journal Nature. The results can be replicated on another quantum computer of similar quality, something Google had not demonstrated before. The quantum computer uses a chip called Willow, which was announced in December 2024. Hartmut Neven, head of Google's Quantum AI research lab, called the work a demonstration of the first algorithm with verifiable quantum advantage and a milestone on the software track.

Michel H. Devoret, who won this year's Nobel Prize in Physics and joined Google in 2023, said future quantum computers will run calculations impossible with classical algorithms. Google stopped short of claiming the work would have practical uses on its own. Instead, the company said Quantum Echoes demonstrated a technique that could be applied to other algorithms in drug discovery and materials science.

A second paper published Wednesday on arXiv showed how the method could be applied to nuclear magnetic resonance. The experiment involved a relatively small quantum system that fell short of full practical quantum advantage because it was not able to work faster than a traditional computer. Google exhaustively red-teamed the research, putting some researchers to work trying to disprove its own results.

Prineha Narang, a professor at UCLA, called the advance meaningful. The quantum computer tested two molecules, one with 15 atoms and another with 28 atoms. Results on the quantum computer matched traditional NMR and revealed information not usually available from NMR. Google's research competes against Microsoft, IBM, universities and efforts in China. The Chinese government has committed more than $15.2 billion to quantum research. Previous claims of quantum advantage have been met with skepticism.
United States

Steve Jobs Honored On New 2026 US Coin Celebrating Innovation (nerds.xyz) 79

BrianFagioli writes: The United States Mint is honoring Steve Jobs and Apple with a new coin for 2026. Part of the American Innovation $1 Coin Program, California's entry depicts a young Jobs seated before rolling northern California hills, accompanied by the words "Make Something Wonderful." The reflective design, created by Elana Hagler and sculpted by Phebe Hemphill, captures how Jobs's surroundings and vision shaped Apple's mission to make technology feel intuitive and human.

The 2026 series also celebrates Dr. Norman Borlaug for Iowa, the Cray-1 supercomputer for Wisconsin, and mobile refrigeration for Minnesota. The obverse of all coins features the Statue of Liberty and a special Liberty Bell mark commemorating the nation's Semiquincentennial. The Steve Jobs coin stands out as one of the few times the U.S. Mint has recognized a modern tech innovator, and some collectors are already calling it one of the most exciting releases in years.

Science

Nobel Prize in Physics Is Awarded for Work in Quantum Mechanics 18

The New York Times: John Clarke, Michel H. Devoret and John M. Martinis were awarded the Nobel Prize in Physics on Tuesday in Sweden for showing that two properties of quantum mechanics, the physical laws that rule the subatomic realm, could be observed on a system large enough to see with the naked eye. They will share a prize of 11 million Swedish kroner, or around $1.17 million.

"There is no advanced technology today that does not rely on quantum mechanics," Olle Eriksson, chairman of the Nobel Committee for Physics, said during the announcement of the award. The laureates' discoveries, he added, paved the way for technologies like the cellphone, cameras and fiber optic cables.

It also helped lay the groundwork for current attempts to build a quantum computer, a device that could compute and process information at speeds that would not be possible with classical computer.
Martinis worked at Google from 2014 to 2020 to build a quantum computer and led the quantum supremacy experiment in 2019. Devoret is cited in Google's recent breakthrough where its Willow quantum chip solved a problem in five minutes that the world's most advanced supercomputer could never solve.

The three laureates conducted experiments with electrical circuits that demonstrated quantum mechanical tunneling and quantized energy levels in systems large enough to hold in the hand. Clarke is a professor at the University of California, Berkeley. Devoret joined his research group in the 1980s and is now at Yale University and UC Santa Barbara. Martinis also joined the group in the 1980s and is currently at UC Santa Barbara and co-founded Qolab, a startup developing utility-scale superconducting quantum computers.
Microsoft

Microsoft Announces $30 Billion Investment In AI Infrastructure, Operations In UK 22

Microsoft will invest $30 billion in the U.K. through 2028 to expand AI infrastructure and operations, including building the country's largest supercomputer with 23,000 GPUs in partnership with Nscale. CNBC reports: On a call with reporters on Tuesday, Microsoft President Brad Smith said his stance on the U.K. has warmed over the years. He previously criticized the country over its attempt in 2023 to block the tech giant's $69 billion acquisition of video game developer Activision-Blizzard. The deal was cleared by the U.K.s competition regulator later that year.

"I haven't always been optimistic every single day about the business climate in the U.K.," Smith said. However, he added, "I am very encouraged by the steps that the government has taken over the last few years." "Just a few years ago, this kind of investment would have been inconceivable because of the regulatory climate then and because there just wasn't the need or demand for this kind of large AI investment," Smith said.
Microsoft's announcement comes as President Donald Trump embarks on a state visit to Britain where he's expected to sign a new deal with U.K. Prime Minister Keir Starmer "to unlock investment and collaboration in AI, Quantum, and Nuclear technologies," the government said in a statement late Tuesday.
Supercomputing

Europe Hopes To Join Competitive AI Race With Supercomputer Jupiter (france24.com) 41

Europe on Friday inaugurated Jupiter, its first exascale supercomputer and the most powerful AI machine on the continent. Built in Germany with 24,000 Nvidia chips, the 500-million-euro system aims to close the AI gap with the US and China while also advancing climate modeling, neuroscience, and renewable energy research. France 24 reports: Based at Juelich Supercomputing Centre in western Germany, it is Europe's first "exascale" supercomputer -- meaning it will be able to perform at least one quintillion (or one billion billion) calculations per second. The United States already has three such computers, all operated by the Department of Energy. Jupiter is housed in a centre covering some 3,600 meters (38,000 square feet) -- about half the size of a football pitch -- containing racks of processors, and packed with about 24,000 Nvidia chips, which are favored by the AI industry.

Half the 500 million euros ($580 million) to develop and run the system over the next few years comes from the European Union and the rest from Germany. Its vast computing power can be accessed by researchers across numerous fields as well as companies for purposes such as training AI models. "Jupiter is a leap forward in the performance of computing in Europe," Thomas Lippert, head of the Juelich centre, told AFP, adding that it was 20 times more powerful than any other computer in Germany. [...]

Yes, Jupiter will require on average around 11 megawatts of power, according to estimates -- equivalent to the energy used to power thousands of homes or a small industrial plant. But its operators insist that Jupiter is the most energy-efficient among the fastest computer systems in the world. It uses the latest, most energy-efficient hardware, has water-cooling systems and the waste heat that it generates will be used to heat nearby buildings, according to the Juelich centre.

Businesses

Co-Founder of xAI Departs the Company (techcrunch.com) 11

Igor Babuschkin, co-founder of xAI, has left the company to start Babuschkin Ventures, a VC firm focused on AI safety and humanity-advancing startups. TechCrunch reports: Babuschkin led engineering teams at xAI and helped build the startup into one of Silicon Valley's leading AI model developers just a few years after it was founded. "Today was my last day at xAI, the company that I helped start with Elon Musk in 2023," Babuschkin wrote in the post. "I still remember the day I first met Elon, we talked for hours about AI and what the future might hold. We both felt that a new AI company with a different kind of mission was needed."

Babuschkin is leaving xAI to launch his own venture capital firm, Babuschkin Ventures, which he says will support AI safety research and back startups that "advance humanity and unlock the mysteries of our universe." The xAI co-founder says he was inspired to start the firm after a dinner with Max Tegmark, the founder of the Future of Life Institute, in which they discussed how AI systems could be built safely to encourage the flourishing of future generations. In his post, Babuschkin says his parents immigrated to the U.S. from Russia in pursuit of a better life for their children.

Prior to co-founding xAI, Babuschkin was part of a research team at Google DeepMind that pioneered AlphaStar in 2019, a breakthrough AI system that could defeat top-ranked players at the video game StarCraft. Babuschkin also worked as a researcher at OpenAI in the years before it released ChatGPT. In his post, Babuschkin details some of the challenges he and Musk faced in building up xAI. He notes that industry veterans called xAI's goal of building its Memphis, Tennessee supercomputer in just three months "impossible." [...] Nevertheless, Babuschkin says he's already looking back fondly on his time at xAI, and "feels like a proud parent, driving away after sending their kid away to college." "I learned 2 priceless lessons from Elon: #1 be fearless in rolling up your sleeves to personally dig into technical problems, #2 have a maniacal sense of urgency," said Babuschkin.

Power

US DOE Taps Federal Sites For Fast-Track AI Datacenter, Energy Builds 11

The U.S. Department of Energy has greenlit four federal sites for private sector AI datacenters and nuclear-powered energy projects, aligning with Trump's directive to fast-track AI infrastructure using government land. "The four that have been finalized are the Idaho National Laboratory, Oak Ridge Reservation, Paducah Gaseous Diffusion Plant, and Savannah River Site," reports The Register. "These will now move forward to invite companies in the private sector to build AI datacenter projects plus any necessary energy sources to power them, including nuclear generation." The Register reports: "By leveraging DoE land assets for the deployment of AI and energy infrastructure, we are taking a bold step to accelerate the next Manhattan Project -- ensuring US AI and energy leadership," Energy Secretary Chris Wright said in a statement. Ironically -- or perhaps not -- Oak Ridge Reservation was established in the early 1940s as part of the original Manhattan Project to develop the first atomic bomb, and is home to the Oak Ridge National Laboratory (ORNL) that operates the Frontier exascale supercomputer, and the Y-12 National Security Complex which supports US nuclear weapons programs.

The other sites are also involved with either nuclear research or atomic weapons in one way or another, which may hint at the administration's intentions for how the datacenters should be powered. All four locations are positioned to host new bit barns as well as power generation to bolster grid reliability, strengthen national security, and reduce energy costs, Wright claimed. [...] In light of this tight time frame, the DoE says that partners may be selected by the end of the year. Details regarding project scope, eligibility requirements, and submission guidelines for each site are expected to be released in the coming months.
Supercomputing

Scientists Make 'Magic State' Breakthrough After 20 Years (livescience.com) 38

An anonymous reader quotes a report from Live Science: In a world first, scientists have demonstrated an enigmatic phenomenon in quantum computing that could pave the way for fault-tolerant machines that are far more powerful than any supercomputer. The process, called "magic state distillation," was first proposed 20 years ago, but its use in logical qubits has eluded scientists ever since. It has long been considered crucial for producing the high-quality resources, known as "magic states," needed to fulfill the full potential of quantum computers. [...] Now, however, scientists with QuEra say they have demonstrated magic state distillation in practice for the first time on logical qubits. They outlined their findings in a new study published July 14 in the journal Nature.

In the study, using the Gemini neutral-atom quantum computer, the scientists distilled five imperfect magic states into a single, cleaner magic state. They performed this separately on a Distance-3 and a Distance-5 logical qubit, demonstrating that it scales with the quality of the logical qubit. "A greater distance means better logical qubits. A Distance-2, for instance, means that you can detect an error but not correct it. Distance-3 means that you can detect and correct a single error. Distance-5 would mean that you can detect and correct up to two errors, and so on, and so on," [explained Yuval Boger, chief commercial officer at QuEra who was not personally involved in the research]. "So the greater the distance, the higher fidelity of the qubit is -- and we liken it to distilling crude oil into a jet fuel."

As a result of the distillation process, the fidelity of the final magic state exceeded that of any input. This proved that fault-tolerant magic state distillation worked in practice, the scientists said. This means that a quantum computer that uses both logical qubits and high-quality magic states to run non-Clifford gates is now possible. "We're seeing sort of a shift from a few years ago," Boger said. "The challenge was: can quantum computers be built at all? Then it was: can errors be detected and corrected? Us and Google and others have shown that, yes, that can be done. Now it's about: can we make these computers truly useful? And to make one computer truly useful, other than making them larger, you want them to be able to run programs that cannot be simulated on classical computers."

Earth

Microsoft Says Its Aurora AI Can Accurately Predict Air Quality, Typhoons (techcrunch.com) 28

An anonymous reader quotes a report from TechCrunch: One of Microsoft's latest AI models can accurately predict air quality, hurricanes, typhoons, and other weather-related phenomena, the company claims. In a paper published in the journal Nature and an accompanying blog post this week, Microsoft detailed Aurora, which the tech giant says can forecast atmospheric events with greater precision and speed than traditional meteorological approaches. Aurora, which has been trained on more than a million hours of data from satellites, radar and weather stations, simulations, and forecasts, can be fine-tuned with additional data to make predictions for particular weather events.

AI weather models are nothing new. Google DeepMind has released a handful over the past several years, including WeatherNext, which the lab claims beats some of the world's best forecasting systems. Microsoft is positioning Aurora as one of the field's top performers -- and a potential boon for labs studying weather science. In experiments, Aurora predicted Typhoon Doksuri's landfall in the Philippines four days in advance of the actual event, beating some expert predictions, Microsoft says. The model also bested the National Hurricane Center in forecasting five-day tropical cyclone tracks for the 2022-2023 season, and successfully predicted the 2022 Iraq sandstorm.

While Aurora required substantial computing infrastructure to train, Microsoft says the model is highly efficient to run. It generates forecasts in seconds compared to the hours traditional systems take using supercomputer hardware. Microsoft, which has made the source code and model weights publicly available, says that it's incorporating Aurora's AI modeling into its MSN Weather app via a specialized version of the model that produces hourly forecasts, including for clouds.

Cloud

The Stealthy Lab Cooking Up Amazon's Secret Sauce (msn.com) 8

Amazon's decade-old acquisition of Annapurna Labs has emerged as a pivotal element in its AI strategy, with the once-secretive Israeli chip design startup now powering AWS infrastructure. The $350 million deal, struck in 2015 after initial talks between Annapurna co-founder Nafea Bshara and Amazon executive James Hamilton, has equipped the tech giant with custom silicon capabilities critical to its cloud computing dominance.

Annapurna's chips, particularly the Trainium processor for AI model training and Graviton for general-purpose computing, now form the foundation of Amazon's AI infrastructure. The company is deploying hundreds of thousands of Trainium chips in its Project Rainier supercomputer being delivered to AI startup Anthropic this year. Amazon CEO Andy Jassy, who led AWS when the acquisition occurred, described it as "one of the most important moments" in AWS history.
United States

Nvidia To Make AI Supercomputers in US for First Time (nvidia.com) 37

Nvidia has announced plans to manufacture AI supercomputers entirely within the United States, commissioning over 1 million square feet of manufacturing space across Arizona and Texas. Production of Blackwell chips has begun at TSMC's Phoenix facilities, while supercomputer assembly will occur at new Foxconn and Wistron plants in Houston and Dallas respectively.

"The engines of the world's AI infrastructure are being built in the United States for the first time," said Jensen Huang, Nvidia's founder and CEO. "Adding American manufacturing helps us better meet the incredible and growing demand for AI chips and supercomputers, strengthens our supply chain and boosts our resiliency."

The company will deploy its own AI, robotics, and digital twin technologies in these facilities, using Nvidia Omniverse to create digital twins of factories and Isaac GR00T to build manufacturing automation robots. Nvidia projects an ambitious $500 billion in domestic AI infrastructure production over the next four years, with manufacturing expected to create hundreds of thousands of jobs.
AMD

New Supercomputing Record Set - Using AMD's Instinct GPUs (tomshardware.com) 23

"AMD processors were instrumental in achieving a new world record," reports Tom's Hardware, "during a recent Ansys Fluent computational fluid dynamics simulation run on the Frontier supercomputer at the Oak Ridge National Laboratory."

The article points out that Frontier was the fastest supercomputer in the world until it was beaten by Lawrence Livermore Lab's El Capitan — with both computers powered by AMD GPUs: According to a press release by Ansys, it ran a 2.2-billion-cell axial turbine simulation for Baker Hughes, an energy technology company, testing its next-generation gas turbines aimed at increasing efficiency. The simulation previously took 38.5 hours to complete on 3,700 CPU cores. By using 1,024 AMD Instinct MI250X accelerators paired with AMD EPYC CPUs in Frontier, the simulation time was slashed to 1.5 hours. This is more than 25 times faster, allowing the company to see the impact of the changes it makes on designs much more quickly...

Given those numbers, the Ansys Fluent CFD simulator apparently only used a fraction of the power available on Frontier. That means it has the potential to run even faster if it can utilize all the available accelerators on the supercomputer. It also shows that, despite Nvidia's market dominance in AI GPUs, AMD remains a formidable competitor, with its CPUs and GPUs serving as the brains of some of the fastest supercomputers on Earth.

Open Source

Startup Claims Its Upcoming (RISC-V ISA) Zeus GPU is 10X Faster Than Nvidia's RTX 5090 (tomshardware.com) 69

"The number of discrete GPU developers from the U.S. and Western Europe shrank to three companies in 2025," notes Tom's Hardware, "from around 10 in 2000." (Nvidia, AMD, and Intel...) No company in the recent years — at least outside of China — was bold enough to engage into competition against these three contenders, so the very emergence of Bolt Graphics seems like a breakthrough. However, the major focuses of Bolt's Zeus are high-quality rendering for movie and scientific industries as well as high-performance supercomputer simulations. If Zeus delivers on its promises, it could establish itself as a serious alternative for scientific computing, path tracing, and offline rendering. But without strong software support, it risks struggling against dominant market leaders.
This week the Sunnyvale, California-based startup introduced its Zeus GPU platform designed for gaming, rendering, and supercomputer simulations, according to the article. "The company says that its Zeus GPU not only supports features like upgradeable memory and built-in Ethernet interfaces, but it can also beat Nvidia's GeForce RTX 5090 by around 10 times in path tracing workloads, according to slide published by technology news site ServeTheHome." There is one catch: Zeus can only beat the RTX 5090 GPU in path tracing and FP64 compute workloads. It's not clear how well it will handle traditional rendering techniques, as that was less of a focus. In speaking with Bolt Graphics, the card does support rasterization, but there was less emphasis on that aspect of the GPU, and it may struggle to compete with the best graphics cards when it comes to gaming. And when it comes to data center options like Nvidia's Blackwell B200, it's an entirely different matter.

Unlike GPUs from AMD, Intel, and Nvidia that rely on proprietary instruction set architectures, Bolt's Zeus relies on the open-source RISC-V ISA, according to the published slides. The Zeus core relies on an open-source out-of-order general-purpose RVA23 scalar core mated with FP64 ALUs and the RVV 1.0 (RISC-V Vector Extension Version 1.0) that can handle 8-bit, 16-bit, 32-bit, and 64-bit data types as well as Bolt's additional proprietary extensions designed for acceleration of scientific workloads... Like many processors these days, Zeus relies on a multi-chiplet design... Unlike high-end GPUs that prioritize bandwidth, Bolt is evidently focusing on greater memory size to handle larger datasets for rendering and simulations. Also, built-in 400GbE and 800GbE ports to enable faster data transfer across networked GPUs indicates the data center focus of Zeus.

High-quality rendering, real-time path tracing, and compute are key focus areas for Zeus. As a result, even the entry-level Zeus 1c26-32 offers significantly higher FP64 compute performance than Nvidia's GeForce RTX 5090 — up to 5 TFLOPS vs. 1.6 TFLOPS — and considerably higher path tracing performance: 77 Gigarays vs. 32 Gigarays. Zeus also features a larger on-chip cache than Nvidia's flagship — up to 128MB vs. 96MB — and lower power consumption of 120W vs. 575W, making it more efficient for simulations, path tracing, and offline rendering. However, the RTX 5090 dominates in AI workloads with its 105 FP16 TFLOPS and 1,637 INT8 TFLOPS compared to the 10 FP16 TFLOPS and 614 INT8 TFLOPS offered by a single-chiplet Zeus...

The article emphasizes that Zeus "is only running in simulation right now... Bolt Graphics says that the first developer kits will be available in late 2025, with full production set for late 2026."

Thanks to long-time Slashdot reader arvn for sharing the news.
Supercomputing

Supercomputer Draws Molecular Blueprint For Repairing Damaged DNA (phys.org) 10

Using the Summit supercomputer at the Department of Energy's Oak Ridge National Laboratory, researchers have modeled a key component of nucleotide excision repair (NER) called the pre-incision complex (PInC), which plays a crucial role in DNA damage repair. Their study, published in Nature Communications, provides new insights into how the PInC machinery orchestrates precise DNA excision, potentially leading to advancements in treating genetic disorders, preventing premature aging, and understanding conditions like xeroderma pigmentosum and Cockayne syndrome. Phys.Org reports: "Computationally, once you assemble the PInC, molecular dynamics simulations of the complex become relatively straightforward, especially on large supercomputers like Summit," [said lead investigator Ivaylo Ivanov, a chemistry professor at Georgia State University]. Nanoscale Molecular Dynamics, or NAMD, is a molecular dynamics code specifically designed for supercomputers and is used to simulate the movements and interactions of large biomolecular systems that contain millions of atoms. Using NAMD, the research team ran extensive simulations. The number-crunching power of the 200-petaflop Summit supercomputer -- capable of performing 200,000 trillion calculations per second -- was essential in unraveling the functional dynamics of the PInC complex on a timescale of microseconds. "The simulations showed us a lot about the complex nature of the PInC machinery. It showed us how these different components move together as modules and the subdivision of this complex into dynamic communities, which form the moving parts of this machine," Ivanov said.

The findings are significant in that mutations in XPF and XPG can lead to severe human genetic disorders. They include xeroderma pigmentosum, which is a condition that makes people more susceptible to skin cancer, and Cockayne syndrome, which can affect human growth and development, lead to impaired hearing and vision, and speed up the aging process. "Simulations allow us to zero in on these important regions because mutations that interfere with the function of the NER complex often occur at community interfaces, which are the most dynamic regions of the machine," Ivanov said. "Now we have a much better understanding of how and from where these disorders manifest."

Slashdot Top Deals