Iphone

Apple Unveils iPhone 16 Pro Featuring Bigger Screen, New Chip And Pro Recording Options (theverge.com) 37

Apple announced the iPhone 16 Pro lineup at Monday's product event. The company's new flagship smartphones have slightly bigger screens across both models: 6.3 inches on the iPhone 16 Pro and 6.9 inches on the iPhone 16 Pro Max (up from 6.1 inches and 6.7 inches, respectively, on the iPhone 15 Pro and 15 Pro Max). The Verge: The bodies of the phones are once again made from titanium. It comes in four colors: black, white, natural, and a new "desert titanium." Apple also claims that the iPhone 16 Pro Max has "the best iPhone battery life ever." The iPhone 16 Pro lineup comes with the A18 Pro chip, with a 16-core Neural Engine that it says will offer "amazing performance" for Apple Intelligence features, including 15 percent faster performance than the iPhone 15 Pro. It also has improved graphics performance thanks to a 6-core GPU that's 20 percent faster than the iPhone 15 Pro's A17 Pro.

The iPhone 16 Pro starts at $999, whereas the iPhone 16 Pro Max starts at $1,199.

Hardware

EmuDeck Enters the Mini PC Market With Linux-Powered 'EmuDeck Machines' (overkill.wtf) 11

An anonymous reader quotes a report from overkill.wtf: The team behind popular emulation tool EmuDeck is today announcing something rather special: they've spent the first half of 2024 working on their very first hardware product, called the EmuDeck Machine, and it's due to arrive before the year is out. This EmuDeck Machine is an upcoming, crowdfunded, retro emulation mini PC running Bazzite, a Linux-based system similar to SteamOS. [...] This new EmuDeck Machine comes in two variants, the EM1 running an Intel N97 APU, and the EM2 -- based on an AMD Ryzen 8600G. While both machines are meant as emulation-first devices, the AMD-based variant can easily function as a console-like PC. This is also thanks to some custom work done by the team: "We've optimized the system for maximum power. The default configuration of an 8600G gets you 32 FPS in Cyberpunk; we've managed to reach 47 FPS with a completely stable system, or 60FPS if you use FSR."

Both machines will ship with a Gamesir Nova Lite controller and EmuDeck preinstalled naturally. The team has also preinstalled all available Decky plugins. But that's not all: if the campaign is successful, the EmuDeck team will also work on a docking station for the EM2 that will upgrade the graphics to an AMD Radeon 7600 desktop GPU. With this, in games like Cyberpunk 2077, you'll be able to reach 160 FPS in 1080p as per EmuDeck's measurements.
You can preorder the EmuDeck Machines via Indigogo, starting at $322 and shipping in December.
Classic Games (Games)

Hydrogels Can Learn To Play Pong (arstechnica.com) 11

An anonymous reader quotes a report from Ars Technica: Pong will always hold a special place in the history of gaming as one of the earliest arcade video games. Introduced in 1972, it was a table tennis game featuring very simple graphics and gameplay. In fact, it's simple enough that even non-living materials known as hydrogels can "learn" to play the game by "remembering" previous patterns of electrical stimulation, according to a new paper published in the journal Cell Reports Physical Science. "Our research shows that even very simple materials can exhibit complex, adaptive behaviors typically associated with living systems or sophisticated AI," said co-author Yoshikatsu Hayashi, a biomedical engineer at the University of Reading in the UK. "This opens up exciting possibilities for developing new types of 'smart' materials that can learn and adapt to their environment." [...]

The experimental setup was fairly simple. The researchers hooked up electroactive hydrogels to a simulated virtual environment of a Pong game using a custom-built electrode array. The games would start with the ball traveling in a random direction. The hydrogels tracked the ball's position via electrical stimulation and tracked the paddle's position by measuring the distribution of ions in the hydrogels. As the games progressed, the researchers measured how often the hydrogel managed to hit the ball with the paddle. They found that, over time, the hydrogels' accuracy improved, hitting the ball more frequently for longer rallies. They reached their maximum potential for accuracy in about 20 minutes, compared to 10 minutes for the DishBrain. The authors attribute this to the ion movement essentially mapping out a "memory" of all motion over time, exhibiting what appears to be emergent memory functions within the material itself. Perhaps the next step will be to "teach" the hydrogels how to align the paddles in such a way that the rallies go on indefinitely.

Nintendo

Nintendo Completely Sat Out the Video Game Graphics Wars. It's Winning Anyway. (sherwood.news) 70

Manny Fidel, reporting for Sherwood News: When you're immersed in a game like "Cyberpunk 2077," it's easy to get lost in its realism. As you run around the crowded streets of Night City, you notice the reflections of the city lights and neon signs in the puddles when it rains. Even the complexion and texture of a character's skin are enamoring. At full power, the game, created by CD Projekt Red, is a graphical marvel. It's also a symbol of a decades-long arms race between the biggest video game companies to make things look as real as possible. And then there are Nintendo games.

Take 2022's "Pokemon Scarlet" and "Pokemon Violet" on the Nintendo Switch. Despite being the latest releases in a legendary franchise, in terms of its graphics they could've easily been published 15 years ago. It's a perfect example of how, sometimes to the frustration of gamers, Nintendo seemingly refuses to step into the present day. None of its flagship games really compete with the rest of the industry's optical experiences. The graphics of games like "Red Dead Redemption 2," "Starfield," and "The Last of Us: Part II" are decades ahead of Nintendo.

But here's the thing: Nintendo doesn't have to catch up, nor does it want to. "Pokemon Scarlet" and "Pokemon Violet" sold 10 million copies during their launch weekend alone. According to IGN, Nintendo is responsible for three of the top five bestselling video game consoles of all time. Its characters -- Mario and Luigi, Link and Zelda, Pikachu and Ash -- have defined and are constantly redefining the industry. Nintendo is a money machine. It's been raking in more than $10 billion in revenue (more than 1.6 trillion yen) annually for the past several years, and its profits have grown sharply, topping out at about $3.3 billion in the fiscal year ended March 2024. For comparison, in its latest fiscal year, Sony's gaming division generated $29.1 billion of revenue and an operating profit of nearly $2 billion. Nintendo posted $11.4 billion of revenue and an operating profit of $3.6 billion.

AI

Elliott Says Nvidia is in a 'Bubble' and AI is 'Overhyped' 73

Hedge fund Elliott Management has told investors that Nvidia is in a "bubble," and the AI technology driving the chipmaking giant's share price is "overhyped." From a report: The Florida-based firm, which manages about $70bn in assets, said in a recent letter to clients seen by the Financial Times that the megacap technology stocks, particularly Nvidia, were in "bubble land." [non-paywalled link] It added that it was "sceptical" that Big Tech companies would keep buying the chipmaker's graphics processing units in such high volumes, and that AI is "overhyped with many applications not ready for prime time."

[...] Many of AI's supposed uses are "never going to be cost-efficient, are never going to actually work right, will take up too much energy, or will prove to be untrustworthy," it said. Elliott, which was founded by billionaire Paul Singer in 1977, added in its client letter that, so far, AI had failed to deliver a promised huge uplift in productivity. "There are few real uses," it said, other than "summarising notes of meetings, generating reports and helping with computer coding." AI, it added, was in effect software that had so far not delivered "value commensurate with the hype."
Open Source

Nvidia's Open-Source Linux Kernel Driver Performing At Parity To Proprietary Driver (phoronix.com) 21

Nvidia's new R555 Linux driver series has significantly improved their open-source GPU kernel driver modules, achieving near parity with their proprietary drivers. Phoronix's Michael Larabel reports: The NVIDIA open-source kernel driver modules shipped by their driver installer and also available via their GitHub repository are in great shape. With the R555 series the support and performance is basically at parity of their open-source kernel modules compared to their proprietary kernel drivers. [...] Across a range of different GPU-accelerated creator workloads, the performance of the open-source NVIDIA kernel modules matched that of the proprietary driver. No loss in performance going the open-source kernel driver route. Across various professional graphics workloads, both the NVIDIA RTX A2000 and A4000 graphics cards were also achieving the same performance whether on the open-source MIT/GPLv2 driver or using NVIDIA's classic proprietary driver.

Across all of the tests I carried out using the NVIDIA 555 stable series Linux driver, the open-source NVIDIA kernel modules were able to achieve the same performance as the classic proprietary driver. Also important is that there was no increased power use or other difference in power management when switching over to the open-source NVIDIA kernel modules.

It's great seeing how far the NVIDIA open-source kernel modules have evolved and that with the upcoming NVIDIA 560 Linux driver series they will be defaulting to them on supported GPUs. And moving forward with Blackwell and beyond, NVIDIA is just enabling the GPU support along their open-source kernel drivers with leaving the proprietary kernel drivers to older hardware. Tests I have done using NVIDIA GeForce RTX 40 graphics cards with Linux gaming workloads between the MIT/GPL and proprietary kernel drivers have yielded similar (boring but good) results: the same performance being achieved with no loss going the open-source route.
You can view Phoronix's performance results in charts here, here, and here.
Graphics

Nvidia RTX 40-Series GPUs Hampered By Low-Quality Thermal Paste (pcgamer.com) 50

"Anyone who is into gaming knows your graphics card is under strain trying to display modern graphics," writes longtime Slashdot reader smooth wombat. "This results in increased power usage, which is then turned into heat. Keeping your card cool is a must to get the best performance possible."

"However, hardware tester Igor's Lab found that vendors for Nvidia RTX 40-series cards are using cheap, poorly applied thermal paste, which is leading to high temperatures and consequently, performance degradation over time. This penny-pinching has been confirmed by Nick Evanson at PC Gamer." From the report: I have four RTX 40-series cards in my office (RTX 4080 Super, 4070 Ti, and two 4070s) and all of them have quite high hotspots -- the highest temperature recorded by an individual thermal sensor in the die. In the case of the 4080 Super, it's around 11 C higher than the average temperature of the chip. I took it apart to apply some decent quality thermal paste and discovered a similar situation to that found by Igor's Lab. In the space of a few months, the factory-applied paste had separated and spread out, leaving just an oily film behind, and a few patches of the thermal compound itself. I checked the other cards and found that they were all in a similar state.

Igor's Lab examined the thermal paste used on a brand-new RTX 4080 and found it to be quite thin in nature, due to large quantities of cheap silicone oil being used, along with zinc oxide filler. There was lots of ground aluminium oxide (the material that provides the actual thermal transfer) but it was quite coarse, leading to the paste separating quite easily. Removing the factory-installed paste from another RTX 4080 graphics card, Igor's Lab applied a more appropriate amount of a high-quality paste and discovered that it lowered the hotspot temperature by nearly 30 C.

AMD

AMD Claims Its Top-Tier Ryzen AI Chip Is Faster Than Apple's M3 Pro 42

AMD has introduced its latest Ryzen AI chips, built on the new Zen 5 architecture, in an ambitious attempt to compete with Apple's dominant MacBook processors. During a recent two-day event in Los Angeles, the company made bold claims about outperforming Apple's M3 and M3 Pro chips in various tasks including multitasking, image processing, and gaming, though these assertions remain unverified due to limited demonstrations and benchmarks provided at the event, The Verge reports. The report adds: At that event, I heard AMD brag about beating the MacBook more than I've ever heard a company directly target a competitor before. AMD claimed its new Ryzen chip "exceeds the performance of what MacBook Air has to offer in multitasking, image processing, 3D rendering, and gaming"; "is 15 percent faster than the M3 Pro" in Cinebench; and is capable of powering up to four displays, "unlike the MacBook Air, which limits you to two displays only." While AMD touted significant improvements in CPU architecture, graphics performance, and AI capabilities, journalists present at the event were unable to fully test or validate these features, leaving many questions unanswered about the chips' real-world performance.

The company's reluctance or inability to showcase certain capabilities, particularly in gaming and AI applications, has raised eyebrows among industry observers, the report adds. The new Ryzen AI chips are scheduled to debut in Asus laptops on July 28th, marking a critical juncture for AMD in the fiercely competitive laptop processor market. As Apple's M-series chips and Qualcomm's Snapdragon processors continue to gain traction in the mobile computing space, the success or failure of AMD's latest offering could have far-reaching implications for the future of x86 architecture in laptops.
Linux

Linux Kernel 6.10 Released (omgubuntu.co.uk) 15

"The latest version of the Linux kernel adds an array of improvements," writes the blog OMG Ubuntu, " including a new memory sealing system call, a speed boost for AES-XTS encryption on Intel and AMD CPUs, and expanding Rust language support within the kernel to RISC-V." Plus, like in all kernel releases, there's a glut of groundwork to offer "initial support" for upcoming CPUs, GPUs, NPUs, Wi-Fi, and other hardware (that most of us don't use yet, but require Linux support to be in place for when devices that use them filter out)...

Linux 6.10 adds (after much gnashing) the mseal() system call to prevent changes being made to portions of the virtual address space. For now, this will mainly benefit Google Chrome, which plans to use it to harden its sandboxing. Work is underway by kernel contributors to allow other apps to benefit, though. A similarly initially-controversial change merged is a new memory-allocation profiling subsystem. This helps developers fine-tune memory usage and more readily identify memory leaks. An explainer from LWN summarizes it well.

Elsewhere, Linux 6.10 offers encrypted interactions with trusted platform modules (TPM) in order to "make the kernel's use of the TPM reasonably robust in the face of external snooping and packet alteration attacks". The documentation for this feature explains: "for every in-kernel operation we use null primary salted HMAC to protect the integrity [and] we use parameter encryption to protect key sealing and parameter decryption to protect key unsealing and random number generation." Sticking with security, the Linux kernel's Landlock security module can now apply policies to ioctl() calls (Input/Output Control), restricting potential misuse and improving overall system security.

On the networking side there's significant performance improvements to zero-copy send operations using io_uring, and the newly-added ability to "bundle" multiple buffers for send and receive operations also offers an uptick in performance...

A couple of months ago Canonical announced Ubuntu support for the RISC-V Milk-V Mars single-board computer. Linux 6.10 mainlines support for the Milk-V Mars, which will make that effort a lot more viable (especially with the Ubuntu 24.10 kernel likely to be v6.10 or newer). Others RISC-V improvements abound in Linux 6.10, including support for the Rust language, boot image compression in BZ2, LZ4, LZMA, LZO, and Zstandard (instead of only Gzip); and newer AMD GPUs thanks to kernel-mode FPU support in RISC-V.

Phoronix has their own rundown of Linux 6.10, plus a list of some of the highlights, which includes:
  • The initial DRM Panic infrastructure
  • The new Panthor DRM driver for newer Arm Mali graphics
  • Better AMD ROCm/AMDKFD support for "small" Ryzen APUs and new additions for AMD Zen 5.
  • AMD GPU display support on RISC-V hardware thanks to RISC-V kernel mode FPU
  • More Intel Xe2 graphics preparations
  • Better IO_uring zero-copy performance
  • Faster AES-XTS disk/file encryption with modern Intel and AMD CPUs
  • Continued online repair work for XFS
  • Steam Deck IMU support
  • TPM bus encryption and integrity protection

Graphics

Arm Announces an Open-Source Graphics Upscaler For Mobile Phones (theverge.com) 6

Arm is launching its Arm Accuracy Super Resolution (ASR) upscaler that "can make games look better, while lowering power consumption on your phone," according to The Verge. "It's also making the upscaling technology available to developers under an MIT open-source license." From the reprot: Arm based its technology on AMD's FidelityFX Super Resolution 2 (FSR 2), which uses temporal upscaling to make PC games look better and boost frame rates. Unlike spatial upscaling, which upscales an image based on a single frame, temporal upscaling involves using multiple frames to generate a higher-quality image.

You can see just how Arm ASR stacks up to AMD's FSR 2 and Qualcomm's GSR tech in [this chart] created by Arm. Arm claims ASR produced 53 percent higher frame rates than rendering at native resolution on a device with an Arm Immortalis-G720 GPU and 2800 x 1260 display, beating AMD FSR 2. It also tested ASR on a device using MediaTek's Dimensity 9300 chip and found that rendering at 540p and upscaling with ASR used much less power than running a game at native 1080p resolution.

Graphics

Affinity Tempts Adobe Users with 6-Month Free Trial of Creative Suite (theverge.com) 39

Serif, the design software developer behind Affinity, has introduced a six-month free trial for its creative suite, offering Affinity Photo, Designer, and Publisher on Mac, Windows PC, and iPad. This move, along with a 50% discount on perpetual licenses, aims to attract Adobe users and reassure them of Affinity's commitment to its one-time purchase pricing model despite its recent acquisition by Canva. The Verge reports: Affinity uses a one-time purchase pricing model that has earned it a loyal fanbase among creatives who are sick of paying for recurring subscriptions. Prices start at $69.99 for Affinity's individual desktop apps or $164.99 for the entire suite, with a separate deal currently offering customers 50 percent off all perpetual licenses.

This discount, alongside the six-month free trial, is potentially geared at soothing concerns that Affinity would change its pricing model after being acquired by Canva earlier this year. "We're saying 'try everything and pay nothing' because we understand making a change can be a big step, particularly for busy professionals," said Affinity CEO Ashley Hewson. "Anyone who takes the trial is under absolutely no obligation to buy."

China

Nvidia Forecasted To Make $12 Billion Selling GPUs In China (theregister.com) 4

Nvidia is expected to earn $12 billion from GPU sales to China in 2024, despite U.S. trade restrictions. Research firm SemiAnalysis says the GPU maker will ship over 1 million units of its new H20 model to the Chinese market, "with each one said to cost between $12,000 and $13,000 apiece," reports The Register. From the report: This figure is said by SemiAnalysis to be nearly double what Huawei is likely to sell of its rival accelerator, the Ascend 910B, as reported by The Financial Times. If accurate, this would seem to contradict earlier reports that Nvidia had moved to cut the price of its products for the China market. This was because buyers were said to be opting instead for domestically made kit for accelerating AI workloads. The H20 GPU is understood to be the top performing model out of three Nvidia GPUs specially designed for the Chinese market to comply with rules introduced by the Biden administration last year that curb performance.

In contrast, Huawei's Ascend 910B is claimed to have performance on a par with that of Nvidia's A100 GPU. It is believed to be an in-house design manufactured by Chinese chipmaker SMIC using a 7nm process technology, unlike the older Ascend 910 product. If this forecast proves accurate, it will be a relief for Nvidia, which earlier disclosed that its sales in China delivered a "mid-single digit percentage" of revenue for its Q4 of FY2024, and was forecast to do the same in Q1 of FY 2025. In contrast, the Chinese market had made up between 20 and 25 percent of the company's revenue in recent years, until the export restrictions landed.

Games

Kien, the Most-Delayed Video Game in History, Released After 22 Years (theguardian.com) 24

An Italian video game, 22 years in the making, has finally hit the market, setting a record for the longest development time in gaming history. "Kien," an action platformer for Nintendo's Game Boy Advance, began development in 2002 by a group of five inexperienced enthusiasts, The Guardian reports. Only one, Fabio Belsanti, saw the project through to completion. The game, inspired by 15th-century Tuscan manuscripts and early Japanese graphics, offers a challenging, nonlinear fantasy experience. It's now available on a translucent gray cartridge, complete with a printed manual -- a rarity in modern gaming. Belsanti's company, AgeOfGames, survived the delay by creating educational games. The recent boom in retro gaming finally made Kien's release feasible, he said.
Businesses

French Antitrust Regulators Preparing Nvidia Charges (reuters.com) 28

French antitrust regulators are preparing to charge Nvidia for allegedly anti-competitive practices, Reuters reported Monday, citing sources. From the report: The French so-called statement of objections or charge sheet would follow dawn raids in the graphics cards sector in September last year which sources said targeted Nvidia. The world's largest maker of chips used both for artificial intelligence and for computer graphics has seen demand for its chips jump following the release of the generative AI application ChatGPT, triggering regulatory scrutiny on both sides of the Atlantic.
AI

Is AI's Demand for Energy Really 'Insatiable'? (arstechnica.com) 56

Bloomberg and The Washington Post "claim AI power usage is dire," writes Slashdot reader NoWayNoShapeNoForm. But Ars Technica "begs to disagree with those speculations."

From Ars Technica's article: The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI's "insatiable" demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous "some [people]" in reporting that "some worry whether there will be enough electricity to meet [the power demands] from any source." Digging into the best available numbers and projections available, though, it's hard to see AI's current and near-future environmental impact in such a dire light... While the headline focus of both Bloomberg and The Washington Post's recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet "data centers" as a whole...

Bloomberg asks one source directly "why data centers were suddenly sucking up so much power" and gets back a blunt answer: "It's AI... It's 10 to 15 times the amount of electricity." Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry's current mania for generative AI. If you squint at Bloomberg's graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI.

Ars Technica first cites Dutch researcher Alex de Vries's estimate that in a few years the AI sector could use between 85 and 134 TWh of power. But another study estimated in 2018 that PC gaming already accounted for 75 TWh of electricity use per year, while "the IEA estimates crypto mining ate up 110 TWh of electricity in 2022." More to the point, de Vries' AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA's recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as "AI").
The future is also hard to predict, the article concludes. "If customers don't respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs..."
Unix

X Window System Turns 40 52

Ancient Slashdot reader ewhac writes: On June 19, 1984, Robert Scheifler announced on MIT's Project Athena mailing list a new graphical windowing system he'd put together. Having cribbed a fair bit of code from the existing windowing toolkit called W, Scheifler named his new system X, thus giving birth to the X Window System. Scheifler prophetically wrote at the time, "The code seems fairly solid at this point, although there are still some deficiencies to be fixed up."

The 1980's and 1990's saw tremendous activity in the development of graphical displays and user interfaces, and X was right in the middle of it all, alongside Apple, Sun, Xerox, Apollo, Silicon Graphics, NeXT, and many others. Despite the fierce, well-funded competition, and heated arguments about how many buttons a mouse should have, X managed to survive, due in large part to its Open Source licensing and its flexible design, allowing it to continue to work well even as graphical hardware rapidly advanced. As such, it was ported to dozens of platforms over the years (including a port to the Amiga computer by Dale Luck in the late 1980's). 40 years later, despite its warts, inconsistencies, age, and Wayland promising for the last ten years to be coming Real Soon Now, X remains the windowing system for UNIX-like platforms.
KDE

KDE Plasma 6.1 Released (kde.org) 42

"The KDE community announced the latest release of their popular desktop environment: Plasma 6.1," writes longtime Slashdot reader jrepin. From the announcement: While Plasma 6.0 was all about getting the migration to the underlying Qt 6 frameworks correct, Plasma 6.1 is where developers start implementing the features that will take you desktop to a new level. In this release, you will find features that go far beyond subtle changes to themes and tweaks to animations (although there is plenty of those too). Among some of the new features in this release you will find improved remote desktop support with a new built-in server, overhauled and streamlined desktop edit mode, restoration of open applications from the previous session on Wayland, synchronization of keyboard LED colors with the desktop accent color, making mouse cursor bigger and easier to find by shaking it, edge barriers (a sticky area for mouse cursor near the edges between screens), explicit sync support eliminates flickering and glitches for NVidia graphics card users on Wayland, and triple buffering support for smoother animations and screen rendering. The changelog for Plasma 6.1 is available here.
XBox (Games)

Upcoming Games Include More Xbox Sequels - and a Medieval 'Doom' (polygon.com) 32

Announced during Microsoft's Xbox Games Showcase, Doom: The Dark Ages is id Software's next foray back into hell. [Also available for PS5 and PC.] Doom: The Dark Ages is a medieval spin on the Doom franchise, taking the Doom Slayer back to the beginning. It's coming to Xbox Game Pass on day one, sometime in 2025.

Microsoft's first trailer for Doom: The Dark Ages shows the frenetic, precision gameplay we've come to expect from the franchise — there's a lot of blasting and shooting and a chainsaw. Oh, and the Doom Slayer can ride a dragon?

"Before he became a hero he was the super weapon of gods and kings," says the trailer (which showcases the game's crazy-good graphics...) The 2020 game Doom Eternal sold 3 million copies in its first month, according to Polygon, with its game director telling the site in 2021 that "our hero is somewhat timeless — I mean, literally, he's immortal. So we could tell all kinds of stories..."

Other upcoming Xbox games were revealed too. Engadget is excited about the reboot of the first-person shooter Perfect Dark (first released in 2000, but now set in the near future). There's also Gears of War: E-Day, Indiana Jones and the Great Circle, State of Decay 3, and Assassin's Creed Shadows, according to Xbox.com — plus "the announcement of three new Xbox Series X|S console options." [Engadget notes it's the first time Microsoft has offered a cheaper all-digital Xbox Series X with no disc drive.] "And on top of all that, we also brought the gameplay reveal of a brand-new Call of Duty game with Call of Duty: Black Ops 6."

Meanwhile, Friday's Summer Game Fest 2024 featured Star Wars Outlaws footage (which according to GamesRadar takes place between Empire Strikes Back and Return of the Jedi, featuring not just card games with Lando Calrissian but also Jabba the Hutt and a frozen Han Solo.) Engadget covered all the announcements from Game Fest, including the upcoming game Mixtape, which Engadget calls a "reality-bending adventure" with "a killer '80s soundtrack" about three cusp-of-adulthood teenagers who "Skate. Party. Avoid the law. Make out. Sneak out. Hang out..." for Xbox/PS5/PC.
Graphics

Nvidia Takes 88% of the GPU Market Share (xda-developers.com) 83

As reported by Jon Peddie Research, Nvidia now holds 88% of the GPU market after its market share jumped 8% in its most recent quarter. "This jump shaves 7% off of AMD's share, putting it down to 19% total," reports XDA Developers. "And if you're wondering where that extra 1% went, it came from all of Intel's market share, squashing it down to 0%." From the report: Dr. Jon Peddie, president of Jon Peddie Research, mentions how the GPU market hasn't really looked "normal" since the 2007 recession. Ever since then, everything from the crypto boom to COVID has messed with the usual patterns. Usually, the first quarter of a year shows a bit of a dip in GPU sales, but because of AI's influence, it may seem like that previous norm may be forever gone: "Therefore, one would expect Q2'24, a traditional quarter, to also be down. But, all the vendors are predicting a growth quarter, mostly driven by AI training systems in hyperscalers. Whereas AI trainers use a GPU, the demand for them can steal parts from the gaming segment. So, for Q2, we expect to see a flat to low gaming AIB result and another increase in AI trainer GPU shipments. The new normality is no normality."
Ubuntu

Canonical Launches Ubuntu Core 24 (ubuntu.com) 5

Canonical, the company behind Ubuntu, has released Ubuntu Core 24, a version of its operating system designed for edge devices and the Internet of Things (IoT). The new release comes with a 12-year Long Term Support commitment and features that enable secure, reliable, and efficient deployment of intelligent devices.

Ubuntu Core 24 introduces validation sets for custom image creation, offline remodelling for air-gapped environments, and new integrations for GPU operations and graphics support. It also offers device management integrations with Landscape and Microsoft Azure IoT Edge. The release is expected to benefit various industries, including automation, healthcare, and robotics, Canonical said.

Slashdot Top Deals