Graphics

Graphics Artists In China Push Back On AI and Its Averaging Effect (theverge.com) 33

Graphic artists in China are pushing back against AI image generators, which they say "profoundly shifts clients' perception of their work, specifically in terms of how much that work costs and how much time it takes to produce," reports The Verge. "Freelance artists or designers working in industries with clients that invest in stylized, eye-catching graphics, like advertising, are particularly at risk." From the report: Long before AI image generators became popular, graphic designers at major tech companies and in-house designers for large corporate clients were often instructed by managers to crib aesthetics from competitors or from social media, according to one employee at a major online shopping platform in China, who asked to remain anonymous for fear of retaliation from their employer. Where a human would need to understand and reverse engineer a distinctive style to recreate it, AI image generators simply create randomized mutations of it. Often, the results will look like obvious copies and include errors, but other graphic designers can then edit them into a final product.

"I think it'd be easier to replace me if I didn't embrace [AI]," the shopping platform employee says. Early on, as tools like Stable Diffusion and Midjourney became more popular, their colleagues who spoke English well were selected to study AI image generators to increase in-house expertise on how to write successful prompts and identify what types of tasks AI was useful for. Ultimately, it was useful for copying styles from popular artists that, in the past, would take more time to study. "I think it forces both designers and clients to rethink the value of designers," Jia says. "Is it just about producing a design? Or is it about consultation, creativity, strategy, direction, and aesthetic?" [...]

Across the board, though, artists and designers say that AI hype has negatively impacted clients' view of their work's value. Now, clients expect a graphic designer to produce work on a shorter timeframe and for less money, which also has its own averaging impact, lowering the ceiling for what designers can deliver. As clients lower budgets and squish timelines, the quality of the designers' output decreases. "There is now a significant misperception about the workload of designers," [says Erbing, a graphic designer in Beijing who has worked with several ad agencies and asked to be called by his nickname]. "Some clients think that since AI must have improved efficiency, they can halve their budget." But this perception runs contrary to what designers spend the majority of their time doing, which is not necessarily just making any image, Erbing says.

Software

Blender 5.0 Introducing HDR Support On Linux With Vulkan + Wayland (phoronix.com) 24

Michael Larabel writes via Phoronix: The upcoming Blender 5.0 3D modeling software application is introducing High Dynamic Range (HDR) display support on Linux when making use of Wayland -- no X11 support for HDR -- and Vulkan graphics accelerator. HDR support for Blender 5.0 on Linux is currently considered experimental. Enabling the HDR support on Linux for the Blender creator software requires having a High Dynamic Range display (of course) and be running on a Wayland desktop, enabling Vulkan API acceleration rather than OpenGL, and enabling the feature currently deemed experimental. Additional details can be found via this Blender DevTalk thread.
Windows

Games Run Faster On SteamOS Than Windows 11, Ars Testing Finds (arstechnica.com) 100

An anonymous reader quotes a report from Ars Technica: Nearly a decade ago, Ars testing found that Valve's "Steam Machines"-era version of SteamOS performed significantly worse than Windows when SteamOS's Linux game ports were tested on the same hardware as their Windows counterparts. Today, though, Ars testing on the Lenovo Legion Go S finds recent games generally run at higher frame rates on SteamOS 3.7 than on Windows 11. [...]

As you can see in the included charts, SteamOS showed noticeable frame rate improvements in four of the five games tested. Only Borderlands 3 showed comparable performance across both operating systems, with Windows eking out ever-so-slightly higher frame rates in that game's benchmarks. For the other four tested games, the stock Lenovo Windows drivers were sometimes significantly worse than those included with SteamOS. When playing Returnal at "High" graphics presets and 1920x1200 resolution, for instance, changing from Lenovo's Windows drivers to SteamOS meant the difference between a hard-to-take 18 FPS average and a downright decent 33 FPS average. Sideloading the updated Asus drivers showed a noticeable improvement in Windows performance across all tested games and even brought Homeworld 3's "Low" graphics benchmark test to practical parity with SteamOS. In all other cases, though, even these updated drivers resulted in benchmark frame rates anywhere from 8 percent to 36 percent lower than those same benchmarks on SteamOS.

These results might seem a bit counterintuitive, considering that games running on SteamOS must go through a Proton translation layer for every native Windows instruction in a game's code. But Valve has put in consistent work over the years to make Proton as efficient and cross-compatible as possible; not to mention its continued work on Linux's Mesa graphics drivers seems to be paying dividends for SteamOS graphics performance. Running SteamOS also means eliminating a lot of operating system overhead that the more generalist Windows uses by default. Microsoft seems aware of this issue for gamers and has recently announced that the upcoming "Xbox Experience for Handheld" will "minimize background activity and defer non-essential tasks" to allow for "more [and] higher framerates" in games.

Ubuntu

Ubuntu To Disable Intel Graphics Security Mitigations To Boost GPU Performance By Up To 20% (arstechnica.com) 15

Disabling Intel graphics security mitigations in GPU compute stacks for OpenCL and Level Zero can yield a performance boost of up to 20%, prompting Ubuntu's Canonical and Intel to disable these mitigations in future Ubuntu packages. Phoronix's Michael Larabel reports: Intel does allow building their GPU compute stack without these mitigations by using the "NEO_DISABLE_MITIGATIONS" build option and that is what Canonical is looking to set now for Ubuntu packages to avoid the significant performance impact. This work will likely all be addressed in time for Ubuntu 25.10. This NEO_DISABLE_MITIGATIONS option is just for compiling the Intel Compute Runtime stack and doesn't impact the Linux kernel security mitigations or else outside of Intel's "NEO" GPU compute stack. Both Intel and Canonical are in agreement with this move and it turns out that even Intel's GitHub binary packages for their Compute Runtime for OpenCL and Level Zero ship with the mitigations disabled due to the performance impact. This Ubuntu Launchpad bug report for the Intel Compute Runtime notes some of the key takeaways. There is also this PPA where Ubuntu developers are currently testing their Compute Runtime builds with NEO_DISABLE_MITIGATIONS enabled for disabling the mitigations.
KDE

KDE Plasma 6.4 Released (kde.org) 29

Longtime Slashdot reader jrepin writes: Plasma is a popular desktop (and mobile) environment for GNU/Linux and other UNIX-like operating systems. Among other things, it also powers the desktop mode of the Steam Deck gaming handheld. The KDE community today announced the latest release: Plasma 6.4. This fresh new release improves on nearly every front, with progress being made in accessibility, color rendering, tablet support, window management, and more.

Plasma already offered virtual desktops and customizable tiles to help organize your windows and activities, and now it lets you choose a different configuration of tiles on each virtual desktop. The Wayland session brings some new accessibility features: you can now move the pointer using your keyboard's number pad keys, or use a three-finger touchpad pinch gesture to zoom in or out.

Plasma file transfer notification now shows a speed graph, giving you a more visual idea of how fast the transfer is going and how long it will take to complete. When any applications are in full screen mode Plasma will now enter Do Not Disturb mode and only show urgent notifications. When you exit full-screen mode, you'll see a summary of any notifications you missed.

Now, when an application tries to access the microphone and finds it muted, a notification will pop up. A new feature in the Application Launcher widget will place a green New! tag next to newly installed apps, so you can easily find where something you just installed lives in the menu.

The Display and Monitor page in System Settings comes with a brand new HDR calibration wizard. Support for Extended Dynamic Range (a different kind of HDR) and P010 video color format has also been added. System Monitor now supports usage monitoring for AMD and Intel graphic cards -- it can even show the GPU usage on a per-process basis.

Spectacle, the built-in app for taking screenshots and screen recordings, has a much-improved design and more streamlined functionality. The background of the desktop or window now darkens when an authentication dialog shows up, helping you locate and focus on the window asking for your password.

There's a brand-new Animations page in System Settings that groups all the settings for purely visual animated effects into one place, making them easier to find and configure. Aurorae, a newly added SVG vector graphics theme engine, enhances KWin window decorations.

You can read more about these and many other other features in the Plasma 6.4 announcement and complete changelog.

Nintendo

Nintendo Switch 2 Has Record-Breaking Launch, Selling Over 3 Million Units (barrons.com) 48

TweakTown writes that the Switch 2 "has reportedly beaten the record for the most-sold console within 24 hours and is on track to shatter the two-month record," selling over 3 million units and tripling the PlayStation 4's previous launch day sales.

So Nintendo's first console in 8 years becomes "one of the most successful hardware releases of all time," writes Barron's, raising hopes for the future: [2017's original Switch] ultimately sold more than 152 million units... Switch 2's big advantage is its backward compatibility, allowing it to play current-generation Switch games and giving gamers solace that their large investments in software are intact... Many older Switch games also play better on the Switch 2, taking advantage of the extra horsepower.
Bloomberg writes that its bigger screen and faster chip "live up to the hype: Despite the hype and a $150 increase over the launch price for the original, the second-generation system manages to impress with faster performance, improved graphics, more comfortable ergonomics and enough tweaks throughout to make this feel like a distinctly new machine... This time, it's capable of outputting 4K resolution and more impactful HDR video to your TV screen... It's a bigger, faster, more polished version of a wildly successful gadget.
The "buzzy launch drew long lines" at retailers like Walmart, Target, Best Buy, and Gamestop, according to the article. (See the photos from AOL.com and USA Today.) "The era of spending hours waiting in line for the latest iPhone is long gone, but the debut of a new video game console is still a rare enough event that Nintendo fans didn't think twice about driving to retailers in the middle of the night to secure a Switch 2."

The Verge also opines that "the Switch 2's eShop is much better," calling it "way faster... with much less lag browsing through sections and loading up game pages."

Or, as Barron's puts it, "Ultimately, Nintendo is winning because it has a different strategy than its competition, the Sony PlayStation and Microsoft Xbox. Instead of trying to appeal to tech snobs like me, who are obsessed with graphics resolution and hardware statistics like teraflops, Nintendo focuses on joy and fun."
Hardware

Polish Engineer Creates Postage Stamp-Sized 1980s Atari Computer (arstechnica.com) 32

Ars Technica's Benj Edwards reports: In 1979, Atari released the Atari 400 and 800, groundbreaking home computers that included custom graphics and sound chips, four joystick ports, and the ability to run the most advanced home video games of their era. These machines, which retailed for $549 and $999, respectively, represented a leap in consumer-friendly personal computing, with their modular design and serial I/O bus that presaged USB. Now, 46 years later, a hobbyist has shrunk down the system hardware to a size that would have seemed like science fiction in the 1970s.

Polish engineer Piotr "Osa" Ostapowicz recently unveiled "Atarino," which may be the world's smallest 8-bit Atari computer re-creation, according to retro computing site Atariteca. The entire system -- processor, graphics chips, sound hardware, and memory controllers -- fits on a module measuring just 2x1.5 centimeters (about 0.79x0.59 inches), which is roughly the size of a postage stamp.

Ostapowicz's creation reimplements the classic Atari XL/XE architecture using modern FPGA (field-programmable gate array) technology. Unlike software emulators that simulate old hardware (and modern recreations that run them, like the Atari 400 Mini console) on a complete computer system of another architecture, Atarino reproduces the original Atari components faithfully at the logic level, allowing it to run vintage software while maintaining compatibility with original peripherals. [...] The project, which began over a decade ago and was first publicly demonstrated in December 2023, includes a 6502C processor, ANTIC and GTIA graphics chips, POKEY sound chip, and memory controllers onto a single Lattice UP5K FPGA chip. Despite its tiny size, the system can run at clock speeds up to 31 MHz -- far faster than the original hardware's 1.79 MHz.
While the Atarino can run vintage software and work with the original peripherals, it brings several key improvements -- including a modernized 6502 core with added instructions, a more efficient memory architecture, enhanced video output via VGA and HDMI, extended graphics modes, refined sound chip emulation, modular hardware design, support for modern connectivity like Wi-Fi and Ethernet, and compatibility with contemporary development tools like CC65 and Visual Studio Code.

Ostapowicz "plans to release complete kits with documentation, inviting the retrocomputing community to experiment with the hardware," adds Edwards.
Open Source

SerenityOS Creator Is Building an Independent, Standards-First Browser Called 'Ladybird' (thenewstack.io) 40

A year ago, the original creator of SerenityOS posted that "for the past two years, I've been almost entirely focused on Ladybird, a new web browser that started as a simple HTML viewer for SerenityOS." So it became a stand-alone project that "aims to render the modern web with good performance, stability and security." And they're also building a new web engine.

"We are building a brand-new browser from scratch, backed by a non-profit..." says Ladybird's official web site, adding that they're driven "by a web standards first approach." They promise it will be truly independent, with "no code from other browsers" (and no "default search engine" deals).

"We are targeting Summer 2026 for a first Alpha version on Linux and macOS. This will be aimed at developers and early adopters." More from the Ladybird FAQ: We currently have 7 paid full-time engineers working on Ladybird. There is also a large community of volunteer contributors... The focus of the Ladybird project is to build a new browser engine from the ground up. We don't use code from Blink, WebKit, Gecko, or any other browser engine...

For historical reasons, the browser uses various libraries from the SerenityOS project, which has a strong culture of writing everything from scratch. Now that Ladybird has forked from SerenityOS, it is no longer bound by this culture, and we will be making use of 3rd party libraries for common functionality (e.g image/audio/video formats, encryption, graphics, etc.) We are already using some of the same 3rd party libraries that other browsers use, but we will never adopt another browser engine instead of building our own...

We don't have anyone actively working on Windows support, and there are considerable changes required to make it work well outside a Unix-like environment. We would like to do Windows eventually, but it's not a priority at the moment.

"Ladybird's founder Andreas Kling has a solid background in WebKit-based C++ development with both Apple and Nokia,," writes software developer/author David Eastman: "You are likely reading this on a browser that is slightly faster because of my work," he wrote on his blog's introduction page. After leaving Apple, clearly burnt out, Kling found himself in need of something to healthily occupy his time. He could have chosen to learn needlepoint, but instead he opted to build his own operating system, called Serenity. Ladybird is a web project spin-off from this, to which Kling now devotes his time...

[B]eyond the extensive open source politics, the main reason for supporting other independent browser projects is to maintain diverse alternatives — to prevent the web platform from being entirely captured by one company. This is where Ladybird comes in. It doesn't have any commercial foundation and it doesn't seem to be waiting to grab a commercial opportunity. It has a range of sponsors, some of which might be strategic (for example, Shopify), but most are goodwill or alignment-led. If you sponsor Ladybird, it will put your logo on its webpage and say thank you. That's it. This might seem uncontroversial, but other nonprofit organisations also give board seats to high-paying sponsors. Ladybird explicitly refuses to do this...

The Acid3 Browser test (which has nothing whatsoever to do with ACID compliance in databases) is an old method of checking compliance with web standards, but vendors can still check how their products do against a battery of tests. They check compliance for the DOM2, CSS3, HTML4 and the other standards that make sure that webpages work in a predictable way. If I point my Chrome browser on my MacBook to http://acid3.acidtests.org/, it gets 94/100. Safari does a bit better, getting to 97/100. Ladybird reportedly passes all 100 tests.

"All the code is hosted on GitHub," says the Ladybird home page. "Clone it, build it, and join our Discord if you want to collaborate on it!"
Operating Systems

Valve Adds SteamOS Support For Its Steam Deck Rivals (polygon.com) 24

Valve's SteamOS 3.7.8 update brings official support for AMD-powered handhelds like Lenovo's Legion Go and Asus' ROG Ally, along with a new "Steam OS Compatible" library tab and key bug fixes. Other features include a battery charge limit, updated graphics drivers, and a shift to Plasma 6.2.5. Polygon reports: Valve outlines two requirements for the third-party devices not explicitly named in the update to run SteamOS on the handheld: they must be AMD-powered and have an NVMe SSD. Specific instructions for installing the operating system have been updated and listed here.

Before this huge update, players had to use an alternative like Bazzite to achieve a similar SteamOS experience on their devices. The new update also piggybacks off of Valve expanding the Steam Deck Verified categorization system to "any device running SteamOS that's not a Steam Deck" in mid-May. To make matters sweeter, a SteamOS-powered version of the Lenovo Legion Go S is scheduled to release on May 25.
You can learn more about SteamOS 3.7.8 here.
Graphics

Nvidia's RTX 5060 Review Debacle Should Be a Wake-Up Call (theverge.com) 67

Nvidia is facing backlash for allegedly manipulating the review process of its GeForce RTX 5060 GPU by withholding drivers, selectively granting early access to favorable reviewers, and pressuring media to present the card in a positive light. As The Verge's Sean Hollister writes, the debacle "should be a wake-up call for gamers and reviewers." Here's an excerpt from the report: Nvidia has gone too far. This week, the company reportedly attempted to delay, derail, and manipulate reviews of its $299 GeForce RTX 5060 graphics card, which would normally be its bestselling GPU of the generation. Nvidia has repeatedly and publicly said the budget 60-series cards are its most popular, and this year it reportedly tried to ensure it by withholding access and pressuring reviewers to paint them in the best light possible.

Nvidia might have wanted to prevent a repeat of 2022, when it launched this card's predecessor. Those reviews were harsh. The 4060 was called a "slap in the face to gamers" and a "wet fart of a GPU." I had guessed the 5060 was headed for the same fate after seeing how reviewers handled the 5080, which similarly showcased how little Nvidia's hardware has improved year over year and relies on software to make up the gaps. But Nvidia had other plans. Here are the tactics that Nvidia reportedly just used to throw us off the 5060's true scent, as individually described by GamersNexus, VideoCardz, Hardware Unboxed, GameStar.de, Digital Foundry, and more:

- Nvidia decided to launch its RTX 5060 on May 19th, when most reviewers would be at Computex in Taipei, Taiwan, rather than at their test beds at home.
- Even if reviewers already had a GPU in hand before then, Nvidia cut off most reviewers' ability to test the RTX 5060 before May 19th by refusing to provide drivers until the card went on sale. (Gaming GPUs don't really work without them.)
- And yet Nvidia allowed specific, cherry-picked reviewers to have early drivers anyhow if they agreed to a borderline unethical deal: they could only test five specific games, at 1080p resolution, with fixed graphics settings, against two weaker GPUs (the 3060 and 2060 Super) where the new card would be sure to win.
- In some cases, Nvidia threatened to withhold future access unless reviewers published apples-to-oranges benchmark charts showing how the RTX 5060's "fake frames" MFG tech can produce more frames than earlier GPUs without it.

Some reviewers apparently took Nvidia up on that proposition, leading to day-one "previews" where the charts looked positively stacked in the 5060's favor [...]. But the reality, according to reviews that have since hit the web, is that the RTX 5060 often fails to beat a four-year-old RTX 3060 Ti, frequently fails to beat a four-year-old 3070, and can sometimes get upstaged by Intel's cheaper $250 B580. And yet, the 5060's lackluster improvements are overshadowed by a juicier story: inexplicably, Nvidia decided to threaten GamersNexus' future access over its GPU coverage. Yes, the same GamersNexus that's developed a staunch reputation for defending consumers from predatory behavior, and just last month published a report on "GPU shrinkflation" that accused Nvidia of misleading marketing. Bad move! [...]

Nvidia is within its rights to withhold access, of course. Nvidia doesn't have to send out graphics cards or grant interviews. It'll only do it if it's good for business. But the unspoken covenant of product reviews is that the press, as a whole, gets a chance to warn the public if a movie, video game, or GPU is not worth their money. It works both ways: the media also gets the chance to warn that a product is so good you might want to line up in advance. That unspoken rule is what Nvidia is trampling here.

China

China's 7-Year Tech Independence Push Yields Major Gains in AI, Robotics and Semiconductors (msn.com) 84

China has achieved substantial technological advances across robotics, AI, and semiconductor manufacturing as part of a seven-year self-reliance campaign that has tripled the country's research and development spending to $500 billion annually.

Chinese robot manufacturers captured nearly half of their domestic market by 2023, up from a quarter of installations just years earlier, while AI startups now rival OpenAI and Google in capabilities. The progress extends to semiconductors, where Huawei released a high-end smartphone powered by what industry analysts believe was a locally-produced advanced processor, despite U.S. export controls targeting China's chip access.

Morgan Stanley projects China's self-sufficiency in graphics processing units will jump from 11% in 2021 to 82% by 2027. Chinese companies have been purchasing as many industrial robots as the rest of the world combined, enabling highly automated factories that can operate in darkness. In space technology, Chinese firms won five of 11 gold medals when U.S. think tanks ranked the world's best commercial satellite systems last year, compared to four for American companies.
Hardware

Nvidia Reportedly Raises GPU Prices by 10-15% (tomshardware.com) 63

An anonymous reader shares a report: A new report claims that Nvidia has recently raised the official prices of nearly all of its products to combat the impact of tariffs and surging manufacturing costs on its business, with gaming graphics cards receiving a 5 to 10% hike while AI GPUs see up to a 15% increase.

As reported by Digitimes Taiwan, Nvidia is facing "multiple crises," including a $5.5 billion hit to its quarterly earnings over export restrictions on AI chips, including a ban on sales of its H20 chips to China.

Digitimes reports that CEO Jensen Huang has been "shuttling back and forth" between the US and China to minimize the impact of tariffs, and that "in order to maintain stable profitability," Nvidia has reportedly recently raised official prices for almost all its products, allowing its partners to increase prices accordingly.

AI

Figma's Big AI Update Takes On Adobe, WordPress, and Canva 10

At its Config 2025 event on Wednesday, Figma unveiled four new AI-powered tools -- Sites, Make, Buzz, and Draw, positioning itself as a full-stack design platform to rival Adobe, WordPress, and Canva. These tools enable users to build websites, generate code, create marketing content, and design vector graphics without leaving the Figma ecosystem. The Verge reports: Figma's first solution is Figma Sites, a website builder that integrates with Figma Design and allows creators to turn their projects into live, functional sites. Figma Sites provides presets for layouts, blocks, templates, and interactions that aim to make building websites less complex and time-consuming. Additional components like custom animations can also be added either using existing code or by prompting Site's AI tool to generate new interaction codes via text descriptions, such as "animate the text to fall into place like a feather." Figma Sites is rolling out in beta for users with full seat access to Figma products. Figma says that AI code generation will be available "in the coming weeks," and that a CMS that allows designers to manage site content will be launched "later this year."

Figma Make is Figma's take on AI coding tools like Google's Gemini Code Assist and Microsoft's GitHub Copilot. The prompt-to-code Figma Make tool is powered by Anthropic's Claude 3.7 model and can build working prototypes and apps based on descriptions or existing designs, such as creating a functional music player that displays a disc that spins when new tracks are played. Specific elements of working design, like text formatting and font style, can be manually edited or adjusted using additional AI prompts. Make is rolling out in beta for full seat Figma users. Figma says it's "exploring integrations with third parties and design systems" for Figma Make and may apply the tool to other apps within its design platform.

Figma Buzz is a marketing-focused design app that's rolling out in beta to all users, and makes it easier for teams to publish brand content, similar to Canva's product design platform. The tool allows Figma designers to create brand-approved templates, styles, and assets that can be used by marketers to quickly assemble emails, social media posts, advertising, and more. Figma Buzz includes generative AI tools for making and editing images using text prompts, and can source information from spreadsheets to bulk create thousands of image assets at once.

Lastly, the Figma Draw vector design app is like a simplified version of Adobe Illustrator that creatives can use to make custom visuals without leaving the Figma platform. It includes a variety of brushes, texture effects, and vector editing tools to create or adjust scalable images and logos for product design projects. Figma Draw is generally available now for full seat users as a toggle in Figma Design, with some features accessible in Sites, Slides, and Buzz. It's not quite as expansive as Adobe's wider Creative Cloud ecosystem, but Figma Draw places the two companies in direct competition for the first time since Adobe killed its own XD product design platform. It also brings some new options to the creative software industry after Adobe failed to acquire Figma for $20 billion due to pressure from competition regulators.
Intel

Intel Says It's Rolling Out Laptop GPU Drivers With 10% To 25% Better Performance (arstechnica.com) 23

Ars Technica's Andrew Cunningham reports: Intel's oddball Core Ultra 200V laptop chips -- codenamed Lunar Lake -- will apparently be a one-off experiment, not to be replicated in future Intel laptop chips. They're Intel's only processors with memory integrated onto the CPU package; the only ones with a neural processing unit that meets Microsoft's Copilot+ performance requirements; and the only ones with Intel's best-performing integrated GPUs, the Intel Arc 130V and 140V.

Today, Intel announced some updates to its graphics driver that specifically benefit those integrated GPUs, welcome news for anyone who bought one and is trying to get by with it as an entry-level gaming system. Intel says that version 32.0.101.6734 of its graphics driver can speed up average frame rates in some games by around 10 percent, and can speed up "1 percent low FPS" (that is, for any given frames per second measurement, whatever your frame rate is the slowest 1 percent of the time) by as much as 25 percent. This should, in theory, make games run better in general and ease some of the stuttering you notice when your game's performance dips down to that 1 percent level.

Operating Systems

OpenBSD 7.7 Released (openbsd.org) 12

Longtime Slashdot reader me34point5 writes: OpenBSD quietly released the new version (7.7) of its "secure by default" operating system. This is the 58th release. Changes include improved hardware and VMM support, along with many kernel improvements. This release brings several specific improvements, including performance boosts on ARM64, Arm SVE support, AMD SEV virtualization enhancements, better low-memory handling on i386, and improved suspend/hibernate and SMP performance. It also updates graphics drivers with support for AMD Ryzen IA 300, Radeon RX 9070, and Intel Arrow Lake, along with expanded hardware support for MediaTek SoCs.

A full list of changes can be found here.
Education

Top Colleges Are Too Costly Even for Parents Making $300,000 (bloomberg.com) 87

Families earning $300,000 annually -- placing them among America's highest earners -- are increasingly finding themselves unable to afford elite college tuition without taking on substantial debt. Bloomberg's analysis of financial aid data from 50 selective colleges reveals households earning between $100,000 and $300,000 occupy a precarious middle ground: too affluent for meaningful aid but insufficiently wealthy to absorb annual costs approaching $100,000.

The squeeze begins around $150,000 income, where families typically contribute 20% ($30,000) annually toward tuition. At $270,000 income, expected contributions reach $61,000 per year. Most institutions eliminate financial aid entirely at approximately $400,000 income. Harvard, MIT, and the University of Pennsylvania recently expanded free tuition thresholds to $200,000, acknowledging this middle-class pressure. The changes take effect for 2025-26.
Ubuntu

Ubuntu 25.04 'Plucky Puffin' Arrives With Linux 6.14, GNOME 48, and ARM64 Desktop ISO (canonical.com) 51

Canonical today released Ubuntu 25.04 "Plucky Puffin," bringing significant upgrades to the non-LTS distribution including Linux kernel 6.14, GNOME 48 with triple buffering, and expanded hardware support.

For the first time, Ubuntu ships an official generic ARM64 desktop ISO targeting virtual machines and Snapdragon-based devices, with initial enablement for the Snapdragon X Elite platform. The release also adds full support for Intel Core Ultra Xe2 integrated graphics and "Battlemage" discrete GPUs, delivering improved ray tracing performance and hardware-accelerated video encoding.

Networking improvements include wpa-psk-sha256 Wi-Fi support and enhanced DNS resolution detection. The installer now better handles BitLocker-protected Windows partitions for dual-boot scenarios. Other notable changes include JPEG XL support by default, NVIDIA Dynamic Boost enabled on supported laptops, Papers replacing Evince as the default document viewer, and APT 3.0 becoming the standard package manager. Ubuntu 25.04 will receive nine months of support until January 2026.
Linux

Linus Torvalds Gently Criticizes Build-Slowing Testing Code Left in Linux 6.15-rc1 (phoronix.com) 25

"The big set of open-source graphics driver updates for Linux 6.15 have been merged," writes Phoronix, "but Linux creator Linus Torvalds isn't particularly happy with the pull request." The new "hdrtest" code is for the Intel Xe kernel driver and is around trying to help ensure the Direct Rendering Manager header files are self-contained and pass kernel-doc tests — basic maintenance checks on the included DRM header files to ensure they are all in good shape.
But Torvalds accused the code of not only slowing down the full-kernel builds, but also leaving behind "random" files for dependencies "that then make the source tree nasty," reports Tom's Hardware: While Torvalds was disturbed by the code that was impacting the latest Linux kernel, beginning his post with a "Grr," he remained precise in his objections to it. "I did the pull, resolved the (trivial) conflicts, but I notice that this ended up containing the disgusting 'hdrtest' crap that (a) slows down the build because it's done for a regular allmodconfig build rather than be some simple thing that you guys can run as needed (b) also leaves random 'hdrtest' turds around in the include directories," he wrote.

Torvalds went on to state that he had previously complained about this issue, and inquired why the hdr testing is being done as a regular part of the build. Moreover, he highlighted that the resulting 'turds' were breaking filename completion. Torvalds underlined this point — and his disgust — by stating, "this thing needs to *die*." In a shot of advice to fellow Linux developers, Torvalds said, "If you want to do that hdrtest thing, do it as part of your *own* checks. Don't make everybody else see that disgusting thing...."

He then noted that he had decided to mark hdrtest as broken for now, to prevent its inclusion in regular builds.

As of Saturday, all of the DRM-Next code had made it into Linux 6.15 Git, notes Phoronix. "But Linus Torvalds is expecting all this 'hdrtest' mess to be cleaned up."
Science

A New Image File Format Efficiently Stores Invisible Light Data (arstechnica.com) 11

An anonymous reader quotes a report from Ars Technica: Imagine working with special cameras that capture light your eyes can't even see -- ultraviolet rays that cause sunburn, infrared heat signatures that reveal hidden writing, or specific wavelengths that plants use for photosynthesis. Or perhaps using a special camera designed to distinguish the subtle visible differences that make paint colors appear just right under specific lighting. Scientists and engineers do this every day, and they're drowning in the resulting data. A new compression format called Spectral JPEG XL might finally solve this growing problem in scientific visualization and computer graphics. Researchers Alban Fichet and Christoph Peters of Intel Corporation detailed the format in a recent paper published in the Journal of Computer Graphics Techniques (JCGT). It tackles a serious bottleneck for industries working with these specialized images. These spectral files can contain 30, 100, or more data points per pixel, causing file sizes to balloon into multi-gigabyte territory -- making them unwieldy to store and analyze.

[...] The current standard format for storing this kind of data, OpenEXR, wasn't designed with these massive spectral requirements in mind. Even with built-in lossless compression methods like ZIP, the files remain unwieldy for practical work as these methods struggle with the large number of spectral channels. Spectral JPEG XL utilizes a technique used with human-visible images, a math trick called a discrete cosine transform (DCT), to make these massive files smaller. Instead of storing the exact light intensity at every single wavelength (which creates huge files), it transforms this information into a different form. [...]

According to the researchers, the massive file sizes of spectral images have reportedly been a real barrier to adoption in industries that would benefit from their accuracy. Smaller files mean faster transfer times, reduced storage costs, and the ability to work with these images more interactively without specialized hardware. The results reported by the researchers seem impressive -- with their technique, spectral image files shrink by 10 to 60 times compared to standard OpenEXR lossless compression, bringing them down to sizes comparable to regular high-quality photos. They also preserve key OpenEXR features like metadata and high dynamic range support.
The report notes that broader adoption "hinges on the continued development and refinement of the software tools that handle JPEG XL encoding and decoding."

Some scientific applications may also see JPEG XL's lossy approach as a drawback. "Some researchers working with spectral data might readily accept the trade-off for the practical benefits of smaller files and faster processing," reports Ars. "Others handling particularly sensitive measurements might need to seek alternative methods of storage."
Graphics

Nvidia's GeForce RTX 5090 Laptop Graphics Benchmarks Revealed 30

MojoKid writes: Similar to Nvidia's recent desktop graphics launches, there are four initial GeForce RTX 50 series laptop GPUs coming to market, starting this month. At the top of the stack is the GeForce RTX 5090 laptop GPU, which is equipped with 10,496 CUDA cores and is paired to 24GB of memory. Boost clocks top out around 2,160MHz and GPU power can range from 95-150 watts, depending on the particular laptop model. GeForce RTX 50 series GPUs for both laptops and desktops feature updated shader cores with support for neural shading, in addition to 4th gen ray tracing cores and 5th gen Tensor cores with support for DLSS 4. The GeForce RTX 50 series features a native PCIe gen 5 interface, in addition to support for DisplayPort 2.1b (up to UHBR20). These GPUs are also fed by the latest high speed GDDR7 memory, which offers efficiency benefits that are pertinent to laptop designs as well. Performance-wise, NVIDIA's mobile GeForce RTX 5090 is the new king of the hill in gaming laptops, and it easily bests all other discrete mobile graphics options on the market currently.

Slashdot Top Deals