×
Windows

Lenovo Is Working On a Windows PC Gaming Handheld Called the 'Legion Go' (windowscentral.com) 17

According to Windows Central, Lenovo is working on a handheld gaming PC dubbed "Legion Go," featuring Windows 11 and Ryzen chips. From the report: While details are scant right now, we understand this will sport AMD's new Phoenix processors, which the chip firm describes as ultra-thin, focused on gaming, AI, and graphics for ultrabooks. The fact the Legion Go will sport Ryzen chips pretty much guarantees that this is a Windows PC gaming handheld, as part of Lenovo's popular gaming "Legion" brand. As of writing, there's no information on exactly when this device could become available, or if, indeed, it'll become available at all.

According to our information, the Legion Go could sport an 8-inch screen, making it larger than the ASUS ROG Ally or the Steam Deck, both of which have a 7-inch display. PC and console games ported to PC are often designed for larger monitors or even TVs, and on smaller screens, UI elements can be difficult to see, especially if the game doesn't have a UI scaling option. A larger display could give the Legion Go a decent advantage over its competitors if it remains lightweight and balanced, which of course remains to be seen. The AMD Phoenix 7040 series chips are described by the firm as "ultra-thin" for powerful, but elegant ultrabook-style devices. They should lend themselves well to a device like the Legion Go, supporting 15W low-power states for lightweight games and maximized battery life, similar to the Steam Deck and ROG Ally. The Z1 Extreme in the ASUS ROG Ally can perform with a TDP below 15W, however, which could give the ROG Ally some advantages there. There's every chance the Legion Go could have other configurations we're unaware of yet, though, we'll just have to wait and see.

Facebook

Meta and Qualcomm Team Up To Run Big AI Models on Phones (cnbc.com) 17

Qualcomm and Meta will enable the social networking company's new large language model, Llama 2, to run on Qualcomm chips on phones and PCs starting in 2024, the companies announced today. From a report: So far, LLMs have primarily run in large server farms, on Nvidia graphics processors, due to the technology's vast needs for computational power and data, boosting Nvidia stock, which is up more than 220% this year. But the AI boom has largely missed the companies that make leading edge processors for phones and PCs, like Qualcomm. Its stock is up about 10% so far in 2023, trailing the NASDAQ's gain of 36%. The announcement on Tuesday suggests that Qualcomm wants to position its processors as well-suited for A.I. but "on the edge," or on a device, instead of "in the cloud." If large language models can run on phones instead of in large data centers, it could push down the significant cost of running A.I. models, and could lead to better and faster voice assistants and other apps.
AI

Crypto Miner Hive Drops 'Blockchain' From Name in Pivot To AI (bloomberg.com) 19

The crypto-mining company formerly known as Hive Blockchain Technologies is pivoting to artificial intelligence and web3, and has changed its name accordingly. From a report: The Vancouver-based miner has dropped the "blockchain" marker and said that its new branding as Hive Digital Technologies is intended to reflect "its mission to drive advancements" in AI applications like ChatGPT, and to "support the new web3 ecosystem."

Hive intends to use its existing fleet of Nvidia graphics processing units "for computational tasks on a massive scale," according to a July 12 filing with the US Securities and Exchange Commission. The vast majority of crypto-mining companies are focused on Bitcoin and use specialized chips that are different from so-called GPUs. Hive is among a handful of companies that deploy GPUs at scale to mine Ether, the second largest cryptocurrency by market value. A recent set of changes on the Ethereum blockchain has meant that these GPUs are no longer necessary, which is a problem for the Ether miners who hold large stocks of them.

Transportation

Ford Gets $9.2 Billion To Help US Catch Up With China's EV Dominance (bloomberg.com) 82

The US government is providing a conditional $9.2 billion loan to Ford for the construction of three battery factories, the largest government backing for a US automaker since the 2009 financial crisis. "The enormous loan [...] marks a watershed moment for President Joe Biden's aggressive industrial policy meant to help American manufacturers catch up to China in green technologies," reports Bloomberg. From the report: The new factories that will eventually supply Ford's expansion into electric vehicles are already under construction in Kentucky and Tennessee through a joint venture called BlueOval SK, owned by the Michigan automaker and South Korean battery giant SK On Co. Ford plans to make as many as 2 million EVs by 2026, a huge increase from the roughly 132,000 it produced last year. The three-factory buildout by BlueOval plus an adjacent Ford EV assembly unit have an estimated price tag of $11.4 billion. BlueOval was previously awarded subsidies by both state governments. That means taxpayers would be providing low-interest financing for almost all of the cost.

Ford's cars and SUVs made with domestic batteries will also be eligible for billions of dollars in incentives embedded in the Inflation Reduction Act's $370 billion in clean-energy funding, part of the historic climate measure narrowly passed into law about a year ago. The US government will subsidize manufacturing of batteries, and buyers could qualify for additional tax rebates of up to $7,500 per vehicle.

The rush of incentives, government lending and private-sector investment has led to a manufacturing boom in the wake of the IRA. More than 100 battery and electric-vehicle production projects are announced or already under construction in the US, representing about $200 billion in total investments. "Not since the advent of the auto industry 100 years ago have we seen an investment like that," says Gary Silberg, KPMG's global automotive sector leader.

Businesses

Adobe's $20 Billion Figma Acquisition Likely To Face EU Investigation (gizmodo.com) 22

According to a report from The Financial Times, the European Union Commission is planning an in-depth investigation into Adobe's $20 billion purchase of Figma, the popular online graphics editing and interface design application. Gizmodo reports: Back in February, the EU Commission noted that it had received numerous requests to review the business deal. The international watchdog announced that it would need to clear the proposed merger, under the justification that it "threatens to significantly affect competition in the market for interactive product design and whiteboarding software."

Now, the Brussels-based Commission will open a phase II investigation, per the FT. Generally, anti-competition probes are handled at the phase I level, which accounts for 90% of all cases, according to EU internal data. In comparison, a phase II analysis takes more time and goes deeper. By the Commission's description, a phase II investigation "typically involves more extensive information gathering, including companies' internal documents, extensive economic data, more detailed questionnaires to market participants, and/or site visits." From the start of such a probe, the regulatory body has 90 days to make a decision.

The EU Commission would not directly confirm its plans to investigate the Adobe/Figma merger. In an email, spokesperson Marta Perez-Cejuela told Gizmodo, "this transaction has not been formally notified to the Commission." Such notification is a requirement before any investigation can move forward. Commission officials requested that Adobe submit an official notification in February. Despite the Commission's lack of formal announcement, an EU probe into the acquisition is expected. Already the U.S. Department of Justice and the United Kingdom's Competition and Markets Authority are looking into the digital design tool deal. The DOJ is reportedly preparing to file an antitrust suit blocking the merger, while the UK CMA is actively investigating the acquisition, with a first decision due by the end of June.

AMD

AMD Likely To Offer Details on AI Chip in Challenge To Nvidia (reuters.com) 18

Advanced Micro Devices on Tuesday is expected to reveal new details about an AI "superchip" that analysts believe will be a strong challenger to Nvidia, whose chips dominate the fast-growing artificial intelligence market. From a report: AMD Chief Executive Lisa Su will give a keynote address at an event in San Francisco on the company's strategy in the data center and AI markets. Analysts expect fresh details about a chip called the MI300, AMD's most advanced graphics processing unit, the category of chips that companies like OpenAI use to develop products such as ChatGPT. Nvidia dominates the AI computing market with 80% to 95% of market share, according to analysts.

Last month, Nvidia's market capitalization briefly touched $1 trillion after the company said it expected a jump in revenue after it secured new chip supplies to meet surging demand. Nvidia has few competitors working at a large scale. While Intel and several startups such as Cerebras Systems and SambaNova Systems have competing products, Nvidia's biggest sales threat so far is the internal chip efforts at Alphabet's Google and Amazon's cloud unit, both of which rent their custom chips to outside developers.

Toys

New Spider-Man Movie Features Lego Scene Made By 14-Year-Old (yahoo.com) 35

Isaac-Lew (Slashdot reader #623) writes: The Lego scene in "Spider-Man: Across The Spider-Verse" was animated by a 14-year-old high school student after the producers saw the trailer he made that was animated Lego-style.
The teenager had used his father's old computers to recreate the trailer "shot for shot to look as if it belonged in a Lego world," reports the New York Times: By that point, he had been honing his skills for several years making short computer-generated Lego videos. "My dad showed me this 3-D software called Blender and I instantly got hooked on it," he said. "I watched a lot of YouTube videos to teach myself certain stuff..."

[A]fter finding the movie's Toronto-based production designer, Patrick O'Keefe, on LinkedIn, and confirming that Sony Pictures Animation's offer was legitimate, Theodore Mutanga, a medical physicist, built his son a new computer and bought him a state-of-the-art graphics card so he could render his work much faster... Over several weeks, first during spring break and then after finishing his homework on school nights, Mutanga worked on the Lego sequence... Christophre Miller [a director of "The Lego Movie" and one of the writer-producers of "Spider-Verse."] saw Mutanga's contribution to "Across the Spider-Verse" not only as a testament to the democratization of filmmaking, but also to the artist's perseverance: he dedicated intensive time and effort to animation, which is "not ever fast or easy to make," Miller said.

'The Lego Movie' is inspired by people making films with Lego bricks at home," Lord said by video. "That's what made us want to make the movie. Then the idea in 'Spider Verse' is that a hero can come from anywhere. And here comes this heroic young person who's inspired by the movie that was inspired by people like him."

Desktops (Apple)

Apple's New Proton-like Tool Can Run Windows Games on a Mac (theverge.com) 50

If you're hoping to see more Windows games on Mac then those dreams might finally come true soon. From a report: Apple has dropped some big news for game developers at its annual Worldwide Developers Conference (WWDC) this week, making it far easier and quicker to port Windows games to Mac thanks to a Proton-like environment that can translate and run the latest DirectX 12 Windows games on macOS. Apple has created a new Game Porting Toolkit that's similar to the work Valve has done with Proton and the Steam Deck.

It's powered by source code from CrossOver, a Wine-based solution for running Windows games on macOS. Apple's tool will instantly translate Windows games to run on macOS, allowing developers to launch an unmodified version of a Windows game on a Mac and see how well it runs before fully porting a game. Mac gaming has been a long running meme among the PC gaming community, despite Resident Evil Village and No Man's Sky ports being some rare recent exceptions to macOS gaming being largely ignored.

"The new Game Porting Toolkit provides an emulation environment to run your existing unmodified Windows game and you can use it to quickly understand the graphics feature usage and performance potential of your game when running on a Mac," explains Aiswariya Sreenivassan, an engineering project manager for GPUs and graphics at Apple, in a WWDC session earlier this week.

Apple

Apple Unveils M2 Ultra Processor (venturebeat.com) 94

Apple announced the M2 Ultra processor, a new chip for its Mac Studio workstation for professional users. From a report: The chip has 134 transistors and 24 central processing unit (CPU) cores with 20% faster performance. It has up to 76 graphics processing unit (GPU) cores at up to 30% faster performance. Apple made the announcement at its WWDC event today on the Apple campus in Cupertino, California.

The chip will go into the Mac Studio product, which previously used Intel silicon. These are machines like those used by engineers to deliver Saturday Night Live or create blockbuster movies, said Jennifer Munn at Apple. Apple said this completes the transition to Apple silicon. Developers can build new versions of apps at warp speed, with up to 25% faster performance than in the past, Munn said. The 32-core neural engine is 40% faster at AI calculations. It supports 192 gigabytes of unified memory, which is 50% more than M1 Ultra.

Desktops (Apple)

Apple Tests New High-End Macs With M2 Max and M2 Ultra Chips Ahead of WWDC (bloomberg.com) 16

Apple is testing a pair of new high-end Macs and their accompanying processors ahead of its Worldwide Developers Conference next week, suggesting that it's nearing the release of professional-focused desktop computers. From a report: The company is planning two new Mac models -- labeled internally as Mac 14,13 and Mac 14,14 -- that run the M2 Max processor announced in January and a yet-to-be-unveiled M2 Ultra chip. That second processor would replace the M1 Ultra model currently featured in the Mac Studio, a high-end desktop announced in March 2022. [...]

The first desktop computer in testing is running an M2 Max processor with eight high-performance cores -- components for the most demanding tasks -- as well as four efficiency cores and 30 graphics cores. Those are the same specifications featured in the MacBook Pro with the M2 Max. This particular machine also includes 96 gigabytes of memory and is running macOS 13.4, the version of the Mac operating system that was just released earlier this month. The second machine in testing has what is labeled as an M2 Ultra chip, which the company hasn't yet announced. That component, which sports 24 processing cores, doubles the performance of the M2 Max model. The chip includes 16 high-performance cores and eight efficiency cores, as well as 60 graphics cores. The company is testing it in configurations with 64 gigabytes, 128 gigabytes and 192 gigabytes of memory.

Microsoft

Microsoft Signs Deal for AI Computing Power With Nvidia-backed CoreWeave That Could Be Worth Billions (cnbc.com) 3

Microsoft's massive investment in OpenAI has put the company at the center of the artificial intelligence boom. But it's not the only place where the software giant is opening its wallet to meet the surging demand for AI-powered services. From a report: CNBC has learned from people with knowledge of the matter that Microsoft has agreed to spend potentially billions of dollars over multiple years on cloud-computing infrastructure from startup CoreWeave, which announced on Wednesday that it raised $200 million. That financing comes just over a month after the company attained a valuation of $2 billion. CoreWeave sells simplified access to Nvidia's graphics processing units, or GPUs, which are considered the best available on the market for running AI models.

Microsoft signed the CoreWeave deal earlier this year in order to ensure that OpenAI, which operates the viral ChatGPT chatbot, will have adequate computing power going forward, said one of the people, who asked not to be named due to confidentiality. OpenAI relies on Microsoft's Azure cloud infrastructure for its hefty compute needs.

Intel

Intel's Revival Plan Runs Into Trouble. 'We Had Some Serious Issues.' (wsj.com) 79

Rivals such as Nvidia have left Intel far behind. CEO Pat Gelsinger aims to reverse firm's fortunes by vastly expanding its factories. From a report: Pat Gelsinger is keenly aware he must act fast to stop Intel from becoming yet another storied American technology company left in the dust by nimbler competitors. Over the past decade, rivals overtook Intel in making the most advanced chips, graphics-chip maker Nvidia leapfrogged Intel to become America's most valuable semiconductor company, and perennial also-ran AMD has been stealing market share. Intel, by contrast, has faced repeated delays introducing new chips and frustration from would-be customers. "We didn't get into this mud hole because everything was going great," said Gelsinger, who took over as CEO in 2021. "We had some serious issues in terms of leadership, people, methodology, et cetera that we needed to attack."

As he sees it, Intel's problems stem largely from how it botched a transition in how chips are made. Intel came to prominence by both designing circuits and making them in its own factories. Now, chip companies tend to specialize either in circuit design or manufacturing, and Intel hasn't been able to pick up much business making chips designed by other people. So far, the turnaround has been rough. Gelsinger, 62 years old and a devout Christian, said he takes inspiration from the biblical story of Nehemiah, who rebuilt the walls of Jerusalem under attack from his enemies. Last year, he told a Christian group in Singapore: "You'll have your bad days, and you need to have a deep passion to rebuild." Gelsinger's plan is to invest as much as hundreds of billions of dollars into new factories that would make semiconductors for other companies alongside Intel's own chips. Two years in, that contract-manufacturing operation, called a "foundry" business, is bogged down with problems.

Hardware

ASUS Shows Off Concept GeForce RTX 40 Graphics Card Without Power-Connectors, Uses Proprietary Slot (wccftech.com) 90

ASUS is extending its connector-less design to graphics cards and has showcased the first GPU, a GeForce RTX 40 design, which features now power plugs. From a report: Spotted during our tour at the ASUS HQ, the ROG team gave us a first look at an upcoming graphics card (currently still in the concept phase) which is part of its GeForce RTX 40 family. The graphics card itself was a GeForce RTX 4070 design but it doesn't fall under any existing VGA product lineup & comes in an interesting design.

So the graphics card itself is a 2.3 slot design that features a triple axial-tech cooling fan system and once again, it isn't part of any interesting GPU lineup from ASUS such as ROG STRIX, TUF Gaming, Dual, etc. The backside of the card features an extended backplate that extends beyond the PCB & there's a cut-out for the air to pass through. The card also comes with a dual-BIOS switch that lets you switch between the "Performance" & "Quiet" modes but while there's a "Megalodon" naming on the backplate, we were told that isn't the final branding for this card.

AMD

AMD's and Nvidia's Latest Sub-$400 GPUs Fail To Push the Bar on 1440p Gaming (theverge.com) 96

An anonymous reader shares a report: I'm disappointed. I've been waiting for AMD and Nvidia to offer up more affordable options for this generation of GPUs that could really push 1440p into the mainstream, but what I've been reviewing over the past week hasn't lived up to my expectations. Nvidia and AMD are both releasing new GPUs this week that are aimed at the budget PC gaming market. After seven years of 1080p dominating the mainstream, I was hopeful this generation would deliver 1440p value cards. Instead, Nvidia has started shipping a $399 RTX 4060 Ti today that the company is positioning as a 1080p card and not the 1440p sweet spot it really should be at this price point.

AMD is aggressively pricing its new Radeon RX 7600 at just $269, and it's definitely more suited to the 1080p resolution at that price point and performance. I just wish there were an option between the $300 to $400 marks that offered enough performance to push us firmly into the 1440p era. More than 60 percent of PC gamers are playing at 1080p, according to Valve's latest Steam data. That means GPU makers like AMD and Nvidia don't have to target 1440p with cards that sell in high volume because demand seems to be low. Part of that low demand could be because a monitor upgrade isn't a common purchase for PC gamers, or they'd have to pay more for a graphics card to even support 1440p. That's probably why both of these cards also still ship with just 8GB of VRAM because why ship it with more if you're only targeting 1080p? A lower resolution doesn't need as much VRAM for texture quality. I've been testing both cards at 1080p and 1440p to get a good idea of where they sit in the GPU market right now. It's fair to say that the RTX 4060 Ti essentially offers the same 1440p performance as an RTX 3070 at 1440p for $399. That's $100 less than the RTX 3070's $499 price point, which, in October 2020, I said offered a 1440p sweet spot for games during that period of time. It's now nearly three years on, and I'd certainly expect more performance here at 1440p. Why is yesterday's 1440p card suddenly a 1080p one for Nvidia?

IT

Nvidia Announces a $299 RTX 4060 With the 4060 Ti Arriving May 24 For $399 (theverge.com) 50

Nvidia has officially announced its RTX 4060 family of GPUs. This includes the RTX 4060 Ti, which will debut next week on May 24th starting at $399, and -- perhaps the biggest news -- the RTX 4060, which will be available in July for just $299, $30 less than the RTX 3060's original retail price. A 16GB version of the RTX 4060 Ti is also due in July for $499. From a report: Nvidia's 60-class GPUs are the most popular among PC gamers on Steam, and the launch of the RTX 4060 family marks the first time we've seen Nvidia's latest RTX 40-series cards available under the $500 price point, let alone under $300. The $399 RTX 4060 Ti will ship on May 24th with just 8GB of VRAM, while a 16GB model is due in July priced at $499. There's an ongoing debate over the value of 8GB cards in the PC gaming community right now, particularly with the arrival of more demanding games that really push the limits of GPU memory even at 1080p (if you want all the max settings enabled, that is). It's a much bigger issue at 1440p and, of course, 4K resolutions, but Nvidia appears to be positioning its RTX 4060 Ti card for the 1080p market. [...] Specs-wise, the RTX 4060 Ti will be a 22 teraflop card with AV1 encoder support and more efficient energy usage. The total graphics power is 160 watts on both the RTX 4060 Ti 8GB and 16GB models, with Nvidia claiming the average gaming power usage will be around 140 watts. The RTX 3060 Ti had a total graphics power of 200 watts, and Nvidia says it uses 197 watts during games on average, so there are some impressive power efficiency improvements here.
AI

AI Needs Specialized Processors. Crypto Miners Say They Have Them (bloomberg.com) 23

When the Ethereum blockchain moved away from using a technique for verifying transactions known as proof of work last September, crypto market demand for the specialized processors that performed these calculations disappeared virtually overnight. Companies that used and hosted GPUs, or graphics processing units, saw a key part of their once-booming business vanish against an increasingly difficult backdrop for crypto. But now mining infrastructure companies like Hive Blockchain and Hut 8 Mining are finding opportunities to repurpose their GPU-based equipment for another industry on the precipice of a possible boom: artificial intelligence. From a report: "If you can reapply some of that investment in the GPU mining infrastructure and convert it to new cards and workloads, it makes sense," Hut 8 Chief Executive Officer Jaime Leverton said in an interview. GPUs -- designed to accelerate graphics rendering -- require constant maintenance and physical infrastructure not all users are prepared to provide. As such, Hut 8 and a few other miners have been using the chips to power high-performance computing, or HPC, services for clients across a range of industries. But inroads with the burgeoning and much-hyped AI sector -- which requires huge amounts of computing power -- represent the kind of transformational opportunity miners had been seeking when they originally bought the processors.
AI

What Happens When AI Tries to Generate a Pizza Commercial? (today.com) 61

The Today show's food reporter delivers a strange report on a viral AI-generated ad "for an imaginary pizza place called 'Pepperoni Hug Spot'."

Everything looks slightly ... off. Generated by AI, the audience is reminded constantly through the uncanny valley that the people aren't real — and neither is the pizza. "Cheese, pepperoni, vegetable, and more secret things," says the voiceover, which is also artificially generated... "Knock, knock, who's there? Pizza magic," the AI narrator says after a delivery driver (whose steering column is on the left side of his car) is shown delivering a pizza.

"Eat Pepperoni Hug Spot pizza. Your tummy say 'Thank you.' Your mouth say, 'Mmm,'" the ad continues while showing a trio of women eating pizza in the oddest possible fashion, complete with bizarre cheese pulls and facial contortions out of a food-based nightmare. "Pepperoni Hug Spot: Like family, but with more cheese..."

Using AI technologies Runway Gen2, Chat GPT4, Eleven Labs, Midjourney and Soundraw AI, the creator was able to produce the background music, voiceover, graphics, video and even generate the script for the ad. "I used Adobe After Effects to combine all the elements, adding title cards, transitions, and graphics," he adds... Seeing it spread, he whipped up a website that fit the uncanny vibe of the commercial and even created merch including hats and T-shirts.

"I figured I should capitalize on my 15 minutes of internet fame, right?" he jokes.

Twitter CEO Elon Musk "simply responded with an exploding head emoji."

And Pizza Hut's official Twitter account posted their reaction: "My heebies have been jeebied."

UPDATE: Saturday Pizza Hut Canada "transformed" one of its restaurants into the restaurant from the commercial, emblazoning the logo for Pepperoni Hug Spot onto its boxes, employee t-shirts, and the sign outside. There's two videos on the official Instagram feed for Pizza Hut Canada (which for the occasion changed its tagline to "Like family, but with more cheese.")

One video closes by promising the pizza does, indeed, contain "secret things."
Graphics

Nvidia Details 'Neural Texture Compression', Claims Significant Improvements (techspot.com) 17

Slashdot reader indominabledemon shared this article from TechSpot: Games today use highly-detailed textures that can quickly fill the frame buffer on many graphics cards, leading to stuttering and game crashes in recent AAA titles for many gamers... [T]he most promising development in this direction so far comes from Nvidia — neural texture compression could reduce system requirements for future AAA titles, at least when it comes to VRAM and storage.... In a research paper published this week, the company details a new algorithm for texture compression that is supposedly better than both traditional block compression (BC) methods as well as other advanced compression techniques such as AVIF and JPEG-XL.

The new algorithm is simply called neural texture compression (NTC), and as the name suggests it uses a neural network designed specifically for material textures. To make this fast enough for practical use, Nvidia researchers built several small neural networks optimized for each material... [T]extures compressed with NTC preserve a lot more detail while also being significantly smaller than even these same textures compressed with BC techniques to a quarter of the original resolution... Researchers explain the idea behind their approach is to compress all these maps along with their mipmap chain into a single file, and then have them be decompressed in real time with the same random access as traditional block texture compression...

However, NTC does have some limitations that may limit its appeal. First, as with any lossy compression, it can introduce visual degradation at low bitrates. Researchers observed mild blurring, the removal of fine details, color banding, color shifts, and features leaking between texture channels. Furthermore, game artists won't be able to optimize textures in all the same ways they do today, for instance, by lowering the resolution of certain texture maps for less important objects or NPCs. Nvidia says all maps need to be the same size before compression, which is bound to complicate workflows. This sounds even worse when you consider that the benefits of NTC don't apply at larger camera distances.

Perhaps the biggest disadvantages of NTC have to do with texture filtering. As we've seen with technologies like DLSS, there is potential for image flickering and other visual artifacts when using textures compressed through NTC. And while games can utilize anisotropic filtering to improve the appearance of textures in the distance at a minimal performance cost, the same isn't possible with Nvidia's NTC at this point.

Graphics

New Intel Linux Graphics Driver Patches Released, Up To 10-15% Better Performance (phoronix.com) 7

A new set of patches have been released for the Intel Linux graphics driver that "can provide 10-15% better performance when operating in the tuned mode," reports Phoronix. From the report: The set of Intel i915 Linux kernel graphics driver patches are about exposing the Intel RPS (Requested Power State) up/down thresholds. Right now the Intel Linux kernel driver has static values set for the up/down thresholds between power states while these patches would make them dynamically configurable by user-space. Google engineer Syed Faaiz Hussain raised the issue that they experimented with the Intel RPS tuning and were able to manage up to 15% better performance. With Counter-Strike: Global Offensive with OpenGL was a 14.5% boost, CS:GO with Vulkan was 12.9% faster, and Civilization VI with OpenGL was 11% faster while Strange Brigade was unchanged. No other game numbers were provided.

But as this is about changing the threshold for how aggressively the Intel graphics hardware switches power states, the proposed patches leave it up to user-space to adjust the thresholds as they wish. Google engineers are interested in hooking this into Feral's GameMode so that the values could be automatically tuned when launching games and then returning to their former state when done gaming, in order to maximize battery life / power efficiency. The only downside with these current patches are that they work only for non-GuC based platforms... So the latest Alder/Raptor Lake notebooks as well as Intel DG2/Alchemist discrete graphics currently aren't able to make use of this tuning option.

Open Source

Linux Kernel 6.3 Released (zdnet.com) 16

An anonymous reader quotes a report from ZDNet, written by Steven Vaughan-Nichols: The latest Linux kernel is out with a slew of new features -- and, for once, this release has been nice and easy. [...] Speaking of Rust, everyone's favorite memory-safe language, the new kernel comes with user-mode Linux support for Rust code. Miguel Ojeda, the Linux kernel developer, who's led the efforts to bring Rust to Linux, said the additions mean we're, "getting closer to a point where the first Rust modules can be upstreamed."

Other features in the Linux 6.3 kernel include support and enablement for upcoming and yet-to-be-released Intel and AMD CPUs and graphics hardware. While these updates will primarily benefit future hardware, several changes in this release directly impact today's users' day-to-day experience. The kernel now supports AMD's automatic Indirect Branch Restricted Speculation (IBRS) feature for Spectre mitigation, providing a less performance-intensive alternative to the retpoline speculative execution.

Linux 6.3 also includes new power management drivers for ARM and RISC-V architectures. RISC-V has gained support for accelerated string functions via the Zbb bit manipulation extension, while ARM received support for scalable matrix extension 2 instructions. For filesystems, Linux 6.3 brings AES-SHA2-based encryption support for NFS, optimizations for EXT4 direct I/O performance, low-latency decompression for EROFS, and a faster Brtfs file-system driver. Bottom line: many file operations will be a bit more secure and faster.

For gamers, the new kernel provides a native Steam Deck controller interface in HID. It also includes compatibility for the Logitech G923 Xbox edition racing wheel and improvements to the 8BitDo Pro 2 wired game controllers. Who says you can't game on Linux? Single-board computers, such as BannaPi R3, BPI-M2 Pro, and Orange Pi R1 Plus, also benefit from updated drivers in this release. There's also support for more Wi-Fi adapters and chipsets. These include: Realtek RTL8188EU Wi-Fi adapter support; Qualcomm Wi-Fi 7 wireless chipset support; and Ethernet support for NVIDIA BlueField 3 DPU. For users dealing with complex networks that have both old-school and modern networks, the new kernel can also handle multi-path TCP handling mixed flows with IPv4 and IPv6.
Linux 6.3 is available from kernel.org. You can learn how to compile the Linux kernel yourself here.

Slashdot Top Deals