Hardware

Nvidia Announces Next-Gen RTX 4090 and RTX 4080 GPUs (theverge.com) 178

Nvidia is officially announcing its RTX 40-series GPUs today. After months of rumors and some recent teasing from Nvidia, the RTX 4090 and RTX 4080 are now both official. The RTX 4090 arrives on October 12th priced at $1,599, with the RTX 4080 priced starting at $899 and available in November. Both are powered by Nvidia's next-gen Ada Lovelace architecture. From a report: The RTX 4090 is the top-end card for the Lovelace generation. It will ship with a massive 24GB of GDDR6X memory. Nvidia claims it's 2-4x faster than the RTX 3090 Ti, and it will consume the same amount of power as that previous generation card. Nvidia recommends a power supply of at least 850 watts based on a PC with a Ryzen 5900X processor. Inside the giant RTX 4090 there are 16,384 CUDA Cores, a base clock of 2.23GHz that boosts up to 2.52GHz, 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs.

Nvidia is actually offering the RTX 4080 in two models, one with 12GB of GDDR6X memory and another with 16GB of GDDR6X memory, and Nvidia claims it's 2-4x faster than the existing RTX 3080 Ti. The 12GB model will start at $899 and include 7,680 CUDA Cores, 7,680 CUDA Cores, a 2.31GHz base clock that boosts up to 2.61GHz, 639 Tensor-TFLOPs, 92 RT-TFLOPs, and 40 Shader-TFLOPs. The 16GB model of the RTX 4080 isn't just a bump to memory, though. Priced starting at $1,199 it's more powerful with 9,728 CUDA Cores, a base clock of 2.21GHz that boosts up to 2.51GHz, 780 Tensor-TFLOPs, 113 RT-TFLOPs, and 49 Shader-TFLOPs of power. The 12GB RTX 4080 model will require a 700 watt power supply, with the 16GB model needing at least 750 watts. Both RTX 4080 models will launch in November.
Further reading: Nvidia Puts AI at Center of Latest GeForce Graphics Card Upgrade.
Books

'Linux IP Stacks Commentary' Book Tries Free Online Updates (satchell.net) 13

Recently the authors of Elements of Publishing shared an update. "After ten years in print, our publisher decided against further printings and has reverted the rights to us. We are publishing Elements of Programming in two forms: a free PDF and a no-markup paperback."

And that's not the only old book that's getting a new life on the web...

22 years ago, long-time Slashdot reader Stephen T. Satchell (satch89450) co-authored Linux IP Stacks Commentary, a book commenting the TCP/IP code in Linux kernel 2.0.34. ("Old-timers will remember the Lion's Unix Commentary, the book published by University xerographic copies on the sly. Same sort of thing.") But the print edition struggled to update as frequently as the Linux kernel itself, and Satchell wrote a Slashdot post exploring ways to fund a possible update.

At the time Slashdot's editors noted that "One of the largest complaints about Linux is that there is a lack of high-profile documentation. It would be sad if this publication were not made simply because of the lack of funds (which some people would see as a lack of interest) necessary to complete it." But that's how things seemed to end up — until Satchell suddenly reappeared to share this update from 2022: When I was released from my last job, I tried retirement. Wasn't for me. I started going crazy with nothing significant to do. So, going through old hard drives (that's another story), I found the original manuscript files, plus the page proof files, for that two-decade-old book. Aha! Maybe it's time for an update. But how to keep it fresh, as Torvalds continues to release new updates of the Linux kernel?

Publish it on the Web. Carefully.

After four months (and three job interviews) I have the beginnings of the second edition up and available for reading. At the moment it's an updated, corrected, and expanded version of the "gray matter", the exposition portions of the first edition....

The URL for the alpha-beta version of this Web book is satchell.net/ipstacks for your reading pleasure. The companion e-mail address is up and running for you to provide feedback. There is no paywall.

But there's also an ingenious solution to the problem of updating the text as the code of the kernel keeps changing: Thanks to the work of Professor Donald Knuth (thank you!) on his WEB and CWEB programming languages, I have made modifications, to devise a method for integrating code from the GIT repository of the Linux kernel without making any modifications (let alone submissions) to said kernel code. The proposed method is described in the About section of the Web book. I have scaffolded the process and it works. But that's not the hard part.

The hard part is to write the commentary itself, and crib some kind of Markup language to make the commentary publishing quality. The programs I write will integrate the kernel code with the commentary verbiage into a set of Web pages. Or two slightly different sets of web pages, if I want to support a mobile-friendly version of the commentary.

Another reason for making it a web book is that I can write it and publish it as it comes out of my virtual typewriter. No hard deadlines. No waiting for the printers. And while this can save trees, that's not my intent. The back-of-the-napkin schedule calls for me to to finish the expository text in September, start the Python coding for generating commentary pages at the same time, and start the writing the commentary on the Internet Control Message Protocol in October. By then, Linus should have version 6.0.0 of the Linux kernel released.

I really, really, really don't want to charge readers to view the web book. Especially as it's still in the virtual typewriter. There isn't any commentary (yet). One thing I have done is to make it as mobile-friendly as I can, because I suspect the target audience will want to read this on a smartphone or tablet, and not be forced to resort to a large-screen laptop or desktop. Also, the graphics are lightweight to minimize the cost for people who pay by the kilopacket. (Does anywhere in the world still do this? Inquiring minds want to know.)

I host this web site on a Protectli appliance in my apartment, so I don't have that continuing expense. The power draw is around 20 watts. My network connection is AT&T fiber — and if it becomes popular I can always upgrade the upstream speed.

The thing is, the cat needs his kibble. I still want to know if there is a source of funding available.

Also, is it worthwhile to make the pages available in a zip file? Then a reader could download a snapshot of the book, and read it off-line.

Bitcoin

GPU Mining No Longer Profitable After Ethereum Merge (tomshardware.com) 163

Just one day after the Ethereum Merge, where the cryptocoin successfully switched from Proof of Work (PoW) to Proof of Stake (PoS), profitability of GPU mining has completely collapsed. Tom's Hardware reports: That means the best graphics cards should finally be back where they belonged, in your gaming PC, just as god intended. That's a quick drop, considering yesterday there were still a few cryptocurrencies that were technically profitable. Looking at WhatToMine, and using the standard $0.10 per kWh, the best-case results are with the GeForce RTX 3090 and Radeon RX 6800 and 6800 XT. Those are technically showing slightly positive results, to the tune of around $0.06 per day after power costs. However, that doesn't factor in the cost of the PC power, or the wear and tear on your graphics card.

Even at a slightly positive net result, it would still take over 20 years to break even on the cost of an RX 6800. We say that tongue-in-cheek, because if there's one thing we know for certain, it's that no one can predict what the cryptocurrency market will look like even one year out, never mind 20 years in the future. It's a volatile market, and there are definitely lots of groups and individuals hoping to figure out a way to Make GPU Mining Profitable Again (MGMPA hats inbound...)

Of the 21 current generation graphics cards from the AMD RX 6000-series and the Nvidia RTX 30-series, only five are theoretically profitable right now, and those are all just barely in the black. This is using data from NiceHash and WhatToMine, so perhaps there are ways to tune other GPUs to get into the net positive, but the bottom line is that no one should be using GPUs for mining right now, and certainly not buying more GPUs for mining purposes. [You can see a full list of the current profitability of the current generation graphics cards here.]

Graphics

EVGA Abandons the GPU Market, Reportedly Citing Conflicts With Nvidia (tomshardware.com) 72

UnknowingFool writes: After a decades long partnership with Nvidia, EVGA has announced they are ending their relationship. Citing conflicts with Nvidia, EVGA CEO Andrew Han said the company will not partner with Intel nor AMD, and will be exiting the GPU market completely. The company will continue to make existing RTX 30-series cards until their stock runs out but will not release a 4000 series card. YouTube channels JayZTwoCents and GamersNexus broke the news after sitting down with EVGA CEO Andrew Han to discuss his frustrations with Nvidia as a partner. Jon Peddie Research also published a brief article on the matter.
Graphics

Canva, the $26 Billion Design Startup, Launches a Productivity Suite To Take On Google Docs, Microsoft Office (fortune.com) 20

Canva, the Australian graphic design business valued at $26 billion, is introducing a new suite of digital workplace products that "represent a direct challenge to Google Docs, Microsoft Office, and Adobe, whose digital tools are mainstays of the modern workplace," reports Fortune. However, Cliff Obrecht, Canva co-founder and COO, claims that Canva isn't trying to compete with these corporate behemoths. "Instead, he sees Canva as a visual-first companion to these tools," reports TechCrunch.

"We're not trying to compete head-to-head with Google Docs," Obrecht told TechCrunch. "Our products are inherently visual, so we take a very visual lens on, what does a visual document look like? How do you turn that boring document that's all text based into something engaging?" Fortune reports: With the launch, Canva hopes to transform itself from a mainly consumer-focused brand often used by individual teams to design social media graphics and presentations to a critical business tool -- and, in the process, crack open the productivity management software market valued at $47.3 billion and growing at 13% a year, according to Grand View Research. "Visual communication is becoming an increasingly critical skill for teams of every size across almost every industry," cofounder and CEO Melanie Perkins said in a statement. "We're bringing simple design products to the workplace to empower every employee, at every organization, and on every device." The product offerings include Canva Docs, Canva Websites, Canva Whiteboards and Data Visualization -- all of which are interoperable, "so if you make a presentation, you can turn it into a document or a website too," notes TechCrunch.

"Canva also plans to launch its API in beta, enabling developers to more easily integrate with the worksuite. Plus, Canva is launching a creator program where highly-vetted designers can sell templates, photos and designs to Canva users."
Intel

Intel Reveals Specs of Arc GPU (windowscentral.com) 23

Intel has dripped out details about its upcoming Arc graphics cards over the last few months, but until recently, we didn't have full specifications for the GPUs. That changed when Intel dropped a video and a post breaking down the full Arc A-series. From a report: The company shared the spec sheets of the Arc A380, Arc A580, Arc 750, and Arc A770. It also explained the naming structure of the new GPUs along with other details. Just about the only major piece of information we're still missing is the release date for the cards. At the top end of the range, Intel's Arc A770 will have 32 Xe cores, 32 ray-tracing units, and a graphics clock of 2100MHz. That GPU will be available with either 8GB or 16GB of memory. Sitting just below the Arc A770, the Arc A750 will have 28 Xe cores, 28 ray-tracing units, and 8GB of memory. The Intel Arc A580 will sit in the middle between the company's high-end GPUs and the Intel Arc A380.
Intel

Asus Packs 12-Core Intel i7 Into a Raspberry Pi-Sized Board (theregister.com) 30

An anonymous reader quotes a report from The Register: The biz's GENE-ADP6, announced this week, can pack as much as a 12-core/16-thread Intel processor with Iris Xe graphics into a 3.5-inch form factor. The diminutive system is aimed at machine-vision applications and can be configured with your choice of Intel silicon including Celeron, or Core i3, i5, or a choice of 10 or 12-core i7 processors. As with other SBCs we've seen from Aaeon and others, the processors aren't socketed so you won't be upgrading later. This device is pretty much aimed at embedded and industrial use, mind. All five SKUs are powered by Intel's current-gen Alder Lake mobile processor family, including a somewhat unusual 5-core Celeron processor that pairs a single performance core with four efficiency cores. However, only the i5 and i7 SKUs come equipped with Intel's Iris Xe integrated graphics. The i3 and Celeron are stuck on UHD graphics. The board can be equipped with up to 64GB of DDR5 memory operating at up to 4800 megatransfers/sec by way of a pair of SODIMM modules.

For I/O the board features a nice set of connectivity including a pair of NICs operating at 2.5 Gbit/sec and 1 Gbit/sec, HDMI 2.1 and Display Port 1.4, three 10Gbit/sec-capable USB 3.2 Gen 2 ports, and a single USB-C port that supports up to 15W of power delivery and display out. For those looking for additional connectivity for their embedded applications, the system also features a plethora of pin headers for USB 2.0, display out, serial interfaces, and 8-bit GPIO. Storage is provided by your choice of a SATA 3.0 interface or a m.2 mSATA/NVMe SSD. Unlike Aaeon's Epic-TGH7 announced last month, the GENE-ADP6 is too small to accommodate a standard PCIe slot, but does feature a FPC connector, which the company says supports additional NVMe storage or external graphics by way of a 4x PCIe 4.0 interface.

Intel

Intel Details 12th Gen Core SoCs Optimized For Edge Applications (theregister.com) 6

Intel has made available versions of its 12th-generation Core processors optimized for edge and IoT applications, claiming the purpose-built chips enable smaller form factor designs, but with the AI inferencing performance to analyze data right at the edge. The Register reports: The latest members of the Alder Lake family, the 12th Gen Intel Core SoC processors for IoT edge (formerly Alder Lake PS) combine the performance profile and power envelope of the mobile chips but the LGA socket flexibility of the desktop chips, according to Intel, meaning they can be mounted directly on a system board or in a socket for easy replacement. Delivered as a multi-chip package, the new processors combine the Alder Lake cores with an integrated Platform Controller Hub (PCH) providing I/O functions and integrated Iris Xe graphics with up to 96 graphics execution units. [...]

Intel VP and general manager of the Network and Edge Compute Division Jeni Panhorst said in a statement that the new processors were designed for a wide range of vertical industries. "As the digitization of business processes continues to accelerate, the amount of data created at the edge and the need for it to be processed and analyzed locally continues to explode," she said. Another key capability for managing systems deployed in edge scenarios is that these processors include Intel vPro features, which include remote management capabilities built into the hardware at the silicon level, so an IT admin can reach into a system and perform actions such as changing settings, applying patches or rebooting the platform.

The chips support up to eight PCIe 4.0 lanes, and four Thunderbolt 4/USB4 lanes, with up to 64GB of DDR5 or DDR4 memory, and the graphics are slated to deliver four 4K displays or one 8K display. Operating system support includes Windows 10 IoT Enterprise 2021 Long Term Servicing Channel (LTSC) and Linux options. Intel said the new SoCs are aimed at a broad range of industries, including point-of-sale kit in the retail, banking, and hospitality sectors, industrial PCs and controllers for the manufacturing industry, plus healthcare.

AMD

AMD Launches Zen 4 Ryzen 7000 CPUs (tomshardware.com) 156

AMD unveiled its 5nm Ryzen 7000 lineup today, outlining the details of four new models that span from the 16-core $699 Ryzen 9 7950X flagship, which AMD claims is the fastest CPU in the world, to the six-core $299 Ryzen 5 7600X, the lowest bar of entry to the first family of Zen 4 processors. Tom's Hardware reports: Ryzen 7000 marks the first 5nm x86 chips for desktop PCs, but AMD's newest chips don't come with higher core counts than the previous-gen models. However, frequencies stretch up to 5.7 GHz - an impressive 800 MHz improvement over the prior generation -- paired with an up to 13% improvement in IPC from the new Zen 4 microarchitecture. That results in a 29% improvement in single-threaded performance over the prior-gen chips. That higher performance also extends out to threaded workloads, with AMD claiming up to 45% more performance in some threaded workloads. AMD says these new chips power huge generational gains over the prior-gen Ryzen 5000 models, with 29% faster gaming and 44% more performance in productivity apps. Going head-to-head with Intel's chips, AMD claims the high-end 7950X is 11% faster overall in gaming than Intel's fastest chip, the 12900K, and that even the low-end Ryzen 5 7600X beats the 12900K by 5% in gaming. It's noteworthy that those claims come with a few caveats [...].

The Ryzen 7000 processors come to market on September 27, and they'll be joined by new DDR5 memory products that support new EXPO overclocking profiles. AMD's partners will also offer a robust lineup of motherboards - the chips will snap into new Socket AM5 motherboards that AMD says it will support until 2025+. These motherboards support DDR5 memory and the PCIe 5.0 interface, bringing the Ryzen family up to the latest connectivity standards. The X670 Extreme and standard X670 chipsets arrive first in September, while the more value-oriented B650 options will come to market in October. That includes the newly announced B650E chipset that brings full PCIe 5.0 connectivity to budget motherboards, while the B650 chipset slots in as a lower-tier option. The Ryzen 7000 lineup also brings integrated RDNA 2 graphics to all of the processors in the stack, a first for the Ryzen family.

Social Networks

'Facebook Misinformation Is Bad Enough. The Metaverse Will Be Worse' (rand.org) 53

The Rand Corporation is an American (nonprofit) think tank. And veliath (Slashdot reader #5,435) spotted their recent warning about "a plausible scenario that could soon take place in the metaverse." A political candidate is giving a speech to millions of people. While each viewer thinks they are seeing the same version of the candidate, in virtual reality they are actually each seeing a slightly different version. For each and every viewer, the candidate's face has been subtly modified to resemble the viewer.... The viewers are unaware of any manipulation of the image. Yet they are strongly influenced by it: Each member of the audience is more favorably disposed to the candidate than they would have been without any digital manipulation.

This is not speculation. It has long been known that mimicry can be exploited as a powerful tool for influence. A series of experiments by Stanford researchers has shown that slightly changing the features of an unfamiliar political figure to resemble each voter made people rate politicians more favorably. The experiments took pictures of study participants and real candidates in a mock-up of an election campaign. The pictures of each candidate were modified to resemble each participant. The studies found that even if 40 percent of the participant's features were blended into the candidate's face, the participants were entirely unaware the image had been manipulated.

In the metaverse, it's easy to imagine this type of mimicry at a massive scale.

At the heart of all deception is emotional manipulation. Virtual reality environments, such as Facebook's (now Meta's) metaverse, will enable psychological and emotional manipulation of its users at a level unimaginable in today's media.... We are not even close to being able to defend users against the threats posed by this coming new medium.... In VR, body language and nonverbal signals such as eye gaze, gestures, or facial expressions can be used to communicate intentions and emotions. Unlike verbal language, we often produce and perceive body language subconsciously....

We must not wait until these technologies are fully realized to consider appropriate guardrails for them. We can reap the benefits of the metaverse while minimizing its potential for great harm.

They recommend developing technology that detect the application of this kind of VR manipulation.

"Society did not start paying serious attention to classical social media — meaning Facebook, Twitter, and the like — until things got completely out of hand. Let us not make the same mistake as social media blossoms into the metaverse."
Businesses

The GPU Shortage is Over. The GPU Surplus Has Arrived (arstechnica.com) 76

A year ago, it was nearly impossible to buy a GeForce GPU for its intended retail price. Now, the company has the opposite problem. From a report: Nvidia CEO Jensen Huang said during the company's Q2 2023 earnings call yesterday that the company is dealing with "excess inventory" of RTX 3000-series GPUs ahead of its next-gen RTX 4000 series release later this year. To deal with this, according to Huang, Nvidia will reduce the number of GPUs it sells to manufacturers of graphics cards and laptops so that those manufacturers can clear out their existing inventory. Huang also says Nvidia has "instituted programs to price position our current products to prepare for next-generation products."

When translated from C-suite to English, this means the company will be cutting the prices of current-generation GPUs to make more room for next-generation ones. Those price cuts should theoretically be passed along to consumers somehow, though that will be up to Nvidia's partners. Nvidia announced earlier this month that it would be missing its quarterly projections by $1.4 billion, mainly due to decreased demand for its gaming GPUs. Huang said that "sell-through" of GPUs, or the number of cards being sold to users, had still "increased 70 percent since pre-COVID," though the company still expects year-over-year revenue from GPUs to decline next quarter.

Desktops (Apple)

Devs Make Progress Getting MacOS Venture Running On Unsupported, Decade-Old Macs (arstechnica.com) 20

An anonymous reader quotes a report from Ars Technica: Skirting the official macOS system requirements to run new versions of the software on old, unsupported Macs has a rich history. Tools like XPostFacto and LeopardAssist could help old PowerPC Macs run newer versions of Mac OS X, a tradition kept alive in the modern era by dosdude1's patchers for Sierra, High Sierra, Mojave, and Catalina. For Big Sur and Monterey, the OpenCore Legacy Patcher (OCLP for short) is the best way to get new macOS versions running on old Macs. It's an offshoot of the OpenCore Hackintosh bootloader, and it's updated fairly frequently with new features and fixes and compatibility for newer macOS versions. The OCLP developers have admitted that macOS Ventura support will be tough, but they've made progress in some crucial areas that should keep some older Macs kicking for a little bit longer.

[...] First, while macOS doesn't technically include system files for pre-AVX2 Intel CPUs, Apple's Rosetta 2 software does still include those files, since Rosetta 2 emulates the capabilities of a pre-AVX2 x86 CPU. By extracting and installing those files in Ventura, you can re-enable support on Ivy Bridge and older CPUs without AVX2 instructions. And this week, Grymalyuk showed off another breakthrough: working graphics support on old Metal-capable Macs, including machines as old as the 2014 5K iMac, the 2012 Mac mini, and even the 2008 cheese grater-style Mac Pro tower. The OCLP team still has other challenges to surmount, not least of which will involve automating all of these hacks so that users without a deep technical understanding of macOS's underpinnings can continue to set up and use the bootloader. Grymalyuk still won't speculate about a timeframe for official Ventura support in OCLP. But given the progress that has been made so far, it seems likely that people with 2012-and-newer Macs should still be able to run Ventura on their Macs without giving up graphics acceleration or other important features.

Intel

Why Stacking Chips Like Pancakes Could Mean a Huge Leap for Laptops (cnet.com) 46

For decades, you could test a computer chip's mettle by how small and tightly packed its electronic circuitry was. Now Intel believes another dimension is as big a deal: how artfully a group of such chips can be packaged into a single, more powerful processor. From a report: At the Hot Chips conference Monday, Intel Chief Executive Pat Gelsinger will shine a spotlight on the company's packaging prowess. It's a crucial element to two new processors: Meteor Lake, a next-generation Core processor family member that'll power PCs in 2023, and Ponte Vecchio, the brains of what's expected to be the world's fastest supercomputer, Aurora.

"Meteor Lake will be a huge technical innovation," thanks to how it packages, said Real World Tech analyst David Kanter. For decades, staying on the cutting edge of chip progress meant miniaturizing chip circuitry. Chipmakers make that circuitry with a process called photolithography, using patterns of light to etch tiny on-off switches called transistors onto silicon wafers. The smaller the transistors, the more designers can add for new features like accelerators for graphics or artificial intelligence chores. Now Intel believes building these chiplets into a package will bring the same processing power boost as the traditional photolithography technique.

Facebook

After Mockery, Mark Zuckerberg Promises Better Metaverse Graphics, Posts New Avatar (cnn.com) 63

What do you when people hate your $10 billion selfie? "Mark Zuckerberg, in response to a torrent of critical memes mocking the graphics of Meta's newest project, has heard his critics — and changed his selfie," reports CNN: Zuckerberg debuted Horizon Worlds, a virtual reality social app, in France and Spain earlier this week, sharing a somewhat flat, goofy digital avatar in front of an animated Eiffel Tower and la Sagrada Família.

The internet immediately jumped in, mocking what many users viewed as (hopefully) preliminary graphics for a venture that Meta has spent at least $10 billion in the last year.

New York Times tech columnist Kevin Roose compared the graphics to "worse than a 2008 Wii game" on Twitter. Slate used the term " buttcheeks." Twitter was less kind: "eye-gougingly ugly" and "an international laughing stock" popping up. Many compared it to early 90's graphics and pointed out how lifeless and childish the Zuckerberg selfie looked. It quickly won the designation "dead eyes."

Well, Zuckerberg has apparently seen the memes, because on Friday he announced there are major updates coming — along with new avatar graphics.

In a CNBC report on how Zuckerberg "is getting dragged on the internet for how ugly the graphics of this game are," they'd actually quoted a Forbes headline that asked, "Does Mark Zuckerberg not understand how bad his metaverse is?"
Intel

Intel Drops DirectX 9 Support On Xe, Arc GPUs, Switches To DirectX 12 Emulation (tomshardware.com) 45

An anonymous reader quotes a report from Ars Technica: Native DX9 hardware support is officially gone from Intel's Xe integrated graphics solutions on 12th Gen CPUs and A-Series Arc Alchemist discrete GPUs. To replace it, all DirectX 9 support will be transferred to DirectX 12 in the form of emulation. Emulation will run on an open-source conversion layer known as "D3D9On12" from Microsoft. Conversion works by sending 3D DirectX 9 graphics commands to the D3D9On12 layer instead of the D3D9 graphics driver directly. Once the D3D9On12 layer receives commands from the D3D9 API, it will convert all commands into D3D12 API calls. So basically, D3D9On12 will act as a GPU driver all on its own instead of the actual GPU driver from Intel. Microsoft says this emulation process has become a relatively performant implementation of DirectX 9. As a result, performance should be nearly as good, if not just as good, as native DirectX 9 hardware support.
Open Source

NVIDIA Publishes 73k Lines Worth Of 3D Header Files For Fermi Through Ampere GPUs (phoronix.com) 6

In addition to NVIDIA being busy working on transitioning to an open-source GPU kernel driver, yesterday they made a rare public open-source documentation contribution... NVIDIA quietly published 73k lines worth of header files to document the 3D classes for their Fermi through current-generation Ampere GPUs. Phoronix's Michael Larabel reports: To NVIDIA's Open-GPU-Docs portal they have posted the 73k lines worth of 3D class header files covering RTX 30 "Ampere" GPUs back through the decade-old GeForce 400/500 "Fermi" graphics processors. These header files define the classes used to program the 3D engine of the GPU, the texture header and texture sampler layout are documented, and other 3D-related programming bits. Having all of these header files will be useful to the open-source Nouveau driver developers to save on their reverse-engineering and guessing/uncertainty over certain bits.

NVIDIA's Open GPU Kernel Driver is for only GeForce RTX 20 "Turing" series and newer, so it's great seeing NVIDIA now posting this documentation going back to Fermi which is squarely to help the open-source community / Nouveau. [...] The timing of NVIDIA opening these 3D classes back to Fermi is interesting and potentially tied to SIGGRAPH 2022 happening this week. Those wanting to grab NVIDIA's latest open-source GPU documentation can find it via this GitHub repository.

Businesses

Crypto-Driven GPU Crash Makes Nvidia Miss Q2 Projections By $1.4 Billion (arstechnica.com) 46

In preliminary second-quarter financial results announced today, Nvidia's year-over-year growth is "down from a previously forecasted $8.1 billion, a miss of $1.4 billion," reports Ars Technica. "Nvidia blamed this shortfall on weaker-than-expected demand for its gaming products, including its GeForce graphics processors." The full results won't arrive until the end of the month. From the report: Nvidia pointed to "a reduction in channel partner sales," meaning that partners like Evga, MSI, Asus, Zotac, Gigabyte, and others were selling fewer new GPUs than anticipated. This drop can be attributed partly to a crash in the value of mining-based cryptocurrencies like Bitcoin and Ethereum -- fewer miners are buying these cards, and miners looking to unload their GPUs on the secondhand market are also giving gamers a cheaper source for graphics cards. "As we expect the macroeconomic conditions affecting sell-through to continue, we took actions with our Gaming partners to adjust channel prices and inventory," said Nvidia CEO Jensen Huang. That means we may see further price drops for existing GeForce GPUs, which have already been dropping in price throughout the year. Some cards still haven't reverted to their originally advertised prices, but they're getting closer all the time.

In better news for Nvidia, the small overall increase in revenue [$6.7 billion] is driven almost exclusively by the company's data center business, including GPU-accelerated AI and machine learning applications and GPU acceleration for cloud-hosted virtual machines. Nvidia's data center revenue is projected to be up 61 percent from last year, from $2.37 billion to $3.81 billion. Nvidia will supposedly launch its next-generation RTX 4000 series GPUs later this year. Based on the new Lovelace architecture, these GPUs may appeal to some gamers who originally sat out the RTX 3000 series due to shortages and inflated prices and are now avoiding the GPUs because they know a replacement is around the corner.

Intel

Intel Unveils Arc Pro GPUs (tomshardware.com) 23

Intel's Arc graphics cards aren't just for gamers, it seems, as the previously CPU-exclusive company has taken the lid off a new line of professional GPUs to complement the existing Arc line -- well, existing in China, maybe. From a report:The new cards are called Arc Pro, and target those who use their graphics cards for more than shooting bad guys. Maybe they won't be among the best graphics cards for gaming, but the AV1 encoding at least might get some takers. Intel today unveiled one mobile professional GPU, the A30M, and two desktop models: the single-slot A40 and double-slot A50. Both desktop cards are described as being for small form-factor machines, which makes us suspect Intel may have some much larger cards up its sleeve.

All the newly announced GPUs feature built-in ray tracing hardware, machine learning capabilities and industry-first AV1 hardware encoding acceleration. Google's royalty-free, open source alternative to HEVC, AV1 hasn't gained a lot of traction on the web so far despite promises from Netflix and YouTube, with its main use being in Google's Duo video calling despite beating HEVC for compression quality. It's always been very slow to encode, however, so a good hardware accelerator and Intel's backing could see it take off.

GNU is Not Unix

There Were 19 New GNU Releases Last Month (fsf.org) 30

"Nineteen new GNU releases in the last month," reads a "July GNU Spotlight" announcement from the Free Software Foundation.

Here's (edited and condensed) descriptions of some of the highlights:
  • GNU Datamash (version 1.8) — a command-line program performing basic numeric, textual, and statistical operations on input textual data files (designed to work within standard pipelines).
  • GNUnet (version 0.17.2) — a framework for secure peer-to-peer networking. "The high-level goal is to provide a strong foundation of free software for a global, distributed network that provides security and privacy. GNUnet in that sense aims to replace the current internet protocol stack. Along with an application for secure publication of files, it has grown to include all kinds of basic applications for the foundation of a GNU internet."
  • GnuTLS (version 3.7.7) — A secure communications library implementing the SSL, TLS and DTLS protocols, provided in the form of a C library.
  • Jami (version 20220726.1515.da8d1da) — a GNU package for universal communication that respects the freedom and privacy of its users, using distributed hash tables for establishing communication. ("This avoids keeping centralized registries of users and storing personal data.")
  • GNU Nettle (version 3.8.1) — a low-level cryptographic library. It is designed to fit in easily in almost any context. It can be easily included in cryptographic toolkits for object-oriented languages or in applications themselves.
  • GNU Octave (version 7.2.0) — a high-level interpreted language specialized for numerical computations, for both linear and non-linear applications and with great support for visualizing results.
  • R (version 4.2.1) — a language and environment for statistical computing and graphics, along with robust support for producing publication-quality data plots. "A large amount of 3rd-party packages are available, greatly increasing its breadth and scope."
  • TRAMP (version 2.5.3) — a GNU Emacs package allowing you to access files on remote machines as though they were local files. "This includes editing files, performing version control tasks and modifying directory contents with dired. Access is performed via ssh, rsh, rlogin, telnet or other similar methods."

Click here to see the other new releases and download information.

The FSF announcement adds that "A number of GNU packages, as well as the GNU operating system as a whole, are looking for maintainers and other assistance."


Graphics

Raspberry Pi 4 Expands 3D Potential With Vulkan Update (arstechnica.com) 53

The Raspberry Pi 4 has hit a major graphics milestone, adding support for a more modern Vulkan 3D APIa. Ars Technica reports: Raspberry Pi CEO Eben Upton announced the Pi 4's Vulkan 1.2 conformance on Monday. Support isn't available yet in downloadable Pi-friendly operating systems but should be coming soon. For most people using their Pi as a server, a DIY controller, or a light desktop, Vulkan 1.2 conformance won't be noticeable. Desktop graphics on the standard Raspberry Pi OS are powered by OpenGL, the older graphics API that Vulkan is meant to replace. There is one group that benefits, says Upton: games and other 3D Android applications. Android uses Vulkan as its low-overhead graphics API.

As with most Raspberry Pi advancements, there could be unforeseen opportunities unleashed by this seemingly tiny change. Vulkan 1.2 support gives developers the same 3D-graphics interface (if not anywhere near the same power) as 2019 NVIDIA graphics cards, 2020 Intel chips with integrated graphics, and dozens of other devices. With a Vulkan 1.0 driver installed, developer Iago Toral was able in 2020 to get the original Quake trilogy mostly running on a Pi 4, with not-too-shabby frame rates.

Slashdot Top Deals