GNU is Not Unix

FSF Says Google's Decision to Deprecate JPEG-XL Emphasizes Need for Browser Choice (fsf.org) 130

"The fact remains that Google Chrome is the arbiter of web standards," argues FSF campaigns manager Greg Farough (while adding that Firefox, "through ethical distributions like GNU IceCat and Abrowser, can weaken that stranglehold.")

"Google's deprecation of the JPEG-XL image format in February in favor of its own patented AVIF format might not end the web in the grand scheme of things, but it does highlight, once again, the disturbing amount of control it has over the platform generally." Part of Google's official rationale for the deprecation is the following line: "There is not enough interest from the entire ecosystem to continue experimenting with JPEG-XL." Putting aside the problematic aspects of the term "ecosystem," let us remark that it's easy to gauge the response of the "entire ecosystem" when you yourself are by far the largest and most dangerous predator in said "ecosystem." In relation to Google's overwhelming power, the average web user might as well be a microbe. In supposedly gauging what the "ecosystem" wants, all Google is really doing is asking itself what Google wants...

While we can't link to Google's issue tracker directly because of another freedom issue — its use of nonfree JavaScript — we're told that the issue regarding JPEG-XL's removal is the second-most "starred" issue in the history of the Chromium project, the nominally free basis for the Google Chrome browser. Chromium users came out of the woodwork to plead with Google not to make this decision. It made it anyway, not bothering to respond to users' concerns. We're not sure what metric it's using to gauge the interest of the "entire ecosystem," but it seems users have given JPEG-XL a strong show of support. In turn, what users will be given is yet another facet of the web that Google itself controls: the AVIF format.

As the response to JPEG-XL's deprecation has shown, our rallying together and telling Google we want something isn't liable to get it to change its mind. It will keep on wanting what it wants: control; we'll keep on wanting what we want: freedom.

Only, the situation isn't hopeless. At the present moment, not even Google can stop us from creating the web communities that we want to see: pages that don't run huge chunks of malicious, nonfree code on our computers. We have the power to choose what we run or do not run in our browsers. Browsers like GNU IceCat (and extensions like LibreJS and JShelter> ) help with that. Google also can't prevent us from exploring networks beyond the web like Gemini. What our community can do is rally support behind those free browsers that choose to support JPEG-XL and similar formats, letting the big G know that even if we're smaller than it, we won't be bossed around.

Hardware

Nvidia's Top AI Chips Are Selling for More Than $40,000 on eBay (cnbc.com) 32

Nvidia's most-advanced graphics cards are selling for more than $40,000 on eBay, as demand soars for chips needed to train and deploy artificial intelligence software. From a report: The prices for Nvidia's H100 processors were noted by 3D gaming pioneer and former Meta consulting technology chief John Carmack on Twitter. On Friday, at least eight H100s were listed on eBay at prices ranging from $39,995 to just under $46,000. Some retailers have offered it in the past for around $36,000. The H100, announced last year, is Nvidia's latest flagship AI chip, succeeding the A100, a roughly $10,000 chip that's been called the "workhorse" for AI applications. Developers are using the H100 to build so-called large language models (LLMs), which are at the heart of AI applications like OpenAI's ChatGPT. Running those systems is expensive and requires powerful computers to churn through terabytes of data for days or weeks at a time. They also rely on hefty computing power so the AI model can generate text, images or predictions. Training AI models, especially large ones like GPT, requires hundreds of high-end Nvidia GPUs working together.
Hardware

Nvidia Announces the RTX 4070, a 'Somewhat Reasonably Priced Desktop GPU' (polygon.com) 89

Nvidia announced the GeForce RTX 4070 desktop GPU, a move that anyone who's been putting off a new midrange DIY PC build has likely been eagerly awaiting. It puts the company's impressive Ada Lovelace graphics architecture within grasp for people who don't want to spend $1,000 or more on a huge graphics card. From a report: It'll launch Thursday, April 13, starting at $599 for Nvidia's Founders Edition single-fan model. As is always the case, other manufacturers like Asus, Zotac, Gigabyte, MSI, and others are putting out factory overclocked variants, too. The Verge already has a full review up for the RTX 4070.

The RTX 4070 Founders Edition card requires a 650 W power supply, and it connects via two PCIe 8-pin cables (an adapter comes in the box). Alternatively, it can connect via a PCIe Gen 5 cable that supports 300 W or higher. The RTX 4070 won't require a humongous case, as it's a two-slot card that's quite a bit smaller than the RTX 4080. It's 9.6 inches long and 4.4 inches wide, which is just about the same size as my RTX 3070 Ti Founders Edition card. Despite being a lower-end GPU compared to Nvidia's RTX 4080 or RTX 4090, it retains the DLSS 3 marquee selling point. It's the next iteration of Nvidia's upscaling technique that drops the render resolution to make games run better, then uses the GPU's AI cores to intelligently upscale what you see.

Google

Chrome 113 To Ship WebGPU By Default (phoronix.com) 43

While Chrome 112 just shipped this week and Chrome 113 only in beta, there is already a big reason to look forward to that next Chrome web browser release: Google is finally ready to ship WebGPU support. From a report: WebGPU provides the next-generation high performance 3D graphics API for the web. With next month's Chrome 113 stable release, the plan is to have WebGPU available out-of-the-box for this new web graphics API. Though in that version Google is limiting it to ChromeOS, macOS, and Windows... Yes, Google says other platforms like Linux will see their roll-out later in the year. The WebGPU API is more akin to Direct3D 12, Vulkan, and Metal compared with the existing WebGL being derived from OpenGL (ES). From Google's blog post: WebGPU is a new API for the web, which exposes modern hardware capabilities and allows rendering and computation operations on a GPU, similar to Direct3D 12, Metal, and Vulkan. Unlike the WebGL family of APIs, WebGPU offers access to more advanced GPU features and provides first-class support for general computations on the GPU. The API is designed with the web platform in mind, featuring an idiomatic JavaScript API, integration with promises, support for importing videos, and a polished developer experience with great error messages.

This initial release of WebGPU serves as a building block for future updates and enhancements. The API will offer more advanced graphics features, and developers are encouraged to send requests for additional features. The Chrome team also plans to provide deeper access to shader cores for even more machine learning optimizations and additional ergonomics in WGSL, the WebGPU Shading Language.

Facebook

Meta To Debut Ad-Creating Generative AI this Year, CTO Says (nikkei.com) 29

Facebook owner Meta intends to commercialize its proprietary generative artificial intelligence by December, joining Google in finding practical applications for the tech. From a report: The company, which began full-scale AI research in 2013, stands out along with Google in the number of studies published. "We've been investing in artificial intelligence for over a decade, and have one of the leading research institutes in the world," Andrew Bosworth, Meta's chief technology officer, told Nikkei in an exclusive interview on Wednesday in Tokyo. "We certainly have a large research organization, hundreds of people." Meta announced in February that it would establish a new organization to develop generative AI, but this is the first time it has indicated a timeline for commercialization. The technology, which can instantly create sentences and graphics, has already been commercialized by ChatGPT creator OpenAI of the U.S. But Bosworth insists Meta remains on the technology's cutting edge.

"We feel very confident that ... we are at the very forefront," he said. "Quite a few of the techniques that are in large language model development were pioneered [by] our teams. "[I] expect we'll start seeing some of them [commercialization of the tech] this year. We just created a new team, the generative AI team, a couple of months ago; they are very busy. It's probably the area that I'm spending the most time [in], as well as Mark Zuckerberg and [Chief Product Officer] Chris Cox." Bosworth believes Meta's artificial intelligence can improve an ad's effectiveness partly by telling the advertiser what tools to use in making it. He said that instead of a company using a single image in an advertising campaign, it can "ask the AI, 'Make images for my company that work for different audiences.' And it can save a lot of time and money."

Bitcoin

Cryptocurrencies Add Nothing Useful To Society, Says Nvidia (theguardian.com) 212

The US chip-maker Nvidia has said cryptocurrencies do not "bring anything useful for society" despite the company's powerful processors selling in huge quantities to the sector. From a report: Michael Kagan, its chief technology officer, said other uses of processing power such as the artificial intelligence chatbot ChatGPT were more worthwhile than mining crypto. Nvidia never embraced the crypto community with open arms. In 2021, the company even released software that artificially constrained the ability to use its graphics cards from being used to mine the popular Ethereum cryptocurrency, in an effort to ensure supply went to its preferred customers instead, who include AI researchers and gamers. Kagan said the decision was justified because of the limited value of using processing power to mine cryptocurrencies.

The first version ChatGPT was trained on a supercomputer made up of about 10,000 Nvidia graphics cards. "All this crypto stuff, it needed parallel processing, and [Nvidia] is the best, so people just programmed it to use for this purpose. They bought a lot of stuff, and then eventually it collapsed, because it doesn't bring anything useful for society. AI does," Kagan told the Guardian. "With ChatGPT, everybody can now create his own machine, his own programme: you just tell it what to do, and it will. And if it doesn't work the way you want it to, you tell it 'I want something different.'" Crypto, by contrast, was more like high-frequency trading, an industry that had led to a lot of business for Mellanox, the company Kagan founded before it was acquired by Nvidia. "We were heavily involved in also trading: people on Wall Street were buying our stuff to save a few nanoseconds on the wire, the banks were doing crazy things like pulling the fibres under the Hudson taut to make them a little bit shorter, to save a few nanoseconds between their datacentre and the stock exchange," he said. "I never believed that [crypto] is something that will do something good for humanity. You know, people do crazy things, but they buy your stuff, you sell them stuff. But you don't redirect the company to support whatever it is."

Build

The Orange Pi 5: a Fast Alternative To The Raspberry Pi 4 (phoronix.com) 81

"With an 8-core Rockchip RK3588S SoC, the Orange Pi 5 is leaps and bounds faster than the aging Raspberry Pi 4," writes Phoronix: With up to 32GB of RAM, the Orange Pi 5 is also capable of serving for a more diverse user-base and even has enough potential for assembling a budget Arm Linux developer desktop. I've been testing out the Orange Pi 5 the past few weeks and it's quite fast and nice for its low price point.

The Orange Pi 5 single board computer was announced last year and went up for pre-ordering at the end of 2022.... When it comes to the software support, among the officially available options for the Orange Pi 5 are Orange Pi OS, Ubuntu, Debian, Android, and Armbian. Other ARM Linux distributions will surely see varying levels of support while even the readily available ISO selection offered by Orange Pi is off to a great start....

Granted, the Orange Pi developer community isn't as large as that of the Raspberry Pi community or the current range of accessories and documentation, but for those more concerned about features and performance, the Orange Pi 5 is extremely interesting.

The article includes Orange Pi 5 specs:
  • A 26-pin header
  • HDMI 2.1, Gigabit LAN, M.2 PCIe 2.0, and USB3 connectivity
  • A Mali G510 MP4 graphics processor, "which has open-source driver hope via the Panfrost driver stack."
  • Four different versions with 4GB, 8GB, 16GB, or 32GB of RAM using LPDDR4 or LPDDR4X. "The Orange Pi 4GB retails for ~$88, the Orange Pi 5 8GB version retails for $108, and the Orange Pi 5 16GB version retails for $138, while as of writing the 32GB version wasn't in stock."

In 169 performance benchmarks (compared to Raspberry Pi 4 boards), "this single board computer came out to delivering 2.85x the performance of the Raspberry Pi 400 overall." And through all this the average SoC temperature was 71 degrees with a peak of 85 degrees — without any extra heatsink or cooling.


Graphics

Intel Graphics Chief Leaves After Five Years (theverge.com) 25

After five years attempting to make Intel into a competitor for Nvidia and AMD in the realm of discrete graphics for gamers and beyond -- with limited success -- Raja Koduri is leaving Intel to form his own generative AI startup. The Verge reports: Intel hired him away from AMD in 2017, where he was similarly in charge of the entire graphics division, and it was an exciting get at the time! Not only had Intel poached a chief architect who'd just gone on sabbatical but Intel also revealed that it did so because it wanted to build discrete graphics cards for the first time in (what would turn out to be) 20 years. Koduri had previously been poached for similarly exciting projects, too -- Apple hired him away from AMD ahead of an impressive string of graphics improvements, and then AMD brought him back again in 2013.

Intel has yet to bring real competition to the discrete graphics card space as of Koduri's departure. [...] But the company has a long GPU roadmap, so it's possible things get better and more competitive in subsequent gens. It took a lot longer than five years for Nvidia and AMD to make it that far. By the time Koduri left, he wasn't just in charge of graphics but also Intel's "accelerated computing" initiatives, including things like a crypto chip.

AI

Nvidia DGX Cloud: Train Your Own ChatGPT in a Web Browser For $37K a Month 22

An anonymous reader writes: Last week, we learned that Microsoft spent hundreds of millions of dollars to buy tens of thousands of Nvidia A100 graphics chips so that partner OpenAI could train the large language models (LLMs) behind Bing's AI chatbot and ChatGPT.

Don't have access to all that capital or space for all that hardware for your own LLM project? Nvidia's DGX Cloud is an attempt to sell remote web access to the very same thing. Announced today at the company's 2023 GPU Technology Conference, the service rents virtual versions of its DGX Server boxes, each containing eight Nvidia H100 or A100 GPUs and 640GB of memory. The service includes interconnects that scale up to the neighborhood of 32,000 GPUs, storage, software, and "direct access to Nvidia AI experts who optimize your code," starting at $36,999 a month for the A100 tier.

Meanwhile, a physical DGX Server box can cost upwards of $200,000 for the same hardware if you're buying it outright, and that doesn't count the efforts companies like Microsoft say they made to build working data centers around the technology.
Linux

Linux 6.4 AMD Graphics Driver Picking Up New Power Features For The Steam Deck (phoronix.com) 2

An anonymous reader shared this report from Phoronix: A pull request of early AMDGPU kernel graphics driver changes was submitted for DRM-Next on Friday as some of the early feature work accumulating for the Linux 6.4 kernel cycle.

Among the AMDGPU kernel driver changes this round are a number of fixes affecting items such as the UMC RAS, DCN 3.2, FreeSync, SR-IOV, various IP blocks, USB4, and more. On the feature side, mentioned subtly in the change-log are a few power-related additions... These additions are largely focused on Van Gogh APUs, which is notably for the Valve Steam Deck and benefiting its graphics moving forward.

First up, this kernel pull request introduces a new sysfs interface for adjusting/setting thermal throttling. This is wired up for Van Gogh and allows reading/updating the thermal limit temperature in millidegrees Celsius. This "APU thermal cap" interface is just wired up for Van Gogh and seems to be Steam Deck driven feature work so that SteamOS will be better able to manage the thermal handling of the APU graphics....

These power features will be exposed via sysfs while Steam OS will wrap around them intelligently and possibly some new UI settings knobs for those wanting more control over their Steam Deck's thermal/performance.

Microsoft

Microsoft Strung Together Tens of Thousands of Chips in a Pricey Supercomputer for OpenAI (bloomberg.com) 25

When Microsoft invested $1 billion in OpenAI in 2019, it agreed to build a massive, cutting-edge supercomputer for the artificial intelligence research startup. The only problem: Microsoft didn't have anything like what OpenAI needed and wasn't totally sure it could build something that big in its Azure cloud service without it breaking. From a report: OpenAI was trying to train an increasingly large set of artificial intelligence programs called models, which were ingesting greater volumes of data and learning more and more parameters, the variables the AI system has sussed out through training and retraining. That meant OpenAI needed access to powerful cloud computing services for long periods of time. To meet that challenge, Microsoft had to find ways to string together tens of thousands of Nvidia's A100 graphics chips -- the workhorse for training AI models -- and change how it positions servers on racks to prevent power outages. Scott Guthrie, the Microsoft executive vice president who oversees cloud and AI, wouldn't give a specific cost for the project, but said "it's probably larger" than several hundred million dollars. [...] Now Microsoft uses that same set of resources it built for OpenAI to train and run its own large artificial intelligence models, including the new Bing search bot introduced last month. It also sells the system to other customers. The software giant is already at work on the next generation of the AI supercomputer, part of an expanded deal with OpenAI in which Microsoft added $10 billion to its investment.
Bug

Nvidia Driver Bug Might Make Your CPU Work Harder After You Close Your Game (arstechnica.com) 13

An anonymous reader shares a report: Nvidia released a new driver update for its GeForce graphics cards that, among other things, introduced a new Video Super Resolution upscaling technology that could make low-resolution videos look better on high-resolution screens. But the driver (version 531.18) also apparently came with a bug that caused high CPU usage on some PCs after running and then closing a game. Nvidia has released a driver hotfix (version 531.26) that acknowledges and should fix the issue, which was apparently being caused by an undisclosed bug in the "Nvidia Container," a process that exists mostly to contain other processes that come with Nvidia's drivers. It also fixes a "random bugcheck" issue that may affect some older laptops with GeForce 1000-series or MX250 and MX350 GPUs.
iMac

Apple Readies Its Next Range of Macs (bloomberg.com) 29

According to a report from Bloomberg's Mark Gurman, Apple is readying a new batch of Macs to launch "between late spring and summer." This includes a 15-inch MacBook Air and a new 24-inch iMac. From the report: Apple's next iMac desktop is at an advanced stage of development called engineering validation testing, or EVT, and the company is conducting production tests of the machine. The next iMac will continue to come in the same 24-inch screen size as the current model, which was announced in April 2021. The versions being tested also come in the same colors as the current iMac, a palette that includes blue, silver, pink and orange.

The new iMacs will, of course, be more powerful -- with a new M-series chip to replace the M1. There also will be some behind-the-scenes changes. The computer will see some of its internal components relocated and redesigned, and the manufacturing process for attaching the iMac's stand is different. While development of the new iMacs -- codenamed J433 and J434 -- has reached a late stage, it's not expected to go into mass production for at least three months. That means it won't ship until the second half of the year at the earliest. Still, this is a great development for anyone disappointed that Apple's all-in-one desktop hasn't been updated in nearly two years.

Aside from the iMac, Apple is scheduled to launch about three new Macs between late spring and summer, I'm told. Those three models are likely to be the first 15-inch MacBook Air (codenamed J515), the first Mac Pro with homegrown Apple chips (J180) and an update to the 13-inch MacBook Air (J513). The big remaining question is which processors these new Macs will run on. We already know the Mac Pro will include the M2 Ultra, which will provide up to 24 CPU cores, 76 graphics cores and the ability to top out the machine with at least 192 gigabytes of memory. We also know that Apple has developed the next iMac on the same timeline as the M3 chip, so I'd expect it to be one of the company's first M3-based machines.

Youtube

Nvidia's Latest GPU Drivers Can Upscale Old Blurry YouTube Videos (theverge.com) 36

Nvidia is releasing new GPU drivers today that will upscale old blurry web videos on RTX 30- and 40-series cards. The Verge reports: RTX Video Super Resolution is a new AI upscaling technology from Nvidia that works inside Chrome or Edge to improve any video in a browser by sharpening the edges of objects and reducing video artifacts. Nvidia will support videos between 360p and 1440p up to 144Hz in frame rate and upscale all the way up to 4K resolution.

This impressive 4K upscaling has previously only been available on Nvidia's Shield TV, but recent advances to the Chromium engine have allowed Nvidia to bring this to its latest RTX 30- and 40-series cards. As this works on any web video, you could use it to upscale content from Twitch or even streaming apps like Netflix where you typically have to pay extra for 4K streams.

Businesses

Rovio Says Paid Angry Birds Had 'Negative Impact' on Free-to-Play Versions (arstechnica.com) 43

Back in the days before practically every mobile game was a free-to-play, ad- and microtransaction-laden sinkhole, Rovio found years of viral success selling paid downloads of Angry Birds to tens of millions of smartphone users. Today, though, the company is delisting the last "pay upfront" version of the game from mobile app stores because of what it says is a "negative impact" on the more lucrative free-to-play titles in the franchise. From a report: Years after its 2009 launch, the original Angry Birds was first pulled from mobile app stores in 2019, a move Rovio later blamed on "outdated game engines and design." The remastered "Rovio Classics" version of the original game launched last year, asking 99 cents for over 390 ad-free levels, complete with updated graphics and a new, future-proofed engine "built from the ground up in Unity." In a tweeted statement earlier this week, though, Rovio announced that it is delisting Rovio Classics: Angry Birds from the Google Play Store and renaming the game Red's First Flight on the iOS App Store (presumably to make it less findable in an "Angry Birds" search). That's because of the game's "impact on our wider games portfolio," Rovio said, including "live" titles such as Angry Birds 2, Angry Birds Friends, and Angry Birds Journey.
Businesses

Nvidia Is Still Making Billions In Q4 2023 Despite a Giant Drop In PC Demand (theverge.com) 22

In its fourth quarter and full-year earnings report yesterday, Nvidia reported $6.05 billion in revenue for Q4 of its fiscal 2023 and $26.92 billion for the full year. That's "almost identical to last year, though profit was down 55 percent," notes The Verge.

"Remember: in 2021, $5 billion in revenue a quarter was a new Nvidia record. Now it's the status quo: the company says it's expecting to see $6.5 billion next quarter, too." From the report: Nvidia's data center and automotive businesses were actually up this quarter, with record revenue for automotive of $294 million; the dip was largely in Nvidia's graphics business, particularly gaming, which were each down 46 percent. That gaming decline includes "lower shipments of SOCs for game consoles," which is code for "Nintendo isn't selling as many Switches anymore" -- it's the only game console that uses an Nvidia chip. Like other chipmakers, Nvidia is shipping fewer GPUs to retailers and partners instead of slashing prices. The polite phrase is "lower sell-in to partners to help align channel inventory levels with current demand expectations." Nvidia also blamed disruptions in China due to covid and other issues.

Every PC maker is reporting that demand for computers has tanked this past quarter, with research firm Gartner calling the 28.5 percent dip in shipments "the largest quarterly shipment decline since Gartner began tracking the PC market in the mid-1990s." That was on top of the slump companies like Nvidia had already seen. And while AMD seemed optimistic this quarter that the slump won't last for long, even it suggested that client processor and gaming revenue would continue to go down in the first half of the calendar year.

Earth

Where More People Will Die -- and Live -- Because of Climate Change (washingtonpost.com) 131

An anonymous reader shares this thought-provoking article by a graphics reporter at The Washington Post who was part of its Pulitzer Prize-winning Explanatory Reporting team: The scientific paper published in the June 2021 issue of the journal Nature Climate Change was alarming. Between 1991 and 2018, the peer-reviewed study reported, more than one-third of deaths from heat exposure were linked to global warming. Hundreds of news outlets covered the findings. The message was clear: climate change is here, and it's already killing people. But that wasn't all that was happening. A month later, the same research group, which is based out of the London School of Hygiene and Tropical Medicine but includes scientists from dozens of countries, released another peer-reviewed study that told a fuller, more complex story about the link between climate change, temperature and human mortality. The two papers' authors were mostly the same, and they used similar data and statistical methods.

Published in Lancet Planetary Health, the second paper reported that between 2000 and 2019, annual deaths from heat exposure increased. But deaths from cold exposure, which were far more common, fell by an even larger amount. All told, during those two decades the world warmed by about 0.9 degrees Fahrenheit, and some 650,000 fewer people died from temperature exposure....

But whose lives? Projections indicate milder temperatures may indeed spare people in the globe's wealthy north, where it's already colder and people can buy protection against the weather. Yet heat will punish people in warmer, less wealthy parts of the world, where each extra degree of temperature can kill and air conditioning will often remain a fantasy....

What about the long term? A groundbreaking peer-reviewed study, published in November in Harvard's Quarterly Journal of Economics, gives us a glimpse. In the study, a team of researchers projected how mortality from temperature would change in the future. The worldwide temperature-linked mortality rate is projected to stay about the same, but you can see enormous geographic variation: colder, wealthier countries do well, while hotter, poorer countries suffer.

Portables

System76 Announces Redesigned 'Pangolin' AMD/Linux Laptop (9to5linux.com) 42

System76 is announcing a "fully redesigned" version of its AMD-only Linux-powered "Pangolin" laptop with an upgraded memory, storage, processor, and display.

9to5Linux reports: It features the AMD Ryzen 7 6800U processor with up to 4.7 GHz clock speeds, 8 cores, 16 threads, and AMD Radeon 680M integrated graphics.... a 15.6-inch 144Hz Full HD (1920 x 1080) display [using 12 integrated Radeon graphics cores] with a matte finish, a sleek magnesium alloy chassis, and promises up to 10 hours of battery life with its 70 Wh Li-Ion battery. It also features a single-color backlit US QWERTY Keyboard and a multitouch clickpad. Under the hood, the Linux-powered laptop boasts 32 GB LPDDR5 6400 MHz of RAM and it can be equipped with up to 16TB PCIe 4.0 NVMe M.2 SSD storage. Another cool feature is the hardware camera kill switch for extra privacy....

As with all of System76's Linux-powered laptops, the all-new Pangolin comes pre-installed with System76's in-house built Pop!_OS Linux distribution featuring the GNOME-based COSMIC desktop and full disk-encryption or with Ubuntu 22.04 LTS.

Microsoft

Microsoft Adds Adobe Acrobat PDF Tech To Its Edge Browser (betanews.com) 57

BetaNews: Yesterday, Microsoft announced it would be bringing AI to its Edge browser thanks to a partnership with ChatGPT owner OpenAI. Today the software giant adds something that many people will be less keen on -- Acrobat PDF technology. Describing the move as the next step to in their "commitment to transform the future of digital work and life," Microsoft and Adobe say this addition will give uses a unique PDF experience with extra features that will remain free of charge. By powering the built-in PDF reader with the Adobe Acrobat PDF engine, Microsoft says users will benefit from "higher fidelity for more accurate colors and graphics, improved performance, strong security for PDF handling, and greater accessibility -- including better text selection and read-aloud narration."
Games

After 16 Years of Freeware, 'Dwarf Fortress' Creators Get $7M Payday (arstechnica.com) 57

An anonymous reader shares a report from Ars Technica: The month before Dwarf Fortress was released on Steam (and Itch.io), the brothers Zach and Tarn Adams made $15,635 in revenue, mostly from donations for their 16-year freeware project. The month after the game's commercial debut, they made $7,230,123, or 462 times that amount....

Tarn Adams noted that "a little less than half will go to taxes," and that other people and expenses must be paid. But enough of it will reach the brothers themselves that "we've solved the main issues of health/retirement that are troubling for independent people." It also means that Putnam, a longtime modder and scripter and community member, can continue their work on the Dwarf Fortress code base, having been hired in December.

The "issues of health/retirement" became very real to the brothers in 2019 when Zach had to seek treatment for skin cancer. The $10,000 cost, mostly covered through his wife's employer-provided insurance, made them realize the need for more robust sustainability. "You're not just going to run GoFundMes until you can't and then die when you're 50," Tarn told The Guardian in late 2022. "That is not cool." This realization pushed them toward a (relatively) more accessible commercial release with traditional graphics, music, and tutorials.

Slashdot Top Deals