Cloud

Will the US Army, Not Meta, Build an 'Open' Metaverse? (venturebeat.com) 35

Just five weeks before his death in 2001, Douglas Adams made a mind-boggling pronouncement. "We are participating in a 3.5 billion-year program to turn dumb matter into smart matter..." He gave the keynote address for an embedded systems conference at San Francisco's Moscone Center... Adams dazzled the audience with a vision of a world where information devices are ultimately "as plentiful as chairs...." When the devices of the world were networked together, they could create a "soft earth" — a shared software model of the world assembled from all the bits of data. Communicating in real time, the soft earth would be alive and developing — and with the right instruments, humankind could just as easily tap into a soft solar system.
It's 21 years later, in a world where the long-time global software company Bohemia Interactive Simulations claims to be "at the forefront of simulation training solutions for defense and civilian organizations." And writing in VentureBeat, their chief commercial officer argues that "We do not yet have a shared imagination for the metaverse and the technology required to build it," complaining that big-tech companies "want to keep users reliant on their tech within a closed, commercialized ecosystem." I envision an open virtual world that supports thousands of simultaneous players and offers valuable, immersive use cases.

The scope of this vision requires an open cloud architecture with native support for cloud scalability. By prioritizing cloud development and clear goal-setting, military organizations have taken significant leaps toward building an actual realization of this metaverse. In terms of industry progress towards the cloud-supported, scalable metaverse, no organization has come further than the U.S. Army.

Their Synthetic Training Environment (STE) has been in development since 2017. The STE aims to replace all legacy simulation programs and integrate different systems into a single, connected system for combined arms and joint training. The STE fundamentally differs from traditional, server-based approaches. For example, it will host a 1:1 digital twin of the Earth on a cloud architecture that will stream high fidelity (photo-realistic) terrain data to connected simulations. New terrain management platforms such as Mantle ETM will ensure that all connected systems operate on exactly the same terrain data. For example, trainees in a tank simulator will see the same trees, bushes and buildings as the pilot in a connected flight simulator, facilitating combined arms operations.

Cloud scalability (that is, scaling with available computational power) will allow for a better real-world representation of essential details such as population density and terrain complexity that traditional servers could not support. The ambition of STE is to automatically pull from available data resources to render millions of simulated entities, such as AI-based vehicles or pedestrians, all at once.... [D]evelopers are creating a high-fidelity, digital twin of the entire planet.

Commercial metaverses created for entertainment or commercial uses may not require an accurate representation of the earth.... Still, the military metaverse could be a microcosm of what may soon be a large-scale, open-source digital world that is not controlled or dominated by a few commercial entities....

STE success will pave the way for any cloud-based, open-source worlds that come after it, and will help prove that the metaverse's value extends far beyond that of a marketing gimmick.

Graphics

As Intel Gets Into Discrete GPUs, It Scales Back Support For Many Integrated GPUs (arstechnica.com) 47

An anonymous reader quotes a report from Ars Technica: Intel is slowly moving into the dedicated graphics market, and its graphics driver releases are looking a lot more like Nvidia's and AMD's than they used to. For its dedicated Arc GPUs and the architecturally similar integrated GPUs that ship with 11th- and 12th-generation Intel CPUs, the company promises monthly driver releases, along with "Day 0" drivers with specific fixes and performance enhancements for just-released games. At the same time, Intel's GPU driver updates are beginning to de-emphasize what used to be the company's bread and butter: low-end integrated GPUs. The company announced yesterday that it would be moving most of its integrated GPUs to a "legacy support model," which will provide quarterly updates to fix security issues and "critical" bugs but won't include the game-specific fixes that newer GPUs are getting.

The change affects a wide swath of GPUs, which are not all ancient history. Among others, the change affects all integrated GPUs in the following processor generations, from low-end unnumbered "HD/UHD graphics" to the faster Intel Iris-branded versions: 6th-generation Core (introduced 2015, codenamed Skylake), 7th-generation Core (introduced 2016, codenamed Kaby Lake), 8th-generation Core (introduced 2017-2018, codenamed Kaby Lake-R, Whiskey Lake, and Coffee Lake), 9th-generation Core (introduced 2018, codenamed Coffee Lake), 10th-generation Core (introduced 2019-2020, codenamed Comet Lake and Ice Lake), and various N4000, N5000, and N6000-series Celeron and Pentium CPUs (introduced 2017-2021, codenamed Gemini Lake, Elkhart Lake, and Jasper Lake).

Intel is still offering a single 1.1GB driver package that supports everything from its newest Iris Xe GPUs to Skylake-era integrated graphics. However, the install package now contains one driver for newer GPUs that are still getting new features and a second driver for older GPUs on the legacy support model. The company uses a similar approach for driver updates for its Wi-Fi adapters, including multiple driver versions in the same download package to support multiple generations of hardware.
"The upshot is that these GPUs' drivers are about as fast and well-optimized as they're going to get, and the hardware isn't powerful enough to play many of the newer games that Intel provides fixes for in new GPU drivers anyway," writes Ars Technica's Andrew Cunningham. "Practically speaking, losing out on a consistent stream of new gaming-centric driver updates is unlikely to impact the users of these GPUs much, especially since Intel will continue to fix problems as they occur."
Graphics

Coding Mistake Made Intel GPUs 100X Slower in Ray Tracing (tomshardware.com) 59

Intel Linux GPU driver developers have released an update that results in a massive 100X boost in ray tracing performance. This is something to be celebrated, of course. However, on the flip side, the driver was 100X slower than it should have been because of a memory allocation oversight. Tom's Hardware reports: Linux-centric news site Phoronix reports that a fix merged into the open-source Intel Mesa Vulkan driver was implemented by Intel Linux graphics driver engineering stalwart Lionel Landwerlin on Thursday. The developer wryly commented that the merge request, which already landed in Mesa 22.2, would deliver "Like a 100x (not joking) improvement." Intel has been working on Vulkan raytracing support since late 2020, but this fix is better late than never.

Usually, the Vulkan driver would ensure temporary memory used for Vulkan raytracing work would be in local memory, i.e., the very fast graphics memory onboard the discrete GPU. A line of code was missing, so this memory allocation housekeeping task wasn't set. Thus, the Vulkan driver would shift ray tracing data to slower offboard system memory and back. Think of the continued convoluted transfers to this slower memory taking place, slowing down the raytracing performance significantly. It turns out, as per our headline, that setting a flag for "ANV_BO_ALLOC_LOCAL_MEM" ensured that the VRAM would be used instead, and a 100X performance boost was the result.
"Mesa 22.2, which includes the new code, is due to be branched in the coming days and will be included in a bundle of other driver refinements, which should reach end-users by the end of August," adds the report.
AMD

AMD Just Leaked Its Nvidia RTX Voice Competitor in a (Now Deleted) Video (theverge.com) 8

AMD looks to be on the cusp of releasing a competitor to RTX Voice, a feature for Nvidia graphics cards that cancels out background noise when you're on a call or otherwise using your mic. From a report: That's according to a trailer that AMD posted to its YouTube channel (apparently in error), Tom's Hardware reports. Thankfully, a copy of the trailer was downloaded before it was deleted by Reddit user u/zenobian and uploaded to the AMD subreddit. The leaked trailer suggests that AMD's Noise Suppression feature will work very similarly to Nvidia's RTX Voice (which has subsequently been rolled into Nvidia's Broadcast app). It uses "a real-time deep learning algorithm" to offer "two-way noise-reduction" that filters background noise out of both outgoing and incoming microphone audio, and is apparently built into AMD's existing Adrenalin software.
Cloud

GeForce Now Rolling Out 120FPS Cloud Gaming To All Compatible Android Smartphones (9to5google.com) 15

Nvidia has just announced that GeForce Now is picking up support for 120fps gameplay on all Android smartphones, after previously limiting the functionality to only a few select models. 9to5Google reports: GeForce Now is a cloud gaming service that allows players to stream PC games from marketplaces such as Steam and the Epic Games Store, among others, to virtually any device. It's a great way to expand the gaming experience on your PC over to a mobile phone or your TV, or just to play games that your PC isn't powerful enough to run on its own. The service is free, but you can pay to get longer sessions and better quality.

Last year, the service picked up its RTX 3080 tier, which offers the power of the still-hard-to-find graphics card, but through the cloud. While it's a pricey option, it was quickly found to be the gold standard of cloud gaming thanks to minimal input latency, higher resolution, and faster refresh rate. It's that faster refresh rate that's boosting GeForce Now for Android players this week, with 120fps expanding to all Android phones with faster refresh rates. If your phone has a 120Hz display, you can now stream games at 120fps.
The official list of supported devices can be found here.

Nvidia says that the expanded support will arrive "over the coming weeks" and that the experience could vary from device to device.
Ubuntu

The Dell XPS Developer Edition Will Soon Arrive With Ubuntu Linux 22.04 (zdnet.com) 31

The Dell XPS 13 Plus Developer Edition with Ubuntu 22.04 Long Term Support (LTS) will arrive on August 23rd. "This means, of course, Canonical and Dell officially have been certified for Ubuntu 22.04 LTS," writes ZDNet's Steven Vaughan-Nichols. "So if you already have a current XPS 13 Plus, you can install Ubuntu 22.04 and automatically receive the same hardware-optimized experience that will ship with the new Developer Edition." From the report: What this certification means is that all of XPS's components have been tested to deliver the best possible experience out of the box. Ubuntu-certified devices are based on Long Term Support (LTS) releases and therefore receive updates for up to 10 years. So if you actually still have an XPS 13 that came with Ubuntu back in the day, it's still supported today. [...] Dell and Canonical have been at this for years. Today's Dell's Developer Editions are the official continuation of Project Sputnik. This initiative began 10 years ago to create high-end Dell systems with Ubuntu preinstalled. These were, and are, designed with programmer input and built for developers.

As Jaewook Woo, Dell's product manager, Linux, explained: "XPS is an innovation portal for Dell -- from its application of cutting-edge technology to experimentation of new user interfaces and experiential design. By bringing the enhanced performance and power management features of Ubuntu 22.04 LTS to our most advanced premium laptop, Dell and Canonical reinforce our joint commitment to continue delivering the best computing experience for developers using Ubuntu."

The forthcoming Dell XPS Plus Developer Edition's specifications are impressive. The base configuration is powered by a 12th-generation Intel i5 1240P processor that runs up to 4.4GHz. For graphics, it uses Intel Iris Xe Graphics. This backs up the 13.4-inch 1920x1200 60Hz display. For storage, it uses a 512GB SSD. The list price is $1,389.

Desktops (Apple)

Linux Distro For Apple Silicon Macs Is Already Up and Running On the Brand-New M2 (arstechnica.com) 129

An anonymous reader quotes a report from Ars Technica: Unlike Intel Macs, Apple silicon Macs were designed to run only Apple's software. But the developers on the Asahi Linux team have been working to change that, painstakingly reverse-engineering support for Apple's processors and other Mac hardware and releasing it as a work-in-progress distro that can actually boot up and run on bare metal, no virtualization required. The Asahi Linux team put out a new release today with plenty of additions and improvements. Most notably, the distro now supports the M1 Ultra and the Mac Studio and has added preliminary support for the M2 MacBook Pro (which has been tested firsthand by the team) and the M2 MacBook Air (which hasn't been tested but ought to work). Preliminary Bluetooth support for all Apple silicon Macs has also been added, though the team notes that it works poorly when connected to a 2.4GHz Wi-Fi network because "Wi-Fi/Bluetooth coexistence isn't properly configured yet."

There are still many other things that aren't working properly, including the USB-A ports on the Studio, faster-than-USB-2.0 speeds from any Type-C/Thunderbolt ports, and GPU acceleration, but progress is being made on all of those fronts. GPU work in particular is coming along, with a "prototype driver" that is "good enough to run real graphics applications and benchmarks" already up and running, though it's not included in this release. The Asahi team has said in the past that it expects support for new chips to be relatively easy to add to Asahi since Apple's chip designers frequently reuse things and don't make extensive hardware changes unless there's a good reason for it. Adding basic support for the M2 to Asahi happened over the course of a single 12-hour development session, and just "a few days" of additional effort were needed to get the rest of the hardware working as well as it does with M1-based Macs.

Graphics

SF Writer/Digital Art/NFT Pioneer Herbert W. Franke Dies at Age 95 (artnews.com) 20

On July 7th Art News explained how 95-year-old Austrian artist Herbert W. Franke "has recently become a sensation within the art world the crypto space," describing the digital pioneer as a computer artist using algorithms and computer programs to visualize math as art. Last month, the physicist and science fiction writer was behind one of the most talked about digital artworks at a booth by the blockchain company Tezos at Art Basel. Titled MONDRIAN (1979), the work paid tribute to artist Piet Mondrian's iconic geometric visuals using a program written on one of the first home computers.

Days before this, Franke, who studied physics in Vienna following World War II and started working at Siemens in 1953, where he conducted photographic experiments after office hours, launched 100 images from his famed series "Math Art" (1980-95) as NFTs on the Quantum platform. The drop was meant to commemorate his birthday on May 14 and to raise funds for his foundation. The NFTs sold out in 30 seconds, with the likes of pioneering blockchain artist Kevin Abosch purchasing a few.

In one of his last interviews, Franke told the site that blockchain "is a totally new environment, and this technology is still in its early stages, like at the beginning of computer art. But I am convinced that it has opened a new door for digital art and introduced the next generation to this new technology." It echoed something he'd said in his first book, published in 1957, which he later quoted in the interview (a full 65 years later). "Technology is usually dismissed as an element hostile to art. I want to try to prove that it is not..."

This morning, long-time Slashdot reader Qbertino wrote: The German IT news site heise reports (article in German) that digital art pioneer, SF author ("The Mind Net") and cyberspace avantgardist Herbert W. Franke has died at age 95. His wife recounted on his Twitter account: "Herbert loved to call himself the dinosaur of computer art. I am [...] devastated to announce that our beloved dinosaur has left the earth.

"He passed away knowing there is a community of artists and art enthusiasts deeply caring about his art and legacy."
Among much pioneering work he founded one of the worlds first digital art festivals "Ars Electronica" in Austria in 1979.

Franke's wife is still running the Art Meets Science web site dedicated to Franke's work. Some highlights from its biography of Franke's life: Herbert W. Franke, born in Vienna on May 14, 1927, studied physics and philosophy at the University of Vienna and received his doctorate in 1951... An Apple II was his first personal computer which he bought 1980. He developed a program as early as 1982 that used a midi interface to control moving image sequences through music....

Only in recent years has "art from the machine" begun to interest traditional museums as a branch of modern art. Franke, who from the beginning was firmly convinced of the future importance of this art movement, has also assembled a collection of computer graphics that is unique in the world, documenting 50 years of this development with works by respected international artists, supplemented by his own works....

As a physicist, Franke was predestined to bring science and technology closer to the general public in popular form due to his talent as a writer, which became apparent early on. About one-third of his nearly fifty books, as well as uncounted journal articles...

Franke's novels and stories are not about predicting future technologies, nor about forecasting our future way of life, but rather about the intellectual examination of possible models of our future and their philosophical as well as ethical interpretation. In this context, however, Franke attaches great importance to the seriousness of scientific or technological assessments of the future in the sense of a feasibility analysis. In his opinion, a serious and meaningful discussion about future developments can basically only be conducted on this basis. In this respect, Franke is not a typical representative of science fiction, but rather a visionary who, as a novelist, deals with relevant questions of social future and human destiny on a high intellectual level.

Technology

Samsung Develops GDDR6 DRAM With 24Gbps Speed for Graphics Cards (zdnet.com) 20

Samsung said on Thursday that it has developed a new GDDR6 (graphics double data rate) DRAM with a data transfer rate of 24 gigabits per second (Gbps). From a report: A premium graphics card that packs the chips will support a data processing rate of up to 1.1 terabytes (TB), equivalent to processing 275 movies in Full HD resolution within a second, the South Korean tech giant said. Samsung said the DRAM was comprised of 16Gb chips using its third-generation 10nm process node, which also incorporates extreme ultraviolet (EUV) lithography during their production. The company also applied high-k metal gates, or the use of metals besides silicon dioxide to make the gate hold more charge, on the DRAM. Samsung said this allowed its latest DRAM to operate at a rate over 30% faster than its 18Gbps GGDR6 DRAM predecessor.
Open Source

Gtk 5 Might Drop X.11 Support, Says GNOME Dev (theregister.com) 145

One of the GNOME developers has suggested that the next major release of Gtk could drop support for the X window system. The Register reports: Emmanuele Bassi opened a discussion last week on the GNOME project's Gitlab instance that asked whether the developers could drop X.11 support in the next release of Gtk. At this point, it is only a suggestion, but if it gets traction, this could significantly accelerate the move to the Wayland display server and the end of X.11.

Don't panic: Gtk 5 is not imminent. Gtk is a well-established toolkit, originally designed for the GIMP bitmap editing program back in 1998. Gtk 4 arrived relatively recently, shortly before the release of GNOME 40 in 2021. GNOME 40 has new user-interface guidelines, and as a part of this, Gtk 4 builds GNOME's Adwaita theme into the toolkit by means of the new libadwaita library, which is breaking the appearance of some existing apps.

Also, to be fair, as we recently covered, the X window system is very old now and isn't seeing major changes, although new releases of parts of it do still happen. This discussion is almost certain to get wildly contentious, and the thread on Gitlab has been closed to further comments for now. If this idea gains traction, one likely outcome might well be a fork of Gtk, just as happened when GNOME 3 came out. [...] A lot of the features of the current version, X.11, are no longer used or relevant to most users. Even so, X.12 is barely even in the planning stages yet.

Medicine

Smart Contact Lens Prototype Puts a Micro LED Display On Top of the Eye (arstechnica.com) 37

An anonymous reader quotes a report from Ars Technica: Since 2015, a California-based company called Mojo Vision has been developing smart contact lenses. Like smart glasses, the idea is to put helpful AR graphics in front of your eyes to help accomplish daily tasks. Now, a functioning prototype brings us closer to seeing a final product. In a blog post this week, Drew Perkins, the CEO of Mojo Vision, said he was the first to have an "on-eye demonstration of a feature-complete augmented reality smart contact lens." In an interview with CNET, he said he's been wearing only one contact at a time for hour-long durations. Eventually, Mojo Vision would like users to be able to wear two Mojo Lens simultaneously and create 3D visual overlays, the publication said. According to his blog, the CEO could see a compass through the contact and an on-screen teleprompter with a quote written on it. He also recalled viewing a green, monochromatic image of Albert Einstein to CNET.

At the heart of the lens is an Arm M0 processor and a Micro LED display with 14,000 pixels per inch. It's just 0.02 inches (0.5 mm) in diameter with a 1.8-micron pixel pitch. Perkins claimed it's the "smallest and densest display ever created for dynamic content." Developing the contact overall included a focus on physics and electronics miniaturization, Perkins wrote. Mojo Lens developed its power management system with "medical-grade micro-batteries" and a proprietary power management integrated circuit. The Mojo Lens also uses a custom-configured magnetometer (CNET noted this drives the compass Perkins saw), accelerometer, and gyroscope for tracking. [...]

A contact lens sounds like it has the potential to be even more discreet than AR headgear posing as regular Ray-Bans. But the current prototype uses a "relay accessory," as Mojo Vision's rep put it, worn around the neck. It includes a processor, GPU, and 5 GHz radio for sending and receiving data to and from the lens. According to CNET, the accessory also sends information "back to computers that track the eye movement data for research." Perkins' blog said this tech required custom ASIC designs. [...] The current prototype also uses a hat with an integrated antenna for easier connecting, CNET reported; though, we'd expect this to be omitted from a final product.
"There's no firm release date for the Mojo Lens, which could be the first AR contact lens to reach consumers," adds Ars. "Near-term goals include getting potential partners, investors, and journalists to try the smart lens."
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Facebook

Facebook Unveils Future 'Near Retina-Quality' VR Headsets (theverge.com) 47

Artem S. Tashkinov writes: Meta's Reality Labs division has revealed new prototypes in its roadmap toward lightweight, hyper-realistic virtual reality graphics. The breakthroughs remain far from consumer-ready, but the designs -- codenamed Butterscotch, Starburst, Holocake 2, and Mirror Lake -- could add up to a slender, brightly lit headset that supports finer detail than its current Quest 2 display.

Yet to be released headsets have features which have been sorely missing previously: near-retina-quality image offering about 2.5 times the resolution of the Quest 2's (sort of) 1832 x 1920 pixels per eye, letting users read the 20/20 vision line on an eye chart, high dynamic range (HDR) lighting with 20,000 nits of brightness and eye tracking. "The goal of all this work is to help us identify which technical paths are going to allow us to make meaningful enough improvements that we can start approaching visual realism." says the Meta CEO.

AMD

AMD Details RDNA 3 Graphics, Zen 4 Performance and Phoenix Point Laptop Products (hothardware.com) 16

Slashdot reader MojoKid writes: AMD unveiled new details of its technology roadmap Thursday at its 2022 Financial Analyst Day. Chief among them were disclosures on the company's next-gen RDNA 3 GPU architecture, Zen 4 CPU architecture and Phoenix Point laptop SoC. AMD's new RDNA 3 GPU architecture for Radeon graphic cards and mobile will be a chiplet-based design, much like the company's Ryzen CPU offering. AMD also confirmed that RDNA 3 GPUs would be fabricated on a 5nm process, likely TSMC N5. The company continued to note that an "optimized graphics pipeline" will enable yet higher clock rates, while the GPU's "rearchitected compute unit" will have ray-tracing performance improvements over RDNA 2 as well. AMD says that RDNA 3 GPUs are coming later this year, with RDNA 4 arriving likely in late 2023.

Meanwhile, AMD's Zen 4 is set to be the "world's first 5nm CPU," arriving later this year with an 8 to 10 percent instructrions per clock lift and greater than 15 percent single-threaded performance gain. Zen 4 will also support DDR5, AVX-512 extensions for AI workloads and a massive 125 percent increase in memory bandwidth. AMD is claiming a 35% multithreaded performance lift for Zen 4.

And, its Phoenix Point laptop platform SoC will be both Zen 4 and RNDA 3 infused. This is a first for AMD, since typically its laptop product's integrated graphics trail the company's current-gen GPU architecture by at least a generation. Phoenix point is set to arrive likely in the first half of 2023.

Games

'A Billion-Dollar Crypto Gaming Startup Promised Riches and Delivered Disaster' (bloomberg.com) 67

"Even many Axie regulars say it's not much fun, but that hasn't stopped people from dedicating hours to researching strategies, haunting Axie-themed Discord channels and Reddit forums, and paying for specialized software that helps them build stronger teams..."

Bloomberg pays a visit to the NFT-based game Axie Infinity with a 39-year-old player who's spent $40,000 there since last August — back when you could actually triple your money in a week. ("I was actually hoping that it could become my full-time job," he says.) The reason this is possible — or at least it seemed possible for a few weird months last year — is that Axie is tied to crypto markets. Players get a few Smooth Love Potion (SLP) tokens for each game they win and can earn another cryptocurrency, Axie Infinity Shards (AXS), in larger tournaments. The characters, themselves known as Axies, are nonfungible tokens, or NFTs, whose ownership is tracked on a blockchain, allowing them to be traded like a cryptocurrency as well....

Axie's creator, a startup called Sky Mavis Inc., heralded all this as a new kind of economic phenomenon: the "play-to-earn" video game. "We believe in a world future where work and play become one," it said in a mission statement on its website. "We believe in empowering our players and giving them economic opportunities. Welcome to our revolution." By last October the company, founded in Ho Chi Minh City, Vietnam, four years ago by a group of Asian, European, and American entrepreneurs, had raised more than $160 million from investors including the venture capital firm Andreessen Horowitz and the crypto-focused firm Paradigm, at a peak valuation of about $3 billion. That same month, Axie Infinity crossed 2 million daily users, according to Sky Mavis.

If you think the entire internet should be rebuilt around the blockchain — the vision now referred to as web3 — Axie provided a useful example of what this looked like in practice. Alexis Ohanian, co-founder of Reddit and an Axie investor, predicted that 90% of the gaming market would be play-to-earn within five years. Gabby Dizon, head of crypto gaming startup Yield Guild Games, describes Axie as a way to create an "investor mindset" among new populations, who would go on to participate in the crypto economy in other ways. In a livestreamed discussion about play-to-earn gaming and crypto on March 2, former Democratic presidential contender Andrew Yang called web3 "an extraordinary opportunity to improve the human condition" and "the biggest weapon against poverty that we have."

By the time Yang made his proclamations the Axie economy was deep in crisis. It had lost about 40% of its daily users, and SLP, which had traded as high as 40 cents, was at 1.8 cents, while AXS, which had once been worth $165, was at $56. To make matters worse, on March 23 hackers robbed Sky Mavis of what at the time was roughly $620 million in cryptocurrencies. Then in May the bottom fell out of the entire crypto market. AXS dropped below $20, and SLP settled in at just over half a penny. Instead of illustrating web3's utopian potential, Axie looked like validation for crypto skeptics who believe web3 is a vision that investors and early adopters sell people to get them to pour money into sketchy financial instruments while hackers prey on everyone involved.

The article does credit the company for building its own blockchain (Ronin) to provide cheaper and faster NFT transactions. "Purists might have taken issue with the decision to abandon the core blockchain precept of decentralization, but on the other hand, the game actually worked."

But the article also chronicles a fast succession of highs and lows:
  • "In Axie's biggest market, the Philippines, the average daily earnings from May to October 2021 for all but the lowest-ranked players were above minimum wage, according to the gaming research and consulting firm Naavik."
  • Axie raised $150 million to reimburse victims of the breach and repair its infrastructure. "But nearly two months later the systems compromised during the hack still weren't up and running, and the executives were vague about when everything would be repaired. (A company spokesperson said on June 3 that this could happen by midmonth, pending the results of an external audit....):
  • Days after the breach it launched Axie: Origin, a new alternate version with better graphics/gameplay — and without a cryptocurrency element.
  • About 75% of the 39-year-old gamer's co-players have "largely" stopped playing the game. "But at least one was sufficiently seduced by Axie's potential to take a significant loan to buy AXS tokens, which he saw as a way to hedge against inflation of the Argentine peso. The local currency has indeed lost value since he took out the loan, but not nearly as much as AXS."

Thanks to long-time Slashdot reader Parker Lewis for sharing the article


AMD

Apple's New MetalFX Upscaling System Will Compete With AMD FSR, Nvidia DLSS (arstechnica.com) 44

At this year's WWDC, Apple announced a surprising new system coming to its Metal 3 gaming API that may sound familiar to PC gamers: MetalFX Upscaling. Ars Technica reports: The system will leverage Apple's custom silicon to reconstruct video game graphics using lower-resolution source images so that games can run more efficiently at lower resolutions while looking higher-res. This "temporal reconstruction" system sounds similar to existing offerings from AMD (FidelityFX Super Resolution 2.0) and Nvidia (Deep Learning Super-Sampling), along with an upcoming "XeSS" system from Intel. Based on how the system is described, it will more closely resemble AMD's system, since Apple has yet to announce a way for MetalFX Upscaling to leverage its custom-made "Neural Engine" system.

By announcing this functionality for some of the world's most popular processors, Apple is arguably letting more game developers build their games and engines with image reconstruction -- even if MetalFX Upscaling isn't open source, unlike AMD's FSR 2.0 system. Still, these image reconstruction systems typically have temporal anti-aliasing (TAA) in common. So long as game devs keep that kind of anti-aliasing in mind with their games and engines, they'll be more likely to take advantage and thus run more efficiently on a wide range of consoles, computers, and smartphones.
The report notes that Metal 3 also includes "a new 'resource-loading' API designed to streamline asset-loading processes in video games." The same Metal 3 API benefits will also come to iPadOS 16 later this year.
Apple

Apple's Finally Making the iPad More Like a Mac (For Multitasking, at Least) (cnet.com) 15

Apple brought its iPad tablet a bit closer to the Mac computers in spirit on Monday at WWDC 2022, announcing new features for its iPadOS 16 software that add better multitasking features. From a report: The new changes to the iPad represent another key shift to the device, aiming to advance the "pro" capabilities of Apple's tablets. While Apple's added to the power and capabilities of its iPads, the software has been criticized by many reviewers, including us at CNET, for not offering enough functionality. [...] Apple also has a collaborative workspace app called Freeform, coming later this year, that will work like a giant whiteboard. Invited collaborators could can start adding stuff at the same time.

iPadOS 16 is also aiming to make better use of more advanced iPads that feature Apple's M1 chip. Metal 3 promises better graphics, but Apple's also aiming to add more desktop-like features in apps: Some will have customizable toolbars, and the Files app looks like it's finally getting a little more versatile for file management. M1 iPads are getting display scaling to create an effectively larger-feeling display, allowing more app screen space (but with smaller text and images). There's also free-form window resizing, along with external display support. Both features have been overdue on iPadOS. Stage Manager, a MacOS feature that's coming later this year, is also on iPadOS. The result looks to be windows that can overlap and be different sizes, just like a Mac.

Operating Systems

Older iPads May Soon Be Able To Run Linux (arstechnica.com) 47

Older iPads with the Apple A7- and A8-based chips may soon be able to run Linux. "Developer Konrad Dybcio and a Linux enthusiast going by "quaack723" have collaborated to get Linux kernel version 5.18 booting on an old iPad Air 2, a major feat for a device that was designed to never run any operating system other than Apple's," reports Ars Technica. From the report: The project appears to use an Alpine Linux-based distribution called "postmarketOS," a relatively small but actively developed distribution made primarily for Android devices. Dybcio used a "checkm8" hashtag in his initial tweet about the project, strongly implying that they used the "Checkm8" bootrom exploit published back in 2019 to access the hardware. For now, the developers only have Linux running on some older iPad hardware using A7 and A8-based chips -- this includes the iPad Air, iPad Air 2, and a few generations of iPad mini. But subsequent tweets imply that it will be possible to get Linux up and running on any device with an A7 or A8 in it, including the iPhone 5S and the original HomePod.

Development work on this latest Linux-on-iDevices effort is still in its early days. The photos that the developers shared both show a basic boot process that fails because it can't mount a filesystem, and Dybcio notes that basic things like USB and Bluetooth support aren't working. Getting networking, audio, and graphics acceleration all working properly will also be a tall order. But being able to boot Linux at all could draw the attention of other developers who want to help the project.

Compared to modern hardware with an Apple M1 chip, A7 and A8-powered devices wouldn't be great as general-purpose Linux machines. While impressive at the time, their CPUs and GPUs are considerably slower than modern Apple devices, and they all shipped with either 1GB or 2GB of RAM. But their performance still stacks up well next to the slow processors in devices like the Raspberry Pi 4, and most (though not all) A7 and A8 hardware has stopped getting new iOS and iPadOS updates from Apple at this point; Linux support could give some of these devices a second life as retro game consoles, simple home servers, or other things that low-power Arm hardware is good for.
Further reading: Linux For M1 Macs? First Alpha Release Announced for Asahi Linux
HP

HP Dev One Laptop Running System76's Ubuntu Linux-based Pop!_OS Now Available (betanews.com) 54

An anonymous reader shares a report: Last month, the open source community was abuzz with excitement following a shocking announcement from System76 that HP was planning to release a laptop running the Pop!_OS operating system. This was significant for several reasons, but most importantly, it was a huge win for Linux users as yet another hardware option was becoming available. Best of all, HP employees have been trained by System76 to offer high-quality customer support. If you aren't aware, System76 support is legendary.

At the time of the announcement, details about the hardware were a bit scarce, but I am happy to report we now have full system specifications for the 14-inch HP Dev One laptop. Most interestingly, there is only one configuration to be had. The developer-focused computer is powered by an octa-core AMD Ryzen 7 PRO 5850U APU which features integrated Radeon graphics. The notebook comes with 16GB RAM and 1TB of NVMe storage, both of which can be user-upgraded later if you choose.
The laptop is priced at $1,099.
Graphics

Linux 5.19 Adds 500K Lines of New Graphics Driver Code (phoronix.com) 79

UnknowingFool writes: The current Linux kernel in development, 5.19, added 495,793 new lines of code for graphic driver updates. David Airlie sent in the new lines as part of Direct Rendering Manager (DRM) subsystem of Linux. The majority of additions were for AMD's RDNA and CDNA platforms but Intel also submitted changes for their DG2 graphics as well. Updates also came from Qualcomm and MediaTek for their GPU offerings.

Slashdot Top Deals