


Gtk 5 Might Drop X.11 Support, Says GNOME Dev (theregister.com) 145
Don't panic: Gtk 5 is not imminent. Gtk is a well-established toolkit, originally designed for the GIMP bitmap editing program back in 1998. Gtk 4 arrived relatively recently, shortly before the release of GNOME 40 in 2021. GNOME 40 has new user-interface guidelines, and as a part of this, Gtk 4 builds GNOME's Adwaita theme into the toolkit by means of the new libadwaita library, which is breaking the appearance of some existing apps.
Also, to be fair, as we recently covered, the X window system is very old now and isn't seeing major changes, although new releases of parts of it do still happen. This discussion is almost certain to get wildly contentious, and the thread on Gitlab has been closed to further comments for now. If this idea gains traction, one likely outcome might well be a fork of Gtk, just as happened when GNOME 3 came out. [...] A lot of the features of the current version, X.11, are no longer used or relevant to most users. Even so, X.12 is barely even in the planning stages yet.

Smart Contact Lens Prototype Puts a Micro LED Display On Top of the Eye (arstechnica.com) 37
At the heart of the lens is an Arm M0 processor and a Micro LED display with 14,000 pixels per inch. It's just 0.02 inches (0.5 mm) in diameter with a 1.8-micron pixel pitch. Perkins claimed it's the "smallest and densest display ever created for dynamic content." Developing the contact overall included a focus on physics and electronics miniaturization, Perkins wrote. Mojo Lens developed its power management system with "medical-grade micro-batteries" and a proprietary power management integrated circuit. The Mojo Lens also uses a custom-configured magnetometer (CNET noted this drives the compass Perkins saw), accelerometer, and gyroscope for tracking. [...]
A contact lens sounds like it has the potential to be even more discreet than AR headgear posing as regular Ray-Bans. But the current prototype uses a "relay accessory," as Mojo Vision's rep put it, worn around the neck. It includes a processor, GPU, and 5 GHz radio for sending and receiving data to and from the lens. According to CNET, the accessory also sends information "back to computers that track the eye movement data for research." Perkins' blog said this tech required custom ASIC designs. [...] The current prototype also uses a hat with an integrated antenna for easier connecting, CNET reported; though, we'd expect this to be omitted from a final product. "There's no firm release date for the Mojo Lens, which could be the first AR contact lens to reach consumers," adds Ars. "Near-term goals include getting potential partners, investors, and journalists to try the smart lens."

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296
I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....
Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....
All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.
This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....
There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.
Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.
But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
- Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
- A lot of things like security were not taken in consideration
- It's a different computing environment.... multi tasking, internet, GPUs
- In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
- A lot of code is due to to software interacting one with another, compatibility with standards
- Shiny technology like microservices allow scaling, heterogenous integration
So who's right and who's wrong? Leave your own best answers in the comments.
And are today's programmers leaving too much code bloat?

Facebook Unveils Future 'Near Retina-Quality' VR Headsets (theverge.com) 47
Yet to be released headsets have features which have been sorely missing previously: near-retina-quality image offering about 2.5 times the resolution of the Quest 2's (sort of) 1832 x 1920 pixels per eye, letting users read the 20/20 vision line on an eye chart, high dynamic range (HDR) lighting with 20,000 nits of brightness and eye tracking. "The goal of all this work is to help us identify which technical paths are going to allow us to make meaningful enough improvements that we can start approaching visual realism." says the Meta CEO.

AMD Details RDNA 3 Graphics, Zen 4 Performance and Phoenix Point Laptop Products (hothardware.com) 16
Meanwhile, AMD's Zen 4 is set to be the "world's first 5nm CPU," arriving later this year with an 8 to 10 percent instructrions per clock lift and greater than 15 percent single-threaded performance gain. Zen 4 will also support DDR5, AVX-512 extensions for AI workloads and a massive 125 percent increase in memory bandwidth. AMD is claiming a 35% multithreaded performance lift for Zen 4.
And, its Phoenix Point laptop platform SoC will be both Zen 4 and RNDA 3 infused. This is a first for AMD, since typically its laptop product's integrated graphics trail the company's current-gen GPU architecture by at least a generation. Phoenix point is set to arrive likely in the first half of 2023.

'A Billion-Dollar Crypto Gaming Startup Promised Riches and Delivered Disaster' (bloomberg.com) 67
Bloomberg pays a visit to the NFT-based game Axie Infinity with a 39-year-old player who's spent $40,000 there since last August — back when you could actually triple your money in a week. ("I was actually hoping that it could become my full-time job," he says.) The reason this is possible — or at least it seemed possible for a few weird months last year — is that Axie is tied to crypto markets. Players get a few Smooth Love Potion (SLP) tokens for each game they win and can earn another cryptocurrency, Axie Infinity Shards (AXS), in larger tournaments. The characters, themselves known as Axies, are nonfungible tokens, or NFTs, whose ownership is tracked on a blockchain, allowing them to be traded like a cryptocurrency as well....
Axie's creator, a startup called Sky Mavis Inc., heralded all this as a new kind of economic phenomenon: the "play-to-earn" video game. "We believe in a world future where work and play become one," it said in a mission statement on its website. "We believe in empowering our players and giving them economic opportunities. Welcome to our revolution." By last October the company, founded in Ho Chi Minh City, Vietnam, four years ago by a group of Asian, European, and American entrepreneurs, had raised more than $160 million from investors including the venture capital firm Andreessen Horowitz and the crypto-focused firm Paradigm, at a peak valuation of about $3 billion. That same month, Axie Infinity crossed 2 million daily users, according to Sky Mavis.
If you think the entire internet should be rebuilt around the blockchain — the vision now referred to as web3 — Axie provided a useful example of what this looked like in practice. Alexis Ohanian, co-founder of Reddit and an Axie investor, predicted that 90% of the gaming market would be play-to-earn within five years. Gabby Dizon, head of crypto gaming startup Yield Guild Games, describes Axie as a way to create an "investor mindset" among new populations, who would go on to participate in the crypto economy in other ways. In a livestreamed discussion about play-to-earn gaming and crypto on March 2, former Democratic presidential contender Andrew Yang called web3 "an extraordinary opportunity to improve the human condition" and "the biggest weapon against poverty that we have."
By the time Yang made his proclamations the Axie economy was deep in crisis. It had lost about 40% of its daily users, and SLP, which had traded as high as 40 cents, was at 1.8 cents, while AXS, which had once been worth $165, was at $56. To make matters worse, on March 23 hackers robbed Sky Mavis of what at the time was roughly $620 million in cryptocurrencies. Then in May the bottom fell out of the entire crypto market. AXS dropped below $20, and SLP settled in at just over half a penny. Instead of illustrating web3's utopian potential, Axie looked like validation for crypto skeptics who believe web3 is a vision that investors and early adopters sell people to get them to pour money into sketchy financial instruments while hackers prey on everyone involved.
The article does credit the company for building its own blockchain (Ronin) to provide cheaper and faster NFT transactions. "Purists might have taken issue with the decision to abandon the core blockchain precept of decentralization, but on the other hand, the game actually worked."
But the article also chronicles a fast succession of highs and lows:
- "In Axie's biggest market, the Philippines, the average daily earnings from May to October 2021 for all but the lowest-ranked players were above minimum wage, according to the gaming research and consulting firm Naavik."
- Axie raised $150 million to reimburse victims of the breach and repair its infrastructure. "But nearly two months later the systems compromised during the hack still weren't up and running, and the executives were vague about when everything would be repaired. (A company spokesperson said on June 3 that this could happen by midmonth, pending the results of an external audit....):
- Days after the breach it launched Axie: Origin, a new alternate version with better graphics/gameplay — and without a cryptocurrency element.
- About 75% of the 39-year-old gamer's co-players have "largely" stopped playing the game. "But at least one was sufficiently seduced by Axie's potential to take a significant loan to buy AXS tokens, which he saw as a way to hedge against inflation of the Argentine peso. The local currency has indeed lost value since he took out the loan, but not nearly as much as AXS."
Thanks to long-time Slashdot reader Parker Lewis for sharing the article

Apple's New MetalFX Upscaling System Will Compete With AMD FSR, Nvidia DLSS (arstechnica.com) 44
By announcing this functionality for some of the world's most popular processors, Apple is arguably letting more game developers build their games and engines with image reconstruction -- even if MetalFX Upscaling isn't open source, unlike AMD's FSR 2.0 system. Still, these image reconstruction systems typically have temporal anti-aliasing (TAA) in common. So long as game devs keep that kind of anti-aliasing in mind with their games and engines, they'll be more likely to take advantage and thus run more efficiently on a wide range of consoles, computers, and smartphones. The report notes that Metal 3 also includes "a new 'resource-loading' API designed to streamline asset-loading processes in video games." The same Metal 3 API benefits will also come to iPadOS 16 later this year.

Apple's Finally Making the iPad More Like a Mac (For Multitasking, at Least) (cnet.com) 15
iPadOS 16 is also aiming to make better use of more advanced iPads that feature Apple's M1 chip. Metal 3 promises better graphics, but Apple's also aiming to add more desktop-like features in apps: Some will have customizable toolbars, and the Files app looks like it's finally getting a little more versatile for file management. M1 iPads are getting display scaling to create an effectively larger-feeling display, allowing more app screen space (but with smaller text and images). There's also free-form window resizing, along with external display support. Both features have been overdue on iPadOS. Stage Manager, a MacOS feature that's coming later this year, is also on iPadOS. The result looks to be windows that can overlap and be different sizes, just like a Mac.

Older iPads May Soon Be Able To Run Linux (arstechnica.com) 47
Development work on this latest Linux-on-iDevices effort is still in its early days. The photos that the developers shared both show a basic boot process that fails because it can't mount a filesystem, and Dybcio notes that basic things like USB and Bluetooth support aren't working. Getting networking, audio, and graphics acceleration all working properly will also be a tall order. But being able to boot Linux at all could draw the attention of other developers who want to help the project.
Compared to modern hardware with an Apple M1 chip, A7 and A8-powered devices wouldn't be great as general-purpose Linux machines. While impressive at the time, their CPUs and GPUs are considerably slower than modern Apple devices, and they all shipped with either 1GB or 2GB of RAM. But their performance still stacks up well next to the slow processors in devices like the Raspberry Pi 4, and most (though not all) A7 and A8 hardware has stopped getting new iOS and iPadOS updates from Apple at this point; Linux support could give some of these devices a second life as retro game consoles, simple home servers, or other things that low-power Arm hardware is good for. Further reading: Linux For M1 Macs? First Alpha Release Announced for Asahi Linux

HP Dev One Laptop Running System76's Ubuntu Linux-based Pop!_OS Now Available (betanews.com) 54
At the time of the announcement, details about the hardware were a bit scarce, but I am happy to report we now have full system specifications for the 14-inch HP Dev One laptop. Most interestingly, there is only one configuration to be had. The developer-focused computer is powered by an octa-core AMD Ryzen 7 PRO 5850U APU which features integrated Radeon graphics. The notebook comes with 16GB RAM and 1TB of NVMe storage, both of which can be user-upgraded later if you choose. The laptop is priced at $1,099.

Linux 5.19 Adds 500K Lines of New Graphics Driver Code (phoronix.com) 79

Samsung Allegedly Assembling a 'Dream Team' To Take Down Apple's M1 In 2025 (neowin.net) 47

Knoxville Researcher Wins A.M. Turing Award (knoxnews.com) 18
A local computer scientist and professor at the University of Tennessee at Knoxville has been named an A.M. Turing Award winner by the Association for Computing Machinery. The Turing Award is often referred to as the "Nobel Prize of computer science." It carries a million dollar prize.
"Oh, it was a complete shock. I'm still recovering from it," Jack Dongarra told Knox News with a warm laugh. "It's nice to see the work being recognized in this way but it couldn't have happened without the support and contribution of many people over time." Chances are Dongarra's work has touched your life, even if you don't know it. If you've ever used a speech recognition program or looked at a weather forecast, you're using technology that relies on Dongarra's software libraries. Dongarra has held a joint appointment at the University of Tennessee and Oak Ridge National Laboratory since 1989. While he doesn't have a household name, his foundational work in computer science has undergirded the development of high-performance computers over the course of his 40-year career...
Dongarra developed software to allow computers to use multiple processors simultaneously, and this is basically how all computer systems work today. Your laptop has multiple processing cores and might have an additional graphics processing core. Many phones have multiple processing cores. "He's continually rethought how to exploit today's computer architectures and done so very effectively," said Nicholas Higham a Royal Society research professor of applied mathematics at the University of Manchester. "He's come up with ideas so that we can get the very best out of these machines." Dongarra also developed software that allowed computers with different hardware and operating systems to run in parallel, networking distant machines as a single computation device. This lets people make more powerful computers out of many smaller devices which helped develop cloud computing, running high-end applications over the internet. Most of Dongarra's work was published open-source through a project called Netlib.
Congratulations!

Hollywood Designer 6.0 Released: Now a 'Full-Blown Multimedia Authoring System' (amigans.net) 20
Long-time Slashdot reader Mike Bouma explains: Airsoft Softwair has released Hollywood Designer 6.0, "a full-blown multimedia authoring system that runs on top of Hollywood and can be used to create all sorts of multimedia-based applications, for example presentations, slide shows, games, and applications. Thanks to Hollywood, all multimedia applications created using Hollywood Designer can be exported as stand-alone executables for the following systems: AmigaOS3, AmigaOS4, WarpOS, MorphOS, AROS, Windows, macOS, Linux, Android, and iOS."
The current version of Hollywood is v9.1 with various updated add-ons. To see earlier versions of Hollywood 9.0 & Designer 5.0 in action have a look at Kas1e's short demonstration on AmigaOS4 / AmigaOne X5000.

Boeing's Starliner Docks with International Space Station. Hatch Opening Now (nasa.gov) 59
And right now, Boeing is beginning the official hatch-opening ceremon, in which the space station astronauts already on the ISS "open the hatch to the vehicle and retrieve some cargo that's packed inside," explains the Verge: NASA tasked Boeing with conducting an uncrewed flight demonstration of Starliner to show that the capsule can hit all of the major milestones it'll need to hit when it is carrying passengers... This mission is called OFT-2 since it's technically a do-over of a mission that Boeing attempted back in 2019, called OFT. During that flight, Starliner launched to space as planned, but a software glitch prevented the capsule from getting in the right orbit it needed to reach to rendezvous with the ISS. Boeing had to bring the vehicle home early, and the company never demonstrated Starliner's ability to dock with the ISS....
Using a series of sensors, the capsule autonomously guided itself onto an open docking port on the space station.... Docking occurred a little over an hour behind schedule, due to some issues with Starliner's graphics and docking ring, which were resolved ahead of the docking....
[Thursday] At 6:54PM ET, Starliner successfully launched to space on top of an Atlas V rocket, built and operated by the United Launch Alliance. Once Starliner separated from the Atlas V, it had to fire its own thrusters to insert itself into the proper orbit for reaching the space station. However, after that maneuver took place, Boeing and NASA revealed that two of the 12 thrusters Starliner uses for the procedure failed and cut off too early. The capsule's flight control system was able to kick in and rerouted to a working thruster, which helped get Starliner into a stable orbit.... Today, Boeing revealed that a drop in chamber pressure had caused the early cutoff of the thruster, but that system behaved normally during follow-up burns of the thrusters. And with redundancies on the spacecraft, the issue "does not pose a risk to the rest of the flight test," according to Boeing.
Boeing also noted today that the Starliner team is investigating some weird behavior of a "thermal cooling loop" but said that temperatures are stable on the spacecraft.
From the space station, NASA astronaut Bob Hines said the achievement "marks a great milestone towards providing additional commercial access to low Earth orbit, sustaining the ISS and enabling NASA's goal of returning humans to the Moon and eventually to Mars.
"Great accomplishments in human spaceflight are long remembered by history. Today will be no different."
A long-time Slashdot reader shares this schedule (EST): 5/20, 3:30 pm — Starliner docking with ISS.
5/21, 11:30 am — Safety checks completed. Hatches opened.
5/24, 12:00 pm — Starliner loading completed. Hatched closed.
5/25, 2:00 pm — Starliner undocking from ISS.
5/25, 5:45 pm — Coverage of Starliner landing begins.
Again, the streams will be broadcast at NASA Television. I don't know about any of you, but I know what I'm doing this weekend.

EA Plans Free Mobile 'Lord of the Rings' Game (cnet.com) 35
"The team is filled with fans of The Lord of the Rings and The Hobbit and each day they bring their tremendous passion and talents together to deliver an authentic experience for players," Malachi Boyle, vice president of mobile RPG for Electronic Arts, said in a statement. "The combination of high-fidelity graphics, cinematic animations, and stylized art immerses players in the fantasy of Middle-earth where they'll go head-to-head with their favorite characters."

Report: 'Nvidia's LHR Limiter Has Fallen, But Gamers Shouldn't Worry' (tomshardware.com) 46
Graphics card pricing has been plummeting, and we're starting to see better availability at retailers, with some GPUs selling at or below Manufacturer Suggested Retail Price. So QuickMiner's arrival shouldn't influence the current state of the graphics market unless big corporations want to buy out everything in sight for the last push before Ethereum's transition to Proof-of-Stake (PoS), often referred to as "The Merge," is complete. We see that as unlikely, considering current profitability even on a 3080 Ti sits at around $3.50 per day and would still need nearly a year to break even at current rates. Initially scheduled for June, The Merge won't finalize until "the few months after," as Ethereum developer Tim Beiko has expressed on Twitter.
It will be interesting to see if Nvidia responds to this with updated drivers or implements LHRv3 in the remaining GPUs. However, it's perhaps not worth the effort at this point, and all existing LHRv2 and earlier cards can just stay on current drivers for optimized mining performance.

Nvidia Transitioning To Official, Open-Source Linux GPU Kernel Driver (phoronix.com) 102
