Technology

Samsung Develops GDDR6 DRAM With 24Gbps Speed for Graphics Cards (zdnet.com) 20

Samsung said on Thursday that it has developed a new GDDR6 (graphics double data rate) DRAM with a data transfer rate of 24 gigabits per second (Gbps). From a report: A premium graphics card that packs the chips will support a data processing rate of up to 1.1 terabytes (TB), equivalent to processing 275 movies in Full HD resolution within a second, the South Korean tech giant said. Samsung said the DRAM was comprised of 16Gb chips using its third-generation 10nm process node, which also incorporates extreme ultraviolet (EUV) lithography during their production. The company also applied high-k metal gates, or the use of metals besides silicon dioxide to make the gate hold more charge, on the DRAM. Samsung said this allowed its latest DRAM to operate at a rate over 30% faster than its 18Gbps GGDR6 DRAM predecessor.
Open Source

Gtk 5 Might Drop X.11 Support, Says GNOME Dev (theregister.com) 145

One of the GNOME developers has suggested that the next major release of Gtk could drop support for the X window system. The Register reports: Emmanuele Bassi opened a discussion last week on the GNOME project's Gitlab instance that asked whether the developers could drop X.11 support in the next release of Gtk. At this point, it is only a suggestion, but if it gets traction, this could significantly accelerate the move to the Wayland display server and the end of X.11.

Don't panic: Gtk 5 is not imminent. Gtk is a well-established toolkit, originally designed for the GIMP bitmap editing program back in 1998. Gtk 4 arrived relatively recently, shortly before the release of GNOME 40 in 2021. GNOME 40 has new user-interface guidelines, and as a part of this, Gtk 4 builds GNOME's Adwaita theme into the toolkit by means of the new libadwaita library, which is breaking the appearance of some existing apps.

Also, to be fair, as we recently covered, the X window system is very old now and isn't seeing major changes, although new releases of parts of it do still happen. This discussion is almost certain to get wildly contentious, and the thread on Gitlab has been closed to further comments for now. If this idea gains traction, one likely outcome might well be a fork of Gtk, just as happened when GNOME 3 came out. [...] A lot of the features of the current version, X.11, are no longer used or relevant to most users. Even so, X.12 is barely even in the planning stages yet.

Medicine

Smart Contact Lens Prototype Puts a Micro LED Display On Top of the Eye (arstechnica.com) 37

An anonymous reader quotes a report from Ars Technica: Since 2015, a California-based company called Mojo Vision has been developing smart contact lenses. Like smart glasses, the idea is to put helpful AR graphics in front of your eyes to help accomplish daily tasks. Now, a functioning prototype brings us closer to seeing a final product. In a blog post this week, Drew Perkins, the CEO of Mojo Vision, said he was the first to have an "on-eye demonstration of a feature-complete augmented reality smart contact lens." In an interview with CNET, he said he's been wearing only one contact at a time for hour-long durations. Eventually, Mojo Vision would like users to be able to wear two Mojo Lens simultaneously and create 3D visual overlays, the publication said. According to his blog, the CEO could see a compass through the contact and an on-screen teleprompter with a quote written on it. He also recalled viewing a green, monochromatic image of Albert Einstein to CNET.

At the heart of the lens is an Arm M0 processor and a Micro LED display with 14,000 pixels per inch. It's just 0.02 inches (0.5 mm) in diameter with a 1.8-micron pixel pitch. Perkins claimed it's the "smallest and densest display ever created for dynamic content." Developing the contact overall included a focus on physics and electronics miniaturization, Perkins wrote. Mojo Lens developed its power management system with "medical-grade micro-batteries" and a proprietary power management integrated circuit. The Mojo Lens also uses a custom-configured magnetometer (CNET noted this drives the compass Perkins saw), accelerometer, and gyroscope for tracking. [...]

A contact lens sounds like it has the potential to be even more discreet than AR headgear posing as regular Ray-Bans. But the current prototype uses a "relay accessory," as Mojo Vision's rep put it, worn around the neck. It includes a processor, GPU, and 5 GHz radio for sending and receiving data to and from the lens. According to CNET, the accessory also sends information "back to computers that track the eye movement data for research." Perkins' blog said this tech required custom ASIC designs. [...] The current prototype also uses a hat with an integrated antenna for easier connecting, CNET reported; though, we'd expect this to be omitted from a final product.
"There's no firm release date for the Mojo Lens, which could be the first AR contact lens to reach consumers," adds Ars. "Near-term goals include getting potential partners, investors, and journalists to try the smart lens."
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Facebook

Facebook Unveils Future 'Near Retina-Quality' VR Headsets (theverge.com) 47

Artem S. Tashkinov writes: Meta's Reality Labs division has revealed new prototypes in its roadmap toward lightweight, hyper-realistic virtual reality graphics. The breakthroughs remain far from consumer-ready, but the designs -- codenamed Butterscotch, Starburst, Holocake 2, and Mirror Lake -- could add up to a slender, brightly lit headset that supports finer detail than its current Quest 2 display.

Yet to be released headsets have features which have been sorely missing previously: near-retina-quality image offering about 2.5 times the resolution of the Quest 2's (sort of) 1832 x 1920 pixels per eye, letting users read the 20/20 vision line on an eye chart, high dynamic range (HDR) lighting with 20,000 nits of brightness and eye tracking. "The goal of all this work is to help us identify which technical paths are going to allow us to make meaningful enough improvements that we can start approaching visual realism." says the Meta CEO.

AMD

AMD Details RDNA 3 Graphics, Zen 4 Performance and Phoenix Point Laptop Products (hothardware.com) 16

Slashdot reader MojoKid writes: AMD unveiled new details of its technology roadmap Thursday at its 2022 Financial Analyst Day. Chief among them were disclosures on the company's next-gen RDNA 3 GPU architecture, Zen 4 CPU architecture and Phoenix Point laptop SoC. AMD's new RDNA 3 GPU architecture for Radeon graphic cards and mobile will be a chiplet-based design, much like the company's Ryzen CPU offering. AMD also confirmed that RDNA 3 GPUs would be fabricated on a 5nm process, likely TSMC N5. The company continued to note that an "optimized graphics pipeline" will enable yet higher clock rates, while the GPU's "rearchitected compute unit" will have ray-tracing performance improvements over RDNA 2 as well. AMD says that RDNA 3 GPUs are coming later this year, with RDNA 4 arriving likely in late 2023.

Meanwhile, AMD's Zen 4 is set to be the "world's first 5nm CPU," arriving later this year with an 8 to 10 percent instructrions per clock lift and greater than 15 percent single-threaded performance gain. Zen 4 will also support DDR5, AVX-512 extensions for AI workloads and a massive 125 percent increase in memory bandwidth. AMD is claiming a 35% multithreaded performance lift for Zen 4.

And, its Phoenix Point laptop platform SoC will be both Zen 4 and RNDA 3 infused. This is a first for AMD, since typically its laptop product's integrated graphics trail the company's current-gen GPU architecture by at least a generation. Phoenix point is set to arrive likely in the first half of 2023.

Games

'A Billion-Dollar Crypto Gaming Startup Promised Riches and Delivered Disaster' (bloomberg.com) 67

"Even many Axie regulars say it's not much fun, but that hasn't stopped people from dedicating hours to researching strategies, haunting Axie-themed Discord channels and Reddit forums, and paying for specialized software that helps them build stronger teams..."

Bloomberg pays a visit to the NFT-based game Axie Infinity with a 39-year-old player who's spent $40,000 there since last August — back when you could actually triple your money in a week. ("I was actually hoping that it could become my full-time job," he says.) The reason this is possible — or at least it seemed possible for a few weird months last year — is that Axie is tied to crypto markets. Players get a few Smooth Love Potion (SLP) tokens for each game they win and can earn another cryptocurrency, Axie Infinity Shards (AXS), in larger tournaments. The characters, themselves known as Axies, are nonfungible tokens, or NFTs, whose ownership is tracked on a blockchain, allowing them to be traded like a cryptocurrency as well....

Axie's creator, a startup called Sky Mavis Inc., heralded all this as a new kind of economic phenomenon: the "play-to-earn" video game. "We believe in a world future where work and play become one," it said in a mission statement on its website. "We believe in empowering our players and giving them economic opportunities. Welcome to our revolution." By last October the company, founded in Ho Chi Minh City, Vietnam, four years ago by a group of Asian, European, and American entrepreneurs, had raised more than $160 million from investors including the venture capital firm Andreessen Horowitz and the crypto-focused firm Paradigm, at a peak valuation of about $3 billion. That same month, Axie Infinity crossed 2 million daily users, according to Sky Mavis.

If you think the entire internet should be rebuilt around the blockchain — the vision now referred to as web3 — Axie provided a useful example of what this looked like in practice. Alexis Ohanian, co-founder of Reddit and an Axie investor, predicted that 90% of the gaming market would be play-to-earn within five years. Gabby Dizon, head of crypto gaming startup Yield Guild Games, describes Axie as a way to create an "investor mindset" among new populations, who would go on to participate in the crypto economy in other ways. In a livestreamed discussion about play-to-earn gaming and crypto on March 2, former Democratic presidential contender Andrew Yang called web3 "an extraordinary opportunity to improve the human condition" and "the biggest weapon against poverty that we have."

By the time Yang made his proclamations the Axie economy was deep in crisis. It had lost about 40% of its daily users, and SLP, which had traded as high as 40 cents, was at 1.8 cents, while AXS, which had once been worth $165, was at $56. To make matters worse, on March 23 hackers robbed Sky Mavis of what at the time was roughly $620 million in cryptocurrencies. Then in May the bottom fell out of the entire crypto market. AXS dropped below $20, and SLP settled in at just over half a penny. Instead of illustrating web3's utopian potential, Axie looked like validation for crypto skeptics who believe web3 is a vision that investors and early adopters sell people to get them to pour money into sketchy financial instruments while hackers prey on everyone involved.

The article does credit the company for building its own blockchain (Ronin) to provide cheaper and faster NFT transactions. "Purists might have taken issue with the decision to abandon the core blockchain precept of decentralization, but on the other hand, the game actually worked."

But the article also chronicles a fast succession of highs and lows:
  • "In Axie's biggest market, the Philippines, the average daily earnings from May to October 2021 for all but the lowest-ranked players were above minimum wage, according to the gaming research and consulting firm Naavik."
  • Axie raised $150 million to reimburse victims of the breach and repair its infrastructure. "But nearly two months later the systems compromised during the hack still weren't up and running, and the executives were vague about when everything would be repaired. (A company spokesperson said on June 3 that this could happen by midmonth, pending the results of an external audit....):
  • Days after the breach it launched Axie: Origin, a new alternate version with better graphics/gameplay — and without a cryptocurrency element.
  • About 75% of the 39-year-old gamer's co-players have "largely" stopped playing the game. "But at least one was sufficiently seduced by Axie's potential to take a significant loan to buy AXS tokens, which he saw as a way to hedge against inflation of the Argentine peso. The local currency has indeed lost value since he took out the loan, but not nearly as much as AXS."

Thanks to long-time Slashdot reader Parker Lewis for sharing the article


AMD

Apple's New MetalFX Upscaling System Will Compete With AMD FSR, Nvidia DLSS (arstechnica.com) 44

At this year's WWDC, Apple announced a surprising new system coming to its Metal 3 gaming API that may sound familiar to PC gamers: MetalFX Upscaling. Ars Technica reports: The system will leverage Apple's custom silicon to reconstruct video game graphics using lower-resolution source images so that games can run more efficiently at lower resolutions while looking higher-res. This "temporal reconstruction" system sounds similar to existing offerings from AMD (FidelityFX Super Resolution 2.0) and Nvidia (Deep Learning Super-Sampling), along with an upcoming "XeSS" system from Intel. Based on how the system is described, it will more closely resemble AMD's system, since Apple has yet to announce a way for MetalFX Upscaling to leverage its custom-made "Neural Engine" system.

By announcing this functionality for some of the world's most popular processors, Apple is arguably letting more game developers build their games and engines with image reconstruction -- even if MetalFX Upscaling isn't open source, unlike AMD's FSR 2.0 system. Still, these image reconstruction systems typically have temporal anti-aliasing (TAA) in common. So long as game devs keep that kind of anti-aliasing in mind with their games and engines, they'll be more likely to take advantage and thus run more efficiently on a wide range of consoles, computers, and smartphones.
The report notes that Metal 3 also includes "a new 'resource-loading' API designed to streamline asset-loading processes in video games." The same Metal 3 API benefits will also come to iPadOS 16 later this year.
Apple

Apple's Finally Making the iPad More Like a Mac (For Multitasking, at Least) (cnet.com) 15

Apple brought its iPad tablet a bit closer to the Mac computers in spirit on Monday at WWDC 2022, announcing new features for its iPadOS 16 software that add better multitasking features. From a report: The new changes to the iPad represent another key shift to the device, aiming to advance the "pro" capabilities of Apple's tablets. While Apple's added to the power and capabilities of its iPads, the software has been criticized by many reviewers, including us at CNET, for not offering enough functionality. [...] Apple also has a collaborative workspace app called Freeform, coming later this year, that will work like a giant whiteboard. Invited collaborators could can start adding stuff at the same time.

iPadOS 16 is also aiming to make better use of more advanced iPads that feature Apple's M1 chip. Metal 3 promises better graphics, but Apple's also aiming to add more desktop-like features in apps: Some will have customizable toolbars, and the Files app looks like it's finally getting a little more versatile for file management. M1 iPads are getting display scaling to create an effectively larger-feeling display, allowing more app screen space (but with smaller text and images). There's also free-form window resizing, along with external display support. Both features have been overdue on iPadOS. Stage Manager, a MacOS feature that's coming later this year, is also on iPadOS. The result looks to be windows that can overlap and be different sizes, just like a Mac.

Operating Systems

Older iPads May Soon Be Able To Run Linux (arstechnica.com) 47

Older iPads with the Apple A7- and A8-based chips may soon be able to run Linux. "Developer Konrad Dybcio and a Linux enthusiast going by "quaack723" have collaborated to get Linux kernel version 5.18 booting on an old iPad Air 2, a major feat for a device that was designed to never run any operating system other than Apple's," reports Ars Technica. From the report: The project appears to use an Alpine Linux-based distribution called "postmarketOS," a relatively small but actively developed distribution made primarily for Android devices. Dybcio used a "checkm8" hashtag in his initial tweet about the project, strongly implying that they used the "Checkm8" bootrom exploit published back in 2019 to access the hardware. For now, the developers only have Linux running on some older iPad hardware using A7 and A8-based chips -- this includes the iPad Air, iPad Air 2, and a few generations of iPad mini. But subsequent tweets imply that it will be possible to get Linux up and running on any device with an A7 or A8 in it, including the iPhone 5S and the original HomePod.

Development work on this latest Linux-on-iDevices effort is still in its early days. The photos that the developers shared both show a basic boot process that fails because it can't mount a filesystem, and Dybcio notes that basic things like USB and Bluetooth support aren't working. Getting networking, audio, and graphics acceleration all working properly will also be a tall order. But being able to boot Linux at all could draw the attention of other developers who want to help the project.

Compared to modern hardware with an Apple M1 chip, A7 and A8-powered devices wouldn't be great as general-purpose Linux machines. While impressive at the time, their CPUs and GPUs are considerably slower than modern Apple devices, and they all shipped with either 1GB or 2GB of RAM. But their performance still stacks up well next to the slow processors in devices like the Raspberry Pi 4, and most (though not all) A7 and A8 hardware has stopped getting new iOS and iPadOS updates from Apple at this point; Linux support could give some of these devices a second life as retro game consoles, simple home servers, or other things that low-power Arm hardware is good for.
Further reading: Linux For M1 Macs? First Alpha Release Announced for Asahi Linux
HP

HP Dev One Laptop Running System76's Ubuntu Linux-based Pop!_OS Now Available (betanews.com) 54

An anonymous reader shares a report: Last month, the open source community was abuzz with excitement following a shocking announcement from System76 that HP was planning to release a laptop running the Pop!_OS operating system. This was significant for several reasons, but most importantly, it was a huge win for Linux users as yet another hardware option was becoming available. Best of all, HP employees have been trained by System76 to offer high-quality customer support. If you aren't aware, System76 support is legendary.

At the time of the announcement, details about the hardware were a bit scarce, but I am happy to report we now have full system specifications for the 14-inch HP Dev One laptop. Most interestingly, there is only one configuration to be had. The developer-focused computer is powered by an octa-core AMD Ryzen 7 PRO 5850U APU which features integrated Radeon graphics. The notebook comes with 16GB RAM and 1TB of NVMe storage, both of which can be user-upgraded later if you choose.
The laptop is priced at $1,099.
Graphics

Linux 5.19 Adds 500K Lines of New Graphics Driver Code (phoronix.com) 79

UnknowingFool writes: The current Linux kernel in development, 5.19, added 495,793 new lines of code for graphic driver updates. David Airlie sent in the new lines as part of Direct Rendering Manager (DRM) subsystem of Linux. The majority of additions were for AMD's RDNA and CDNA platforms but Intel also submitted changes for their DG2 graphics as well. Updates also came from Qualcomm and MediaTek for their GPU offerings.
Hardware

Samsung Allegedly Assembling a 'Dream Team' To Take Down Apple's M1 In 2025 (neowin.net) 47

Samsung is rumored to be assembling a special task force dubbed "Dream Platform One team" tasked with designing a custom in-house Samsung mobile Application Processor (AP) that can take on Apple Silicon. Neowin reports: It's probably fair to say that Samsung hasn't had the best time with its Exynos offerings when compared against rivals like Qualcomm or Apple. To shake its fortunes up, the company also paired up with AMD for its Exynos 2200 GPU, and results were a mixed bag. Both the AMD RDNA 2 Xclipse 920 graphics and the Exynos 2200 CPU were found to be pretty disappointing in terms of power efficiency as they were not much better than the previous Exynos 2100 offering. In a nutshell, the new CPU was around 5% faster while the AMD graphics was around 17% better, both of which were clearly not enough (via TechAltar on Twitter). However, the company is looking to get real serious and down to business come 2025. The new report coincides with a separate report suggesting that Samsung was working on a custom chipset for its Galaxy S series. The downside is that it's not slated for 2025 and will obviously have to compete against whatever Apple offers at that time.
Technology

Knoxville Researcher Wins A.M. Turing Award (knoxnews.com) 18

schwit1 writes: It's a few weeks old, but ...

A local computer scientist and professor at the University of Tennessee at Knoxville has been named an A.M. Turing Award winner by the Association for Computing Machinery. The Turing Award is often referred to as the "Nobel Prize of computer science." It carries a million dollar prize.

"Oh, it was a complete shock. I'm still recovering from it," Jack Dongarra told Knox News with a warm laugh. "It's nice to see the work being recognized in this way but it couldn't have happened without the support and contribution of many people over time." Chances are Dongarra's work has touched your life, even if you don't know it. If you've ever used a speech recognition program or looked at a weather forecast, you're using technology that relies on Dongarra's software libraries. Dongarra has held a joint appointment at the University of Tennessee and Oak Ridge National Laboratory since 1989. While he doesn't have a household name, his foundational work in computer science has undergirded the development of high-performance computers over the course of his 40-year career...

Dongarra developed software to allow computers to use multiple processors simultaneously, and this is basically how all computer systems work today. Your laptop has multiple processing cores and might have an additional graphics processing core. Many phones have multiple processing cores. "He's continually rethought how to exploit today's computer architectures and done so very effectively," said Nicholas Higham a Royal Society research professor of applied mathematics at the University of Manchester. "He's come up with ideas so that we can get the very best out of these machines." Dongarra also developed software that allowed computers with different hardware and operating systems to run in parallel, networking distant machines as a single computation device. This lets people make more powerful computers out of many smaller devices which helped develop cloud computing, running high-end applications over the internet. Most of Dongarra's work was published open-source through a project called Netlib.

Congratulations!


Upgrades

Hollywood Designer 6.0 Released: Now a 'Full-Blown Multimedia Authoring System' (amigans.net) 20

After nearly 20 years, Hollywood Designer 6.0 is "very stable and mature", write its developers — envisioning both hobbyist and professional users (with its support for modern graphics-editing features like filter effects and vector graphics) in its massive new evolution.

Long-time Slashdot reader Mike Bouma explains: Airsoft Softwair has released Hollywood Designer 6.0, "a full-blown multimedia authoring system that runs on top of Hollywood and can be used to create all sorts of multimedia-based applications, for example presentations, slide shows, games, and applications. Thanks to Hollywood, all multimedia applications created using Hollywood Designer can be exported as stand-alone executables for the following systems: AmigaOS3, AmigaOS4, WarpOS, MorphOS, AROS, Windows, macOS, Linux, Android, and iOS."

The current version of Hollywood is v9.1 with various updated add-ons. To see earlier versions of Hollywood 9.0 & Designer 5.0 in action have a look at Kas1e's short demonstration on AmigaOS4 / AmigaOne X5000.

ISS

Boeing's Starliner Docks with International Space Station. Hatch Opening Now (nasa.gov) 59

Boeing's Starliner successfully docked to the International Space Station Friday night for the first time.

And right now, Boeing is beginning the official hatch-opening ceremon, in which the space station astronauts already on the ISS "open the hatch to the vehicle and retrieve some cargo that's packed inside," explains the Verge: NASA tasked Boeing with conducting an uncrewed flight demonstration of Starliner to show that the capsule can hit all of the major milestones it'll need to hit when it is carrying passengers... This mission is called OFT-2 since it's technically a do-over of a mission that Boeing attempted back in 2019, called OFT. During that flight, Starliner launched to space as planned, but a software glitch prevented the capsule from getting in the right orbit it needed to reach to rendezvous with the ISS. Boeing had to bring the vehicle home early, and the company never demonstrated Starliner's ability to dock with the ISS....

Using a series of sensors, the capsule autonomously guided itself onto an open docking port on the space station.... Docking occurred a little over an hour behind schedule, due to some issues with Starliner's graphics and docking ring, which were resolved ahead of the docking....

[Thursday] At 6:54PM ET, Starliner successfully launched to space on top of an Atlas V rocket, built and operated by the United Launch Alliance. Once Starliner separated from the Atlas V, it had to fire its own thrusters to insert itself into the proper orbit for reaching the space station. However, after that maneuver took place, Boeing and NASA revealed that two of the 12 thrusters Starliner uses for the procedure failed and cut off too early. The capsule's flight control system was able to kick in and rerouted to a working thruster, which helped get Starliner into a stable orbit.... Today, Boeing revealed that a drop in chamber pressure had caused the early cutoff of the thruster, but that system behaved normally during follow-up burns of the thrusters. And with redundancies on the spacecraft, the issue "does not pose a risk to the rest of the flight test," according to Boeing.

Boeing also noted today that the Starliner team is investigating some weird behavior of a "thermal cooling loop" but said that temperatures are stable on the spacecraft.

From the space station, NASA astronaut Bob Hines said the achievement "marks a great milestone towards providing additional commercial access to low Earth orbit, sustaining the ISS and enabling NASA's goal of returning humans to the Moon and eventually to Mars.

"Great accomplishments in human spaceflight are long remembered by history. Today will be no different."

A long-time Slashdot reader shares this schedule (EST): 5/20, 3:30 pm — Starliner docking with ISS.
5/21, 11:30 am — Safety checks completed. Hatches opened.
5/24, 12:00 pm — Starliner loading completed. Hatched closed.
5/25, 2:00 pm — Starliner undocking from ISS.
5/25, 5:45 pm — Coverage of Starliner landing begins.

Again, the streams will be broadcast at NASA Television. I don't know about any of you, but I know what I'm doing this weekend.

Lord of the Rings

EA Plans Free Mobile 'Lord of the Rings' Game (cnet.com) 35

Electronic Arts and Middle-earth Enterprises "announced on Monday an upcoming free mobile game called The Lord of the Rings: Heroes of Middle-earth," reports CNET: With the role-playing game, Lord of the Rings fans can look forward to experiencing the iconic universe in a whole new way.... The game will feature immersive storytelling with iconic plot lines, turn-based combat and a selection of characters from both The Lord of the Rings and The Hobbit to battle the evils of Middle-earth.

"The team is filled with fans of The Lord of the Rings and The Hobbit and each day they bring their tremendous passion and talents together to deliver an authentic experience for players," Malachi Boyle, vice president of mobile RPG for Electronic Arts, said in a statement. "The combination of high-fidelity graphics, cinematic animations, and stylized art immerses players in the fantasy of Middle-earth where they'll go head-to-head with their favorite characters."

Graphics

Report: 'Nvidia's LHR Limiter Has Fallen, But Gamers Shouldn't Worry' (tomshardware.com) 46

Slashdot reader Hmmmmmm shared this report from Hot Hardware: When Nvidia launched its Ampere Lite Hash Rate (LHR) graphics card with the feared Ethereum anti-mining limiter, the world knew it was only a matter of time before someone or a team cracked it. NiceHash, the company that designed the QuickMiner software and Excavator miner, has finally broken Nvidia's algorithm, restoring LHR graphics cards to their 100% Ethereum mining performance....

Graphics card pricing has been plummeting, and we're starting to see better availability at retailers, with some GPUs selling at or below Manufacturer Suggested Retail Price. So QuickMiner's arrival shouldn't influence the current state of the graphics market unless big corporations want to buy out everything in sight for the last push before Ethereum's transition to Proof-of-Stake (PoS), often referred to as "The Merge," is complete. We see that as unlikely, considering current profitability even on a 3080 Ti sits at around $3.50 per day and would still need nearly a year to break even at current rates. Initially scheduled for June, The Merge won't finalize until "the few months after," as Ethereum developer Tim Beiko has expressed on Twitter.

It will be interesting to see if Nvidia responds to this with updated drivers or implements LHRv3 in the remaining GPUs. However, it's perhaps not worth the effort at this point, and all existing LHRv2 and earlier cards can just stay on current drivers for optimized mining performance.

Open Source

Nvidia Transitioning To Official, Open-Source Linux GPU Kernel Driver (phoronix.com) 102

Nvidia is publishing their Linux GPU kernel modules as open-source and will be maintaining it moving forward. Phoronix's Michael Larabel reports: To much excitement and a sign of the times, the embargo has just expired on this super-exciting milestone that many of us have been hoping to see for many years. Over the past two decades NVIDIA has offered great Linux driver support with their proprietary driver stack, but with the success of AMD's open-source driver effort going on for more than a decade, many have been calling for NVIDIA to open up their drivers. Their user-space software is remaining closed-source but as of today they have formally opened up their Linux GPU kernel modules and will be maintaining it moving forward. [...] This isn't limited to just Tegra or so but spans not only their desktop graphics but is already production-ready for data center GPU usage.
United States

SEC Charges NVIDIA with Inadequate Disclosures about Impact of Cryptomining (sec.gov) 31

The Securities and Exchange Commission today announced settled charges against NVIDIA for inadequate disclosures concerning the impact of cryptomining on the company's gaming business. From an SEC press release: The SEC's order finds that, during consecutive quarters in NVIDIA's fiscal year 2018, the company failed to disclose that cryptomining was a significant element of its material revenue growth from the sale of its graphics processing units (GPUs) designed and marketed for gaming. Cryptomining is the process of obtaining crypto rewards in exchange for verifying crypto transactions on distributed ledgers. As demand for and interest in crypto rose in 2017, NVIDIA customers increasingly used its gaming GPUs for cryptomining. In two of its Forms 10-Q for its fiscal year 2018, NVIDIA reported material growth in revenue within its gaming business. NVIDIA had information, however, that this increase in gaming sales was driven in significant part by cryptomining. Despite this, NVIDIA did not disclose in its Forms 10-Q, as it was required to do, these significant earnings and cash flow fluctuations related to a volatile business for investors to ascertain the likelihood that past performance was indicative of future performance. The SEC's order also finds that NVIDIA's omissions of material information about the growth of its gaming business were misleading given that NVIDIA did make statements about how other parts of the company's business were driven by demand for crypto, creating the impression that the company's gaming business was not significantly affected by cryptomining.

Slashdot Top Deals