


Knoxville Researcher Wins A.M. Turing Award (knoxnews.com) 18
A local computer scientist and professor at the University of Tennessee at Knoxville has been named an A.M. Turing Award winner by the Association for Computing Machinery. The Turing Award is often referred to as the "Nobel Prize of computer science." It carries a million dollar prize.
"Oh, it was a complete shock. I'm still recovering from it," Jack Dongarra told Knox News with a warm laugh. "It's nice to see the work being recognized in this way but it couldn't have happened without the support and contribution of many people over time." Chances are Dongarra's work has touched your life, even if you don't know it. If you've ever used a speech recognition program or looked at a weather forecast, you're using technology that relies on Dongarra's software libraries. Dongarra has held a joint appointment at the University of Tennessee and Oak Ridge National Laboratory since 1989. While he doesn't have a household name, his foundational work in computer science has undergirded the development of high-performance computers over the course of his 40-year career...
Dongarra developed software to allow computers to use multiple processors simultaneously, and this is basically how all computer systems work today. Your laptop has multiple processing cores and might have an additional graphics processing core. Many phones have multiple processing cores. "He's continually rethought how to exploit today's computer architectures and done so very effectively," said Nicholas Higham a Royal Society research professor of applied mathematics at the University of Manchester. "He's come up with ideas so that we can get the very best out of these machines." Dongarra also developed software that allowed computers with different hardware and operating systems to run in parallel, networking distant machines as a single computation device. This lets people make more powerful computers out of many smaller devices which helped develop cloud computing, running high-end applications over the internet. Most of Dongarra's work was published open-source through a project called Netlib.
Congratulations!

Hollywood Designer 6.0 Released: Now a 'Full-Blown Multimedia Authoring System' (amigans.net) 20
Long-time Slashdot reader Mike Bouma explains: Airsoft Softwair has released Hollywood Designer 6.0, "a full-blown multimedia authoring system that runs on top of Hollywood and can be used to create all sorts of multimedia-based applications, for example presentations, slide shows, games, and applications. Thanks to Hollywood, all multimedia applications created using Hollywood Designer can be exported as stand-alone executables for the following systems: AmigaOS3, AmigaOS4, WarpOS, MorphOS, AROS, Windows, macOS, Linux, Android, and iOS."
The current version of Hollywood is v9.1 with various updated add-ons. To see earlier versions of Hollywood 9.0 & Designer 5.0 in action have a look at Kas1e's short demonstration on AmigaOS4 / AmigaOne X5000.

Boeing's Starliner Docks with International Space Station. Hatch Opening Now (nasa.gov) 59
And right now, Boeing is beginning the official hatch-opening ceremon, in which the space station astronauts already on the ISS "open the hatch to the vehicle and retrieve some cargo that's packed inside," explains the Verge: NASA tasked Boeing with conducting an uncrewed flight demonstration of Starliner to show that the capsule can hit all of the major milestones it'll need to hit when it is carrying passengers... This mission is called OFT-2 since it's technically a do-over of a mission that Boeing attempted back in 2019, called OFT. During that flight, Starliner launched to space as planned, but a software glitch prevented the capsule from getting in the right orbit it needed to reach to rendezvous with the ISS. Boeing had to bring the vehicle home early, and the company never demonstrated Starliner's ability to dock with the ISS....
Using a series of sensors, the capsule autonomously guided itself onto an open docking port on the space station.... Docking occurred a little over an hour behind schedule, due to some issues with Starliner's graphics and docking ring, which were resolved ahead of the docking....
[Thursday] At 6:54PM ET, Starliner successfully launched to space on top of an Atlas V rocket, built and operated by the United Launch Alliance. Once Starliner separated from the Atlas V, it had to fire its own thrusters to insert itself into the proper orbit for reaching the space station. However, after that maneuver took place, Boeing and NASA revealed that two of the 12 thrusters Starliner uses for the procedure failed and cut off too early. The capsule's flight control system was able to kick in and rerouted to a working thruster, which helped get Starliner into a stable orbit.... Today, Boeing revealed that a drop in chamber pressure had caused the early cutoff of the thruster, but that system behaved normally during follow-up burns of the thrusters. And with redundancies on the spacecraft, the issue "does not pose a risk to the rest of the flight test," according to Boeing.
Boeing also noted today that the Starliner team is investigating some weird behavior of a "thermal cooling loop" but said that temperatures are stable on the spacecraft.
From the space station, NASA astronaut Bob Hines said the achievement "marks a great milestone towards providing additional commercial access to low Earth orbit, sustaining the ISS and enabling NASA's goal of returning humans to the Moon and eventually to Mars.
"Great accomplishments in human spaceflight are long remembered by history. Today will be no different."
A long-time Slashdot reader shares this schedule (EST): 5/20, 3:30 pm — Starliner docking with ISS.
5/21, 11:30 am — Safety checks completed. Hatches opened.
5/24, 12:00 pm — Starliner loading completed. Hatched closed.
5/25, 2:00 pm — Starliner undocking from ISS.
5/25, 5:45 pm — Coverage of Starliner landing begins.
Again, the streams will be broadcast at NASA Television. I don't know about any of you, but I know what I'm doing this weekend.

EA Plans Free Mobile 'Lord of the Rings' Game (cnet.com) 35
"The team is filled with fans of The Lord of the Rings and The Hobbit and each day they bring their tremendous passion and talents together to deliver an authentic experience for players," Malachi Boyle, vice president of mobile RPG for Electronic Arts, said in a statement. "The combination of high-fidelity graphics, cinematic animations, and stylized art immerses players in the fantasy of Middle-earth where they'll go head-to-head with their favorite characters."

Report: 'Nvidia's LHR Limiter Has Fallen, But Gamers Shouldn't Worry' (tomshardware.com) 46
Graphics card pricing has been plummeting, and we're starting to see better availability at retailers, with some GPUs selling at or below Manufacturer Suggested Retail Price. So QuickMiner's arrival shouldn't influence the current state of the graphics market unless big corporations want to buy out everything in sight for the last push before Ethereum's transition to Proof-of-Stake (PoS), often referred to as "The Merge," is complete. We see that as unlikely, considering current profitability even on a 3080 Ti sits at around $3.50 per day and would still need nearly a year to break even at current rates. Initially scheduled for June, The Merge won't finalize until "the few months after," as Ethereum developer Tim Beiko has expressed on Twitter.
It will be interesting to see if Nvidia responds to this with updated drivers or implements LHRv3 in the remaining GPUs. However, it's perhaps not worth the effort at this point, and all existing LHRv2 and earlier cards can just stay on current drivers for optimized mining performance.

Nvidia Transitioning To Official, Open-Source Linux GPU Kernel Driver (phoronix.com) 102

SEC Charges NVIDIA with Inadequate Disclosures about Impact of Cryptomining (sec.gov) 31

AMD Doubles the Number of CPU Cores It Offers In Chromebooks (arstechnica.com) 23
On their own, the chips aren't that exciting. They seemingly offer similar performance to the already-released Ryzen 5000 U-series chips. The Ryzen 5000 C-series also uses years-old Vega integrated graphics rather than the upgraded RDNA 2 found in Ryzen 6000 mobile chips, which, upon release, AMD said are "up to 2.1 times faster." But for someone who's constantly pushing their Chromebook to do more than just open a Chrome tab or two, the chips bring potentially elevated performance than what's currently available.

OpenBSD 7.1 Released with Support for Apple M1, Improvements for ARM64 and RISC-V (openbsd.org) 26
Phoronix calls it "the newest version of this popular, security-minded BSD operating system." With OpenBSD 7.1, the Apple Silicon support is now considered "ready for general use" with keypad/touchpad support for M1 laptops, a power management controller driver added, I2C and SPI controller drivers, and a variety of other driver additions for supporting the Apple Silicon hardware.
OpenBSD 7.1 also has a number of other improvements benefiting the 64-bit ARM (ARM64) and RISC-V architectures. OpenBSD 7.1 also brings SMP kernel improvements, support for futexes with shared anonymous memory, and more. On the graphics front there is updating the Linux DRM code against the state found in Linux 5.15.26 as well as now enabling Intel Elkhart Lake / Jasper Lake / Rocket Lake support.
The Register notes OpenBSD now "supports a surprisingly wide range of hardware: x86-32, x86-64, ARM7, Arm64, DEC Alpha, HP PA-RISC, Hitachi SH4, Motorola 88000, MIPS64, SPARC64, RISC-V 64, and both Apple PowerPC and IBM POWER." The Register's FOSS desk ran up a copy in VirtualBox, and we were honestly surprised how quick and easy it was. By saying "yes" to everything, it automatically partitioned the VM's disk into a rather complex array of nine slices, installed the OS, a boot loader, an X server and display manager, plus the FVWM window manager. After a reboot, we got a graphical login screen and then a rather late-1980s Motif-style desktop with an xterm.
It was easy to install XFCE, which let us set the screen resolution and other modern niceties, and there are also KDE, GNOME, and other pretty front-ends, plus plenty of familiar tools such as Mozilla apps, LibreOffice and so on....
We were expecting to have to do a lot more work. Yes, OpenBSD is a niche OS, but the project gave the world OpenSSH, LibreSSL, the PF firewall as used in macOS, much of Android's Bionic C library, and more besides.... In a world of multi-gigabyte OSes, it's quite refreshing. It felt like stepping back into the early 1990s, the era of Real Unix, when you had to put in some real effort and learn stuff in order to bend the OS to your will — but in return, you got something relatively bulletproof.

Nvidia and AMD GPUs Are Returning To Shelves and Prices Are Finally Falling (theverge.com) 78
Just as importantly, some graphics cards are actually staying in stock at retailers when their prices are too high -- again, something that sounds perfectly normal but that we haven't seen in a while. For many months, boutiques like my local retailer Central Computers could only afford to sell you a GPU as part of a big PC bundle, but now it's making every card available on its own. GameStop is selling a Radeon RX 6600 for just $10 over MSRP, and it hasn't yet sold out. Newegg has also continually been offering an RTX 3080 Ti for just $10 over MSRP (after rebate, too) -- even if $1,200 still seems high for that card's level of performance.

Playdate, the Pocket-Sized Game Console With a Crank, Begins Shipping (oregonlive.com) 28
Panic is a software company, not an electronics manufacturer, and its first foray into computer hardware encountered a string of problems -- exacerbated by the pandemic and the resulting global shortage in computer chips. Most recently, Panic announced last November that many of its first 5,000 Playdates had faulty batteries. The company responded by sending them all back to its manufacturing contractor in Malaysia for replacement with new batteries from a different supplier.
Playdate fielded 20,000 orders in just 20 minutes when the first gadgets went on sale last July. And despite the delays, initial reviews Monday were very enthusiastic [...]. All the reviews noted, though, that Panic is a long way from untangling its production snarls. Only the first orders are going out now -- thousands more Playdates are coming sometime later, though Panic hasn't said just when. There's also good news for DIYers: iFixit's teardown says the gaming system is relatively easy to fix if you ever need to replace its battery or buttons.

Razer's First Linux Laptop Called 'Sexy' - But It's Not for Gamers (theverge.com) 45
And here's how Ars Technica describes the Razer x Lambda Tensorbook (announced Tuesday): Made in collaboration with Lambda, the Linux-based clamshell focuses on deep-learning development. Lambda, which has been around since 2012, is a deep-learning infrastructure provider used by the US Department of Defense and "97 percent of the top research universities in the US," according to the company's announcement. Lambda's offerings include GPU clusters, servers, workstations, and cloud instances that train neural networks for various use cases, including self-driving cars, cancer detection, and drug discovery.
Dubbed "The Deep Learning Laptop," the Tensorbook has an Nvidia RTX 3080 Max-Q (16GB) and targets machine-learning engineers, especially those who lack a laptop with a discrete GPU and thus have to share a remote machine's resources, which negatively affects development.... "When you're stuck SSHing into a remote server, you don't have any of your local data or code and even have a hard time demoing your model to colleagues," Lambda co-founder and CEO Stephen Balaban said in a statement, noting that the laptop comes with PyTorch and TensorFlow for quickly training and demoing models from a local GUI interface without SSH. Lambda isn't a laptop maker, so it recruited Razer to build the machine....
While there are more powerful laptops available, the Tensorbook stands out because of its software package and Ubuntu Linux 20.04 LTS.
The Verge writes: While Razer currently offers faster CPU, GPU and screens in today's Blade lineup, it's not necessarily a bad deal if you love the design, considering how pricey Razer's laptops can be. But we've generally found that Razer's thin machines run quite hot in our reviews, and the Blade in question was no exception even with a quarter of the memory and a less powerful RTX 3060 GPU. Lambda's FAQ page does not address heat as of today.
Lambda is clearly aiming this one at prospective MacBook Pro buyers, and I don't just say that because of the silver tones. The primary hardware comparison the company touts is a 4x speedup over Apple's M1 Max in a 16-inch MacBook Pro when running TensorFlow.
Specifically, Lambda's web site claims the new laptop "delivers model training performance up to 4x faster than Apple's M1 Max, and up to 10x faster than Google Colab instances." And it credits this to the laptop's use of NVIDIA's GeForce RTX 3080 Max-Q 16GB GPU, adding that NVIDIA GPUs "are the industry standard for parallel processing, ensuring leading performance and compatibility with all machine learning frameworks and tools."
"It looks like a fine package and machine, but pricing starts at $3,499," notes Hot Hardware, adding "There's a $500 up-charge to have it configured to dual-boot Windows 10."
The Verge speculates on what this might portend for the future. "Perhaps the recently renewed interest in Linux gaming, driven by the Steam Deck, will push Razer to consider Linux for its own core products as well."

How Apple's Monster M1 Ultra Chip Keeps Moore's Law Alive 109
Apple's new chip is all about increasing overall processing power. "Depending on how you define Moore's law, this approach allows you to create systems that engage many more transistors than what fits on one chip," says Jesus del Alamo, a professor at MIT who researches new chip components. He adds that it is significant that TSMC, at the cutting edge of chipmaking, is looking for new ways to keep performance rising. "Clearly, the chip industry sees that progress in the future is going to come not only from Moore's law but also from creating systems that could be fabricated by different technologies yet to be brought together," he says. "Others are doing similar things, and we certainly see a trend towards more of these chiplet designs," adds Linley Gwennap, author of the Microprocessor Report, an industry newsletter. The rise of modular chipmaking might help boost the performance of future devices, but it could also change the economics of chipmaking. Without Moore's law, a chip with twice the transistors may cost twice as much. "With chiplets, I can still sell you the base chip for, say, $300, the double chip for $600, and the uber-double chip for $1,200," says Todd Austin, an electrical engineer at the University of Michigan.

AMD Confirms Its GPU Drivers Are Overclocking CPUs Without Asking (tomshardware.com) 73
Overclocking a Ryzen CPU requires the software to manipulate the BIOS settings, just as we see with other software overclocking utilities. For AMD, this can mean simply engaging the auto-overclocking Precision Boost Overdrive (PBO) feature. This feature does all the dirty work, like adjusting voltages and frequency on the fly, to give you a one-click automatic overclock. However, applying a GPU profile in the AMD driver can now inexplicably alter the BIOS settings to enable automatic overclocking. This is problematic because of the potential ill effects of overclocking -- in fact, overclocking a Ryzen CPU automatically voids the warranty. AMD's software typically requires you to click a warning to acknowledge that you understand the risks associated with overclocking, and that it voids your warranty, before it allows you to overclock the system. Unfortunately, that isn't happening here. Until AMD issues a fix, "users have taken to using the Radeon Software Slimmer to delete the Ryzen Master SDK from the GPU driver, thus preventing any untoward changes to the BIOS settings," adds Tom's Hardware.

Epic's Unreal Engine 5 Has Officially Launched (axios.com) 60
Epic workers demonstrated how the engine could be used to make and tweak modern games. Then came the key slide showing dozens of partners, including PlayStation, Xbox and Tencent followed by testimonials from recent Unreal Engine converts CD Projekt RED, which had previously used its own tech to make games in the Witcher and Cyberpunk franchises and ending with the kicker that Crystal Dynamics, another studio that long operated its own in-house engine, would use Unreal on its next Tomb Raider game. More details at Kotaku.

Intel Beats AMD and Nvidia with Arc GPU's Full AV1 Support (neowin.net) 81
But AV1 even turned up in Intel's announcement this week of the Arc A-series, a new line of discrete GPUs, Neowin reports: Intel has been quick to respond and the company has become the first such GPU hardware vendor to have full AV1 support on its newly launched Arc GPUs. While AMD and Nvidia both offer AV1 decoding with their newest GPUs, neither have support for AV1 encoding.
Intel says that hardware encoding of AV1 on its new Arc GPUs is 50 times faster than those based on software-only solutions. It also adds that the efficiency of AV1 encode with Arc is 20% better compared to HEVC. With this feature, Intel hopes to potentially capture at least some of the streaming and video editing market that's based on users who are looking for a more robust AV1 encoding solution compared to CPU-based software approaches.
From Intel's announcement: Intel Arc A-Series GPUs are the first in the industry to offer full AV1 hardware acceleration, including both encode and decode, delivering faster video encode and higher quality streaming while consuming the same internet bandwidth. We've worked with industry partners to ensure that AV1 support is available today in many of the most popular media applications, with broader adoption expected this year. The AV1 codec will be a game changer for the future of video encoding and streaming.

Intel Enters Discrete GPU Market With Launch of Arc A-Series For Laptops (hothardware.com) 23
For example, Arc A370M arrives today with 8 Xe cores, 8 ray tracing units, 4GB of GDDR6 memory linked to a 64-bit memory bus, and a 1,550MHz graphics clock. Graphics power is rated at 35-50W. However, Arc A770M, Intel's highest-end mobile GPU will come with 32 Xe cores, 32 ray tracing units, 16GB of GDDR 6 memory over a 256-bit interface and with a 1650MHz graphics clock. Doing the math, Arc A770M could be up to 4X more powerful than Arc 370M. In terms of performance, Intel showcased benchmarks from a laptop outfitted with a Core i7-12700H processor and Arc A370M GPU that can top the 60 FPS threshold at 1080p in many games where integrated graphics could come up far short. Examples included Doom Eternal (63 fps) at high quality settings, and Hitman 3 (62 fps), and Destiny 2 (66 fps) at medium settings. Intel is also showcasing new innovations for content creators as well, with its Deep Link, Hyper Encode and AV1 video compression support offering big gains in video upscaling, encoding and streaming. Finally, Intel Arc Control software will offer unique features like Smooth Sync that blends tearing artifacts when V-Synch is turned off, as well as Creator Studio with background blur, frame tracking and broadcast features for direct game streaming services support.

The Untold Story of the Creation of GIF At CompuServe In 1987 (fastcompany.com) 43
GIF came to be because online services such as CompuServe were getting more graphical, but the computer makers of the time — such as Apple, Commodore, and IBM — all had their own proprietary image types. "We didn't want to have to put up images in 79 different formats," explains Trevor. CompuServe needed one universal graphics format.
Even though the World Wide Web and digital cameras were still in the future, work was already underway on the image format that came to be known as JPEG. But it wasn't optimized for CompuServe's needs: For example, stock charts and weather graphics didn't render crisply. So Trevor asked Wilhite to create an image file type that looked good and downloaded quickly at a time when a 2,400 bits-per-second dial-up modem was considered torrid. Reading a technical journal, Wilhite came across a discussion of an efficient compression technique known as LZW for its creators — Abraham Limpel, Jacob Ziv, and Terry Welch. It turned out to be an ideal foundation for what CompuServe was trying to build, and allowed GIF to pack a lot of image information into as few bytes as possible. (Much later, computing giant Unisys, which gained a patent for LZW, threatened companies that used it with lawsuits, leading to a licensing agreement with CompuServe and the creation of the patent-free PNG image format.)
GIF officially debuted on June 15, 1987. "It met my requirements, and it was extremely useful for CompuServe," says Trevor....
GIF was also versatile, offering the ability to store the multiple pictures that made it handy for creating mini-movies as well as static images. And it spread beyond CompuServe, showing up in Mosaic, the first graphical web browser, and then in Netscape Navigator. The latter browser gave GIFs the ability to run in an infinite loop, a crucial feature that only added to their hypnotic quality. Seeing cartoon hamsters dance for a split second is no big whoop, but watching them shake their booties endlessly was just one of many cultural moments that GIFs have given us.

Stephen Wilhite, Creator of the GIF, Has Died (theverge.com) 128
Although GIFs are synonymous with animated internet memes these days, that wasn't the reason Wilhite created the format. CompuServe introduced them in the late 1980s as a way to distribute "high-quality, high-resolution graphics" in color at a time when internet speeds were glacial compared to what they are today. "He invented GIF all by himself -- he actually did that at home and brought it into work after he perfected it," Kathaleen said. "He would figure out everything privately in his head and then go to town programming it on the computer."
If you want to go more in-depth into the history of the GIF, the Daily Dot has a good explainer of how the format became an internet phenomenon. In 2013, Wilhite weighed in on the long-standing debate about the correct pronunciation of the image format. He told The New York Times, "The Oxford English Dictionary accepts both pronunciations. They are wrong. It is a soft 'G,' pronounced 'jif.' End of story."