Hardware

Samsung Allegedly Assembling a 'Dream Team' To Take Down Apple's M1 In 2025 (neowin.net) 47

Samsung is rumored to be assembling a special task force dubbed "Dream Platform One team" tasked with designing a custom in-house Samsung mobile Application Processor (AP) that can take on Apple Silicon. Neowin reports: It's probably fair to say that Samsung hasn't had the best time with its Exynos offerings when compared against rivals like Qualcomm or Apple. To shake its fortunes up, the company also paired up with AMD for its Exynos 2200 GPU, and results were a mixed bag. Both the AMD RDNA 2 Xclipse 920 graphics and the Exynos 2200 CPU were found to be pretty disappointing in terms of power efficiency as they were not much better than the previous Exynos 2100 offering. In a nutshell, the new CPU was around 5% faster while the AMD graphics was around 17% better, both of which were clearly not enough (via TechAltar on Twitter). However, the company is looking to get real serious and down to business come 2025. The new report coincides with a separate report suggesting that Samsung was working on a custom chipset for its Galaxy S series. The downside is that it's not slated for 2025 and will obviously have to compete against whatever Apple offers at that time.
Technology

Knoxville Researcher Wins A.M. Turing Award (knoxnews.com) 18

schwit1 writes: It's a few weeks old, but ...

A local computer scientist and professor at the University of Tennessee at Knoxville has been named an A.M. Turing Award winner by the Association for Computing Machinery. The Turing Award is often referred to as the "Nobel Prize of computer science." It carries a million dollar prize.

"Oh, it was a complete shock. I'm still recovering from it," Jack Dongarra told Knox News with a warm laugh. "It's nice to see the work being recognized in this way but it couldn't have happened without the support and contribution of many people over time." Chances are Dongarra's work has touched your life, even if you don't know it. If you've ever used a speech recognition program or looked at a weather forecast, you're using technology that relies on Dongarra's software libraries. Dongarra has held a joint appointment at the University of Tennessee and Oak Ridge National Laboratory since 1989. While he doesn't have a household name, his foundational work in computer science has undergirded the development of high-performance computers over the course of his 40-year career...

Dongarra developed software to allow computers to use multiple processors simultaneously, and this is basically how all computer systems work today. Your laptop has multiple processing cores and might have an additional graphics processing core. Many phones have multiple processing cores. "He's continually rethought how to exploit today's computer architectures and done so very effectively," said Nicholas Higham a Royal Society research professor of applied mathematics at the University of Manchester. "He's come up with ideas so that we can get the very best out of these machines." Dongarra also developed software that allowed computers with different hardware and operating systems to run in parallel, networking distant machines as a single computation device. This lets people make more powerful computers out of many smaller devices which helped develop cloud computing, running high-end applications over the internet. Most of Dongarra's work was published open-source through a project called Netlib.

Congratulations!


Upgrades

Hollywood Designer 6.0 Released: Now a 'Full-Blown Multimedia Authoring System' (amigans.net) 20

After nearly 20 years, Hollywood Designer 6.0 is "very stable and mature", write its developers — envisioning both hobbyist and professional users (with its support for modern graphics-editing features like filter effects and vector graphics) in its massive new evolution.

Long-time Slashdot reader Mike Bouma explains: Airsoft Softwair has released Hollywood Designer 6.0, "a full-blown multimedia authoring system that runs on top of Hollywood and can be used to create all sorts of multimedia-based applications, for example presentations, slide shows, games, and applications. Thanks to Hollywood, all multimedia applications created using Hollywood Designer can be exported as stand-alone executables for the following systems: AmigaOS3, AmigaOS4, WarpOS, MorphOS, AROS, Windows, macOS, Linux, Android, and iOS."

The current version of Hollywood is v9.1 with various updated add-ons. To see earlier versions of Hollywood 9.0 & Designer 5.0 in action have a look at Kas1e's short demonstration on AmigaOS4 / AmigaOne X5000.

ISS

Boeing's Starliner Docks with International Space Station. Hatch Opening Now (nasa.gov) 59

Boeing's Starliner successfully docked to the International Space Station Friday night for the first time.

And right now, Boeing is beginning the official hatch-opening ceremon, in which the space station astronauts already on the ISS "open the hatch to the vehicle and retrieve some cargo that's packed inside," explains the Verge: NASA tasked Boeing with conducting an uncrewed flight demonstration of Starliner to show that the capsule can hit all of the major milestones it'll need to hit when it is carrying passengers... This mission is called OFT-2 since it's technically a do-over of a mission that Boeing attempted back in 2019, called OFT. During that flight, Starliner launched to space as planned, but a software glitch prevented the capsule from getting in the right orbit it needed to reach to rendezvous with the ISS. Boeing had to bring the vehicle home early, and the company never demonstrated Starliner's ability to dock with the ISS....

Using a series of sensors, the capsule autonomously guided itself onto an open docking port on the space station.... Docking occurred a little over an hour behind schedule, due to some issues with Starliner's graphics and docking ring, which were resolved ahead of the docking....

[Thursday] At 6:54PM ET, Starliner successfully launched to space on top of an Atlas V rocket, built and operated by the United Launch Alliance. Once Starliner separated from the Atlas V, it had to fire its own thrusters to insert itself into the proper orbit for reaching the space station. However, after that maneuver took place, Boeing and NASA revealed that two of the 12 thrusters Starliner uses for the procedure failed and cut off too early. The capsule's flight control system was able to kick in and rerouted to a working thruster, which helped get Starliner into a stable orbit.... Today, Boeing revealed that a drop in chamber pressure had caused the early cutoff of the thruster, but that system behaved normally during follow-up burns of the thrusters. And with redundancies on the spacecraft, the issue "does not pose a risk to the rest of the flight test," according to Boeing.

Boeing also noted today that the Starliner team is investigating some weird behavior of a "thermal cooling loop" but said that temperatures are stable on the spacecraft.

From the space station, NASA astronaut Bob Hines said the achievement "marks a great milestone towards providing additional commercial access to low Earth orbit, sustaining the ISS and enabling NASA's goal of returning humans to the Moon and eventually to Mars.

"Great accomplishments in human spaceflight are long remembered by history. Today will be no different."

A long-time Slashdot reader shares this schedule (EST): 5/20, 3:30 pm — Starliner docking with ISS.
5/21, 11:30 am — Safety checks completed. Hatches opened.
5/24, 12:00 pm — Starliner loading completed. Hatched closed.
5/25, 2:00 pm — Starliner undocking from ISS.
5/25, 5:45 pm — Coverage of Starliner landing begins.

Again, the streams will be broadcast at NASA Television. I don't know about any of you, but I know what I'm doing this weekend.

Lord of the Rings

EA Plans Free Mobile 'Lord of the Rings' Game (cnet.com) 35

Electronic Arts and Middle-earth Enterprises "announced on Monday an upcoming free mobile game called The Lord of the Rings: Heroes of Middle-earth," reports CNET: With the role-playing game, Lord of the Rings fans can look forward to experiencing the iconic universe in a whole new way.... The game will feature immersive storytelling with iconic plot lines, turn-based combat and a selection of characters from both The Lord of the Rings and The Hobbit to battle the evils of Middle-earth.

"The team is filled with fans of The Lord of the Rings and The Hobbit and each day they bring their tremendous passion and talents together to deliver an authentic experience for players," Malachi Boyle, vice president of mobile RPG for Electronic Arts, said in a statement. "The combination of high-fidelity graphics, cinematic animations, and stylized art immerses players in the fantasy of Middle-earth where they'll go head-to-head with their favorite characters."

Graphics

Report: 'Nvidia's LHR Limiter Has Fallen, But Gamers Shouldn't Worry' (tomshardware.com) 46

Slashdot reader Hmmmmmm shared this report from Hot Hardware: When Nvidia launched its Ampere Lite Hash Rate (LHR) graphics card with the feared Ethereum anti-mining limiter, the world knew it was only a matter of time before someone or a team cracked it. NiceHash, the company that designed the QuickMiner software and Excavator miner, has finally broken Nvidia's algorithm, restoring LHR graphics cards to their 100% Ethereum mining performance....

Graphics card pricing has been plummeting, and we're starting to see better availability at retailers, with some GPUs selling at or below Manufacturer Suggested Retail Price. So QuickMiner's arrival shouldn't influence the current state of the graphics market unless big corporations want to buy out everything in sight for the last push before Ethereum's transition to Proof-of-Stake (PoS), often referred to as "The Merge," is complete. We see that as unlikely, considering current profitability even on a 3080 Ti sits at around $3.50 per day and would still need nearly a year to break even at current rates. Initially scheduled for June, The Merge won't finalize until "the few months after," as Ethereum developer Tim Beiko has expressed on Twitter.

It will be interesting to see if Nvidia responds to this with updated drivers or implements LHRv3 in the remaining GPUs. However, it's perhaps not worth the effort at this point, and all existing LHRv2 and earlier cards can just stay on current drivers for optimized mining performance.

Open Source

Nvidia Transitioning To Official, Open-Source Linux GPU Kernel Driver (phoronix.com) 102

Nvidia is publishing their Linux GPU kernel modules as open-source and will be maintaining it moving forward. Phoronix's Michael Larabel reports: To much excitement and a sign of the times, the embargo has just expired on this super-exciting milestone that many of us have been hoping to see for many years. Over the past two decades NVIDIA has offered great Linux driver support with their proprietary driver stack, but with the success of AMD's open-source driver effort going on for more than a decade, many have been calling for NVIDIA to open up their drivers. Their user-space software is remaining closed-source but as of today they have formally opened up their Linux GPU kernel modules and will be maintaining it moving forward. [...] This isn't limited to just Tegra or so but spans not only their desktop graphics but is already production-ready for data center GPU usage.
United States

SEC Charges NVIDIA with Inadequate Disclosures about Impact of Cryptomining (sec.gov) 31

The Securities and Exchange Commission today announced settled charges against NVIDIA for inadequate disclosures concerning the impact of cryptomining on the company's gaming business. From an SEC press release: The SEC's order finds that, during consecutive quarters in NVIDIA's fiscal year 2018, the company failed to disclose that cryptomining was a significant element of its material revenue growth from the sale of its graphics processing units (GPUs) designed and marketed for gaming. Cryptomining is the process of obtaining crypto rewards in exchange for verifying crypto transactions on distributed ledgers. As demand for and interest in crypto rose in 2017, NVIDIA customers increasingly used its gaming GPUs for cryptomining. In two of its Forms 10-Q for its fiscal year 2018, NVIDIA reported material growth in revenue within its gaming business. NVIDIA had information, however, that this increase in gaming sales was driven in significant part by cryptomining. Despite this, NVIDIA did not disclose in its Forms 10-Q, as it was required to do, these significant earnings and cash flow fluctuations related to a volatile business for investors to ascertain the likelihood that past performance was indicative of future performance. The SEC's order also finds that NVIDIA's omissions of material information about the growth of its gaming business were misleading given that NVIDIA did make statements about how other parts of the company's business were driven by demand for crypto, creating the impression that the company's gaming business was not significantly affected by cryptomining.
AMD

AMD Doubles the Number of CPU Cores It Offers In Chromebooks (arstechnica.com) 23

AMD announced the Ryzen 5000 C-series for Chromebooks today. "The top chip in the series has eight of AMD's Zen 3 cores, giving systems that use it more x86 CPU cores than any other Chromebook," reports Ars Technica. From the report: The 7nm Ryzen 5000 C-series ranges from the Ryzen 3 5125C with two Zen 3 cores and a base and boost clock speed of 3 GHz, up to the Ryzen 7 5825C with eight cores and a base clock speed of 2 GHz that can boost to 4.5 GHz. For comparison, Intel's Core i7-1185G7, found in some higher end Chromebooks, has four cores and a base clock speed of 3 GHz that can boost to 4.8 GHz.

On their own, the chips aren't that exciting. They seemingly offer similar performance to the already-released Ryzen 5000 U-series chips. The Ryzen 5000 C-series also uses years-old Vega integrated graphics rather than the upgraded RDNA 2 found in Ryzen 6000 mobile chips, which, upon release, AMD said are "up to 2.1 times faster." But for someone who's constantly pushing their Chromebook to do more than just open a Chrome tab or two, the chips bring potentially elevated performance than what's currently available.

Unix

OpenBSD 7.1 Released with Support for Apple M1, Improvements for ARM64 and RISC-V (openbsd.org) 26

"Everyone's favorite security focused operating system, OpenBSD 7.1 has been released for a number of architectures," writes long-time Slashdot reader ArchieBunker, "including Apple M1 chips."

Phoronix calls it "the newest version of this popular, security-minded BSD operating system." With OpenBSD 7.1, the Apple Silicon support is now considered "ready for general use" with keypad/touchpad support for M1 laptops, a power management controller driver added, I2C and SPI controller drivers, and a variety of other driver additions for supporting the Apple Silicon hardware.

OpenBSD 7.1 also has a number of other improvements benefiting the 64-bit ARM (ARM64) and RISC-V architectures. OpenBSD 7.1 also brings SMP kernel improvements, support for futexes with shared anonymous memory, and more. On the graphics front there is updating the Linux DRM code against the state found in Linux 5.15.26 as well as now enabling Intel Elkhart Lake / Jasper Lake / Rocket Lake support.

The Register notes OpenBSD now "supports a surprisingly wide range of hardware: x86-32, x86-64, ARM7, Arm64, DEC Alpha, HP PA-RISC, Hitachi SH4, Motorola 88000, MIPS64, SPARC64, RISC-V 64, and both Apple PowerPC and IBM POWER." The Register's FOSS desk ran up a copy in VirtualBox, and we were honestly surprised how quick and easy it was. By saying "yes" to everything, it automatically partitioned the VM's disk into a rather complex array of nine slices, installed the OS, a boot loader, an X server and display manager, plus the FVWM window manager. After a reboot, we got a graphical login screen and then a rather late-1980s Motif-style desktop with an xterm.

It was easy to install XFCE, which let us set the screen resolution and other modern niceties, and there are also KDE, GNOME, and other pretty front-ends, plus plenty of familiar tools such as Mozilla apps, LibreOffice and so on....

We were expecting to have to do a lot more work. Yes, OpenBSD is a niche OS, but the project gave the world OpenSSH, LibreSSL, the PF firewall as used in macOS, much of Android's Bionic C library, and more besides.... In a world of multi-gigabyte OSes, it's quite refreshing. It felt like stepping back into the early 1990s, the era of Real Unix, when you had to put in some real effort and learn stuff in order to bend the OS to your will — but in return, you got something relatively bulletproof.

Businesses

Nvidia and AMD GPUs Are Returning To Shelves and Prices Are Finally Falling (theverge.com) 78

For nearly two years, netting a PS5, Xbox Series X, or AMD Radeon and Nvidia RTX graphics cards without paying a fortune has been a matter of luck (or a lot of skill). At its peak, scalpers were successfully charging double or even triple MSRP for a modern GPU. But it's looking like the great GPU shortage is nearly over. From a report: In January, sites including Tom's Hardware reported that prices were finally beginning to drop, and drop they did; they've now dropped an average of 30 percent in the three months since. On eBay, the most popular graphics cards are only commanding a street price of $200-$300 over MSRP. And while that might still seem like a lot, some have fallen further: used Nvidia RTX 3080 Ti or AMD RX 6900 XT are currently fetching less than their original asking price, a sure sign that sanity is returning to the marketplace.

Just as importantly, some graphics cards are actually staying in stock at retailers when their prices are too high -- again, something that sounds perfectly normal but that we haven't seen in a while. For many months, boutiques like my local retailer Central Computers could only afford to sell you a GPU as part of a big PC bundle, but now it's making every card available on its own. GameStop is selling a Radeon RX 6600 for just $10 over MSRP, and it hasn't yet sold out. Newegg has also continually been offering an RTX 3080 Ti for just $10 over MSRP (after rebate, too) -- even if $1,200 still seems high for that card's level of performance.

Portables (Games)

Playdate, the Pocket-Sized Game Console With a Crank, Begins Shipping (oregonlive.com) 28

Playdate, the hotly anticipated video game system from Portland tech company Panic, began shipping Monday after a succession of manufacturing setbacks delayed the gadget by more than two years. OregonLive reports: Playdate is a throwback to the handheld video games of the 1980s. Designers eschewed the latest graphics technology in favor of a simple, black-and-white screen and an old-fashioned directional button pad. In a note of whimsey, the $179 Playdate also has a crank on the side. The crank provides various functions across the 24 games that come with purchase. (Games will be released online, two at a time, over the next 12 weeks.)

Panic is a software company, not an electronics manufacturer, and its first foray into computer hardware encountered a string of problems -- exacerbated by the pandemic and the resulting global shortage in computer chips. Most recently, Panic announced last November that many of its first 5,000 Playdates had faulty batteries. The company responded by sending them all back to its manufacturing contractor in Malaysia for replacement with new batteries from a different supplier.

Playdate fielded 20,000 orders in just 20 minutes when the first gadgets went on sale last July. And despite the delays, initial reviews Monday were very enthusiastic [...]. All the reviews noted, though, that Panic is a long way from untangling its production snarls. Only the first orders are going out now -- thousands more Playdates are coming sometime later, though Panic hasn't said just when.
There's also good news for DIYers: iFixit's teardown says the gaming system is relatively easy to fix if you ever need to replace its battery or buttons.
Graphics

Razer's First Linux Laptop Called 'Sexy' - But It's Not for Gamers (theverge.com) 45

A headline at Hot Hardware calls it "a sexy Linux laptop with deep learning chops... being pitched as the world's most powerful laptop for machine learning workloads."

And here's how Ars Technica describes the Razer x Lambda Tensorbook (announced Tuesday): Made in collaboration with Lambda, the Linux-based clamshell focuses on deep-learning development. Lambda, which has been around since 2012, is a deep-learning infrastructure provider used by the US Department of Defense and "97 percent of the top research universities in the US," according to the company's announcement. Lambda's offerings include GPU clusters, servers, workstations, and cloud instances that train neural networks for various use cases, including self-driving cars, cancer detection, and drug discovery.

Dubbed "The Deep Learning Laptop," the Tensorbook has an Nvidia RTX 3080 Max-Q (16GB) and targets machine-learning engineers, especially those who lack a laptop with a discrete GPU and thus have to share a remote machine's resources, which negatively affects development.... "When you're stuck SSHing into a remote server, you don't have any of your local data or code and even have a hard time demoing your model to colleagues," Lambda co-founder and CEO Stephen Balaban said in a statement, noting that the laptop comes with PyTorch and TensorFlow for quickly training and demoing models from a local GUI interface without SSH. Lambda isn't a laptop maker, so it recruited Razer to build the machine....

While there are more powerful laptops available, the Tensorbook stands out because of its software package and Ubuntu Linux 20.04 LTS.

The Verge writes: While Razer currently offers faster CPU, GPU and screens in today's Blade lineup, it's not necessarily a bad deal if you love the design, considering how pricey Razer's laptops can be. But we've generally found that Razer's thin machines run quite hot in our reviews, and the Blade in question was no exception even with a quarter of the memory and a less powerful RTX 3060 GPU. Lambda's FAQ page does not address heat as of today.

Lambda is clearly aiming this one at prospective MacBook Pro buyers, and I don't just say that because of the silver tones. The primary hardware comparison the company touts is a 4x speedup over Apple's M1 Max in a 16-inch MacBook Pro when running TensorFlow.

Specifically, Lambda's web site claims the new laptop "delivers model training performance up to 4x faster than Apple's M1 Max, and up to 10x faster than Google Colab instances." And it credits this to the laptop's use of NVIDIA's GeForce RTX 3080 Max-Q 16GB GPU, adding that NVIDIA GPUs "are the industry standard for parallel processing, ensuring leading performance and compatibility with all machine learning frameworks and tools."

"It looks like a fine package and machine, but pricing starts at $3,499," notes Hot Hardware, adding "There's a $500 up-charge to have it configured to dual-boot Windows 10."

The Verge speculates on what this might portend for the future. "Perhaps the recently renewed interest in Linux gaming, driven by the Steam Deck, will push Razer to consider Linux for its own core products as well."
Apple

How Apple's Monster M1 Ultra Chip Keeps Moore's Law Alive 109

By combining two processors into one, the company has squeezed a surprising amount of performance out of silicon. From a report: "UltraFusion gave us the tools we needed to be able to fill up that box with as much compute as we could," Tim Millet, vice president of hardware technologies at Apple, says of the Mac Studio. Benchmarking of the M1 Ultra has shown it to be competitive with the fastest high-end computer chips and graphics processor on the market. Millet says some of the chip's capabilities, such as its potential for running AI applications, will become apparent over time, as developers port over the necessary software libraries. The M1 Ultra is part of a broader industry shift toward more modular chips. Intel is developing a technology that allows different pieces of silicon, dubbed "chiplets," to be stacked on top of one another to create custom designs that do not need to be redesigned from scratch. The company's CEO, Pat Gelsinger, has identified this "advanced packaging" as one pillar of a grand turnaround plan. Intel's competitor AMD is already using a 3D stacking technology from TSMC to build some server and high-end PC chips. This month, Intel, AMD, Samsung, TSMC, and ARM announced a consortium to work on a new standard for chiplet designs. In a more radical approach, the M1 Ultra uses the chiplet concept to connect entire chips together.

Apple's new chip is all about increasing overall processing power. "Depending on how you define Moore's law, this approach allows you to create systems that engage many more transistors than what fits on one chip," says Jesus del Alamo, a professor at MIT who researches new chip components. He adds that it is significant that TSMC, at the cutting edge of chipmaking, is looking for new ways to keep performance rising. "Clearly, the chip industry sees that progress in the future is going to come not only from Moore's law but also from creating systems that could be fabricated by different technologies yet to be brought together," he says. "Others are doing similar things, and we certainly see a trend towards more of these chiplet designs," adds Linley Gwennap, author of the Microprocessor Report, an industry newsletter. The rise of modular chipmaking might help boost the performance of future devices, but it could also change the economics of chipmaking. Without Moore's law, a chip with twice the transistors may cost twice as much. "With chiplets, I can still sell you the base chip for, say, $300, the double chip for $600, and the uber-double chip for $1,200," says Todd Austin, an electrical engineer at the University of Michigan.
AMD

AMD Confirms Its GPU Drivers Are Overclocking CPUs Without Asking (tomshardware.com) 73

AMD has confirmed to Tom's Hardware that a bug in its GPU driver is, in fact, changing Ryzen CPU settings in the BIOS without permission. This condition has been shown to auto-overclock Ryzen CPUs without the user's knowledge. From the report: Reports of this issue began cropping up on various social media outlets recently, with users reporting that their CPUs had mysteriously been overclocked without their consent. The issue was subsequently investigated and tracked back to AMD's GPU drivers. AMD originally added support for automatic CPU overclocking through its GPU drivers last year, with the idea that adding in a Ryzen Master module into the Radeon Adrenalin GPU drivers would simplify the overclocking experience. Users with a Ryzen CPU and Radeon GPU could use one interface to overclock both. Previously, it required both the GPU driver and AMD's Ryzen Master software.

Overclocking a Ryzen CPU requires the software to manipulate the BIOS settings, just as we see with other software overclocking utilities. For AMD, this can mean simply engaging the auto-overclocking Precision Boost Overdrive (PBO) feature. This feature does all the dirty work, like adjusting voltages and frequency on the fly, to give you a one-click automatic overclock. However, applying a GPU profile in the AMD driver can now inexplicably alter the BIOS settings to enable automatic overclocking. This is problematic because of the potential ill effects of overclocking -- in fact, overclocking a Ryzen CPU automatically voids the warranty. AMD's software typically requires you to click a warning to acknowledge that you understand the risks associated with overclocking, and that it voids your warranty, before it allows you to overclock the system. Unfortunately, that isn't happening here.
Until AMD issues a fix, "users have taken to using the Radeon Software Slimmer to delete the Ryzen Master SDK from the GPU driver, thus preventing any untoward changes to the BIOS settings," adds Tom's Hardware.
Games

Epic's Unreal Engine 5 Has Officially Launched (axios.com) 60

Epic Games has officially launched Unreal Engine 5, its newest set of software tools for making video games. From a report: While Epic may be known to much of the public for its hit game Fortnite, its core business has long been Unreal. Epic's goal is to make Unreal Engine the definitive toolset for building games, virtual worlds and other digital entertainment. The engine is free to download, but Epic then takes a 5% cut of games that generate more than $1 million in revenue. During a kickoff video showcase today, the selling point wasn't just what the engine can do, but who is using it.

Epic workers demonstrated how the engine could be used to make and tweak modern games. Then came the key slide showing dozens of partners, including PlayStation, Xbox and Tencent followed by testimonials from recent Unreal Engine converts CD Projekt RED, which had previously used its own tech to make games in the Witcher and Cyberpunk franchises and ending with the kicker that Crystal Dynamics, another studio that long operated its own in-house engine, would use Unreal on its next Tomb Raider game.
More details at Kotaku.
Intel

Intel Beats AMD and Nvidia with Arc GPU's Full AV1 Support (neowin.net) 81

Neowin notes growing support for the "very efficient, potent, royalty-free video codec" AV1, including Microsoft's adding of support for hardware acceleration of AV1 on Windows.

But AV1 even turned up in Intel's announcement this week of the Arc A-series, a new line of discrete GPUs, Neowin reports: Intel has been quick to respond and the company has become the first such GPU hardware vendor to have full AV1 support on its newly launched Arc GPUs. While AMD and Nvidia both offer AV1 decoding with their newest GPUs, neither have support for AV1 encoding.

Intel says that hardware encoding of AV1 on its new Arc GPUs is 50 times faster than those based on software-only solutions. It also adds that the efficiency of AV1 encode with Arc is 20% better compared to HEVC. With this feature, Intel hopes to potentially capture at least some of the streaming and video editing market that's based on users who are looking for a more robust AV1 encoding solution compared to CPU-based software approaches.

From Intel's announcement: Intel Arc A-Series GPUs are the first in the industry to offer full AV1 hardware acceleration, including both encode and decode, delivering faster video encode and higher quality streaming while consuming the same internet bandwidth. We've worked with industry partners to ensure that AV1 support is available today in many of the most popular media applications, with broader adoption expected this year. The AV1 codec will be a game changer for the future of video encoding and streaming.
Intel

Intel Enters Discrete GPU Market With Launch of Arc A-Series For Laptops (hothardware.com) 23

MojoKid writes: Today Intel finally launched its first major foray into discrete GPUs for gamers and creators. Dubbed Intel Arc A-Series and comprised of 5 different chips built on two different Arc Alchemist SoCs, the company announced its entry level Arc 3 Graphics is shipping in market now with laptop OEMs delivering new all-Intel products shortly. The two SoCs set the foundation across three performance tiers, including Arc 3, Arc 5, and Arc 7.

For example, Arc A370M arrives today with 8 Xe cores, 8 ray tracing units, 4GB of GDDR6 memory linked to a 64-bit memory bus, and a 1,550MHz graphics clock. Graphics power is rated at 35-50W. However, Arc A770M, Intel's highest-end mobile GPU will come with 32 Xe cores, 32 ray tracing units, 16GB of GDDR 6 memory over a 256-bit interface and with a 1650MHz graphics clock. Doing the math, Arc A770M could be up to 4X more powerful than Arc 370M. In terms of performance, Intel showcased benchmarks from a laptop outfitted with a Core i7-12700H processor and Arc A370M GPU that can top the 60 FPS threshold at 1080p in many games where integrated graphics could come up far short. Examples included Doom Eternal (63 fps) at high quality settings, and Hitman 3 (62 fps), and Destiny 2 (66 fps) at medium settings. Intel is also showcasing new innovations for content creators as well, with its Deep Link, Hyper Encode and AV1 video compression support offering big gains in video upscaling, encoding and streaming. Finally, Intel Arc Control software will offer unique features like Smooth Sync that blends tearing artifacts when V-Synch is turned off, as well as Creator Studio with background blur, frame tracking and broadcast features for direct game streaming services support.

Graphics

The Untold Story of the Creation of GIF At CompuServe In 1987 (fastcompany.com) 43

Back in 1987 Alexander Trevor worked with the GIF format's creator, Steve Wilhite, at CompuServe. 35 years later Fast Company tech editor Harry McCracken (also Slashdot reader harrymcc) located Trevor for the inside story: Wilhite did not come up with the GIF format in order to launch a billion memes. It was 1987, and he was a software engineer at CompuServe, the most important online service until an upstart called America Online took off in the 1990s. And he developed the format in response to a request from CompuServe executive Alexander "Sandy" Trevor. (Trevor's most legendary contribution to CompuServe was not instigating GIF: He also invented the service's CB Simulator — the first consumer chat rooms and one of the earliest manifestation of social networking, period. That one he coded himself as a weekend project in 1980.)

GIF came to be because online services such as CompuServe were getting more graphical, but the computer makers of the time — such as Apple, Commodore, and IBM — all had their own proprietary image types. "We didn't want to have to put up images in 79 different formats," explains Trevor. CompuServe needed one universal graphics format.

Even though the World Wide Web and digital cameras were still in the future, work was already underway on the image format that came to be known as JPEG. But it wasn't optimized for CompuServe's needs: For example, stock charts and weather graphics didn't render crisply. So Trevor asked Wilhite to create an image file type that looked good and downloaded quickly at a time when a 2,400 bits-per-second dial-up modem was considered torrid. Reading a technical journal, Wilhite came across a discussion of an efficient compression technique known as LZW for its creators — Abraham Limpel, Jacob Ziv, and Terry Welch. It turned out to be an ideal foundation for what CompuServe was trying to build, and allowed GIF to pack a lot of image information into as few bytes as possible. (Much later, computing giant Unisys, which gained a patent for LZW, threatened companies that used it with lawsuits, leading to a licensing agreement with CompuServe and the creation of the patent-free PNG image format.)

GIF officially debuted on June 15, 1987. "It met my requirements, and it was extremely useful for CompuServe," says Trevor....

GIF was also versatile, offering the ability to store the multiple pictures that made it handy for creating mini-movies as well as static images. And it spread beyond CompuServe, showing up in Mosaic, the first graphical web browser, and then in Netscape Navigator. The latter browser gave GIFs the ability to run in an infinite loop, a crucial feature that only added to their hypnotic quality. Seeing cartoon hamsters dance for a split second is no big whoop, but watching them shake their booties endlessly was just one of many cultural moments that GIFs have given us.

Media

Stephen Wilhite, Creator of the GIF, Has Died (theverge.com) 128

Stephen Wilhite, one of the lead inventors of the GIF, died last week from COVID at the age of 74, according to his wife, Kathaleen, who spoke to The Verge. From the report: Stephen Wilhite worked on GIF, or Graphics Interchange Format, which is now used for reactions, messages, and jokes, while employed at CompuServe in the 1980s. He retired around the early 2000s and spent his time traveling, camping, and building model trains in his basement.

Although GIFs are synonymous with animated internet memes these days, that wasn't the reason Wilhite created the format. CompuServe introduced them in the late 1980s as a way to distribute "high-quality, high-resolution graphics" in color at a time when internet speeds were glacial compared to what they are today. "He invented GIF all by himself -- he actually did that at home and brought it into work after he perfected it," Kathaleen said. "He would figure out everything privately in his head and then go to town programming it on the computer."

If you want to go more in-depth into the history of the GIF, the Daily Dot has a good explainer of how the format became an internet phenomenon.
In 2013, Wilhite weighed in on the long-standing debate about the correct pronunciation of the image format. He told The New York Times, "The Oxford English Dictionary accepts both pronunciations. They are wrong. It is a soft 'G,' pronounced 'jif.' End of story."

Slashdot Top Deals