Apple

Apple Introduces M1 Pro and M1 Max (apple.com) 201

Apple today announced M1 Pro and M1 Max, its new chips for the Mac. Apple: M1 Pro and M1 Max introduce a system-on-a-chip (SoC) architecture to pro systems for the first time. The chips feature fast unified memory, industry-leading performance per watt, and incredible power efficiency, along with increased memory bandwidth and capacity. M1 Pro offers up to 200GB/s of memory bandwidth with support for up to 32GB of unified memory. M1 Max delivers up to 400GB/s of memory bandwidth -- 2x that of M1 Pro and nearly 6x that of M1 -- and support for up to 64GB of unified memory. And while the latest PC laptops top out at 16GB of graphics memory, having this huge amount of memory enables graphics-intensive workflows previously unimaginable on a notebook. The efficient architecture of M1 Pro and M1 Max means they deliver the same level of performance whether MacBook Pro is plugged in or using the battery. M1 Pro and M1 Max also feature enhanced media engines with dedicated ProRes accelerators specifically for pro video processing. M1 Pro and M1 Max are by far the most powerful chips Apple has ever built.

Utilizing the industry-leading 5-nanometer process technology, M1 Pro packs in 33.7 billion transistors, more than 2x the amount in M1. A new 10-core CPU, including eight high-performance cores and two high-efficiency cores, is up to 70 percent faster than M1, resulting in unbelievable pro CPU performance. Compared with the latest 8-core PC laptop chip, M1 Pro delivers up to 1.7x more CPU performance at the same power level and achieves the PC chip's peak performance using up to 70 percent less power. Even the most demanding tasks, like high-resolution photo editing, are handled with ease by M1 Pro. M1 Pro has an up-to-16-core GPU that is up to 2x faster than M1 and up to 7x faster than the integrated graphics on the latest 8-core PC laptop chip.1 Compared to a powerful discrete GPU for PC notebooks, M1 Pro delivers more performance while using up to 70 percent less power. And M1 Pro can be configured with up to 32GB of fast unified memory, with up to 200GB/s of memory bandwidth, enabling creatives like 3D artists and game developers to do more on the go than ever before.

M1 Max features the same powerful 10-core CPU as M1 Pro and adds a massive 32-core GPU for up to 4x faster graphics performance than M1. With 57 billion transistors -- 70 percent more than M1 Pro and 3.5x more than M1 -- M1 Max is the largest chip Apple has ever built. In addition, the GPU delivers performance comparable to a high-end GPU in a compact pro PC laptop while consuming up to 40 percent less power, and performance similar to that of the highest-end GPU in the largest PC laptops while using up to 100 watts less power.2 This means less heat is generated, fans run quietly and less often, and battery life is amazing in the new MacBook Pro. M1 Max transforms graphics-intensive workflows, including up to 13x faster complex timeline rendering in Final Cut Pro compared to the previous-generation 13-inch MacBook Pro. M1 Max also offers a higher-bandwidth on-chip fabric, and doubles the memory interface compared with M1 Pro for up to 400GB/s, or nearly 6x the memory bandwidth of M1. This allows M1 Max to be configured with up to 64GB of fast unified memory. With its unparalleled performance, M1 Max is the most powerful chip ever built for a pro notebook.

Programming

Apple Joins Blender's Development Fund To Support 3D Graphics Tool (macrumors.com) 51

Blender today announced that Apple has joined the Blender Development Fund to support continued development of the free open source 3D graphics tool. From a report: Alongside a contribution to the Development Fund, Apple will provide engineering expertise and additional resources to Blender and its broader development community to help support Blender artists and developers, according to the announcement. Blender CEO Ton Roosendaal said the announcement means that "macOS will be back as a complete supported Blender platform."
Hardware

The Mega65: A Modernization of the Canceled Commodore 65 Computer From 1991 113

Slashdot reader TommyROM writes: The Commodore 65 was a never-released computer slated to follow the fabled Commodore 64 from 1982. Developed between 1990 and 1991, it would have been the most powerful 8-bit computer on the market with 128K RAM, high-resolution graphics (up to 1280x400), and stereo sound. A few prototypes were made before Commodore canceled the project in 1991.

Now an updated version of the Commodore 65 has been realized. Project founder Paul Gardner-Stephen began working on recreating the C65 in 2014, and eventually teamed up with the non-profit Museum of Electronic Games & Art to create the FPGA-based Mega65, a modernization of the original Commodore 65 featuring a custom main board, mechanical keyboard, and injection molded case. It uses the original C65 ROMs but improves on the design with SD card support, Ethernet, and HDMI output. It is about 40 times faster than a C64 and backwards compatible, including cartridge and joystick ports. The design is open-sourced for long-term compatibility. Additionally, there is a hand-held version in the works that is also a cellphone. They are currently taking pre-orders for the Mega65 at a price of 666.66 euros ($742 plus shipping).

The Retro Hour podcast has an interview with founder Paul Gardner-Stephen where he discusses the impetus of the project and goes into more details of the design.
Google

Google's Fuchsia Is Expanding To 'Additional Smart Devices and Other Form Factors' (9to5google.com) 32

According to new job listings, Google is looking to expand the Fuchsia operating system from its current home on the Nest Hub to âoeadditional smart devices and other form factors." 9to5Google reports: The first listing, for "Staff Software Engineer, Fuchsia Devices," celebrates Fuchsia's recent milestone and points clearly to Google wanting Fuchsia to run on more "real world products" than just smart displays: "In 2021 we shipped Fuchsia to millions of Google smart displays, now it's time to expand to additional smart devices and other form factors. Come join us and work on the next-generation Google operating system! Although the first uses of Fuchsia are smart displays, we are working on expanding to additional form factors and use cases. The Fuchsia Devices team is responsible for making sure we can successfully apply the Fuchsia platform to real world products that make a difference to Google and our users."

So what types of devices should we expect Fuchsia to come to next? Well for one, we may look at the plural phrasing of "first uses of Fuchsia are smart displays," which suggests the Nest Hub Max and Nest Hub (2nd Gen) may be getting their chance to switch to Fuchsia soon. Of course, Google has made it abundantly clear here that smart displays are just the beginning. In another listing, for "Engineering Manager, Fuchsia Devices," the company explains that the Fuchsia Devices team is aiming to "[expand] the reach of Nest/Assistant to new form factors" through "real consumer devices." [From the listing:] "The Fuchsia Devices Smart Products team is part of the larger Fuchsia organization and is responsible for productionizing various types of Fuchsia Devices. Our team delivers real consumer devices to end users and enables you to have a large impact at Google by expanding the reach of Nest/Assistant to new form factors."

This could suggest entirely new device categories for Google's Nest lineup, powered by Fuchsia. In another section, we get some hints at what those proposed devices would be capable of: "Chromecast, Video Conferencing and Machine Learning are core parts of many of the upcoming smart products." It's important to note that "Chromecast" here is probably not referring to the lineup of Chromecast hardware for TVs switching to Fuchsia. Instead, it's more likely referring to how Google's speakers and displays can receive a "Cast" from your phone. This is somewhat clarified in the job listing's responsibilities section, which lists a handful of features that squarely line up with features of the camera-equipped Nest Hub Max, including "Face Match." The Nest Hub Max's store listing also references "Chromecast built-in" as a feature: "Plan, scope, and execute of features like Video Casting, Actions on Google, Video Calling, Face Match, and on device ML."

It also looks like Made by Google devices may soon not be the only ones shipping with Fuchsia. According to another listing, Google is looking for someone on Fuchsia's "platform graphics and media" team to, among other things, "influence hardware decisions made by partners." [From the listing:] "As a Staff Software Engineer on the Fuchsia team, you'll drive the technical direction for Graphics and Media and ensure that Fuchsia is bringing maximum value to partners and Fuchsia-based products. You also will influence hardware decisions made by partners to improve Fuchsia and Google's ability to deliver efficient software solutions for critical Graphics and Media workloads."
Fuchsia debuted on the first-generation Nest Hub earlier this year.
Microsoft

The Best Part of Windows 11 Is Its Linux, Argues Ars Technica (arstechnica.com) 148

The best part of Windows 11 is Linux, argues Ars Technica: For years now, Windows 10's Windows Subsystem for Linux has been making life easier for developers, sysadmins, and hobbyists who have one foot in the Windows world and one foot in the Linux world. But WSL, handy as it is, has been hobbled by several things it could not do. Installing WSL has never been as easy as it should be — and getting graphical apps to work has historically been possible but also a pain in the butt that required some fairly obscure third-party software. Windows 11 finally fixes both of those problems. The Windows Subsystem for Linux isn't perfect on Windows 11, but it's a huge improvement over what came before.

Microsoft has traditionally made installing WSL more of a hassle than it should be, but the company finally got the process right in Windows 10 build 2004. Just open an elevated Command prompt (start --> type cmd --> click Run as Administrator), type wsl --install at the prompt, and you're good to go. Windows 11, thankfully, carries this process forward unchanged. A simple wsl --install with no further arguments gets you Hyper-V and the other underpinnings of WSL, along with the current version of Ubuntu. If you aren't an Ubuntu fan, you can see what other easily installable distributions are available with the command wsl --list --online. If you decide you'd prefer a different distro, you can install it instead with — for example — wsl --install -d openSUSE-42. If you're not sure which distribution you prefer, don't fret. You can install as many as you like, simply by repeating wsl --list --online to enumerate your options and wsl --install -d distroname to install whichever you like. Installing a second distribution doesn't uninstall the first; it creates a separate environment, independent of any others. You can run as many of these installed environments as you like simultaneously, without fear of one messing up another.

In addition to easy installation, WSL on Windows 11 brings support for both graphics and audio in WSL apps. This isn't exactly a first — Microsoft debuted WSLg in April, with Windows 10 Insider Build 21364. But Windows 11 is the first production Windows build with WSLg support. If this is your first time hearing of WSLg, the short version is simple: you can install GUI apps — for example, Firefox — from your Ubuntu (or other distro) command line, and they'll work as expected, including sound. When I installed WSLg on Windows 11 on the Framework laptop, running firefox from the Ubuntu terminal popped up the iconic browser automatically. Heading to YouTube in it worked perfectly, too, with neither frame drops in the video nor glitches in the audio....

[T]here is one obvious "killer app" for WSLg that has us excited — and that's virt-manager, the RedHat-originated virtualization management tool. virt-manager is a simple tool that streamlines the creation, management, and operation of virtual machines using the Linux Kernel Virtual Machine... virt-manager never got a Windows port and seems unlikely to. But it runs under WSLg like a champ.

They reported a few problems, like when running GNOME's Software Center app (and the GNOME shell desktop environment).

But "If you're already a Windows Subsystem for Linux (WSL) user, Windows 11 offers an enormously improved experience compared to what you're accustomed to from Windows 10. It installs more easily, makes more functionality available, and offers better desktop integration than older workarounds such as running MobaXTerm's X11 server."
AI

The Rise of the Robo-Voices (wsj.com) 54

The next time you see a movie or TV show that was dubbed from a foreign language, the voices you hear may not belong to actors who rerecorded dialogue in a sound booth. In fact, they may not belong to actors at all. From a report: Highly sophisticated digital voice manufacturing is coming, and entertainment executives say it could bring a revolution in sound as industry-changing as computer graphics were for visuals. New companies are using artificial intelligence to create humanlike voices from samples of a living actor's voice -- models that not only can sound like specific performers, but can speak any language, cry, scream, laugh, even talk with their mouths full. At the same time, companies are refining the visual technology so actors look like they are really speaking.

As streaming services export American fare globally and foreign markets send their hits to the U.S., dubbing is a bigger business than ever. But the uses of synthetic voices extend well beyond localizing foreign films. AI models can provide youthful voices for aging actors. The technology can resurrect audio from celebrities who have died or lost the ability to speak. And it can tweak dialogue in postproduction without the need for actors. All the tinkering raises thorny ethical questions. Where is the line between creating an engrossing screen experience and fabricating an effect that leaves audiences feeling duped?

The technology is set to hit a new target in the coming months, when foreign-language dubbed versions of the 2019 indie horror movie "Every Time I Die" are released in South America. Those versions mark one of the first times entire dubbed movies use computerized voice clones based on the voices of the original English-speaking cast. So when the film comes out abroad, audiences will hear the original actors "speaking" Spanish or Portuguese. Deepdub created the replicas based on 5-minute recordings of each actor speaking English.

ISS

Russian Actress and Director To Start Making First Movie on Space Station (nytimes.com) 45

The first dog in space. The first man and woman. Now Russia has clinched another spaceflight first before the United States: Beating Hollywood to orbit. From a report: A Russian actress, Yulia Sherepild, a director, Klim Shipenko, and their veteran Russian astronaut guide, Anton Shkaplerov, launched on a Russian rocket toward the International Space Station on Tuesday. Their mission is to shoot scenes for the first feature-length film in space. While cinematic sequences in space have long been portrayed on big screens using sound stages and advanced computer graphics, never before has a full-length movie been shot and directed in space.

Whether the film they shoot in orbit is remembered as a cinematic triumph, the mission highlights the busy efforts of governments as well as private entrepreneurs to expand access to space. Earth's orbit and beyond were once visited only by astronauts handpicked by government space agencies. But a growing number of visitors in the near future will be more like Ms. Sherepild and Mr. Shipenko, and less like the highly trained Mr. Shkaplerov and his fellow space explorers. A Soyuz rocket, the workhorse of Russia's space program, lifted off on time at 4:55 a.m. Eastern time from the Baikonur Cosmodrome in Kazakhstan. Before the launch on Tuesday, the MS-19 crew posed for photos and waved to family and fans in Baikonur. Mr. Shipenko, the director of the film which is named "The Challenge," held up a script as he waved to cameras.

IT

New USB-C Logos Make Picking USB Cables, Chargers Less Confusing (pcworld.com) 87

Choosing the correct USB-C charger and cable for you laptop is about as fun as visiting the dentist, but new logos released today should go a long way toward making easier. PCWorld: The USB Implementers Forum group that oversees the USB standard has released logos that easily indicate whether a cable or charger can hit the new 240 watt rating. Previous USB-C chargers and cables were rated to hit 65 watts or 100 watts but a new version of USB Power Delivery released this May has pushed the limit to an impressive 240 watts. Obviously, that means if you're looking for a 240 watt aftermarket charger for a new gaming laptop that supports it, you want one. With the new USB-C logos, all you have to do is look for a Certified USB Charger 240W logo with a lightning bolt like the one from the chart above. The other component you may need is a 240 watt USB-C cable, so consumers need only look for Certified USB Charger 240W with a cable in its logo. Both logos also can also be paired with USB 40Gbps bits to indicate if the cable is certified to support USB4's 40Gbps speed.

The higher output 240 watt power range is a welcome addition to USB-C as it should allow laptop makers to bringing universal USB-C charging to far more powerful laptops, including gaming laptops with discrete graphics chips -- something that was out of reach of the previous USB-C chargers, cables, and ports. In fact, we found that we probably wouldn't want to use a small USB-C charger in a gaming laptop with today's technology. With 240 watt USB-C charger, we'd probably change our mind. The problem, of course, is that the USB-IF is an organization that certifies cables, chargers, and USB-C brick a brats, but it's not mandatory. This has lead to small brand and no-name manufacturers getting the spec wrong in the past. The good news is the cables from companies that actually obtain certifications correctly should work correctly.

Education

School Reopenings Stymie Teens' Reseller Gigs (pcmag.com) 147

It turns out school reopenings are disrupting the cash flow of industrious teenagers who spent the pandemic scooping up in-demand products via bots and reselling them for a hefty profit. From a report: "Yes, I am back in school. Yea, it's very annoying," said one US high school student named Dillon, who regularly buys video game consoles and graphics cards with automated bots. "I am sitting in math class and drawing class with my computer open, and I get told to shut it down during a [product] drop sometimes," he told PCMag in an interview. Dillon may be young, but he's among the legion of online scalpers who spent the pandemic at home buying and reselling the tech world's most-wanted products. "I would say around $10,000 to $12,500 average a month," he told PCMag. "Some months it would be exponentially higher, some would be lower."

Using automated bots he purchased and installed on his computer, and intel from other online resellers, Dillon scooped up products like the PlayStation 5 ahead of other consumers and sold them off at inflated pricing. But lately, Dillon's reselling hit a snag. After months away from high school because of the pandemic, he's now back in the classroom, where computer use can be strictly controlled. "When everything closed [during the pandemic], I could do whatever I wanted because I was doing my school from home," he said. But with the return of in-classroom teaching, Dillon says his profits have now fallen by about 25%.

AMD

AMD Radeon Software Can Overclock Your Ryzen CPU Now, Too (pcworld.com) 25

An anonymous reader quotes a report from PCWorld: The latest version of Radeon Software adds an unusual (and welcome) new twist: The ability to automatically overclock your Ryzen processor if you're rocking an all-AMD gaming desktop. Yes, your GPU software can speed up your CPU now, too -- and it can do it all with a single click. [...] The addition of Ryzen auto-overclocking in Radeon Software 21.9.1 continues the theme, and might just allow you to ditch AMD's separate Ryzen Master tool if you're running a Team Red graphics card. They'll need to be newer hardware, though, as the feature currently only supports AMD's latest Ryzen 5000 CPUs and Radeon RX 6000 GPUs.

AMD's blog describes how to use the new tool: "To access this easy-to-use feature, open up Radeon Software using the hotkey 'ALT' + 'R', navigate to the 'Performance' tab found at the top of the window, and select 'Tuning' in the sub tab directly below it. If you have the latest generation of AMD Ryzen and Radeon product installed on your system, a 'Tuning Control' section should appear for your system, allowing you to select 'Auto Overclock' to increase performance on both your processor and graphics card. We also have a new tuning section for CPUs, allowing you to overclock just your CPU. When the feature is selected, the system will ask for a restart and once you are back in Windows, you will be good to go!"
"Radeon Software 21.9.1 also adds official Windows 11 support and the ability for Radeon RX 5000-series GPUs to tap into Smart Access Memory," adds PCWorld. "AMD also took the time to tout FidelityFX Super Resolution's rapid uptake. The DLSS rival is now supported in 27 games, with Arkane's awesome-looking Deathloop set to launch this week with native FSR support in place."

You can download these new drivers here.
Games

Nvidia Leak May Reveal Unannounced Games, Including God of War For PC (theverge.com) 35

An Nvidia GeForce Now server has leaked a confidential list of thousands of games, some of which have never been seen before, like the PlayStation exclusive God of War seemingly coming to Windows PC via Steam. Developer Ighor July has published the list to GitHub. The Verge reports: Here's a screenshot of what that looks like in the GeForce Now client, according to the developer. There are reasons to believe the list is legit. We know graphics giant Nvidia has access to games long before they're released -- and we know Sony in particularly has been banking on banking on PlayStation games on PC. It quietly revealed Uncharted 4 was coming to PC, after seeing a 250 percent return on its investment porting Horizon: Zero Dawn to the platform, and it was just Thursday that Sony announced it would be part of the Uncharted: Legacy of Thieves Collection -- a name that we'd never heard of before then, but already appears in Nvidia's list as well. So too do Demon's Souls and Final Fantasy XVI -- the games where Sony had to retroactively retract all mentions of PC to make them sound like PlayStation exclusives. PS5 exclusive Returnal appears as well, as does a Final Fantasy VII Remake for PC.

And there are codenames for games in here that seem original, ones that bring up zero search results. Is "Platinum" the internal name for Bethesda's Indiana Jones games? But there are also a lot of mentions that seem rather out of date or out of place, like a whole host of Facebook-exclusive Oculus Rift titles that would make little sense on Nvidia's GeForce Now cloud gaming service, or a mention of a "Titanfall 3" which clarifies that it's actually "GAMEapex_legends_-_titanfall," aka Apex Legends, the popular battle royale game. And some of them may simply be guesses, like Kingdom Hearts IV, "BioShock 2022," and so on. All of that means you should probably take any given name on the list with a grain of salt...

AMD

Lenovo's First Windows 11 Laptops Run On Ryzen (pcworld.com) 59

Lenovo will ring in the arrival of Windows 11 with a pair of premium AMD Ryzen-based laptops. PCWorld reports: The IdeaPad Slim 7 Carbon will feature a carbon lid and aluminum body to go with its drop-dead gorgeous 14-inch OLED screen. Besides the infinite contrast an OLED provides, Lenovo will use a fast 90Hz, 2880x1800 panel with an aspect ratio of 16:10 on the Slim 7 Carbon. That's just over 5 megapixels with a density of 243 pixels-per-inch. This is one smoking screen. But this beauty goes deeper than the skin. Inside the Slim 7 Carbon, you'll find an 8-core Ryzen 7 5800U with an optional Nvidia GeForce MX450 GPU. Lenovo will offer up to 16GB of power efficient LPDDR4X RAM and up to a 1TB PCIe SSD.
[...]
If a 14-inch screen laptop with a GeForce MX450 isn't enough for you, Lenovo also unveiled a new IdeaPad Slim 7 Pro. It's aimed at someone who needs a little more oomph. The laptop also features a 16:10 aspect ratio screen, which we consider superior to 16:9 aspect ratio laptops for getting work done. The IPS panel shines at a very bright 500 nits, and it's rated for 100 percent of the sRGB spectrum with an option for a 120Hz refresh rate version. Inside the laptop you'll find an 8-core Ryzen 7 5800H, up to 16GB of DDR4, a 1TB PCIe SSD, and up to a GeForce RTX 3050 Laptop graphics chip. Compared to the Slim 7 Carbon, you should expect the CPU to run faster thanks to the additional thermal headroom of the H-class Ryzen chip. The GeForce RTX 3050 Laptop GPU, meanwhile, is based on Nvidia's newest "Ampere" GPU cores instead of the older "Turing" GPU the GeForce MX450 uses in the Slim 7 Carbon. That upgrade translates to far better gaming performance, hardware ray tracing support, and the inclusion of Nvidia hardware encoding and decoding, which can help you use Adobe Premiere on the road.

The Internet

Jagex Nixes Community-Built RuneScape HD Client, Massive Backlash Follows (runescape.com) 22

New submitter Sauce Tin writes: In a blog post, Jagex announced the shutdown of a community-driven RuneScape HD graphics client. The announcement came at an inopportune time -- the community client was prepped for release this week and had been announced years beforehand, with 2,000+ hours of effort of a single individual behind it. The effort had been noticed by Jagex, however no opposition from the company was made -- until recently. Thousands of players vented on the game's subreddit, ultimately reaching the top of r/all. Jagex has had a past of infuriating its player base over the years, including the removal of free trade, PvP combat, and LGBT holiday content.
Hardware

ASUS Bets on OLED for All of Its New Creator Laptops (engadget.com) 93

ASUS has just four letters to sell you on its latest creator-focused notebooks: OLED. From a report: The company is bringing OLED screens to all of its new models, a move meant to differentiate them in the increasingly crowded PC market. Compared to traditional LCD screens, OLED offers deeper blacks levels, vastly better contrast, and more responsiveness. Even today, as LCDs have evolved to be brighter and faster, OLED offers a more pronounced visual "pop." We've been seeing notebooks with OLED for years, like on the XPS 15 and ZenBook, but they've typically been positioned as a premium feature for select models. Now ASUS is trying to make its name synonymous with OLED, so much so that it's bringing it to new mid-range notebooks like the VivoBook Pro 14X and 16X. It's also touting the first 16-inch 4K OLED HDR screens on notebooks across several models: the ProArt Studiobook Pro, ProArt Studiobook and the Vivobook Pro.

Befitting its name, you can expect to see the fastest hardware on the market in the StudioBook Pro 16 OLED (starting at $2,500). It'll be powered by H-series Ryzen 5000 processors, 3rd-gen Intel Xeon chips and NVIDIA's professional-grade RTX A2000 and A5000 GPUs. And if you don't need all of that power, there's also the Studiobook 16 OLED ($2,000), which has the same Ryzen chips, Intel Core i7 CPUs and either RTX 3070 or 3060 graphics. Both notebooks will be equipped with 4K OLED HDR screens that reach up to 550 nits and cover 100 percent of DCI-P3 color gamut. They'll also sport ASUS Dial, a new rotary accessory located at the top of their trackpads, offering similar functionality to Microsoft's forgotten Surface Dial.

Chrome

Chrome 94 Beta Adds WebGPU API With Support For Apple's Metal (9to5mac.com) 36

An anonymous reader quotes a report from 9to5Mac, written by Filipe Esposito: Google this week announced the beta release of Chrome 94, the next update to Google's desktop web browser. In addition to general improvements, the update also adds support for the new WebGPU API, which comes to replace WebGL and can even access Apple's Metal API. As described by Google in a blog post, WebGPU is a new, more advanced graphics API for the web that is able to access GPU hardware, resulting in better performance for rendering interfaces in websites and web apps.

For those unfamiliar, Metal is an API introduced by Apple in 2014 that provides low-level access to GPU hardware for iOS, macOS, and tvOS apps. In other words, apps can access the GPU without overloading the CPU, which is one of the limitations of old APIs like OpenGL. Google says WebGPU is not expected to come enabled by default for all Chrome users until early 2022. The final release of Chrome 94 should enable WebCodecs for everyone, which is another API designed to improve the encoding and decoding of streaming videos.

PlayStation (Games)

Emulator Runs PS1 Games in 4K on the New Xboxes (inputmag.com) 13

Duckstation, an emulator that allows users to run Playstation games, was recently made available for installation onto the latest generation of Xbox consoles. From a report: It's time to jog those nostalgia muscles, as the emulator will not only be able to play your PS1 favorites but also scale those games up to native 4K resolution at 60fps. In addition to the 4K treatment, Duckstation will let gamers improve the overall look of the emulation experience in a couple of other ways.

Turning this on disables dithering, an effect that was built into the original Playstation hardware. Dithering in layman's terms was basically a function to improve depth of color by underpinning graphics with a series of lines or dots, which were then blurred by the system's video encoders. Turning this on helps improve graphic capabilities by smoothing out the blocky textures on 3D objects. The original low-poly graphics of the PS1 would often look cruder as they enlarged, so this function basically smoothes out those clunky compositions.

Transportation

Older Tesla Vehicles To Get UI Performance Boost Thanks To Famed Video Game Engineer (electrek.co) 86

Tesla is working with famed video game engineer John Carmack to improve the interface performance in older vehicles. Electrek reports: Carmack is a legend in the video game world and in the broader computer science industry. He made important advancements in 3D computer graphics and was the lead programmer on game-changing video games like Doom and Quake. Later in his career, he focused his talents on virtual reality and became CTO of Oculus. More recently, he stepped down from his role at Oculus to focus on general artificial intelligence. In the 2000s, Carmack also had an interest in rocketry and started Armadillo Aerospace.

Several of these interests overlap with Elon Musk's, who has a lot of respect for Carmack and tried to hire him for a long time. While it doesn't sound like Musk has convinced him to come work with him just yet, Carmack confirmed that he is actually working on a Tesla product. Carmack drives a Tesla Model S, and he confirmed that he is working with Tesla engineers to improve interface performance: "I did kind of volunteer to help them fix what I consider very poor user interface performance on the older model S (that I drive). Their engineers have been sharing data with me." Tesla has had performance issues with its older media control unit found in older Tesla Model S vehicles. The automaker offers a media computer upgrade to improve performance, but you are stuck if you don't want to pay the $2,500 upgrade.

Intel

45 Teraflops: Intel Unveils Details of Its 100-Billion Transistor AI Chip (siliconangle.com) 16

At its annual Architecture Day semiconductor event Thursday, Intel revealed new details about its powerful Ponte Vecchio chip for data centers, reports SiliconANGLE: Intel is looking to take on Nvidia Corp. in the AI silicon market with Ponte Vecchio, which the company describes as its most complex system-on-chip or SOC to date. Ponte Vecchio features some 100 billion transistors, nearly twice as many as Nvidia's flagship A100 data center graphics processing unit. The chip's 100 billion transistors are divided among no fewer than 47 individual processing modules made using five different manufacturing processes. Normally, an SOC's processing modules are arranged side by side in a flat two-dimensional design. Ponte Vecchio, however, stacks the modules on one another in a vertical, three-dimensional structure created using Intel's Foveros technology.

The bulk of Ponte Vecchio's processing power comes from a set of modules aptly called the Compute Tiles. Each Compute Tile has eight Xe cores, GPU cores specifically optimized to run AI workloads. Every Xe core, in turn, consists of eight vector engines and eight matrix engines, processing modules specifically built to run the narrow set of mathematical operations that AI models use to turn data into insights... Intel shared early performance data about the chip in conjunction with the release of the technical details. According to the company, early Ponte Vecchio silicon has demonstrated performance of more than 45 teraflops, or about 45 trillion operations per second.

The article adds that it achieved those speeds while processing 32-bit single-precision floating-point values floating point values — and that at least one customer has already signed up to use Ponte Vecchio. The Argonne National Laboratory will include Ponte Vecchio chips in its upcoming $500 million Aurora supercomputer. Aurora will provide one exaflop of performance when it becomes fully operational, the equivalent of a quintillion calculations per second.
Businesses

Laptop Shortage is Easing as Pandemic Demand Wanes (bloomberg.com) 17

Since early in the pandemic, soaring demand for consumer electronics led to persistent chip shortages. Some recent signs suggest the situation may finally be starting to change. From a report: An executive at the memory chip maker Micron Technology said last week at an investor conference that demand for consumer PCs is slowing and that some of its customers have more chips lying around. A day later, Morgan Stanley downgraded several chip stocks in a note titled "Winter is Coming." The analysts said PC inventory is rising and that the smartphone market is likely to experience similar deterioration. An old investor maxim says technology companies tend to handily outperform during cyclical upswings while the reverse is true on the downside. Well, the industry is beginning to fall short of estimates.

Global PC shipments grew by 13% in the second quarter, according to research firm IDC. That was below Evercore ISI's expectation of 18% and a big deceleration from the 55% rise in the first quarter. Furthermore, wireless router manufacturer Netgear Inc. gave disappointing guidance last month, adding that sales were worse-than-expected in its consumer networking category. Still, it's probably too soon to declare an end. Outbreaks of the delta variant and the long-term efficacy of vaccines make predictions even harder than usual. Some chip analysts have said reports of weakness are primarily seasonal and that sales will pick up through next year. Shortages also vary by part. So even if you can walk into a store and find plenty of laptops, you'll still struggle to get a new car or a video game console.

In some cases, chip delivery times are longer than 20 weeks, the longest wait in at least four years. But as I wrote last month, the pandemic rush to computers and printers won't repeat itself. Once a worker or student buys a laptop, they don't need another one for several years. Retailers are offering extensive discounts on nearly every PC-related category, with the exception of graphics cards. (It's still a good time to be in the games business.) The waning demand for PCs will likely last for at least several more quarters.

Intel

Intel Enters the PC Gaming GPU Battle With Arc 92

Dave Knott writes: Intel is branding its upcoming consumer GPUs as Intel Arc. This new Arc brand will cover both the hardware and software powering Intel's high-end discrete GPUs, as well as multiple hardware generations. The first of those, known previously as DG2, is expected to arrive in the form of codename "Alchemist" in Q1 2022. Intel's Arc GPUs will be capable of mesh shading, variable rate shading, video upscaling, and real-time ray tracing. Most importantly, Intel is also promising AI-accelerated super sampling, which sounds like Intel has its own competitor to Nvidia's Deep Learning Super Sampling (DLSS) technology.

Slashdot Top Deals