Google

Google Used Reinforcement Learning To Design Next-Gen AI Accelerator Chips (venturebeat.com) 18

Chip floorplanning is the engineering task of designing the physical layout of a computer chip. In a paper published in the journal Nature, Google researchers applied a deep reinforcement learning approach to chip floorplanning, creating a new technique that "automatically generates chip floorplans that are superior or comparable to those produced by humans in all key metrics, including power consumption, performance and chip area." VentureBeat reports: The Google team's solution is a reinforcement learning method capable of generalizing across chips, meaning that it can learn from experience to become both better and faster at placing new chips. Training AI-driven design systems that generalize across chips is challenging because it requires learning to optimize the placement of all possible chip netlists (graphs of circuit components like memory components and standard cells including logic gates) onto all possible canvases. [...] The researchers' system aims to place a "netlist" graph of logic gates, memory, and more onto a chip canvas, such that the design optimizes power, performance, and area (PPA) while adhering to constraints on placement density and routing congestion. The graphs range in size from millions to billions of nodes grouped in thousands of clusters, and typically, evaluating the target metrics takes from hours to over a day.

Starting with an empty chip, the Google team's system places components sequentially until it completes the netlist. To guide the system in selecting which components to place first, components are sorted by descending size; placing larger components first reduces the chance there's no feasible placement for it later. Training the system required creating a dataset of 10,000 chip placements, where the input is the state associated with the given placement and the label is the reward for the placement (i.e., wirelength and congestion). The researchers built it by first picking five different chip netlists, to which an AI algorithm was applied to create 2,000 diverse placements for each netlist. The system took 48 hours to "pre-train" on an Nvidia Volta graphics card and 10 CPUs, each with 2GB of RAM. Fine-tuning initially took up to 6 hours, but applying the pre-trained system to a new netlist without fine-tuning generated placement in less than a second on a single GPU in later benchmarks. In one test, the Google researchers compared their system's recommendations with a manual baseline: the production design of a previous-generation TPU chip created by Google's TPU physical design team. Both the system and the human experts consistently generated viable placements that met timing and congestion requirements, but the AI system also outperformed or matched manual placements in area, power, and wirelength while taking far less time to meet design criteria.

Operating Systems

Linux X86/x86_64 Will Now Always Reserve the First 1MB of RAM (phoronix.com) 77

AmiMoJo shares a report from Phoronix: The Linux x86/x86_64 kernel code already had logic in place for reserving portions of the first 1MB of RAM to avoid the BIOS or kernel potentially clobbering that space among other reasons while now Linux 5.13 is doing away with that 'wankery' and will just unconditionally always reserve the first 1MB of RAM. The Linux kernel was already catering to Intel Sandy Bridge graphics accessing memory below the 1MB mark, the first 64K of memory are known to be corrupted by some BIOSes, and similar problems coming up in that low area of memory. But rather than dealing with all that logic and other possible niche cases besides the EGA/VGA frame-buffer and BIOS, the kernel is playing it safe and just always reserving the first 1MB of RAM so it will not get clobbered by the kernel.
KDE

KDE Plasma 5.22 Released (phoronix.com) 13

KDE Plasma 5.22 is now available, bringing "hugely improved" Wayland support, better performance for gaming, adaptive panel transparency for the panel and widgets, and more. Phoronix reports: There is now support for variable rate refresh (VRR) / Adaptive-Sync on Wayland, vertical/horizontal maximization now working with KWin Wayland, global menu applet support under Wayland, support for activities, and a lot of other general improvements and fixes so the overall Wayland support is much more polished and nearly at par to the X.Org Server support.

The performance for gaming with KDE Plasma on Wayland should also be better with now having direct scan-out support for full-screen windows. Rounding out the graphics fun with this release is also GPU hot-plugging support on Wayland for KWin, such as if using an external GPU or USB display adapter. KDE Plasma 5.22 also delivers on adaptive panel transparency for the panel and widgets, desktop notification improvements, Plasma System Monitor has replaced KSysGuard as the default system monitoring application, and a variety of other improvements.
You can view the full changelog for Plasma 5.22 here.
The Courts

Dell Hit With Fraud Case Over Alienware Area-51m Upgrade Claims (tomshardware.com) 43

A California man has filed for a class action lawsuit against PC manufacturer Dell, claiming that the company "intentionally misled and deceived" buyers of its Alienware Area 51-m R1 gaming laptop, which was advertised to be more upgradeable than other gaming notebooks. From a report: The plaintiff, Robert Felter, who is based in San Francisco, alleges that Dell misleads customers to believe that the laptop would be upgradeable, possibly into future generations of components. The case, Felter v. Dell Technologies, Inc. (3:21-cv-04187) has been filed with the United States District Court in the Northern District of California. The Alienware Area 51-m was announced at CES 2019 and launched soon after. (The complaint claims the announcement was made in the summer of 2019, which is incorrect.). Among the Area 51-m's biggest touted innovations were a user-replaceable CPU and GPU.

At media briefings, Alienware representatives told the press that the CPU could be upgraded as long as it used Intel's Z390 chipset. The laptop used Intel's 9th Gen Core desktop processors, up to the Intel Core i9-9900K. Dell developed separate proprietary Dell Graphics Form Factor (DGFF) modules for the Nvidia graphics. The lawsuit, however, claims that consumers were told that "core components" (meaning the CPU and GPU) could be replaced beyond the current generation of hardware. "Dell's advertisement to the public didn't place any restrictions on the upgradeability of the laptop," lawyer David W. Kani said in an email to Tom's Hardware. "They also never disclosed that those with the highest spec CPU and/or GPU that their device would not be upgradeable."

Bitcoin

Norton 360 Antivirus Now Lets You Mine Ethereum Cryptocurrency (bleepingcomputer.com) 66

NortonLifelock has added the ability to mine Ethereum cryptocurrency directly within its Norton 360 antivirus program as a way to "protect" users from malicious mining software. BleepingComputer reports: This new mining feature is called 'Norton Crypto' and will be rolling out tomorrow to Norton 360 users enrolled in Norton's early adopter program. When Norton Crypto is enabled, the software will use the device's graphics card (GPU) to mine for Ethereum, which will then be transferred into a Norton wallet hosted in the cloud. It is not clear if every device running Norton Crypto is mining independently or as part of a pool of users for a greater chance of earning rewards of Ethereum.

As the difficulty of mining Ethereum by yourself is very high, Norton users will likely be pooled together for greater chances of mining a block. If Norton is operating a pool for this new feature, they may take a small fee of all mined Ethereum as is common among pool operators, making this new feature a revenue generator for the company.
"As the crypto economy continues to become a more important part of our customers' lives, we want to empower them to mine cryptocurrency with Norton, a brand they trust," said Vincent Pilette, CEO of NortonLifeLock. "Norton Crypto is yet another innovative example of how we are expanding our Cyber Safety platform to protect our customers' ever-evolving digital lives."
AMD

AMD Unveils Radeon RX 6000M Mobile GPUs For New Breed of All-AMD Gaming Laptops (hothardware.com) 15

MojoKid writes: AMD just took the wraps off its new line of Radeon RX 6000M GPUs for gaming laptops. Combined with its Ryzen 5000 series processors, the company claims all-AMD powered "AMD Advantage" machines will deliver new levels of performance, visual fidelity and value for gamers. AMD unveiled three new mobile GPUs. Sitting at the top is the Radeon RX 6800M, featuring 40 compute units, 40 ray accelerators, a 2,300MHz game clock and 12GB of GDDR6 memory. According to AMD, its flagship Radeon RX 6800M mobile GPU can deliver 120 frames per second at 1440p with a blend of raytracing, compute, and traditional effects.

Next, the new Radeon RX 6700M sports 36 compute units, 36 ray accelerators, a 2,300MHz game clock and 10GB of GDDR6 memory. Finally, the Radeon RX 6600M comes armed with 28 compute units and 28 ray accelerators, a 2,177MHz game clock and 8GB of GDDR6 memory. HotHardware has a deep dive review of a new ASUS ROG Strix G15 gaming laptop with the Radeon RX 6800M on board, as well as an 8-core Ryzen 9 5900HX processor. In the benchmarks, the Radeon RX 6800M-equipped machine puts up numbers that rival GeForce RTX 3070 and 3080 laptop GPUs in traditional rasterized game engines, though it trails a bit in ray tracing enhanced gaming. You can expect this new breed of all-AMD laptops to arrive in market sometime later this month.

Hardware

The GeForce RTX 3080 Ti is Nvidia's 'New Gaming Flagship' (pcworld.com) 60

Nvidia officially announced the long-awaited GeForce RTX 3080 Ti during its Computex keynote late Monday night, and this $1,200 graphics card looks like an utter beast. The $600 GeForce RTX 3070 Ti also made its debut with faster GDDR6X memory. From a report: All eyes are on the RTX 3080 Ti, though. Nvidia dubbed it GeForce's "new gaming flagship" as the $1,500 RTX 3090 is built for work and play alike, but the new GPU is a 3090 in all but name (and memory capacity). While Nvidia didn't go into deep technical details during the keynote, the GeForce RTX 3080 Ti's specifications page shows it packing a whopping 10,240 CUDA cores -- just a couple hundred less than the 3090's 10,496 count, but massively more than the 8,704 found in the vanilla 3080.

Expect this card to chew through games on par with the best, especially in games that support real-time ray tracing and Nvidia's amazing DLSS feature. The memory system can handle the ride, as it's built using the RTX 3090's upgraded bones. The GeForce RTX 3080 Ti comes with a comfortable 12GB of blazing-fast GDDR6X memory over a wide 384-bit bus, which is half the ludicrous 24GB capacity found in the 3090, but more than enough to handle any gaming workload you through at it. That's not true with the vanilla RTX 3080, which comes with 10GB of GDDR6X over a smaller bus, as rare titles (like Doom Eternal) can already use more than 10GB of memory when you're playing at 4K resolution with the eye candy cranked to the max. The extra two gigs make the RTX 3080 Ti feel much more future-proof.

AMD

Samsung Exynos Chip With AMD Graphics To Bring Ray-Tracing To Mobile (liliputing.com) 26

Two years after announcing plans to bring AMD graphics to Samsung Exynos mobile chips, it looks like the first of those chips could be ready to launch soon. From a report: During a Computex keynote, AMD's Lisa Su said that Samsung's "next flagship" mobile system-on-a-chip would feature custom graphics from AMD featuring the company's RDNA 2 graphics architecture. What does that means for mobile devices powered by the chip? The kind of graphics horsepower that had usually been associated with discrete GPUs. Su says that the upcoming Exynos chip will support features including ray tracking and variable rate shading. While that wouldn't make it the first ARM-based chip with those features (Apple's M1 processor also supports ray tracing), it could still be enough to help give Samsung an edge over rival Qualcomm.
Intel

Intel's latest 11th Gen Processor Brings 5.0GHz Speeds To Thin and Light Laptops (theverge.com) 51

Intel made a splash earlier in May with the launch of its first 11th Gen Tiger Lake H-series processors for more powerful laptops, but at Computex 2021, the company is also announcing a pair of new U-series chips -- one of which marks the first 5.0GHz clock speed for the company's U-series lineup of lower voltage chips. From a report: Specifically, Intel is announcing the Core i7-1195G7 -- its new top of the line chip in the U-series range -- and the Core i5-1155G7, which takes the crown of Intel's most powerful Core i5-level chip, too. Like the original 11th Gen U-series chips, the new chips operate in the 12W to 28W range. Both new chips are four core / eight thread configurations, and feature Intel's Iris Xe integrated graphics (the Core i7-1195G7 comes with 96 EUs, while the Core i5-1155G7 has 80 EUs.)

The Core i7-1195G7 features a base clock speed of 2.9GHz, but cranks up to a 5.0GHz maximum single core speed using Intel's Turbo Boost Max 3.0 technology. The Core i5-1155G7, on the other hand, has a base clock speed of 2.5GHz and a boosted speed of 4.5GHz. Getting to 5GHz out of the box is a fairly recent development for laptop CPUs, period: Intel's first laptop processor to cross the 5GHz mark arrived in 2019.

Graphics

Resale Prices Triple for NVIDIA Chips as Gamers Compete with Bitcoin Miners (yahoo.com) 108

"In the niche world of customers for high-end semiconductors, a bitter feud is pitting bitcoin miners against hardcore gamers," reports Quartz: At issue is the latest line of NVIDIA graphics cards — powerful, cutting-edge chips with the computational might to display the most advanced video game graphics on the market. Gamers want the chips so they can experience ultra-realistic lighting effects in their favorite games. But they can't get their hands on NVIDIA cards, because miners are buying them up and adapting them to crunch cryptographic codes and harvest digital currency. The fierce competition to buy chips — combined with a global semiconductor shortage — has driven resale prices up as much as 300%, and led hundreds of thousands of desperate consumers to sign up for daily raffles for the right to buy chips at a significant mark-up.

To broker a peace between its warring customers, NVIDIA is, essentially, splitting its cutting-edge graphics chips into two dumbed-down products: GeForce for gamers and the Cryptocurrency Mining Processor (CMP) for miners. GeForce is the latest NVIDIA graphics card — except key parts of it have been slowed down to make it less valuable for miners racing to solve crypto puzzles. CMP is based on a slightly older version of NVIDIA's graphics card which has been stripped of all of its display outputs, so gamers can't use it to render graphics.

NVIDIA's goal in splitting its product offerings is to incentivize miners to only buy CMP chips, and leave the GeForce chips for the gamers. "What we hope is that the CMPs will satisfy the miners...[and] steer our GeForce supply to gamers," said CEO Jansen Huang on a May 26 conference call with investors and analysts... It won't be easy to keep the miners at bay, however. NVIDIA tried releasing slowed-down graphics chips in February in an effort to deter miners from buying them, but it didn't work. The miners quickly figured out how to hack the chips and make them perform at full-speed again.

Microsoft

Millions Can Now Run Linux GUI Apps in Windows 10 (bleepingcomputer.com) 203

"You can now use GUI app support on Windows Subsystem for Linux (WSL)," Microsoft announced this week, "so that all the tools and workflows of Linux run on your developer machine." Bleeping Computer has already tested it running Gnome's file manager Nautilus, the open-source application monitor/task manager Stacer, the backup software Timeshift, and even the game Hedgewars.

Though it's currently available only to the millions who've registered for Windows 10 "Insider Preview" builds, it's already drawing positive reviews. "With the Windows Subsystem for Linux, developers no longer need to dual-boot a Windows and Linux system," argues the Windows Central site, "as you can now install all the Linux stuff a developer would need right on top of Windows instead."

Finally formally announced at this week's annual Microsoft Build conference, the new functionality runs graphical Linux apps "seamlessly," according to Tech Radar, calling the feature "highly anticipated." Arguably, one of the biggest, and surely the most exciting update to the Windows 10 WSL, Microsoft has been working on WSLg for quite a while and in fact first demoed it at last year's conference, before releasing the preview in April... Microsoft recommends running WSLg after enabling support for virtual GPU (vGPU) for WSL, in order to take advantage of 3D acceleration within the Linux apps.... WSLg also supports audio and microphone devices, which means the graphical Linux apps will also be able to record and play audio.

Keeping in line with its developer slant, Microsoft also announced that since WSLg can now help Linux apps leverage the graphics hardware on the Windows machine, the subsystem can be used to efficiently run Linux AI and ML workloads... If WSLg developers are to be believed, the update is expected to be generally available alongside the upcoming release of Windows.

Bleeping Computer explains that WSLg launches a "companion system distro" with Wayland, X, and Pulse Audio servers, calling its bundling with Windows 10 "an exciting development as it blurs the lines between Linux and Windows 10, and fans get the benefits of both worlds."
Piracy

German 'Upload Filter' Law Sets Standards To Prevent Overblocking 31

AmiMoJo writes: The German Parliament has adopted new legislation that will implement the EU Copyright Directive into local law. This includes the controversial Article 17 that, according to some, would lead to overbroad upload filters. To deal with these concerns, the German law prevents 'minor' and limited use of copyrighted content from being blocked automatically. These 'presumably authorized' uploads should not be blocked automatically if they qualify for all of the selection criteria below:

1. The upload should use less than 50% of the original copyrighted work
2. The upload must use the copyrighted work in combination with other content
3. The use should be 'minor'

The term 'minor' applies to non-commercial uses of fewer than 15 seconds of video or audio, 160 characters of text, or 125 kB of graphics. If the use of a copyrighted work exceeds these 'minor' thresholds, it can still qualify as 'presumably authorized' when the uploader flags it as an exception.
Technology

Snap's New Spectacles Let You See the World in Augmented Reality (theverge.com) 34

Snap's new Spectacles glasses are its most ambitious yet. But there's a big catch: you can't buy them. From a report: On Thursday, Snap CEO Evan Spiegel unveiled the company's first true augmented reality glasses, technology that he and rivals like Facebook think will one day be as ubiquitous as mobile phones. A demo showed virtual butterflies fluttering over colorful plants and landing in Spiegel's extended hand. The new Spectacles have dual waveguide displays capable of superimposing AR effects made with Snapchat's software tools. The frame features four built-in microphones, two stereo speakers, and a built-in touchpad. Front-facing cameras help the glasses detect objects and surfaces you're looking at so that graphics more naturally interact with the world around you.

[...] The idea is to encourage a small portion of the 200,000 people who already make AR effects in Snapchat to experiment with creating experiences for the new Spectacles, according to Spiegel. Like the bright yellow vending machines Snap used to sell the first version of Spectacles several years ago, the approach could end up being a clever way to build buzz for the glasses ahead of their wide release. Spiegel has said that AR glasses will take roughly a decade to reach mainstream adoption. "I don't believe the phone is going away," he told The Verge in an interview this week. "I just think that the next generation of Spectacles can help unlock a new way to use AR hands-free, and the ability to really roam around with your eyes looking up at the horizon, out at the world."

IT

Nvidia is Nerfing New RTX 3080 and 3070 Cards for Ethereum Cryptocurrency Mining (theverge.com) 122

Nvidia is extending its cryptocurrency mining limits to newly manufactured GeForce RTX 3080, RTX 3070, and RTX 3060 Ti graphics cards. From a report: After nerfing the hash rates of the RTX 3060 for its launch in February, Nvidia is now starting to label new cards with a "Lite Hash Rate" or "LHR" identifier to let potential customers know the cards will be restricted for mining. "This reduced hash rate only applies to newly manufactured cards with the LHR identifier and not to cards already purchased," says Matt Wuebbling, Nvidia's head GeForce marketing.

"We believe this additional step will get more GeForce cards at better prices into the hands of gamers everywhere." These new RTX 3060 Ti, RTX 3070, and RTX 3080 cards will start shipping later this month, and the LHR identifier will be displayed in retail product listings and on the box. Nvidia originally started hash limiting with the RTX 3060, and the company has already committed to not limiting the performance of GPUs already sold.

Classic Games (Games)

After 35 Years, Classic Shareware Game 'Cap'n Magneto' Finally Fully Resurrected (statesman.com) 23

A newspaper in Austin, Texas shares the story behind a cult-classic videogame, the 1985 Macintosh shareware game "Cap'n Magneto."

It was the work of Al Evans, who'd "decided to live life to the fullest after suffering severe burn injuries in 1963" at the age of 17. Beneath the surface, "Cap'n Magneto" is a product of its creator's own quest to overcome adversity after a terrible car crash — an amalgamation of hard-earned lessons on the value of relationships, being an active participant in shaping the world and knowing how to move on... "Whether I was going to survive at all was very iffy," Evans said. "The chance of me living to the age of 28 or 30 was below 30% or something like that." Regardless of how much time he had left, Evans said he refused to let his injuries hold him back from living his life to the fullest. He would live his life with honesty, he decided, and do his best to always communicate with others truthfully. "I wasn't going to spend the next two years of my life dorking around different hospitals. So I said what's the alternative?" Evans said...

To float his many hobbies and interests, however, Evans knew he had to make money. In addition to doing work as a graphic designer and a translator, he picked up computer programming, which opened his eyes to a digital frontier that allowed for the creation of new worlds with the stroke of a keyboard. When he realized the technical capabilities of the Macintosh — the first personal computer that had a graphics-driven user interface and a built-in mouse function — Evans said he set out to build a world that could marry storytelling and graphics. With the help of his wife Cea, Evans created his one and only computer game: "Cap'n Magneto."

"I really wanted to write a good game, and I definitely think it was that," Evans said...

Australia-based gaming historian, author and journalist Richard Moss says, "What really marked it as different, though, was that the alien speech, once ungarbled by a tricorder item that players had to find, would be spoken aloud through the Mac's built-in speech synthesizer and written on-screen in comic-style speech bubbles," Moss said. "And unlike most role playing games of the time, every character you'd meet in the game could be friendly and helpful or cold and dismissive or aggressive and hostile — depending on a mix of random chance and player choice...."

With "Cap'n Magneto," Evans said he wanted to make sure that players could befriend the non-playable alien characters that the hero encounters. Though the game is beatable without their help, it is significantly easier with the help of allies. A reality in which everyone was an enemy, to Evans, was simply dishonest.

"That doesn't reflect the game of life, you know? Some people, well, most people actually, are probably pretty friendly," he said.

35 years after its release, Evans — now 75 years old — received a message on Facebook informing him that the game was still being played — but no one could finish it because the built-in "nagware" required payments that couldn't be completed.

That problem has finally been fixed, and long-time Slashdot reader shanen now shares the web site where the full game can finally be downloaded.
Security

Hackers Used Fake GPU Overclocking Software To Push Malware (vice.com) 11

Computer hardware maker MSI is warning gamers not to visit a website that's impersonating the brand and its graphics card overclocking software, Afterburner, to push malware. From a report: On Thursday, MSI published a press release warning of "a malicious software being disguised as the official MSI Afterburner." "The malicious software is being unlawfully hosted on a suspicious website impersonating as MSI's official website with the domain name https:// afterburner - msi [ . ] space," the company wrote. "MSI has no relation with this website or the aforementioned domain. [...] This webpage is hosting software which may contain virus, trojan, keylogger, or other type of malicious program that have been disguised to look like MSI Afterburner," the company added. "DO NOT DOWNLOAD ANY SOFTWARE FROM THIS WEBSITE."
AI

GTA 5 Graphics Are Now Being Boosted By Advanced AI At Intel (gizmodo.com) 44

Researchers at Intel Labs have applied machine learning techniques to GTA 5 to make it look incredibly realistic. Gizmodo reports: [I]nstead of training a neural network on famous masterpieces, the researchers at Intel Labs relied on the Cityscapes Dataset, a collection of images of a German city's urban center captured by a car's built-in camera, for training. When a different artistic style is applied to footage using machine learning techniques, the results are often temporally unstable, which means that frame by frame there are weird artifacts jumping around, appearing and reappearing, that diminish how real the results look. With this new approach, the rendered effects exhibit none of those telltale artifacts, because in addition to processing the footage rendered by Grand Theft Auto V's game engine, the neural network also uses other rendered data the game's engine has access to, like the depth of objects in a scene, and information about how the lighting is being processed and rendered.

That's a gross simplification -- you can read a more in-depth explanation of the research here -- but the results are remarkably photorealistic. The surface of the road is smoothed out, highlights on vehicles look more pronounced, and the surrounding hills in several clips look more lush and alive with vegetation. What's even more impressive is that the researchers think, with the right hardware and further optimization, the gameplay footage could be enhanced by their convolutional network at "interactive rates" -- another way to say in real-time -- when baked into a video game's rendering engine.

Apple

Apple's M2 Chip Goes Into Mass Production for Mac (nikkei.com) 235

The next generation of Mac processors designed by Apple entered mass production this month, Nikkei Asia reported Tuesday, citing sources, bringing the U.S. tech giant one step closer to its goal of replacing Intel-designed central processing units with its own. From the report: Shipments of the new chipset -- tentatively known as the M2, after Apple's current M1 processor -- could begin as early as July for use in MacBooks that are scheduled to go on sale in the second half of this year, the people said. The new chipset is produced by key Apple supplier Taiwan Semiconductor Manufacturing Co., the world's largest contract chipmaker, using the latest semiconductor production technology, known as 5-nanometer plus, or N5P. Producing such advanced chipsets takes at least three months. The start of mass production came as Apple introduced new iMac and iPad Pro models using the M1. The company said the M1 offers CPU performance up to 85% faster than an iMac using an Intel chipset, and graphics performance that is twice as fast.
Ubuntu

Canonical Launches Ubuntu 21.04 'Hirsute Hippo' 46

Canonical released Ubuntu 21.04 with native Microsoft Active Directory integration, Wayland graphics by default, and a Flutter application development SDK. Separately, Canonical and Microsoft have announced performance optimization and joint support for Microsoft SQL Server on Ubuntu. Canonical blog adds: "Native Active Directory integration and certified Microsoft SQL Server on Ubuntu are top priorities for our enterprise customers." said Mark Shuttleworth, CEO of Canonical. "For developers and innovators, Ubuntu 21.04 delivers Wayland and Flutter for smoother graphics and clean, beautiful, design-led cross-platform development." You can read the full list of new features and changelog here.
Desktops (Apple)

Parallels 16.5 Can Virtualize ARM Windows Natively on M1 Macs With Up to 30% Faster Performance (macrumors.com) 60

Parallels today announced the release of Parallels Desktop 16.5 for Mac with full support for M1 Macs, allowing for the Windows 10 ARM Insider Preview and ARM-based Linux distributions to be run in a virtual machine at native speeds on M1 Macs. From a report: Parallels says running a Windows 10 ARM Insider Preview virtual machine natively on an M1 Mac results in up to 30 percent better performance compared to a 2019 model 15-inch MacBook Pro with an Intel Core i9 processor, 32GB of RAM, and Radeon Pro Vega 20 graphics. Parallels also indicates that on an M1 Mac, Parallels Desktop 16.5 uses 2.5x less energy than on the latest Intel-based MacBook Air. Microsoft does not yet offer a retail version of ARM-based Windows, with the Windows 10 ARM Insider Preview available on Microsoft's website for Windows Insider program members. The ability to run macOS Big Sur in a virtual machine is a feature that Parallels hopes to add support for in Parallels Desktop later this year as well.

Slashdot Top Deals