×
Graphics

Vice Mocks GIFs as 'For Boomers Now, Sorry'. (And For Low-Effort Millennials) (vice.com) 227

"GIF folders were used by ancient civilisations as a way to store and catalogue animated pictures that were once employed to convey emotion," Vice writes: Okay, you probably know what a GIF folder is — but the concept of a special folder needed to store and save GIFs is increasingly alien in an era where every messaging app has its own in-built GIF library you can access with a single tap. And to many youngsters, GIFs themselves are increasingly alien too — or at least, okay, increasingly uncool. "Who uses gifs in 2020 grandma," one Twitter user speedily responded to Taylor Swift in August that year when the singer-songwriter opted for an image of Dwayne "The Rock" Johnson mouthing the words "oh my god" to convey her excitement at reaching yet another career milestone.

You don't have to look far to find other tweets or TikToks mocking GIFs as the preserve of old people — which, yes, now means millennials. How exactly did GIFs become so embarrassing? Will they soon disappear forever, like Homer Simpson backing up into a hedge...?

Gen Z might think GIFs are beloved by millennials, but at the same time, many millennials are starting to see GIFs as a boomer plaything. And this is the first and easiest explanation as to why GIFs are losing their cultural cachet. Whitney Phillips, an assistant professor of communication at Syracuse University and author of multiple books on internet culture, says that early adopters have always grumbled when new (read: old) people start to encroach on their digital space. Memes, for example, were once subcultural and niche. When Facebook came along and made them more widespread, Redditors and 4Chan users were genuinely annoyed that people capitalised on the fruits of their posting without putting in the cultural work. "That democratisation creates a sense of disgust with people who consider themselves insiders," Phillips explains. "That's been central to the process of cultural production online for decades at this point...."

In 2016, Twitter launched its GIF search function, as did WhatsApp and iMessage. A year later, Facebook introduced its own GIF button in the comment section on the site. GIFs became not only centralised but highly commercialised, culminating in Facebook buying GIPHY for $400 million in 2020. "The more GIFs there are, maybe the less they're regarded as being special treasures or gifts that you're giving people," Phillips says. "Rather than looking far and wide to find a GIF to send you, it's clicking the search button and typing a word. The gift economy around GIFs has shifted...."

Linda Kaye, a cyberpsychology professor at Edge Hill University, hasn't done direct research in this area but theorises that the ever-growing popularity of video-sharing on TikTok means younger generations are more used to "personalised content creation", and GIFs can seem comparatively lazy.

The GIF was invented in 1987 "and it's important to note the format has already fallen out of favour and had a comeback multiple times before," the article points out. It cites Jason Eppink, an independent artist and curator who curated an exhibition on GIFs for the Museum of the Moving Image in New York in 2014, who highlighted how GIFs were popular with GeoCities users in the 90s, "so when Facebook launched, they didn't support GIFs.... They were like, 'We don't want this ugly symbol of amateur web to clutter our neat and uniform cool new website." But then GIFs had a resurgence on Tumblr.

Vice concludes that while even Eppink no longer uses GIFs any more, "Perhaps the waxing and waning popularity of the GIF is an ironic mirror of the format itself — destined to repeat endlessly, looping over and over again."
Graphics

Blender 3.0 Released With More New Features and Improvements 37

Long-time Slashdot reader Qbertino writes: The Free Open Source 3D production software Blender has been released in version 3.0 (official showreel) with more new features, improvements and performance optimizations as well as further improved workflows.

In recent years Blender has received an increasing rate of attention from the 3D industry, with various larger businesses such as Epic, Microsoft, Apple and most recently Intel joining the blender foundation and donating to its development fund. Blender has seen an increasing rise in usage in various industries, such as animated feature film production, architecture and game development.
Google

Google is Building an AR Headset (theverge.com) 52

Meta may be the loudest company building AR and VR hardware. Microsoft has HoloLens. Apple is working on something, too. But don't count out Google. The Verge: The search giant has recently begun ramping up work on an AR headset, internally codenamed Project Iris, that it hopes to ship in 2024, according to two people familiar with the project who requested anonymity to speak without the company's permission. Like forthcoming headsets from Meta and Apple, Google's device uses outward-facing cameras to blend computer graphics with a video feed of the real world, creating a more immersive, mixed reality experience than existing AR glasses from the likes of Snap and Magic Leap. Early prototypes being developed at a facility in the San Francisco Bay Area resemble a pair of ski goggles and don't require a tethered connection to an external power source.

Google's headset is still early in development without a clearly defined go-to-market strategy, which indicates that the 2024 target year may be more aspirational than set in stone. The hardware is powered by a custom Google processor, like its newest Google Pixel smartphone, and runs on Android, though recent job listings indicate that a unique OS is in the works. Given power constraints, Google's strategy is to use its data centers to remotely render some graphics and beam them into the headset via an internet connection. I'm told that the Pixel team is involved in some of the hardware pieces, but it's unclear if the headset will ultimately be Pixel-branded.

Intel

Intel To Unveil 'Ultra Low-Voltage Bitcoin Mining ASIC' In February (coindesk.com) 31

Intel, one of the world's largest chip makers, is likely to unveil a specialized crypto-mining chip at the International Solid-State Circuits Conference (ISSCC) in February, according to the conference's agenda (PDF). CoinDesk reports: One of Intel's "highlighted chip releases" at the conference is entitled "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." The session is scheduled for Feb. 23. This brings the company into direct competition with the likes of Bitmain and MicroBT in the market for bitcoin mining ASICs, or application-specific integrated circuits, for the first time. [...] Unlike its competitor Nvidia, Intel has said it doesn't plan to add ether mining limits on its graphics cards.
AMD

AMD Returns To Smartphone Graphics (theregister.com) 13

AMD's GPU technology is returning to mobile handsets with Samsung's Exynos 2200 system-on-chip, which was announced on Tuesday. The Register reports: The Exynos 2200 processor, fabricated using a 4nm process, has Armv9 CPU cores and the oddly named Xclipse GPU, which is an adaptation of AMD's RDNA 2 mainstream GPU architecture. AMD was in the handheld GPU market until 2009, when it sold the Imageon GPU and handheld business for $65m to Qualcomm, which turned the tech into the Adreno GPU for its Snapdragon family. AMD's Imageon processors were used in devices from Motorola, Panasonic, Palm and others making Windows Mobile handsets. AMD's now returning to a more competitive mobile graphics market with Apple, Arm and Imagination also possessing homegrown smartphone GPUs.

Samsung and AMD announced the companies were working together on graphics in June last year. With Exynos 2200, Samsung has moved on from Arm's Mali GPU family, which was in the predecessor Exynos 2100 used in the current flagship Galaxy smartphones. Samsung says the power-optimized GPU has hardware-accelerated ray tracing, which simulates lighting effects and other features to make gaming a better experience. [...] The Exynos 2200 has an image signal processor that can apparently handle 200-megapixel pictures and record 8K video. Other features include HDR10+ support, and 4K video decoding at up to 240fps or 8K decoding at up to 60fps. It supports display refresh rates of up to 144Hz.

The eight-core CPU cluster features a balance of high-performing and power-efficient cores. It has one Arm Cortex-X2 flagship core, three Cortex-A710 big cores and four Cortex-A510s, which is in the same ballpark as Qualcomm's Snapdragon 8 Gen 1 and Mediatek's Dimensity 9000, which are the only other chips using Arm's Armv9 cores and are made using a 4nm process. An integrated 5G modem supports both sub-6GHz and millimeter wave bands, and a feature to mix LTE and 5G signals speeds up data transfers to 10Gbps. The chip also has a security processor and an AI engine that is said to be two times faster than its predecessor in the Exynos 2100.

AI

Nvidia's AI-Powered Scaling Makes Old Games Look Better Without a Huge Performance Hit (theverge.com) 41

Nvidia's latest game-ready driver includes a tool that could let you improve the image quality of games that your graphics card can easily run, alongside optimizations for the new God of War PC port. The Verge reports: The tech is called Deep Learning Dynamic Super Resolution, or DLDSR, and Nvidia says you can use it to make "most games" look sharper by running them at a higher resolution than your monitor natively supports. DLDSR builds on Nvidia's Dynamic Super Resolution tech, which has been around for years. Essentially, regular old DSR renders a game at a higher resolution than your monitor can handle and then downscales it to your monitor's native resolution. This leads to an image with better sharpness but usually comes with a dip in performance (you are asking your GPU to do more work, after all). So, for instance, if you had a graphics card capable of running a game at 4K but only had a 1440p monitor, you could use DSR to get a boost in clarity.

DLDSR takes the same concept and incorporates AI that can also work to enhance the image. According to Nvidia, this means you can upscale less (and therefore lose less performance) while still getting similar image quality improvements. In real numbers, Nvidia claims you'll get image quality similar to running at four times the resolution using DSR with only 2.25 times the resolution with DLDSR. Nvidia gives an example using 2017's Prey: Digital Deluxe running on a 1080p monitor: 4x DSR runs at 108 FPS, while 2.25x DLDSR is getting 143 FPS, only two frames per second slower than running at native 1080p.

IOS

Fortnite Sneaks Back Onto iPhone By Way Of GeForce Now (kotaku.com) 13

It's been 518 days since Apple kicked Fortnite off of the App Store after Epic Games tried to bypass its payment system. Now the popular free-to-play battle royale is once again playable on iPhones, sort of. From a report: Starting next week, Fortnite will be available on iOS by way of streaming, as part of an upcoming closed beta for Nvidia's GeForce Now game streaming program. "Fortnite on GeForce NOW will launch in a limited-time closed beta for mobile, all streamed through the Safari web browser on iOS and the GeForce NOW Android app," Nvidia announced on its blog today. "The beta is open for registration for all GeForce NOW members, and will help test our server capacity, graphics delivery and new touch controls performance."

GeForce Now, subscriptions for which range from free to $200 a year for the premium tier, lets users stream games they already own to PCs, tablets, and smartphones. It's one way to make blockbuster PC games portable, or to play them on rigs with beefier specs than the ones people already have at home. In Fortnite's case, GeForce Now subscribers will soon be able to stream the shooter to iOS devices and play it using touch controls via Apple's Safari. The browser workaround is one way companies like Microsoft have been able to get their game streaming platforms on iPhones despite Apple's ban on allowing them inside its App Store. Now its bringing back the game that kicked off a massive, messy, year-long legal battle that's still raging to this day.

Data Storage

PCI Express 6.0 Specification Finalized: x16 Slots To Reach 128GBps (anandtech.com) 31

PCI Special Interest Group (PCI-SIG) has released the much-awaited final (1.0) specification for PCI Express 6.0. From a report: The next generation of the ubiquitous bus is once again doubling the data rate of a PCIe lane, bringing it to 8GB/second in each direction -- and far, far higher for multi-lane configurations. With the final version of the specification now sorted and approved, the group expects the first commercial hardware to hit the market in 12-18 months, which in practice means it should start showing up in servers in 2023. First announced in the summer of 2019, PCI Express 6.0 is, as the name implies, the immediate follow-up to the current-generation PCIe 5.0 specification. Having made it their goal to keep doubling PCIe bandwidth roughly every 3 years, the PCI-SIG almost immediately set about work on PCIe 6.0 once the 5.0 specification was completed, looking at ways to once again double the bandwidth of PCIe. The product of those development efforts is the new PCIe 6.0 spec, and while the group has missed their original goal of a late 2021 release by mere weeks, today they are announcing that the specification has been finalized and is being released to the group's members. As always, the creation of an even faster version of PCIe technology has been driven by the insatiable bandwidth needs of the industry. The amount of data being moved by graphics cards, accelerators, network cards, SSDs, and other PCIe devices only continues to increase, and thus so must bus speeds to keep these devices fed. As with past versions of the standard, the immediate demand for the faster specification comes from server operators, whom are already regularly using large amounts of high-speed hardware. But in due time the technology should filter down to consumer devices (i.e. PCs) as well.
The Almighty Buck

Norton 360 Criticized For Installing a Cryptominer (krebsonsecurity.com) 96

"Norton 360, one of the most popular antivirus products on the market today, has installed a cryptocurrency mining program on its customers' computers," reports security researcher Brian Krebs.

The Verge follows up: The TL;DR is that yes, Norton does install a crypto miner with its software, without making that clear in the initial setup process. But it isn't going to do anything unless you specifically opt in, so it's not a situation where you'll install the security suite and instantly start seeing your computer lag as it crunches crypto in the background.

A NortonLifeLock spokesperson also told The Verge in an email that you can completely remove NCrypt.exe by temporarily turning off Norton's tamper protection feature, and then deleting the executable. We confirmed that ourselves, and it could be good news for anyone worried about Norton remotely activating the feature.

But Krebs reports the product has drawn some bad reactions — and not just because Norton is keeping 15% of the currencies mined: [M]any Norton users complain the mining program is difficult to remove, and reactions from longtime customers have ranged from unease and disbelief to, "Dude, where's my crypto...?"

According to the FAQ posted on its site, "Norton Crypto" will mine Ethereum cryptocurrency while the customer's computer is idle. The FAQ also says Norton Crypto will only run on systems that meet certain hardware and software requirements (such as an NVIDIA graphics card with at least 6 GB of memory). "Norton creates a secure digital Ethereum wallet for each user," the FAQ reads. "The key to the wallet is encrypted and stored securely in the cloud. Only you have access to the wallet." NortonLifeLock began offering the mining service in July 2021...

[M]any users have reported difficulty removing the mining program.

From reading user posts on the Norton Crypto community forum, it seems some longtime Norton customers were horrified at the prospect of their antivirus product installing coin-mining software, regardless of whether the mining service was turned off by default. "How on Earth could anyone at Norton think that adding crypto mining within a security product would be a good thing?," reads a Dec. 28 thread titled "Absolutely furious."

"Norton should be DETECTING and killing off crypto mining hijacking, not installing their own," the post reads....

"Norton is pretty much amplifying energy consumption worldwide, costing their customers more in electricity use than the customer makes on the mining, yet allowing Norton to make a ton of profit," tweeted security researcher Chris Vickery. "It's disgusting, gross, and brand-suicide."

Then there's the matter of getting paid.... "Transfers of cryptocurrencies may result in transaction fees (also known as "gas" fees) paid to the users of the cryptocurrency blockchain network who process the transaction," the FAQ explains... Which might explain why so many Norton Crypto users have taken to the community's online forum to complain they were having trouble withdrawing their earnings. Those gas fees are the same regardless of the amount of crypto being moved, so the system simply blocks withdrawals if the amount requested can't cover the transfer fees.

Thanks to Slashdot reader JustAnotherOldGuy for tipping us off to the story!
AMD

AMD Announces Ryzen 6000 Mobile CPUs for Laptops: Zen3+ on 6nm with RDNA2 Graphics (anandtech.com) 20

AnandTech: The notebook market is a tough nut to crack with a single solution. People want that mix of high performance at the top, cost effectiveness at the bottom, and throughout there has to be efficiency, utility, and function. On the back of a successful ramp last year, AMD is striking the notebook market hot again in 2022 with the launch of its new Ryzen 6000 Mobile processors. These 'Rembrandt' APUs feature AMD's latest RDNA2 graphics, up to eight Zen3+ cores with enhanced power management features, and it uses TSMC's N6 manufacturing process for performance and efficiency improvements. Yesterday AMD disclosed that they would be launching the new Ryzen 6000 Mobile series today -- updated cores, better graphics, more features, all in a single monolithic package a little over 200 mm2.

There will be 10 new processors, ranging from the traditional portable 15 W and 28 W hardware, up to 35 W and 45 W plus for the high-end gaming machines. AMD is expecting 200+ premium systems in the market with Ryzen Mobile in 2022. At the heart of the design is AMD's Zen 3+ core, which affords an improvement in power management between the cores, but keeps the Zen 3 performance characteristics. The focus here is mainly to improve idle power consumption and power when using accelerators, to help extend the life of ultraportable devices -- AMD is claiming 15-40% lower power between web browsing and video streaming. There is a frequency uplift as well, with the top processors going up to 5.0 GHz. AMD is claiming up to 1.3x single thread performance for the Ryzen 7 6800U.

Intel

Intel Demos Lightning Fast 13.8 GBps PCIe 5.0 SSD with Alder Lake (tomshardware.com) 40

Intel has demonstrated how its Core i9-12900K Alder Lake processor can work with Samsung's recently announced PM1743 PCIe 5.0 x4 SSD. The result is as astonishing as it is predictable: the platform demonstrated approximately 13.8 GBps throughput in the IOMeter benchmark. From a report: Intel planned to show the demo at CES, however, the company is no longer going in person. So, Ryan Shrout, Intel's chief performance strategist, decided to share the demo publicly via Twitter. The system used for the demonstration included a Core i9-12900K processor, an Asus Z690 motherboard and an EVGA GeForce RTX 3080 graphics board. Intel hooked up Samsung's PM1743 SSD using a special PCIe 5.0 interposer card and the drive certainly did not disappoint. From a practical standpoint, 13.8 GBps may be overkill for regular desktop users, but for those who need to load huge games, work with large 8K video files or ultra-high-resolution images will appreciate the added performance. However, there is a small catch with this demo. Apparently, Samsung will be among the first to ship its PM1743 PCIe 5.0 drives, which is why Intel decided to use this SSD for the demonstration. But Samsung's PM1743-series is aimed at enterprises, so it will be available in a 2.5-inch/15mm with dual-port support and new-generation E3.S (76 Ã-- 112.75 Ã-- 7.5mm) form-factors, so it is not aimed at desktops (and Intel admits that).
Graphics

'Quite OK Image' Format (QOI) Coming To a Graphics Program Near You? (phoboslab.org) 103

Slashdot reader Tesseractic comes bearing gifts — specifically, news of "a new image format that is lossless, gives much faster encodes, faster decodes and roughly comparable compression compared to what's in use today."

Quite OK Image format (or QOI) is the brainchild of developer Dominic Szablewski, who complains current image formats like PNG, JPEG, MPEG, MOV and MP4 "burst with complexity at the seams," the Register reports: "Every tiny aspect screams 'design by consortium'," he added, going on to lament the fact that most common codecs are old, closed, and "require huge libraries, are compute hungry and difficult to work with." Szablewski thought he could do better and appears to have achieved that objective by cooking up some code, floating it on GitHub, and paying attention to the 500-plus comments it generated.

While Szablewski admits that QOI will not compress images as well as an optimized PNG encoder, he claims it "losslessy compresses images to a similar size of PNG, while offering 20x-50x faster encoding and 3x-4x faster decoding." Most importantly, to Szablewski, the reference en-/decoder fits in about 300 lines of C and the file format spec requires is just one page long.

"In the last few weeks QOI implementations for lot of different languages and libraries popped up," Szablewski wrote on his blog, with Zig, Rust,Go, TypeScript, Haskell, Ä, Python, C#, Elixir, Swift, Java, and Pascal among the options.

Hardware

This 8-bit Processor Built in Minecraft Can Run Its Own Games (pcworld.com) 60

The months-long project demonstrates the physics behind the CPUs we take for granted. From a report: Computer chips have become so tiny and complex that it's sometimes hard to remember that there are real physical principles behind them. They aren't just a bunch of ever-increasing numbers. For a practical (well, virtual) example, check out the latest version of a computer processor built exclusively inside the Minecraft game engine. Minecraft builder "Sammyuri" spent seven months building what they call the Chungus 2, an enormously complex computer processor that exists virtually inside the Minecraft game engine. This project isn't the first time a computer processor has been virtually rebuilt inside Minecraft, but the Chungus 2 (Computation Humongous Unconventional Number and Graphics Unit) might very well be the largest and most complex, simulating an 8-bit processor with a one hertz clock speed and 256 bytes of RAM. Minecraft processors use the physics engine of the game to recreate the structure of real processors on a macro scale, with materials including redstone dust, torches, repeaters, pistons, levers, and other simple machines. For a little perspective, each "block" inside the game is one virtual meter on each side, so recreating this build in the real world would make it approximately the size of a skyscraper or cruise ship.
Power

Metaverse Vision Requires 1000x More Computational Power, Says Intel (intel.com) 79

Leading chip-maker Intel has stressed that building Metaverse -- at scale and accessible by billions of humans in real time -- will require a 1,000-times increase in computational efficiency from what we have today. Insider reports: Raja Koduri, a senior vice president and head of Intel's Accelerated Computing Systems and Graphics Group, said that our computing, storage and networking infrastructure today is simply not enough to enable this Metaverse vision, being popularized by Meta (formerly Facebook) and other companies. "We need several orders of magnitude more powerful computing capability, accessible at much lower latencies across a multitude of device form factors," Koduri said in a blog post. To enable these capabilities at scale, the entire plumbing of the internet will need major upgrades, he added.
Businesses

Adobe Takes on Canva With Freemium Offering (ft.com) 36

Adobe unveiled its first comprehensive package of design software for non-professionals on Monday, taking direct aim at a booming market that has turned Australian start-up Canva into one of the world's most valuable private tech companies. From a report: The service includes versions of widely used professional design tools such as the Photoshop picture editor, Illustrator graphics tool and video-editing service Premiere, behind a simpler interface that analysts said bore a striking resemblance to Canva. The move follows a leap in the valuation of companies that have extended the market for design software with tools aimed at non-expert users. Canva's fundraising round in September valued it at $40bn, more than double what it was judged to be worth five months before. Figma, which makes software for product designers and more general business users, saw its value rise fivefold in little more than a year to $10bn. Adobe's move is partly defensive, since it could face disruption as Canva's simple tool moves deeper into the business world, said Liz Miller, an analyst at advisory firm Constellation Research. Adobe's new service, called Creative Cloud Express, is likely to appeal to many people in small or medium-sized businesses who might have been thought of before as customers for Adobe's more expensive software, but who are happy to use simpler design tools with fewer features, she said. [...] A basic version of the new service would be available free of charge through app stores and its own website, Adobe said, with a premium version priced at $9.99 a month. [Editor's note: the aforementioned link may be paywalled; alternative source]
The Matrix

'Matrix' Stars Discuss Free 'Matrix Awakens' Demo Showing Off Epic's Unreal Engine 5 (theverge.com) 34

This year's Game Awards also saw the premiere of The Matrix Awakens, a new in-world "tech demonstrator" written by Lana Wachowski, the co-writer/director of the original Matrix trilogy and director of the upcoming sequel. It's available free on the PS5 and Xbox Series X/S, reports the Verge, and they also scored a sit-down video interview with Keanu Reeves and Carrie-Ann Moss about the new playable experience — and the new Matrix movie: Reeves also revealed that he thinks there should be a modern Matrix video game, that he's flattered by Cyberpunk 2077 players modding the game to have sex with his character, and why he thinks Facebook shouldn't co-opt the metaverse.

Apart from serving as a clever promotion vehicle for the new Matrix movie premiering December 22nd, The Matrix Awakens is designed to showcase what's possible with the next major version of Epic's Unreal Engine coming next year. It's structured as a scripted intro by Wachowski, followed by a playable car chase scene and then an open-world sandbox experience you can navigate as one of Epic's metahuman characters. A big reason for doing the demo is to demonstrate how Epic thinks its technology can be used to blend scripted storytelling with games and much more, according to Epic CTO Kim Libreri, who worked on the special effects for the original Matrix trilogy...

Everything in the virtual city is fully loaded no matter where your character is located (rather than rendered only when the character gets near), down to the detail of a chain link fence in an alley. All of the moving vehicles, people, and lighting in the city are generated by AI, the latter of which Libreri describes as a breakthrough that means lighting is no longer "this sort of niche art form." Thanks to updates coming to Unreal Engine, which powers everything from Fortnite to special effects in Disney's The Mandalorian, developers will be able to use the same, hyper-realistic virtual assets across different experiences. It's part of Epic's goal to help build the metaverse.

Elsewhere the site writes that The Matrix Awakens "single-handedly proves next-gen graphics are within reach of Sony and Microsoft's new game consoles." It's unlike any tech demo you've ever tried before. When we said the next generation of gaming didn't actually arrive with Xbox Series X and PS5, this is the kind of push that has the potential to turn that around....

Just don't expect it to make you question your reality — the uncanny valley is still alive and well.... But from a "is it time for photorealistic video game cities?" perspective, The Matrix Awakens is seriously convincing. It's head-and-shoulders above the most photorealistic video game cities we've seen so far, including those in the Spider-Man, Grand Theft Auto and Watch Dogs series... Despite glitches and an occasionally choppy framerate, The Matrix Awakens city feels more real, thanks to Unreal Engine's incredible global illumination and real-time raytracing ("The entire world is lit by only the sun, sky and emissive materials on meshes," claims Epic), the detail of the procedurally generated buildings, and how dense it all is in terms of cars and foot traffic.

And the most convincing part is that it's not just a scripted sequence running in real-time on your PS5 or Xbox like practically every other tech demo you've seen — you get to run, drive, and fly through it, manipulate the angle of the sun, turn on filters, and dive into a full photo mode, as soon as the scripted and on-rails shooter parts of the demo are done. Not that there's a lot to do in The Matrix Awakens except finding different ways to take in the view. You can't land on buildings, there's no car chases except for the scripted one, no bullets to dodge. You can crash any one of the game's 38,146 drivable cars into any of the other cars or walls, I guess. I did a bunch of that before I got bored, though, just taking in the world.... Almost 10 million unique and duplicated assets were created to make the city....

Epic Games' pitch is that Unreal Engine 5 developers can do this or better with its ready-made tools at their disposal, and I can't wait to see them try.

Operating Systems

Qualcomm Has an Exclusivity Deal With Microsoft For Windows On ARM (xda-developers.com) 49

An anonymous reader quotes a report from XDA Developers: Last week, we reported that MediaTek is planning to build a chipset for Windows on ARM. As it turns out, the Windows on ARM chipset space could be even hotter than that, because there's a reason that we've only seen Qualcomm SoCs in ARM PCs so far. Qualcomm actually has an exclusivity deal with Microsoft for Windows on ARM, and speaking with people familiar with it, we've learned that the deal is set to expire soon. Other than the fact that Microsoft has publicly said that anyone who wants to can build a Windows on ARM chip, this really shouldn't come as a surprise. Qualcomm didn't just start building PC chips hoping that Microsoft would compile Windows to support it. No, these two companies worked together to make it happen. Because of that, Qualcomm gets to enjoy a bit of exclusivity.

One thing I wasn't able to learn is when the deal will expire, only that it's the thing holding back other chip vendors from competing in the space. It's possible that Samsung might want to throw its hat into the ring with its Exynos processors too, especially given its recent partnership with AMD for graphics power. This is also presumably why Apple Silicon Macs aren't officially supported for running Windows 11, so hopefully that will change as well. [...] Between MediaTek's Executive Summit and Qualcomm's Investor Day, there's been a very clear message that ARM SoC vendors absolutely believe that the 'Wintel' partnership is going to fade and that the transition to ARM isn't just happening, it's inevitable. Naturally, that means that all of these companies are going to want to be part of it when it opens up. Qualcomm has quite a head start though, given that it's been doing this for a few years and on top of that, it's going to start building its own custom silicon thanks to its Nuvia acquisition.

PlayStation (Games)

The Next Generation of Gaming Didn't Actually Arrive With Xbox Series X and PS5 (theverge.com) 46

A year ago, the next generation of console gaming was supposed to have arrived. The Xbox Series X (and Series S) and PlayStation 5 strode boldly onto the scene, with massive chassis and even bigger promises of games with better graphics, shorter loading times, and revolutionary new breakthroughs. But a year in, and that next generation of gaming has yet to arrive. From a report: There are still too few consoles, and more importantly, too few games that truly take advantage of them, leaving the first year of the PS5 and Xbox Series X more of a beta test for the lucky few who have been able to get ahold of one, rather than the proper start of a new era of gaming.

A complicated mess of factors have led to the next-gen bottleneck. The physical consoles themselves are still nigh-impossible to buy, which naturally limits the number of customers who own them and can buy games for them. That in turn means that there's little incentive for developers to aim for exclusive next-gen titles that truly harness the power of the PS5 or Xbox Series X. Why limit yourself (and your sales) to the handful of next-gen console owners when there are millions of Xbox One and PS4 customers to whom you can sell copies of games? Adding to the mess has been the fact that industry-wide delays (many of which are due to similar pandemic-related issues as the broader supply chain problems) have also seen tons of next-gen optimized or exclusive games moved out to 2022 and beyond. Meaning even if you can get ahold of a console, there are still relatively few blockbuster titles to actually play on them.

Cloud

NVIDIA's Cloud Gaming Service Quietly Capped Frame Rates on 12 Games (theverge.com) 24

Nvidia's "GeForce Now" cloud gaming service has been quietly capping the frame rates for a handful of 12 specific games on certain tiers "to ensure consistent performance," reports the Verge.

"Nvidia says the vast majority of games run at 60fps, but not these 12." Nvidia's GeForce Now cloud gaming service just leapfrogged Google Stadia in performance, with a new $200-a-year tier that practically gives you the power of an RTX 3080 desktop graphics card in the cloud. But if you're grandfathered into the original $4.99 a month "Founders" tier, or pay $100 a year for "Priority" access, you may not be getting quite what you expected...

Nvidia now has an official support page (via 9to5Google) explaining the practice, after Redditors and others revealed that a variety of games were locked to frame rates lower than 60fps. It appears that Nvidia's been doing this for quite a while but only for a handful of demanding games. I did a little searching, and some people were already complaining about being locked to 45fps in Cyberpunk 2077 in December 2020, just as Nvidia admits here.

Assassin's Creed Odyssey and Immortals Fenyx Rising are the other games that have sub-50fps frame rates, while others run a bit higher.

"For our Priority Members, the maximum frames rendered per second is generally set to 60, or higher, for most of the 1,100+ games we've onboarded so far," NVIDIA explains on its official support page. "There are some exceptions that we determined do not run well enough at 60 FPS on the GPUs used by Priority members. So the default OPS for these specific graphics-intensive games cannot be overridden.

"This is to ensure all Priority members are running a consistent, high-quality experience."
China

Have Scientists Disproven Google's Quantum Supremacy Claim? (scmp.com) 35

Slashdot reader AltMachine writes: In October 2019, Google said its Sycamore processor was the first to achieve quantum supremacy by completing a task in three minutes and 20 seconds that would have taken the best classical supercomputer, IBM's Summit, 10,000 years. That claim — particularly how Google scientists arrived at the "10,000 years" conclusion — has been questioned by some researchers, but the counterclaim itself was not definitive.

Now though, in a paper to be submitted to a scientific journal for peer review, scientists at the Institute of Theoretical Physics under the Chinese Academy of Sciences said their algorithm on classical computers completed the simulation for the Sycamore quantum circuits [possibly paywalled; alternative source of the same article] "in about 15 hours using 512 graphics processing units (GPUs)" at a higher fidelity than Sycamore's. Further, the team said "if our simulation of the quantum supremacy circuits can be implemented in an upcoming exaflop supercomputer with high efficiency, in principle, the overall simulation time can be reduced to a few dozens of seconds, which is faster than Google's hardware experiments".

As China unveiled a photonic quantum computer which solved a Gaussian boson sampling problem in 200 seconds that would have taken 600 million years on classical computer, in December 2020, disproving Sycamore's claim would place China being the first country to achieve quantum supremacy.

Slashdot Top Deals