Intel

Intel Arc Update: Alchemist Laptops Q1, Desktops Q2; 4M GPUs Total for 2022 (anandtech.com) 12

As part of Intel's annual investor meeting taking place today, Raja Koduri, Intel's SVP and GM of the Accelerated Computing Systems and Graphics (AXG) Group delivered an update to investors on the state of Intel's GPU and accelerator group, including some fresh news on the state of Intel's first generation of Arc graphics products. AnandTech: Among other things, the GPU frontman confirmed that while Intel will indeed ship the first Arc mobile products in the current quarter, desktop products will not come until Q2. Meanwhile, in the first disclosure of chip volumes, Intel is now projecting that they'll ship 4mil+ Arc GPUs this year. In terms of timing, today's disclosure confirms some earlier suspicions that developed following Intel's CES 2022 presentation: that the company would get its mobile Arc products out before their desktop products. Desktop products will now follow in the second quarter of this year, a couple of months behind the mobile parts. And finally, workstation products, which Intel has previously hinted at, are on their way and will land in Q3.
Intel

Intel To Enter Bitcoin Mining Market With Energy-Efficient GPU (pcmag.com) 52

Intel is entering the blockchain mining market with an upcoming GPU capable of mining Bitcoin. From a report: Intel insists the effort won't put a strain energy supplies or deprive consumers of chips. The goal is to create the most energy-efficient blockchain mining equipment on the planet, it says. "We expect that our circuit innovations will deliver a blockchain accelerator that has over 1,000x better performance per watt than mainstream GPUs for SHA-256 based mining," Intel's General Manager for Graphics, Raja Koduri, said in the announcement. (SHA-256 is a reference to the mining algorithm used to create Bitcoins.)

News of Intel's blockchain-mining effort first emerged last month after the ISSCC technology conference posted details about an upcoming Intel presentation titled: "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." ASICs are chips designed for a specific purpose, and also refer to dedicated hardware to mine Bitcoin. Friday's announcement from Koduri added that Intel is establishing a new "Custom Compute Group" to create chip platforms optimized for customers' workloads, including for blockchains.

Transportation

Tesla Now Runs the Most Productive Auto Factory In America (bloomberg.com) 198

An anonymous reader quotes a report from Bloomberg: Elon Musk has a very specific vision for the ideal factory: densely packed, vertically integrated and unusually massive. During Tesla's early days of mass production, he was chided for what was perceived as hubris. Now, Tesla's original California factory has achieved a brag-worthy title: the most productive auto plant in North America. Last year Tesla's factory in Fremont, California, produced an average of 8,550 cars a week. That's more than Toyota's juggernaut in Georgetown, Kentucky (8,427 cars a week), BMW AG's Spartanburg hub in South Carolina (8,343) or Ford's iconic truck plant in Dearborn, Michigan (5,564), according to a Bloomberg analysis of production data from more than 70 manufacturing facilities.

In a year when auto production around the world was stifled by supply-chain shortages, Tesla expanded its global production by 83% over 2020 levels. Its other auto factory, in Shanghai, tripled output to nearly 486,000. In the coming weeks, Tesla is expected to announce the start of production at two new factories -- Gigafactory Berlin-Brandenburg, its first in Europe, and Gigafactory Texas in Austin. Musk said in October that he plans to further increase production in Fremont and Shanghai by 50%. [...] Once Tesla flips the switch on two new factories, what comes next? Musk has a longstanding target to increase vehicle deliveries by roughly 50% a year. To continue such growth, Tesla will need to either open more factories or make the facilities even more productive. Musk said in October that he's working on both. Site selection for the next Gigafactories begins this year.

AI

Meta Unveils New AI Supercomputer (wsj.com) 48

An anonymous reader quotes a report from The Wall Street Journal: Meta said Monday that its research team built a new artificial intelligence supercomputer that the company maintains will soon be the fastest in the world. The supercomputer, the AI Research SuperCluster, was the result of nearly two years of work, often conducted remotely during the height of the pandemic, and led by the Facebook parent's AI and infrastructure teams. Several hundred people, including researchers from partners Nvidia, Penguin Computing and Pure Storage, were involved in the project, the company said.

Meta, which announced the news in a blog post Monday, said its research team currently is using the supercomputer to train AI models in natural-language processing and computer vision for research. The aim is to boost capabilities to one day train models with more than a trillion parameters on data sets as large as an exabyte, which is roughly equivalent to 36,000 years of high-quality video. "The experiences we're building for the metaverse require enormous compute powerand RSC will enable new AI models that can learn from trillions of examples, understand hundreds of languages, and more," Meta CEO Mark Zuckerberg said in a statement provided to The Wall Street Journal. Meta's AI supercomputer houses 6,080 Nvidia graphics-processing units, putting it fifth among the fastest supercomputers in the world, according to Meta.

By mid-summer, when the AI Research SuperCluster is fully built, it will house some 16,000 GPUs, becoming the fastest AI supercomputer in the world, Meta said. The company declined to comment on the location of the facility or the cost. [...] Eventually the supercomputer will help Meta's researchers build AI models that can work across hundreds of languages, analyze text, images and video together and develop augmented reality tools, the company said. The technology also will help Meta more easily identify harmful content and will aim to help Meta researchers develop artificial-intelligence models that think like the human brain and support rich, multidimensional experiences in the metaverse. "In the metaverse, it's one hundred percent of the time, a 3-D multi-sensorial experience, and you need to create artificial-intelligence agents in that environment that are relevant to you," said Jerome Pesenti, vice president of AI at Meta.

Graphics

Vice Mocks GIFs as 'For Boomers Now, Sorry'. (And For Low-Effort Millennials) (vice.com) 227

"GIF folders were used by ancient civilisations as a way to store and catalogue animated pictures that were once employed to convey emotion," Vice writes: Okay, you probably know what a GIF folder is — but the concept of a special folder needed to store and save GIFs is increasingly alien in an era where every messaging app has its own in-built GIF library you can access with a single tap. And to many youngsters, GIFs themselves are increasingly alien too — or at least, okay, increasingly uncool. "Who uses gifs in 2020 grandma," one Twitter user speedily responded to Taylor Swift in August that year when the singer-songwriter opted for an image of Dwayne "The Rock" Johnson mouthing the words "oh my god" to convey her excitement at reaching yet another career milestone.

You don't have to look far to find other tweets or TikToks mocking GIFs as the preserve of old people — which, yes, now means millennials. How exactly did GIFs become so embarrassing? Will they soon disappear forever, like Homer Simpson backing up into a hedge...?

Gen Z might think GIFs are beloved by millennials, but at the same time, many millennials are starting to see GIFs as a boomer plaything. And this is the first and easiest explanation as to why GIFs are losing their cultural cachet. Whitney Phillips, an assistant professor of communication at Syracuse University and author of multiple books on internet culture, says that early adopters have always grumbled when new (read: old) people start to encroach on their digital space. Memes, for example, were once subcultural and niche. When Facebook came along and made them more widespread, Redditors and 4Chan users were genuinely annoyed that people capitalised on the fruits of their posting without putting in the cultural work. "That democratisation creates a sense of disgust with people who consider themselves insiders," Phillips explains. "That's been central to the process of cultural production online for decades at this point...."

In 2016, Twitter launched its GIF search function, as did WhatsApp and iMessage. A year later, Facebook introduced its own GIF button in the comment section on the site. GIFs became not only centralised but highly commercialised, culminating in Facebook buying GIPHY for $400 million in 2020. "The more GIFs there are, maybe the less they're regarded as being special treasures or gifts that you're giving people," Phillips says. "Rather than looking far and wide to find a GIF to send you, it's clicking the search button and typing a word. The gift economy around GIFs has shifted...."

Linda Kaye, a cyberpsychology professor at Edge Hill University, hasn't done direct research in this area but theorises that the ever-growing popularity of video-sharing on TikTok means younger generations are more used to "personalised content creation", and GIFs can seem comparatively lazy.

The GIF was invented in 1987 "and it's important to note the format has already fallen out of favour and had a comeback multiple times before," the article points out. It cites Jason Eppink, an independent artist and curator who curated an exhibition on GIFs for the Museum of the Moving Image in New York in 2014, who highlighted how GIFs were popular with GeoCities users in the 90s, "so when Facebook launched, they didn't support GIFs.... They were like, 'We don't want this ugly symbol of amateur web to clutter our neat and uniform cool new website." But then GIFs had a resurgence on Tumblr.

Vice concludes that while even Eppink no longer uses GIFs any more, "Perhaps the waxing and waning popularity of the GIF is an ironic mirror of the format itself — destined to repeat endlessly, looping over and over again."
Graphics

Blender 3.0 Released With More New Features and Improvements 37

Long-time Slashdot reader Qbertino writes: The Free Open Source 3D production software Blender has been released in version 3.0 (official showreel) with more new features, improvements and performance optimizations as well as further improved workflows.

In recent years Blender has received an increasing rate of attention from the 3D industry, with various larger businesses such as Epic, Microsoft, Apple and most recently Intel joining the blender foundation and donating to its development fund. Blender has seen an increasing rise in usage in various industries, such as animated feature film production, architecture and game development.
Google

Google is Building an AR Headset (theverge.com) 52

Meta may be the loudest company building AR and VR hardware. Microsoft has HoloLens. Apple is working on something, too. But don't count out Google. The Verge: The search giant has recently begun ramping up work on an AR headset, internally codenamed Project Iris, that it hopes to ship in 2024, according to two people familiar with the project who requested anonymity to speak without the company's permission. Like forthcoming headsets from Meta and Apple, Google's device uses outward-facing cameras to blend computer graphics with a video feed of the real world, creating a more immersive, mixed reality experience than existing AR glasses from the likes of Snap and Magic Leap. Early prototypes being developed at a facility in the San Francisco Bay Area resemble a pair of ski goggles and don't require a tethered connection to an external power source.

Google's headset is still early in development without a clearly defined go-to-market strategy, which indicates that the 2024 target year may be more aspirational than set in stone. The hardware is powered by a custom Google processor, like its newest Google Pixel smartphone, and runs on Android, though recent job listings indicate that a unique OS is in the works. Given power constraints, Google's strategy is to use its data centers to remotely render some graphics and beam them into the headset via an internet connection. I'm told that the Pixel team is involved in some of the hardware pieces, but it's unclear if the headset will ultimately be Pixel-branded.

Intel

Intel To Unveil 'Ultra Low-Voltage Bitcoin Mining ASIC' In February (coindesk.com) 31

Intel, one of the world's largest chip makers, is likely to unveil a specialized crypto-mining chip at the International Solid-State Circuits Conference (ISSCC) in February, according to the conference's agenda (PDF). CoinDesk reports: One of Intel's "highlighted chip releases" at the conference is entitled "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." The session is scheduled for Feb. 23. This brings the company into direct competition with the likes of Bitmain and MicroBT in the market for bitcoin mining ASICs, or application-specific integrated circuits, for the first time. [...] Unlike its competitor Nvidia, Intel has said it doesn't plan to add ether mining limits on its graphics cards.
AMD

AMD Returns To Smartphone Graphics (theregister.com) 13

AMD's GPU technology is returning to mobile handsets with Samsung's Exynos 2200 system-on-chip, which was announced on Tuesday. The Register reports: The Exynos 2200 processor, fabricated using a 4nm process, has Armv9 CPU cores and the oddly named Xclipse GPU, which is an adaptation of AMD's RDNA 2 mainstream GPU architecture. AMD was in the handheld GPU market until 2009, when it sold the Imageon GPU and handheld business for $65m to Qualcomm, which turned the tech into the Adreno GPU for its Snapdragon family. AMD's Imageon processors were used in devices from Motorola, Panasonic, Palm and others making Windows Mobile handsets. AMD's now returning to a more competitive mobile graphics market with Apple, Arm and Imagination also possessing homegrown smartphone GPUs.

Samsung and AMD announced the companies were working together on graphics in June last year. With Exynos 2200, Samsung has moved on from Arm's Mali GPU family, which was in the predecessor Exynos 2100 used in the current flagship Galaxy smartphones. Samsung says the power-optimized GPU has hardware-accelerated ray tracing, which simulates lighting effects and other features to make gaming a better experience. [...] The Exynos 2200 has an image signal processor that can apparently handle 200-megapixel pictures and record 8K video. Other features include HDR10+ support, and 4K video decoding at up to 240fps or 8K decoding at up to 60fps. It supports display refresh rates of up to 144Hz.

The eight-core CPU cluster features a balance of high-performing and power-efficient cores. It has one Arm Cortex-X2 flagship core, three Cortex-A710 big cores and four Cortex-A510s, which is in the same ballpark as Qualcomm's Snapdragon 8 Gen 1 and Mediatek's Dimensity 9000, which are the only other chips using Arm's Armv9 cores and are made using a 4nm process. An integrated 5G modem supports both sub-6GHz and millimeter wave bands, and a feature to mix LTE and 5G signals speeds up data transfers to 10Gbps. The chip also has a security processor and an AI engine that is said to be two times faster than its predecessor in the Exynos 2100.

AI

Nvidia's AI-Powered Scaling Makes Old Games Look Better Without a Huge Performance Hit (theverge.com) 41

Nvidia's latest game-ready driver includes a tool that could let you improve the image quality of games that your graphics card can easily run, alongside optimizations for the new God of War PC port. The Verge reports: The tech is called Deep Learning Dynamic Super Resolution, or DLDSR, and Nvidia says you can use it to make "most games" look sharper by running them at a higher resolution than your monitor natively supports. DLDSR builds on Nvidia's Dynamic Super Resolution tech, which has been around for years. Essentially, regular old DSR renders a game at a higher resolution than your monitor can handle and then downscales it to your monitor's native resolution. This leads to an image with better sharpness but usually comes with a dip in performance (you are asking your GPU to do more work, after all). So, for instance, if you had a graphics card capable of running a game at 4K but only had a 1440p monitor, you could use DSR to get a boost in clarity.

DLDSR takes the same concept and incorporates AI that can also work to enhance the image. According to Nvidia, this means you can upscale less (and therefore lose less performance) while still getting similar image quality improvements. In real numbers, Nvidia claims you'll get image quality similar to running at four times the resolution using DSR with only 2.25 times the resolution with DLDSR. Nvidia gives an example using 2017's Prey: Digital Deluxe running on a 1080p monitor: 4x DSR runs at 108 FPS, while 2.25x DLDSR is getting 143 FPS, only two frames per second slower than running at native 1080p.

IOS

Fortnite Sneaks Back Onto iPhone By Way Of GeForce Now (kotaku.com) 13

It's been 518 days since Apple kicked Fortnite off of the App Store after Epic Games tried to bypass its payment system. Now the popular free-to-play battle royale is once again playable on iPhones, sort of. From a report: Starting next week, Fortnite will be available on iOS by way of streaming, as part of an upcoming closed beta for Nvidia's GeForce Now game streaming program. "Fortnite on GeForce NOW will launch in a limited-time closed beta for mobile, all streamed through the Safari web browser on iOS and the GeForce NOW Android app," Nvidia announced on its blog today. "The beta is open for registration for all GeForce NOW members, and will help test our server capacity, graphics delivery and new touch controls performance."

GeForce Now, subscriptions for which range from free to $200 a year for the premium tier, lets users stream games they already own to PCs, tablets, and smartphones. It's one way to make blockbuster PC games portable, or to play them on rigs with beefier specs than the ones people already have at home. In Fortnite's case, GeForce Now subscribers will soon be able to stream the shooter to iOS devices and play it using touch controls via Apple's Safari. The browser workaround is one way companies like Microsoft have been able to get their game streaming platforms on iPhones despite Apple's ban on allowing them inside its App Store. Now its bringing back the game that kicked off a massive, messy, year-long legal battle that's still raging to this day.

Data Storage

PCI Express 6.0 Specification Finalized: x16 Slots To Reach 128GBps (anandtech.com) 31

PCI Special Interest Group (PCI-SIG) has released the much-awaited final (1.0) specification for PCI Express 6.0. From a report: The next generation of the ubiquitous bus is once again doubling the data rate of a PCIe lane, bringing it to 8GB/second in each direction -- and far, far higher for multi-lane configurations. With the final version of the specification now sorted and approved, the group expects the first commercial hardware to hit the market in 12-18 months, which in practice means it should start showing up in servers in 2023. First announced in the summer of 2019, PCI Express 6.0 is, as the name implies, the immediate follow-up to the current-generation PCIe 5.0 specification. Having made it their goal to keep doubling PCIe bandwidth roughly every 3 years, the PCI-SIG almost immediately set about work on PCIe 6.0 once the 5.0 specification was completed, looking at ways to once again double the bandwidth of PCIe. The product of those development efforts is the new PCIe 6.0 spec, and while the group has missed their original goal of a late 2021 release by mere weeks, today they are announcing that the specification has been finalized and is being released to the group's members. As always, the creation of an even faster version of PCIe technology has been driven by the insatiable bandwidth needs of the industry. The amount of data being moved by graphics cards, accelerators, network cards, SSDs, and other PCIe devices only continues to increase, and thus so must bus speeds to keep these devices fed. As with past versions of the standard, the immediate demand for the faster specification comes from server operators, whom are already regularly using large amounts of high-speed hardware. But in due time the technology should filter down to consumer devices (i.e. PCs) as well.
The Almighty Buck

Norton 360 Criticized For Installing a Cryptominer (krebsonsecurity.com) 96

"Norton 360, one of the most popular antivirus products on the market today, has installed a cryptocurrency mining program on its customers' computers," reports security researcher Brian Krebs.

The Verge follows up: The TL;DR is that yes, Norton does install a crypto miner with its software, without making that clear in the initial setup process. But it isn't going to do anything unless you specifically opt in, so it's not a situation where you'll install the security suite and instantly start seeing your computer lag as it crunches crypto in the background.

A NortonLifeLock spokesperson also told The Verge in an email that you can completely remove NCrypt.exe by temporarily turning off Norton's tamper protection feature, and then deleting the executable. We confirmed that ourselves, and it could be good news for anyone worried about Norton remotely activating the feature.

But Krebs reports the product has drawn some bad reactions — and not just because Norton is keeping 15% of the currencies mined: [M]any Norton users complain the mining program is difficult to remove, and reactions from longtime customers have ranged from unease and disbelief to, "Dude, where's my crypto...?"

According to the FAQ posted on its site, "Norton Crypto" will mine Ethereum cryptocurrency while the customer's computer is idle. The FAQ also says Norton Crypto will only run on systems that meet certain hardware and software requirements (such as an NVIDIA graphics card with at least 6 GB of memory). "Norton creates a secure digital Ethereum wallet for each user," the FAQ reads. "The key to the wallet is encrypted and stored securely in the cloud. Only you have access to the wallet." NortonLifeLock began offering the mining service in July 2021...

[M]any users have reported difficulty removing the mining program.

From reading user posts on the Norton Crypto community forum, it seems some longtime Norton customers were horrified at the prospect of their antivirus product installing coin-mining software, regardless of whether the mining service was turned off by default. "How on Earth could anyone at Norton think that adding crypto mining within a security product would be a good thing?," reads a Dec. 28 thread titled "Absolutely furious."

"Norton should be DETECTING and killing off crypto mining hijacking, not installing their own," the post reads....

"Norton is pretty much amplifying energy consumption worldwide, costing their customers more in electricity use than the customer makes on the mining, yet allowing Norton to make a ton of profit," tweeted security researcher Chris Vickery. "It's disgusting, gross, and brand-suicide."

Then there's the matter of getting paid.... "Transfers of cryptocurrencies may result in transaction fees (also known as "gas" fees) paid to the users of the cryptocurrency blockchain network who process the transaction," the FAQ explains... Which might explain why so many Norton Crypto users have taken to the community's online forum to complain they were having trouble withdrawing their earnings. Those gas fees are the same regardless of the amount of crypto being moved, so the system simply blocks withdrawals if the amount requested can't cover the transfer fees.

Thanks to Slashdot reader JustAnotherOldGuy for tipping us off to the story!
AMD

AMD Announces Ryzen 6000 Mobile CPUs for Laptops: Zen3+ on 6nm with RDNA2 Graphics (anandtech.com) 20

AnandTech: The notebook market is a tough nut to crack with a single solution. People want that mix of high performance at the top, cost effectiveness at the bottom, and throughout there has to be efficiency, utility, and function. On the back of a successful ramp last year, AMD is striking the notebook market hot again in 2022 with the launch of its new Ryzen 6000 Mobile processors. These 'Rembrandt' APUs feature AMD's latest RDNA2 graphics, up to eight Zen3+ cores with enhanced power management features, and it uses TSMC's N6 manufacturing process for performance and efficiency improvements. Yesterday AMD disclosed that they would be launching the new Ryzen 6000 Mobile series today -- updated cores, better graphics, more features, all in a single monolithic package a little over 200 mm2.

There will be 10 new processors, ranging from the traditional portable 15 W and 28 W hardware, up to 35 W and 45 W plus for the high-end gaming machines. AMD is expecting 200+ premium systems in the market with Ryzen Mobile in 2022. At the heart of the design is AMD's Zen 3+ core, which affords an improvement in power management between the cores, but keeps the Zen 3 performance characteristics. The focus here is mainly to improve idle power consumption and power when using accelerators, to help extend the life of ultraportable devices -- AMD is claiming 15-40% lower power between web browsing and video streaming. There is a frequency uplift as well, with the top processors going up to 5.0 GHz. AMD is claiming up to 1.3x single thread performance for the Ryzen 7 6800U.

Intel

Intel Demos Lightning Fast 13.8 GBps PCIe 5.0 SSD with Alder Lake (tomshardware.com) 40

Intel has demonstrated how its Core i9-12900K Alder Lake processor can work with Samsung's recently announced PM1743 PCIe 5.0 x4 SSD. The result is as astonishing as it is predictable: the platform demonstrated approximately 13.8 GBps throughput in the IOMeter benchmark. From a report: Intel planned to show the demo at CES, however, the company is no longer going in person. So, Ryan Shrout, Intel's chief performance strategist, decided to share the demo publicly via Twitter. The system used for the demonstration included a Core i9-12900K processor, an Asus Z690 motherboard and an EVGA GeForce RTX 3080 graphics board. Intel hooked up Samsung's PM1743 SSD using a special PCIe 5.0 interposer card and the drive certainly did not disappoint. From a practical standpoint, 13.8 GBps may be overkill for regular desktop users, but for those who need to load huge games, work with large 8K video files or ultra-high-resolution images will appreciate the added performance. However, there is a small catch with this demo. Apparently, Samsung will be among the first to ship its PM1743 PCIe 5.0 drives, which is why Intel decided to use this SSD for the demonstration. But Samsung's PM1743-series is aimed at enterprises, so it will be available in a 2.5-inch/15mm with dual-port support and new-generation E3.S (76 Ã-- 112.75 Ã-- 7.5mm) form-factors, so it is not aimed at desktops (and Intel admits that).
Graphics

'Quite OK Image' Format (QOI) Coming To a Graphics Program Near You? (phoboslab.org) 103

Slashdot reader Tesseractic comes bearing gifts — specifically, news of "a new image format that is lossless, gives much faster encodes, faster decodes and roughly comparable compression compared to what's in use today."

Quite OK Image format (or QOI) is the brainchild of developer Dominic Szablewski, who complains current image formats like PNG, JPEG, MPEG, MOV and MP4 "burst with complexity at the seams," the Register reports: "Every tiny aspect screams 'design by consortium'," he added, going on to lament the fact that most common codecs are old, closed, and "require huge libraries, are compute hungry and difficult to work with." Szablewski thought he could do better and appears to have achieved that objective by cooking up some code, floating it on GitHub, and paying attention to the 500-plus comments it generated.

While Szablewski admits that QOI will not compress images as well as an optimized PNG encoder, he claims it "losslessy compresses images to a similar size of PNG, while offering 20x-50x faster encoding and 3x-4x faster decoding." Most importantly, to Szablewski, the reference en-/decoder fits in about 300 lines of C and the file format spec requires is just one page long.

"In the last few weeks QOI implementations for lot of different languages and libraries popped up," Szablewski wrote on his blog, with Zig, Rust,Go, TypeScript, Haskell, Ä, Python, C#, Elixir, Swift, Java, and Pascal among the options.

Hardware

This 8-bit Processor Built in Minecraft Can Run Its Own Games (pcworld.com) 60

The months-long project demonstrates the physics behind the CPUs we take for granted. From a report: Computer chips have become so tiny and complex that it's sometimes hard to remember that there are real physical principles behind them. They aren't just a bunch of ever-increasing numbers. For a practical (well, virtual) example, check out the latest version of a computer processor built exclusively inside the Minecraft game engine. Minecraft builder "Sammyuri" spent seven months building what they call the Chungus 2, an enormously complex computer processor that exists virtually inside the Minecraft game engine. This project isn't the first time a computer processor has been virtually rebuilt inside Minecraft, but the Chungus 2 (Computation Humongous Unconventional Number and Graphics Unit) might very well be the largest and most complex, simulating an 8-bit processor with a one hertz clock speed and 256 bytes of RAM. Minecraft processors use the physics engine of the game to recreate the structure of real processors on a macro scale, with materials including redstone dust, torches, repeaters, pistons, levers, and other simple machines. For a little perspective, each "block" inside the game is one virtual meter on each side, so recreating this build in the real world would make it approximately the size of a skyscraper or cruise ship.
Power

Metaverse Vision Requires 1000x More Computational Power, Says Intel (intel.com) 79

Leading chip-maker Intel has stressed that building Metaverse -- at scale and accessible by billions of humans in real time -- will require a 1,000-times increase in computational efficiency from what we have today. Insider reports: Raja Koduri, a senior vice president and head of Intel's Accelerated Computing Systems and Graphics Group, said that our computing, storage and networking infrastructure today is simply not enough to enable this Metaverse vision, being popularized by Meta (formerly Facebook) and other companies. "We need several orders of magnitude more powerful computing capability, accessible at much lower latencies across a multitude of device form factors," Koduri said in a blog post. To enable these capabilities at scale, the entire plumbing of the internet will need major upgrades, he added.
Businesses

Adobe Takes on Canva With Freemium Offering (ft.com) 36

Adobe unveiled its first comprehensive package of design software for non-professionals on Monday, taking direct aim at a booming market that has turned Australian start-up Canva into one of the world's most valuable private tech companies. From a report: The service includes versions of widely used professional design tools such as the Photoshop picture editor, Illustrator graphics tool and video-editing service Premiere, behind a simpler interface that analysts said bore a striking resemblance to Canva. The move follows a leap in the valuation of companies that have extended the market for design software with tools aimed at non-expert users. Canva's fundraising round in September valued it at $40bn, more than double what it was judged to be worth five months before. Figma, which makes software for product designers and more general business users, saw its value rise fivefold in little more than a year to $10bn. Adobe's move is partly defensive, since it could face disruption as Canva's simple tool moves deeper into the business world, said Liz Miller, an analyst at advisory firm Constellation Research. Adobe's new service, called Creative Cloud Express, is likely to appeal to many people in small or medium-sized businesses who might have been thought of before as customers for Adobe's more expensive software, but who are happy to use simpler design tools with fewer features, she said. [...] A basic version of the new service would be available free of charge through app stores and its own website, Adobe said, with a premium version priced at $9.99 a month. [Editor's note: the aforementioned link may be paywalled; alternative source]
The Matrix

'Matrix' Stars Discuss Free 'Matrix Awakens' Demo Showing Off Epic's Unreal Engine 5 (theverge.com) 34

This year's Game Awards also saw the premiere of The Matrix Awakens, a new in-world "tech demonstrator" written by Lana Wachowski, the co-writer/director of the original Matrix trilogy and director of the upcoming sequel. It's available free on the PS5 and Xbox Series X/S, reports the Verge, and they also scored a sit-down video interview with Keanu Reeves and Carrie-Ann Moss about the new playable experience — and the new Matrix movie: Reeves also revealed that he thinks there should be a modern Matrix video game, that he's flattered by Cyberpunk 2077 players modding the game to have sex with his character, and why he thinks Facebook shouldn't co-opt the metaverse.

Apart from serving as a clever promotion vehicle for the new Matrix movie premiering December 22nd, The Matrix Awakens is designed to showcase what's possible with the next major version of Epic's Unreal Engine coming next year. It's structured as a scripted intro by Wachowski, followed by a playable car chase scene and then an open-world sandbox experience you can navigate as one of Epic's metahuman characters. A big reason for doing the demo is to demonstrate how Epic thinks its technology can be used to blend scripted storytelling with games and much more, according to Epic CTO Kim Libreri, who worked on the special effects for the original Matrix trilogy...

Everything in the virtual city is fully loaded no matter where your character is located (rather than rendered only when the character gets near), down to the detail of a chain link fence in an alley. All of the moving vehicles, people, and lighting in the city are generated by AI, the latter of which Libreri describes as a breakthrough that means lighting is no longer "this sort of niche art form." Thanks to updates coming to Unreal Engine, which powers everything from Fortnite to special effects in Disney's The Mandalorian, developers will be able to use the same, hyper-realistic virtual assets across different experiences. It's part of Epic's goal to help build the metaverse.

Elsewhere the site writes that The Matrix Awakens "single-handedly proves next-gen graphics are within reach of Sony and Microsoft's new game consoles." It's unlike any tech demo you've ever tried before. When we said the next generation of gaming didn't actually arrive with Xbox Series X and PS5, this is the kind of push that has the potential to turn that around....

Just don't expect it to make you question your reality — the uncanny valley is still alive and well.... But from a "is it time for photorealistic video game cities?" perspective, The Matrix Awakens is seriously convincing. It's head-and-shoulders above the most photorealistic video game cities we've seen so far, including those in the Spider-Man, Grand Theft Auto and Watch Dogs series... Despite glitches and an occasionally choppy framerate, The Matrix Awakens city feels more real, thanks to Unreal Engine's incredible global illumination and real-time raytracing ("The entire world is lit by only the sun, sky and emissive materials on meshes," claims Epic), the detail of the procedurally generated buildings, and how dense it all is in terms of cars and foot traffic.

And the most convincing part is that it's not just a scripted sequence running in real-time on your PS5 or Xbox like practically every other tech demo you've seen — you get to run, drive, and fly through it, manipulate the angle of the sun, turn on filters, and dive into a full photo mode, as soon as the scripted and on-rails shooter parts of the demo are done. Not that there's a lot to do in The Matrix Awakens except finding different ways to take in the view. You can't land on buildings, there's no car chases except for the scripted one, no bullets to dodge. You can crash any one of the game's 38,146 drivable cars into any of the other cars or walls, I guess. I did a bunch of that before I got bored, though, just taking in the world.... Almost 10 million unique and duplicated assets were created to make the city....

Epic Games' pitch is that Unreal Engine 5 developers can do this or better with its ready-made tools at their disposal, and I can't wait to see them try.

Slashdot Top Deals