Google

Steam (Officially) Comes To Chrome OS 24

An anonymous reader shares a report: This may feel like deja vu because Google itself mistakenly leaked this announcement a few days ago, but the company today officially announced the launch of Steam OS on Chrome OS. Before you run off to install it, there are a few caveats: This is still an alpha release and only available on the more experimental and unstable Chrome OS Dev channel. The number of supported devices is also still limited since it'll need at least 8GB of memory, an 11th-generation Intel Core i5 or i7 processor and Intel Iris Xe Graphics. That's a relatively high-end configuration for what are generally meant to be highly affordable devices and somewhat ironically means that you can now play games on Chrome OS devices that are mostly meant for business users. The list of supported games is also still limited but includes the likes of Portal 2, Skyrim, The Witcher 3: Wild Hunt, Half-Life 2, Stardew Valley, Factorio, Stellaris, Civilization V, Fallout 4, Dico Elysium and Untitled Goose Game.
Technology

Nvidia Takes the Wraps off Hopper, Its Latest GPU Architecture (venturebeat.com) 58

After much speculation, Nvidia today at its March 2022 GTC event announced the Hopper GPU architecture, a line of graphics cards that the company says will accelerate the types of algorithms commonly used in data science. Named for Grace Hopper, the pioneering U.S. computer scientist, the new architecture succeeds Nvidia's Ampere architecture, with launched roughly two years ago. From a report: The first card in the Hopper lineup is the H100, containing 80 billion transistors and a component called the Transformer Engine that's designed to speed up specific categories of AI models. Another architectural highlight includes Nvidia's MIG technology, which allows an H100 to be partitioned into seven smaller, isolated instances to handle different types of jobs. "Datacenters are becoming AI factories -- processing and refining mountains of data to produce intelligence," Nvidia founder and CEO Jensen Huang said in a press release. "Nvidia H100 is the engine of the world's AI infrastructure that enterprises use to accelerate their AI-driven businesses."

The H100 is the first Nvidia GPU to feature dynamic programming instructions (DPX), "instructions" in this context referring to segments of code containing steps that need to be executed. Developed in the 1950s, dynamic programming is an approach to solving problems using two key techniques: recursion and memoization. Recursion in dynamic programming involves breaking a problem down into sub-problems, ideally saving time and computational effort. In memoization, the answers to these sub-problems are stored so that the sub-problems don't need to be recomputed when they're needed later on in the main problem. Dynamic programming is used to find optimal routes for moving machines (e.g., robots), streamline operations on sets of databases, align unique DNA sequences, and more.

Iphone

Apple's iPhone Cameras Accused of Being 'Too Smart' (newyorker.com) 162

The New Yorker argues that photos on newer iPhones are "coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning...."

"[T]he truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of 'computational photography,' a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal." In late 2020, Kimberly McCabe, an executive at a consulting firm in the Washington, D.C. area, upgraded from an iPhone 10 to an iPhone 12 Pro... But the 12 Pro has been a disappointment, she told me recently, adding, "I feel a little duped." Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl's upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-reflex camera (D.S.L.R.), "what I see in real life is what I see on the camera and in the picture." The new iPhone promises "next level" photography with push-button ease. But the results look odd and uncanny. "Make it less smart — I'm serious," she said. Lately she's taken to carrying a Pixel, from Google's line of smartphones, for the sole purpose of taking pictures....

Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, "I've tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing." A dusky purple gets edited, and in the process erased, because the hue is evaluated as undesirable, as a flaw instead of a feature. The device "sees the things I'm trying to photograph as a problem to solve," he added. The image processing also eliminates digital noise, smoothing it into a soft blur, which might be the reason behind the smudginess that McCabe sees in photos of her daughter's gymnastics. The "fix" ends up creating a distortion more noticeable than whatever perceived mistake was in the original.

Earlier this month, Apple's iPhone team agreed to provide me information, on background, about the camera's latest upgrades. A staff member explained that, when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. Then a "Deep Fusion" feature, which has existed in some form since 2019, merges the clearest parts of all those frames together, pixel by pixel, forming a single composite image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy.... The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently. On both the 12 Pro and 13 Pro, I've found that the image processing makes clouds and contrails stand out with more clarity than the human eye can perceive, creating skies that resemble the supersaturated horizons of an anime film or a video game. Andy Adams, a longtime photo blogger, told me, "H.D.R. is a technique that, like salt, should be applied very judiciously." Now every photo we take on our iPhones has had the salt applied generously, whether it is needed or not....

The average iPhone photo strains toward the appearance of professionalism and mimics artistry without ever getting there. We are all pro photographers now, at the tap of a finger, but that doesn't mean our photos are good.

Data Storage

Nvidia Wants To Speed Up Data Transfer By Connecting Data Center GPUs To SSDs (arstechnica.com) 15

Microsoft brought DirectStorage to Windows PCs this week. The API promises faster load times and more detailed graphics by letting game developers make apps that load graphical data from the SSD directly to the GPU. Now, Nvidia and IBM have created a similar SSD/GPU technology, but they are aiming it at the massive data sets in data centers. From a report: Instead of targeting console or PC gaming like DirectStorage, Big accelerator Memory (BaM) is meant to provide data centers quick access to vast amounts of data in GPU-intensive applications, like machine-learning training, analytics, and high-performance computing, according to a research paper spotted by The Register this week. Entitled "BaM: A Case for Enabling Fine-grain High Throughput GPU-Orchestrated Access to Storage" (PDF), the paper by researchers at Nvidia, IBM, and a few US universities proposes a more efficient way to run next-generation applications in data centers with massive computing power and memory bandwidth. BaM also differs from DirectStorage in that the creators of the system architecture plan to make it open source.
Security

Cybercriminals Who Breached Nvidia Issue One of the Most Unusual Demands Ever (arstechnica.com) 60

shanen shares a report: Data extortionists who stole up to 1 terabyte of data from Nvidia have delivered one of the most unusual ultimatums ever in the annals of cybercrime: allow Nvidia's graphics cards to mine cryptocurrencies faster or face the imminent release of the company's crown-jewel source code. A ransomware group calling itself Lapsus$ first claimed last week that it had hacked into Nvidia's corporate network and stolen more than 1TB of data. Included in the theft, the group claims, are schematics and source code for drivers and firmware. A relative newcomer to the ransomware scene, Lapsus$ has already published one tranche of leaked files, which among other things included the usernames and cryptographic hashes for 71,335 of the chipmaker's employees.
Hardware

Raspberry Pi Alternative Banana Pi Reveals Powerful New Board (tomshardware.com) 78

Banana Pi has revealed a new board resembling the Raspberry Pi Computer Module 3. According to Tom's Hardware, it features a powerful eight-core processor, up to 8GB of RAM and 32GB eMMC. Additional features like ports will require you to connect it to a carrier board. From the report: At the core of the Banana Pi board is a Rockchip RK3588 SoC. This brings together four Arm Cortex-A76 cores at up to 2.6 GHz with four Cortex-A55 cores at 1.8 GHz in Arm's new DynamIQ configuration - essentially big.LITTLE in a single fully integrated cluster. It uses an 8nm process. The board is accompanied by an Arm Mali-G610 MP4 Odin GPU with support for OpenGLES 1.1, 2.0, and 3.2, OpenCL up to 2.2, and Vulkan1.2. There's a 2D graphics engine supporting resolutions up to 8K too, with four separate displays catered for (one of which can be 8K 30FPS), and up to 8GB of RAM, though the SoC supports up to 32GB. Built-in storage is catered for by up to 128GB of eMMC flash. It offers 8K 30fps video encoding in the H.265, VP9, AVS2 and (at 30fps) H.264 codecs.

That carrier board is a monster, with ports along every edge. It looks to be about four times the area of the compute board, though no official measurements have been given. You get three HDMIs (the GPU supports version 2.1), two gigabit Ethernet, two SATA, three USB Type-A (two 2.0 and one 3) one USB Type-C, micro SD, 3.5mm headphones, ribbon connectors, and what looks very like a PCIe 3.0 x4 micro slot. The PCIe slot seems to breakout horizontally, an awkward angle if you are intending to house the board in a case. Software options include Android and Linux.

Security

Nvidia Says Employee, Company Information Leaked Online After Cyber Attack (cnn.com) 9

U.S. chipmaker Nvidia said on Tuesday a cyber attacker has leaked employee credentials and some company proprietary information online after their systems were breached. From a report: "We have no evidence of ransomware being deployed on the Nvidia environment or that this is related to the Russia-Ukraine conflict," the company's spokesperson said in a statement. The Santa Clara, California-based company said it became aware of the breach on Feb. 23. Nvidia added it was working to analyze the information that has been leaked and does not anticipate any disruption to the company's business. A ransomware outfit under the name "Lapsus$" has reportedly claimed to be responsible for the leak and seemingly has information about the schematics, drivers and firmware, among other data, about the graphics chips.
Security

Utility Promising To Restore Mining Performance on Nvidia GPUs Actually Malware (web3isgoinggreat.com) 23

Web3 is Going Great reports: The popular Tom's Hardware and PC Gamer websites both ran articles about a utility called "Nvidia RTX LHR v2 Unlocker", which claimed to increase the artificially-limited cryptocurrency mining performance of its RTX graphics cards. These graphics cards are shipped with performance-limiting software to reduce the GPUs' attractiveness to cryptocurrency miners, whose thirst for GPUs has made it difficult and expensive for gamers and various others to acquire the hardware. Unfortunately, both publications had to run a second article just a day later to warn their readers away from the software they had just advertised.
Intel

Intel Arc Update: Alchemist Laptops Q1, Desktops Q2; 4M GPUs Total for 2022 (anandtech.com) 12

As part of Intel's annual investor meeting taking place today, Raja Koduri, Intel's SVP and GM of the Accelerated Computing Systems and Graphics (AXG) Group delivered an update to investors on the state of Intel's GPU and accelerator group, including some fresh news on the state of Intel's first generation of Arc graphics products. AnandTech: Among other things, the GPU frontman confirmed that while Intel will indeed ship the first Arc mobile products in the current quarter, desktop products will not come until Q2. Meanwhile, in the first disclosure of chip volumes, Intel is now projecting that they'll ship 4mil+ Arc GPUs this year. In terms of timing, today's disclosure confirms some earlier suspicions that developed following Intel's CES 2022 presentation: that the company would get its mobile Arc products out before their desktop products. Desktop products will now follow in the second quarter of this year, a couple of months behind the mobile parts. And finally, workstation products, which Intel has previously hinted at, are on their way and will land in Q3.
Intel

Intel To Enter Bitcoin Mining Market With Energy-Efficient GPU (pcmag.com) 52

Intel is entering the blockchain mining market with an upcoming GPU capable of mining Bitcoin. From a report: Intel insists the effort won't put a strain energy supplies or deprive consumers of chips. The goal is to create the most energy-efficient blockchain mining equipment on the planet, it says. "We expect that our circuit innovations will deliver a blockchain accelerator that has over 1,000x better performance per watt than mainstream GPUs for SHA-256 based mining," Intel's General Manager for Graphics, Raja Koduri, said in the announcement. (SHA-256 is a reference to the mining algorithm used to create Bitcoins.)

News of Intel's blockchain-mining effort first emerged last month after the ISSCC technology conference posted details about an upcoming Intel presentation titled: "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." ASICs are chips designed for a specific purpose, and also refer to dedicated hardware to mine Bitcoin. Friday's announcement from Koduri added that Intel is establishing a new "Custom Compute Group" to create chip platforms optimized for customers' workloads, including for blockchains.

Transportation

Tesla Now Runs the Most Productive Auto Factory In America (bloomberg.com) 198

An anonymous reader quotes a report from Bloomberg: Elon Musk has a very specific vision for the ideal factory: densely packed, vertically integrated and unusually massive. During Tesla's early days of mass production, he was chided for what was perceived as hubris. Now, Tesla's original California factory has achieved a brag-worthy title: the most productive auto plant in North America. Last year Tesla's factory in Fremont, California, produced an average of 8,550 cars a week. That's more than Toyota's juggernaut in Georgetown, Kentucky (8,427 cars a week), BMW AG's Spartanburg hub in South Carolina (8,343) or Ford's iconic truck plant in Dearborn, Michigan (5,564), according to a Bloomberg analysis of production data from more than 70 manufacturing facilities.

In a year when auto production around the world was stifled by supply-chain shortages, Tesla expanded its global production by 83% over 2020 levels. Its other auto factory, in Shanghai, tripled output to nearly 486,000. In the coming weeks, Tesla is expected to announce the start of production at two new factories -- Gigafactory Berlin-Brandenburg, its first in Europe, and Gigafactory Texas in Austin. Musk said in October that he plans to further increase production in Fremont and Shanghai by 50%. [...] Once Tesla flips the switch on two new factories, what comes next? Musk has a longstanding target to increase vehicle deliveries by roughly 50% a year. To continue such growth, Tesla will need to either open more factories or make the facilities even more productive. Musk said in October that he's working on both. Site selection for the next Gigafactories begins this year.

AI

Meta Unveils New AI Supercomputer (wsj.com) 48

An anonymous reader quotes a report from The Wall Street Journal: Meta said Monday that its research team built a new artificial intelligence supercomputer that the company maintains will soon be the fastest in the world. The supercomputer, the AI Research SuperCluster, was the result of nearly two years of work, often conducted remotely during the height of the pandemic, and led by the Facebook parent's AI and infrastructure teams. Several hundred people, including researchers from partners Nvidia, Penguin Computing and Pure Storage, were involved in the project, the company said.

Meta, which announced the news in a blog post Monday, said its research team currently is using the supercomputer to train AI models in natural-language processing and computer vision for research. The aim is to boost capabilities to one day train models with more than a trillion parameters on data sets as large as an exabyte, which is roughly equivalent to 36,000 years of high-quality video. "The experiences we're building for the metaverse require enormous compute powerand RSC will enable new AI models that can learn from trillions of examples, understand hundreds of languages, and more," Meta CEO Mark Zuckerberg said in a statement provided to The Wall Street Journal. Meta's AI supercomputer houses 6,080 Nvidia graphics-processing units, putting it fifth among the fastest supercomputers in the world, according to Meta.

By mid-summer, when the AI Research SuperCluster is fully built, it will house some 16,000 GPUs, becoming the fastest AI supercomputer in the world, Meta said. The company declined to comment on the location of the facility or the cost. [...] Eventually the supercomputer will help Meta's researchers build AI models that can work across hundreds of languages, analyze text, images and video together and develop augmented reality tools, the company said. The technology also will help Meta more easily identify harmful content and will aim to help Meta researchers develop artificial-intelligence models that think like the human brain and support rich, multidimensional experiences in the metaverse. "In the metaverse, it's one hundred percent of the time, a 3-D multi-sensorial experience, and you need to create artificial-intelligence agents in that environment that are relevant to you," said Jerome Pesenti, vice president of AI at Meta.

Graphics

Vice Mocks GIFs as 'For Boomers Now, Sorry'. (And For Low-Effort Millennials) (vice.com) 227

"GIF folders were used by ancient civilisations as a way to store and catalogue animated pictures that were once employed to convey emotion," Vice writes: Okay, you probably know what a GIF folder is — but the concept of a special folder needed to store and save GIFs is increasingly alien in an era where every messaging app has its own in-built GIF library you can access with a single tap. And to many youngsters, GIFs themselves are increasingly alien too — or at least, okay, increasingly uncool. "Who uses gifs in 2020 grandma," one Twitter user speedily responded to Taylor Swift in August that year when the singer-songwriter opted for an image of Dwayne "The Rock" Johnson mouthing the words "oh my god" to convey her excitement at reaching yet another career milestone.

You don't have to look far to find other tweets or TikToks mocking GIFs as the preserve of old people — which, yes, now means millennials. How exactly did GIFs become so embarrassing? Will they soon disappear forever, like Homer Simpson backing up into a hedge...?

Gen Z might think GIFs are beloved by millennials, but at the same time, many millennials are starting to see GIFs as a boomer plaything. And this is the first and easiest explanation as to why GIFs are losing their cultural cachet. Whitney Phillips, an assistant professor of communication at Syracuse University and author of multiple books on internet culture, says that early adopters have always grumbled when new (read: old) people start to encroach on their digital space. Memes, for example, were once subcultural and niche. When Facebook came along and made them more widespread, Redditors and 4Chan users were genuinely annoyed that people capitalised on the fruits of their posting without putting in the cultural work. "That democratisation creates a sense of disgust with people who consider themselves insiders," Phillips explains. "That's been central to the process of cultural production online for decades at this point...."

In 2016, Twitter launched its GIF search function, as did WhatsApp and iMessage. A year later, Facebook introduced its own GIF button in the comment section on the site. GIFs became not only centralised but highly commercialised, culminating in Facebook buying GIPHY for $400 million in 2020. "The more GIFs there are, maybe the less they're regarded as being special treasures or gifts that you're giving people," Phillips says. "Rather than looking far and wide to find a GIF to send you, it's clicking the search button and typing a word. The gift economy around GIFs has shifted...."

Linda Kaye, a cyberpsychology professor at Edge Hill University, hasn't done direct research in this area but theorises that the ever-growing popularity of video-sharing on TikTok means younger generations are more used to "personalised content creation", and GIFs can seem comparatively lazy.

The GIF was invented in 1987 "and it's important to note the format has already fallen out of favour and had a comeback multiple times before," the article points out. It cites Jason Eppink, an independent artist and curator who curated an exhibition on GIFs for the Museum of the Moving Image in New York in 2014, who highlighted how GIFs were popular with GeoCities users in the 90s, "so when Facebook launched, they didn't support GIFs.... They were like, 'We don't want this ugly symbol of amateur web to clutter our neat and uniform cool new website." But then GIFs had a resurgence on Tumblr.

Vice concludes that while even Eppink no longer uses GIFs any more, "Perhaps the waxing and waning popularity of the GIF is an ironic mirror of the format itself — destined to repeat endlessly, looping over and over again."
Graphics

Blender 3.0 Released With More New Features and Improvements 37

Long-time Slashdot reader Qbertino writes: The Free Open Source 3D production software Blender has been released in version 3.0 (official showreel) with more new features, improvements and performance optimizations as well as further improved workflows.

In recent years Blender has received an increasing rate of attention from the 3D industry, with various larger businesses such as Epic, Microsoft, Apple and most recently Intel joining the blender foundation and donating to its development fund. Blender has seen an increasing rise in usage in various industries, such as animated feature film production, architecture and game development.
Google

Google is Building an AR Headset (theverge.com) 52

Meta may be the loudest company building AR and VR hardware. Microsoft has HoloLens. Apple is working on something, too. But don't count out Google. The Verge: The search giant has recently begun ramping up work on an AR headset, internally codenamed Project Iris, that it hopes to ship in 2024, according to two people familiar with the project who requested anonymity to speak without the company's permission. Like forthcoming headsets from Meta and Apple, Google's device uses outward-facing cameras to blend computer graphics with a video feed of the real world, creating a more immersive, mixed reality experience than existing AR glasses from the likes of Snap and Magic Leap. Early prototypes being developed at a facility in the San Francisco Bay Area resemble a pair of ski goggles and don't require a tethered connection to an external power source.

Google's headset is still early in development without a clearly defined go-to-market strategy, which indicates that the 2024 target year may be more aspirational than set in stone. The hardware is powered by a custom Google processor, like its newest Google Pixel smartphone, and runs on Android, though recent job listings indicate that a unique OS is in the works. Given power constraints, Google's strategy is to use its data centers to remotely render some graphics and beam them into the headset via an internet connection. I'm told that the Pixel team is involved in some of the hardware pieces, but it's unclear if the headset will ultimately be Pixel-branded.

Intel

Intel To Unveil 'Ultra Low-Voltage Bitcoin Mining ASIC' In February (coindesk.com) 31

Intel, one of the world's largest chip makers, is likely to unveil a specialized crypto-mining chip at the International Solid-State Circuits Conference (ISSCC) in February, according to the conference's agenda (PDF). CoinDesk reports: One of Intel's "highlighted chip releases" at the conference is entitled "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." The session is scheduled for Feb. 23. This brings the company into direct competition with the likes of Bitmain and MicroBT in the market for bitcoin mining ASICs, or application-specific integrated circuits, for the first time. [...] Unlike its competitor Nvidia, Intel has said it doesn't plan to add ether mining limits on its graphics cards.
AMD

AMD Returns To Smartphone Graphics (theregister.com) 13

AMD's GPU technology is returning to mobile handsets with Samsung's Exynos 2200 system-on-chip, which was announced on Tuesday. The Register reports: The Exynos 2200 processor, fabricated using a 4nm process, has Armv9 CPU cores and the oddly named Xclipse GPU, which is an adaptation of AMD's RDNA 2 mainstream GPU architecture. AMD was in the handheld GPU market until 2009, when it sold the Imageon GPU and handheld business for $65m to Qualcomm, which turned the tech into the Adreno GPU for its Snapdragon family. AMD's Imageon processors were used in devices from Motorola, Panasonic, Palm and others making Windows Mobile handsets. AMD's now returning to a more competitive mobile graphics market with Apple, Arm and Imagination also possessing homegrown smartphone GPUs.

Samsung and AMD announced the companies were working together on graphics in June last year. With Exynos 2200, Samsung has moved on from Arm's Mali GPU family, which was in the predecessor Exynos 2100 used in the current flagship Galaxy smartphones. Samsung says the power-optimized GPU has hardware-accelerated ray tracing, which simulates lighting effects and other features to make gaming a better experience. [...] The Exynos 2200 has an image signal processor that can apparently handle 200-megapixel pictures and record 8K video. Other features include HDR10+ support, and 4K video decoding at up to 240fps or 8K decoding at up to 60fps. It supports display refresh rates of up to 144Hz.

The eight-core CPU cluster features a balance of high-performing and power-efficient cores. It has one Arm Cortex-X2 flagship core, three Cortex-A710 big cores and four Cortex-A510s, which is in the same ballpark as Qualcomm's Snapdragon 8 Gen 1 and Mediatek's Dimensity 9000, which are the only other chips using Arm's Armv9 cores and are made using a 4nm process. An integrated 5G modem supports both sub-6GHz and millimeter wave bands, and a feature to mix LTE and 5G signals speeds up data transfers to 10Gbps. The chip also has a security processor and an AI engine that is said to be two times faster than its predecessor in the Exynos 2100.

AI

Nvidia's AI-Powered Scaling Makes Old Games Look Better Without a Huge Performance Hit (theverge.com) 41

Nvidia's latest game-ready driver includes a tool that could let you improve the image quality of games that your graphics card can easily run, alongside optimizations for the new God of War PC port. The Verge reports: The tech is called Deep Learning Dynamic Super Resolution, or DLDSR, and Nvidia says you can use it to make "most games" look sharper by running them at a higher resolution than your monitor natively supports. DLDSR builds on Nvidia's Dynamic Super Resolution tech, which has been around for years. Essentially, regular old DSR renders a game at a higher resolution than your monitor can handle and then downscales it to your monitor's native resolution. This leads to an image with better sharpness but usually comes with a dip in performance (you are asking your GPU to do more work, after all). So, for instance, if you had a graphics card capable of running a game at 4K but only had a 1440p monitor, you could use DSR to get a boost in clarity.

DLDSR takes the same concept and incorporates AI that can also work to enhance the image. According to Nvidia, this means you can upscale less (and therefore lose less performance) while still getting similar image quality improvements. In real numbers, Nvidia claims you'll get image quality similar to running at four times the resolution using DSR with only 2.25 times the resolution with DLDSR. Nvidia gives an example using 2017's Prey: Digital Deluxe running on a 1080p monitor: 4x DSR runs at 108 FPS, while 2.25x DLDSR is getting 143 FPS, only two frames per second slower than running at native 1080p.

IOS

Fortnite Sneaks Back Onto iPhone By Way Of GeForce Now (kotaku.com) 13

It's been 518 days since Apple kicked Fortnite off of the App Store after Epic Games tried to bypass its payment system. Now the popular free-to-play battle royale is once again playable on iPhones, sort of. From a report: Starting next week, Fortnite will be available on iOS by way of streaming, as part of an upcoming closed beta for Nvidia's GeForce Now game streaming program. "Fortnite on GeForce NOW will launch in a limited-time closed beta for mobile, all streamed through the Safari web browser on iOS and the GeForce NOW Android app," Nvidia announced on its blog today. "The beta is open for registration for all GeForce NOW members, and will help test our server capacity, graphics delivery and new touch controls performance."

GeForce Now, subscriptions for which range from free to $200 a year for the premium tier, lets users stream games they already own to PCs, tablets, and smartphones. It's one way to make blockbuster PC games portable, or to play them on rigs with beefier specs than the ones people already have at home. In Fortnite's case, GeForce Now subscribers will soon be able to stream the shooter to iOS devices and play it using touch controls via Apple's Safari. The browser workaround is one way companies like Microsoft have been able to get their game streaming platforms on iPhones despite Apple's ban on allowing them inside its App Store. Now its bringing back the game that kicked off a massive, messy, year-long legal battle that's still raging to this day.

Data Storage

PCI Express 6.0 Specification Finalized: x16 Slots To Reach 128GBps (anandtech.com) 31

PCI Special Interest Group (PCI-SIG) has released the much-awaited final (1.0) specification for PCI Express 6.0. From a report: The next generation of the ubiquitous bus is once again doubling the data rate of a PCIe lane, bringing it to 8GB/second in each direction -- and far, far higher for multi-lane configurations. With the final version of the specification now sorted and approved, the group expects the first commercial hardware to hit the market in 12-18 months, which in practice means it should start showing up in servers in 2023. First announced in the summer of 2019, PCI Express 6.0 is, as the name implies, the immediate follow-up to the current-generation PCIe 5.0 specification. Having made it their goal to keep doubling PCIe bandwidth roughly every 3 years, the PCI-SIG almost immediately set about work on PCIe 6.0 once the 5.0 specification was completed, looking at ways to once again double the bandwidth of PCIe. The product of those development efforts is the new PCIe 6.0 spec, and while the group has missed their original goal of a late 2021 release by mere weeks, today they are announcing that the specification has been finalized and is being released to the group's members. As always, the creation of an even faster version of PCIe technology has been driven by the insatiable bandwidth needs of the industry. The amount of data being moved by graphics cards, accelerators, network cards, SSDs, and other PCIe devices only continues to increase, and thus so must bus speeds to keep these devices fed. As with past versions of the standard, the immediate demand for the faster specification comes from server operators, whom are already regularly using large amounts of high-speed hardware. But in due time the technology should filter down to consumer devices (i.e. PCs) as well.

Slashdot Top Deals