×
Bug

NVIDIA Fixes High Severity Flaws Affecting Windows, Linux devices (bleepingcomputer.com) 24

Bleeping Computer reports: NVIDIA has released security updates to address six security vulnerabilities found in Windows and Linux GPU display drivers, as well as ten additional flaws affecting the NVIDIA Virtual GPU (vGPU) management software. The vulnerabilities expose Windows and Linux machines to attacks leading to denial of service, escalation of privileges, data tampering, or information disclosure.

All these security bugs require local user access, which means that potential attackers will first have to gain access to vulnerable devices using an additional attack vector. Following successful exploitation of one of the vulnerabilities patched today, attackers can easily escalate privileges to gain permissions above the default ones granted by the OS.

United Kingdom

UK Watchdog Begins Investigating Nvidia's $40 Billion Takeover of Arm (theguardian.com) 22

Britain's competition watchdog has launched an investigation into the $40 billion takeover of the UK-based chip designer Arm by the US company Nvidia. From a report: The Competition and Markets Authority (CMA) has called for interested parties to submit views on the contentious deal before the launch of a formal investigation later this year. Arm Holdings, which employs 6,500 staff including 3,000 in the UK, is a global leader in designing chips for smartphones, computers and tablets. California-based Nvidia, a graphics chip specialist, announced its plan to buy the British tech group from Japan's SoftBank in September. SoftBank had acquired Arm for $32 billion in 2016, when the Japanese company took advantage of the fall in value of the pound after the Brexit vote. Arm is based in Cambridge but has operations in a number of UK towns and cities, including Manchester, Belfast and Warwick. Its chief executive, Simon Segars, acknowledged at the time of the Nvidia deal that it could take up to 18 months to win approval from regulators around the world.
Hardware

Graphics Cards Are About To Get a Lot More Expensive, Asus Warns (pcworld.com) 159

Ever since Nvidia's GeForce RTX 30-series and AMD's Radeon RX 6000-series graphics cards launched last fall, the overwhelming demand and tight supply, exacerbated by a cryptocurrency boom, has caused prices for all graphics cards to go nuts. Brace yourself: It looks like it's about to get even worse. From a report: In the Asus DIY PC Facebook group, Asus technical marketing manager Juan Jose Guerrero III warned that prices for the company's components will increase in the new year. "We have an announcement in regards to MSRP price changes that are effective in early 2021 for our award-winning series of graphic cards and motherboards," Guerrero wrote, though he warned that "additional models" may also wind up receiving price increases as well. "Our new MSRP reflects increases in cost for components. operating costs, and logistical activities plus a continuation of import tariffs. We worked closely with our supply and logistic partners to minimize price increases. ASUS greatly appreciates your continued business and support as we navigate through this time of unprecedented market change."
AMD

Speculation Grows As AMD Files Patent for GPU Design (hothardware.com) 39

Long-time Slashdot reader UnknowingFool writes: AMD filed a patent on using chiplets for a GPU with hints on why it has waited this long to extend their CPU strategy to GPUs. The latency between chiplets poses more of a performance problem for GPUs, and AMD is attempting to solve the problem with a new interconnect called high bandwidth passive crosslink. This new interconnect will allow each GPU to more effectively communicate with each other and the CPU.
"With NVIDIA working on its own MCM design with Hopper architecture, it's about time that we left monolithic GPU designs in the past and enable truly exponential performance growth," argues Wccftech.

And Hot Hardware delves into the details, calling it a "hybrid CPU-FPGA design that could be enabled by Xilinx tech." While they often aren't as great as CPUs on their own, FPGAs can do a wonderful job accelerating specific tasks... [A]n FPGA in the hands of a capable engineer can offload a wide variety of tasks from a CPU and speed processes along. Intel has talked a big game about integrating Xeons with FPGAs over the last six years, but it hasn't resulted in a single product hitting its lineup. A new patent by AMD, though, could mean that the FPGA newcomer might be ready to make one of its own...

AMD made 20 claims in its patent application, but the gist is that a processor can include one or more execution units that can be programmed to handle different types of custom instruction sets. That's exactly what an FPGA does...

AMD has been working on different ways to speed up AI calculations for years. First the company announced and released the Radeon Impact series of AI accelerators, which were just big headless Radeon graphics processors with custom drivers. The company doubled down on that with the release of the MI60, its first 7-nm GPU ahead of the Radeon RX 5000 series launch, in 2018. A shift to focusing on AI via FPGAs after the Xilinx acquisition makes sense, and we're excited to see what the company comes up with.

Graphics

Flash Is About To Die, But Classic Flash Games Will Live On (fastcompany.com) 45

Fast Company's technology editor harrymcc writes: After years of growing technical irrelevance and security concerns, the Flash browser plug-in will reach the end of the road on January 12 when Adobe blocks its ability to display content. The web will survive just fine. But there's a huge library of old Flash games — some of them quirky, interesting, and worth preserving. Over at Fast Company, Jared Newman wrote about several grassroots initiatives that will allow us to continue to enjoy these artifacts of the Flash era even after Flash is history.
Some tips from the article:
  • If you have a Windows PC, the best way to replay old Flash content is with FlashPoint, a free program with more than 70,000 web games and 8,000 animations, most of which are Flash-based. (Experimental Mac and Linux versions are also available, but are complicated to set up....)
  • Ruffle is the underlying emulation software that The Internet Archive is using. You can also install it as a standalone program or browser extension...
  • Newgrounds has released its own Flash Player for Windows that safely loads content from its website, so you still get the full experience of using Newgrounds proper.

But the article opens with a sentence reminding us that "After all the challenges of 2020, there's one thing we can all look forward to in the new year: Adobe Flash Player will finally be dead."


Nintendo

Linux Kernel Ported to the Nintendo 64 (phoronix.com) 33

Phoronix reports: It's been a turbulent year and 2020 is certainly ending interesting in the Linux/open-source space... If it wasn't odd enough seeing Sony providing a new official Linux driver for their PlayStation 5 DualSense controller for ending out the year, there is also a new Linux port to the Nintendo 64 game console... Yes, a brand new port to the game console that launched more than two decades ago.

Open-source developer Lauri Kasanen who has contributed to Mesa and the Linux graphics stack took to developing a new Nintendo 64 port and announced it this Christmas day. This isn't the first time Linux has been ported to the N64 but prior attempts weren't aimed at potentially upstreaming it into the mainline Linux kernel...

This fresh port to the N64 was pursued in part to help port emulators and frame-buffer or console games.

And also, the announcement adds, "Most importantly, because I can."
Microsoft

Microsoft Flight Simulator In VR: a Turbulent Start For Wide-Open Skies (arstechnica.com) 19

An anonymous reader quotes a report from Ars Technica: After over a year of requests from fans and enthusiasts, and months of official teases, Microsoft Flight Simulator has a virtual reality mode. Whether you play the game via Steam or the Windows Store, you can now take advantage of "OpenXR" calls to seemingly any PC-VR system on the market, aided by an "enable/disable VR" keyboard shortcut at any time. This summer, ahead of the game's final-stretch beta test, the developers at Asobo Studio used a screen-share feature in a video call to tease the VR mode to us at Ars Technica. This is never an ideal way to show off VR, in part because the platform requires high refresh rates for comfortable play, which can't be smoothly sent in a pandemic-era video call. But even for a video call, it looked choppy. Asobo's team assured us that the incomplete VR mode was running well -- but of course, we're all on edge about game-preview assurances as of late. Now that users have been formally invited to slap Microsoft Flight Simulator onto their faces, I must strongly urge users not to do so -- or at least heavily temper their expectations. Honestly, Asobo Studio should've issued these warnings, not me, because this mode is nowhere near retail-ready.

Ultimately, trying to use the 2020 version of MSFS within its VR mode's "potato" settings is a stupid idea until some kinks get worked out. It's bad enough how many visual toggles must be dropped to PS2 levels to reach a comfortable 90 fps refresh; what's worse is that even in this low-fidelity baseline, you'll still face serious stomach-turning anguish in the form of constant frametime spikes. Turn the details up to a "medium" level in order to savor the incredible graphics engine Asobo built, of course, and you're closer to 45 fps. I didn't even bother finding an average performance for the settings at maximum. That test made me sick enough to delay this article by a few hours. [...] The thing is, my VR stomach can always survive the first few minutes of a bumpy refresh before I have to rip my headset off in anguish -- and this was long enough to see the absolute potential of MSFS as a must-play VR library addition. I don't have an ultrawide monitor, so testing MSFS has always been an exercise in wishing for a better field of view -- to replicate the glance-all-over behavior of actual flight. Getting a taste of that in my headset -- with accurate cockpit lighting, impressive volumetric clouds, and 3D modeling of my plane's various sounds -- made me want to sit for hours in this mode and get lost in compelling, realistic flight. But even the most iron stomachs can only take so much screen flicker within VR before churning, and that makes MSFS's demanding 3D engine a terrible fit for the dream of hours-long VR flight... at least, for the time being.

GNU is Not Unix

A New Release For GNU Octave (lwn.net) 59

Long-time Slashdot reader lee1 shares his recent article from LWN: On November 26, version 6.1 of GNU Octave, a language and environment for numerical computing, was released. There are several new features and enhancements in the new version, including improvements to graphics output, better communication with web services, and over 40 new functions...

In the words of its manual:

GNU Octave is a high-level language primarily intended for numerical computations. It is typically used for such problems as solving linear and nonlinear equations, numerical linear algebra, statistical analysis, and for performing other numerical experiments.

Octave is free software distributed under the GPLv3. The program was first publicly released in 1993; it began as a teaching tool for students in a chemical engineering class. The professors, James B. Rawlings and John G. Ekerdt, tried to have the students use Fortran, but found that they were spending too much time trying to get their programs to compile and run instead of working on the actual substance of their assignments... Octave became part of the GNU project in 1997...

Octave, written in C, C++, and Fortran, soon adopted the goal and policy of being a fully compatible replacement for MATLAB. According to the Octave Wiki, any differences between Octave and MATLAB are considered to be bugs, "in general", and most existing MATLAB scripts will work unmodified when fed to Octave, and vice versa...

When octave is started in the terminal it brings up an interactive prompt. The user can type in expressions, and the results are printed immediately.

Google

'Google is Getting Left Behind Due To Horrible UI/UX' (danielmiessler.com) 269

Daniel Miessler, a widely respected infosec professional in San Francisco, writes about design and user experience choices Google has made across its services in recent years: I've been writing for probably a decade about how bad Google's GUI is for Google Analytics, Google Apps, and countless of their other properties -- not to mention their multiple social media network attempts, like Google+ and Wave. Back then it was super annoying, but kind of ok. They're a hardcore engineering group, and their backend services are without equal. But lately it's just becoming too much.

1. Even Gmail is a cesspool at this point. Nobody would ever design a webmail interface like that, starting from scratch.
2. What happened to Google Docs? Why does it not look and behave more like Notion, or Quip, or any of the other alternatives that made progress in the last 5-10 years?
3. What college course do I take to manage a Google Analytics property?
4. Google just rolled out Google Analytics 4 -- I think -- and the internet is full of people asking the same question I am. "Is this a real rollout?"

[...] My questions are simple:
1. How the hell is this possible? I get it 10 years ago. But then they came out with the new design language. Materialize, or whatever it was. Cool story, and cool visuals. But it's not about the graphics, it's about the experience.
2. How can you be sitting on billions of dollars and be unable to hire product managers that can create usable interfaces?
3. How can you run Gmail on an interface that's tangibly worse than anything else out there?
4. How can you let Google Docs get completely obsoleted by startups?

I've heard people say that Google has become the new Microsoft, or the new Oracle, but damn -- at least Microsoft is innovating. At least Oracle has a sailing team, or whatever else they do. I'm being emotional at this point.

Google, you are made out of money. Fix your fucking interfaces. Focus on the experience. Focus on simplicity. And use navigation language that's similar across your various properties, so that I'll know what to do whether I'm managing my Apps account, or my domains, or my Analytics. You guys are awesome at so many things. Make the commitment to fix how we interact with them.

GUI

Creator of DirectX Dies at Age 55 (livemint.com) 94

The Wall Street Journal looks back to the days when Windows was "a loser in the world of computer games." But to change that, Eric Engstrom and his cohorts "secretly hired programmers to get the work done, and they had to do an end run around partners like Intel," remembers VentureBeat.

Long-time Slashdot reader whh3 shares The Wall Street Journal's report: Windows inserted itself between game programs and the computer hardware in a way that slowed down graphics and animation. Game developers vastly preferred the DOS operating system, which didn't gum up their special effects. That created an opportunity for three Microsoft misfits — Eric Engstrom, Alex St. John and Craig Eisler. Mr. Engstrom, who died Dec. 1 at the age of 55, and his pals formed one of several factions within Microsoft trying to solve the game problem. Openly contemptuous of colleagues who didn't share their ideas, they were so obnoxious that Brad Silverberg, who ran the Windows business, dubbed them the Beastie Boys. He had to fend off frequent demands for their dismissal.

Yet the solution they developed, DirectX, beat anything else on offer inside Microsoft. DirectX software recognized games and allowed them direct access to the computer's graphical capabilities, allowing a richer game experience than DOS could. "It was brilliant," Mr. Silverberg said. Launched in 1995, DirectX wowed game developers and led to a flood of new games for computers loaded with Windows. That success emboldened Microsoft to plunge deeper into the lucrative gaming market by developing the Xbox console.

Microsoft's game business produced $11.6 billion of revenue in the year ended June 30...

"He thought things were possible that nobody else on the planet thought would be possible," said Ben G. Wolff, a friend who runs a robotics company, "and sometimes he'd be right."

"DirectX remains the foundation for many games on Windows 10 and the Xbox Series X," writes Engadget, "and it's likely to remain relevant for years to come."

And VentureBeat shared this remark from Alex St. John at a memorial service for Engstrom. "He had huge dreams and huge fantasies, and he always took us all with him."
Games

Do Games Made Under Crunch Conditions Deserve 'Best Direction' Awards? (kotaku.com) 146

The annual Game Awards ceremony awarded this year's "Best Direction" award to Naughty Dog studio's The Last of Us Part IIprovoking a strong reaction from Kotaku's staff writer.

"I think it's pretty obvious that no game that required its developers to crunch, like The Last of Us Part II did, should be given a Best Direction award." It's no secret that Naughty Dog subjected its workers to unbelievable levels of crunch to get The Last of Us Part II out the door, but that's hardly an innovation when it comes to Naughty Dog or game development in general. Over the years, the studio has seen constant employee turnover as developers crunch on games like The Last of Us and Uncharted, burn out, and throw in the towel. Relentless overtime, missed weekends, long stretches of time without seeing your family — these things take a toll on even the most passionate artist.

"This can't be something that's continuing over and over for each game, because it is unsustainable," one The Last of Us Part II developer told Kotaku earlier this year. "At a certain point you realize, 'I can't keep doing this. I'm getting older. I can't stay and work all night.'"

Let's be clear: the existence of crunch indicates a failure in leadership. It's up to game directors and producers to ensure workloads are being managed properly and goals are being met. If workers are being forced to crunch, explicitly or otherwise, it means the managers themselves have fallen short somewhere, either in straining the limits of their existing staff, fostering an environment where overtime is an implied (if unspoken) requirement, or both. And as ambitious as The Last of Us Part II director Neil Druckmann and his projects may be, "questionable experiments in the realm of pushing human limits" are not required to make a great game...

I feel like the industry, now more than ever, is willing to discuss the dangers of crunch culture and solutions to eradicate it. But lavishing praise on the way The Last of Us Part II was directed feels like a tacit endorsement of crunch and only serves to push that conversation to the backburner again. A popular online statement, first coined by Fanbyte podcast producer Jordan Mallory, says, "I want shorter games with worse graphics made by people who are paid more to work less and I'm not kidding." The message from all those who share it is clear: No game, not even industry darling The Last of Us Part II, is worth destroying lives to create.

Graphics

NVIDIA Apologizes, 'Walks Back' Threat to Withhold GPUs From Reviewer (techspot.com) 111

This week NVIDIA threatened to stop providing GeForce Founders Edition review units to reviewer Steven Walton, who runs the YouTube channel Hardware Unboxed (and is also an editor/reviewer at TechSpot). NVIDIA had complained "your GPU reviews and recommendations have continued to focus singularly on rasterization performance, and you have largely discounted all of the other technologies we offer gamers. It is very clear from your community commentary that you do not see things the same way that we, gamers, and the rest of the industry do."

NVIDIA's email to Walton had said that henceforward their review products would instead be allocated to other media outlets "that recognize the changing landscape of gaming and the features that are important to gamers and anyone buying a GPU today, be it for gaming, content creation, or studio and stream."

But TechSpot reports tonight that "Less than 48 hours later, Steve received the good news. Nvidia apologized and walked everything back." Great news indeed, but let's be clear this wouldn't have happened if not for the support of the community at large and key people in the tech space that have such an enormous influence that it was too much for Nvidia to ignore. Linus from LinusTechTips (his angry rant on the WAN Show embedded above is pure gold) and Steve from Gamers Nexus, were two of those persons.
And unfortunately, by then TechSpot had already composed a scathing takedown of NVIDIA's email: As a corporation, it's Nvidia's prerogative to decide on the reviewers it chooses to collaborate with. However, this and other related incidents raise serious questions around journalistic independence and what they are expecting of reviewers when they are sent products for an unbiased opinion...

In today's dynamic graphics hardware space, with 350W flagships, hardware ray tracing, and exotic cooling solutions, there's a wide range of data points Hardware Unboxed looks at. But at the end of the day, there's only one real question every GPU buyer wants to know: how well do games run on a particular piece of hardware? Considering that 99% percent of Steam games feature raster-only rendering pipelines, rasterization performance was, is, and will be, a key point that Steve considers in GPU reviews...

[M]ost games (including almost all RTX titles) are built on raster renderers. A hypothetical graphics card with most of its die space reserved for ray tracing would run Quake II RTX great and... not much else. Ray tracing absolutely deserves a place in modern GPU reviews. But there's simply not enough of it in enough games for any responsible reviewer to put it center-stage, in place of raster performance. It wouldn't do justice to consumers, who will primarily be running raster workloads. This is why Nvidia's complaint is so puzzling.

Games

Inside the Obsessive World of Miniature Arcade Machine Makers (wired.co.uk) 25

The success of Nintendo's diminutive gadget led to a flurry of copycats, from a tiny Commodore 64 to a miniaturised Sony PlayStation. Some were good; many were flawed, with the play experience only being surface deep. Fortunately, some companies wanted to go further than fashioning yet another miniature plug-and-play TV console. From a report: One, the ZX Spectrum Next, brought into being a machine from an alternate universe in which Sinclair was never sold to Amstrad and instead built a computer to take on the might of the Amiga and Atari ST. Two other companies headed further back into gaming's past and set themselves an equally ambitious challenge: recreating the exciting, noisy, visually arresting classic cabinets you once found in arcades. "I always saw them as more than just a game, with their unique shapes, art, sounds and lights acting together to lure money from your pocket," explains Matt Precious, managing partner at Quarter Arcades creators Numskull Designs. "I was disappointed you couldn't purchase models of these machines during a time when physical items like LPs were booming in an increasingly sterile world of digital downloads."

Quarter Arcades was subsequently born as a project "trying to capture a piece of gaming history" in quarter-scale cabinets. The machines are in exact scale, including the controls, and play the original arcade ROMs. But look closer and there's an obsessive level of detail: the rough texture of the control panel art; mimicking an original cab's acoustics by careful speaker positioning; recreating the Space Invaders 'Pepper's ghost' illusion effect where graphics 'float' above an illuminated backdrop -- all realised by dismantling and reverse-engineering original cabs.

AI

'Cyberpunk 2077' Finally Shows What DLSS Is Good For (vice.com) 69

An anonymous reader shares a report: More recent Nvidia graphics cards have a proprietary feature called Deep Learning Super Sampling (DLSS), and while it's often been touted as a powerful new rendering tool, the results have sometimes been underwhelming. Some of this is down to the oddly mixed-message around how DLSS was rolled-out: it only works on more recent Nvidia cards that are still near the cutting edge of PC graphics hardware⦠but DLSS is designed to render images at lower resolutions but display them as if they were rendered natively at a higher resolution. If you had just gotten a new Nvidia card and were excited to see what kind of framerates and detail levels it could sustain, what DLSS actually did sounded counterintuitive. Even games like Control, whose support of DLSS was especially praised, left me scratching my head about why I would want to use the feature. On my 4K TV, Control looked and ran identically well with and without DLSS, so why wouldn't I just max-out my native graphics settings instead rather than use a fancy upscaler? Intellectually, I understood the DLSS could produce similarly great looking images without taxing my hardware as much, but I neither fully believed it, nor had I seen a game where the performance gain was meaningful.

Cyberpunk 2077 converted me. DLSS is a miracle, and without it there's probably no way I would ever have been happy with my graphics settings or the game's performance. I have a pretty powerful video card, an RTX 2080 TI, but my CPU is an old i5 overclocked to about 3.9 GHz and it's a definite bottleneck on a lot of games. Without DLSS, Cyberpunk 2077 was very hard to get running smoothly. The busiest street scenes would look fine if I were in a static position, but a quick pan with my mouse would cause the whole world to stutter. If I was walking around Night City, I would get routine slow-downs. Likewise, sneaking around and picking off guards during encounters was all well and good but the minute the bullets started flying, with grenades exploding everywhere and positions changing rapidly, my framerate would crater to the point where the game verged on unplayable. To handle these peaks of activity, I had to lower my detail settings way below what I wanted, and what my hardware could support for about 80 percent of my time with the game. Without DLSS, I never found a balance I was totally happy with. The game neither looked particularly great, nor did it run very well. DLSS basically solved this problem for me. With it active, I could run Cyberpunk at max settings, with stable framerates in all but the busiest scenes.

PlayStation (Games)

Is Sony Developing a Dual-GPU PS5 Pro? (collider.com) 60

According to a Sony patent spotted by T3, the console maker may be working on a new PlayStation 5 with two graphics card. From the report: The patent describes a "scalable game console" where "a second GPU [is] communicatively coupled to the first GPU" and that the system is for "home console and cloud gaming" usage. To us here at T3 that suggests a next-gen PlayStation console, most likely a new PS5 Pro flagship, supercharged with two graphics cards instead of just one. These would both come in the APU (accelerated processing unit) format that the PlayStation 5's system-on-a-chip (SoC) do, with two custom made AMD APUs working together to deliver enhanced gaming performance and cloud streaming.

The official Sony patent notes that, "plural SoCs may be used to provide a 'high-end' version of the console with greater processing and storage capability," while "the 'high end' system can also contain more memory such as random-access memory (RAM) and other features and may also be used for a cloud-optimized version using the same game console chip with more performance." And, with the PlayStation 5 console only marginally weaker on paper than the Xbox Series X (the PS5 delivers 10.28 teraflops compared to the Xbox Series X's 12 teraflops), a new PS5 Pro console that comes with two APUs rather than one, improving local gaming performance as well as cloud gaming, would be no doubt the Xbox Series X as king of the next-gen consoles death blow.

The cloud gaming part of the patent is particularly interesting, too, as it seems to suggest that this technology could not just find itself in a new flagship PS5 Pro console, but also in more streamlined cloud-based hardware, too. An upgraded PS5 Digital Edition seems a smart bet, as too the much-rumored PSP 5G. [...] Will we see a PS5 Pro anytime soon? Here at T3 we think absolutely not -- we imagine we'll get at least two straight years of PS5 before we see anything at all. As for a cloud-based next-gen PSP 5G, though...

Entertainment

'Code Switch' From NPR Is Apple's Podcast of the Year (engadget.com) 48

Apple has picked "Code Switch" as the best audio show of the year, marking the first time the company has recognized a single podcast in this way. Engadget reports: Code Switch is NPR's weekly discussion on race. While the series has been on the air for the better part of seven years, it became significantly more popular over the summer as people across the US took to protest the death of George Floyd and other instances of racial injustice.

As in past years, the company also shared a selection of the most popular audio shows people listened to through Apple Podcasts. Few surprises here as old favorites like Stuff You Should Know, This American Life and The Daily came out as the most popular shows in the US. When it comes to shows new to 2020, Unlocking Us, Nice White Parents and CounterClock made the top three for the year. Apple's editorial team had their say as well. They picked California Love, Canary by the Washington Post and Dying for Sex as their favorites of 2020. If you're looking for something new to listen to, all three lists are a good place to start.

Hardware

NVIDIA Launches GeForce RTX 3060 Ti, Sets a New Gaming Performance Bar At $399 (hothardware.com) 70

MojoKid writes: NVIDIA expanded its line-up of Ampere-based graphics cards today with a new lower cost GeForce RTX 3060 Ti. As its name suggests, the new $399 NVIDIA GPU supplants the previous-gen GeForce RTX 2060 / RTX 2060 Super, and slots in just behind the recently-released GeForce RTX 3070. The GeForce RTX 3060 Ti features 128 CUDA cores per SM, for a total of 4,864, 4 Third-Gen Tensor cores per SM (152 total), and 38 Second-Gen RT cores. The GPU has a typical boost clock of 1,665MHz and it is linked to 8GB of standard GDDR6 memory (not the GDDR6X of the RTX 3080/3090) via a 256-bit memory interface that offers up to 448GB/s of peak bandwidth. In terms of overall performance, the RTX 3060 Ti lands in the neighborhood of the GeForce RTX 2080 Super, and well ahead of cards like AMD's Radeon RX 5700 XT. The GeForce RTX 3060 Ti's 8GB frame buffer may give some users pause, but for 1080p and 1440p gaming, it shouldn't be a problem for the overwhelming majority of titles. It's also par for the course in this $399 price band. Cards are reported to be shipping in retail tomorrow.
Graphics

Cerebras' Wafer-Size Chip Is 10,000 Times Faster Than a GPU (venturebeat.com) 123

An anonymous reader quotes a report from VentureBeat: Cerebras Systems and the federal Department of Energy's National Energy Technology Laboratory today announced that the company's CS-1 system is more than 10,000 times faster than a graphics processing unit (GPU). On a practical level, this means AI neural networks that previously took months to train can now train in minutes on the Cerebras system.

Cerebras makes the world's largest computer chip, the WSE. Chipmakers normally slice a wafer from a 12-inch-diameter ingot of silicon to process in a chip factory. Once processed, the wafer is sliced into hundreds of separate chips that can be used in electronic hardware. But Cerebras, started by SeaMicro founder Andrew Feldman, takes that wafer and makes a single, massive chip out of it. Each piece of the chip, dubbed a core, is interconnected in a sophisticated way to other cores. The interconnections are designed to keep all the cores functioning at high speeds so the transistors can work together as one. [...] A single Cerebras CS-1 is 26 inches tall, fits in one-third of a rack, and is powered by the industry's only wafer-scale processing engine, Cerebras' WSE. It combines memory performance with massive bandwidth, low latency interprocessor communication, and an architecture optimized for high bandwidth computing.

Cerebras's CS-1 system uses the WSE wafer-size chip, which has 1.2 trillion transistors, the basic on-off electronic switches that are the building blocks of silicon chips. Intel's first 4004 processor in 1971 had 2,300 transistors, and the Nvidia A100 80GB chip, announced yesterday, has 54 billion transistors. Feldman said in an interview with VentureBeat that the CS-1 was also 200 times faster than the Joule Supercomputer, which is No. 82 on a list of the top 500 supercomputers in the world. [...] In this demo, the Joule Supercomputer used 16,384 cores, and the Cerebras computer was 200 times faster, according to energy lab director Brian Anderson. Cerebras costs several million dollars and uses 20 kilowatts of power.

Graphics

Radeon RX 6800 and 6800 XT Performance Marks AMD's Return To High-End Graphics (hothardware.com) 62

MojoKid writes: AMD officially launched its Radeon RX 6800 and Radeon RX 6800 XT graphics cards today, previously known in the PC gaming community as Big Navi. The company claimed these high-end GPUs would compete with NVIDIA's best GeForce RTX 30 series and it appears AMD made good on its claims. AMD's new Radeon RX 6800 XT and Radeon 6800 are based on the company's RDNA 2 GPU architecture, with the former sporting 72 Compute Units (CUs) and 2250MHz boost clock, while the RX 6800 sports 60 CUs at a 2105MHz boost clock. Both cards come equipped with 16GB of GDDR6 memory and 128MB of on-die cache AMD calls Infinity Cache, that improves bandwidth and latency, feeding the GPU in front of its 256-bit GDDR6 memory interface.

In the benchmarks, it is fair to say the Radeon RX 6800 is typically faster than an NVIDIA GeForce RTX 3070 just as AMD suggested. Things are not as cut and dry for the Radeon RX 6800 XT though, as the GeForce RTX 3080 and Radeon RX 6800 XT trade victories depending on the game title or workload, but the RTX 3080 has an edge overall. In DXR Ray Tracing performance, NVIDIA has a distinct advantage at the high-end. Though the Radeon RX 6800 wasn't too far behind and RTX 3070, neither the Radeon RX 6800 XT or 6800 card came close the GeForce RTX 3080. Pricing is set at $649 and $579 for the AMD Radeon RX 6800 XT and Radeon RX 6800, respectively and the cards are on sale as of today. However, demand is likely to be fierce as this new crop of high-end graphics cards from both companies have been quickly evaporating from retail shelves.

Desktops (Apple)

Apple's M1 Is Exceeding Expectations (extremetech.com) 274

Reviews are starting to pour in of Apple's MacBook Pro, MacBook Air and Mac Mini featuring the new M1 ARM-based processor -- and they're overwhelmingly positive. "As with the Air, the Pro's performance exceeds expectations," writes Nilay Patel via The Verge.

"Apple's next chapter offers strong performance gains, great battery and starts at $999," says Brian Heater via TechCrunch.

"When Apple said it would start producing Macs with its own system-on-chip processors, custom CPU and GPU silicon (and a bunch of other stuff) to replace parts from Intel and AMD, we figured it would be good. I never expected it would be this good," says Jason Cross in his review of the MacBook Air M1.

"The M1 is a serious, serious contender for one of the all-time most efficient and highest-performing architectures we've ever seen deploy," says ExtremeTech's Joel Hruska.

"Spending a few days with the 2020 Mac mini has shown me that it's a barnburner of a miniature desktop PC," writes Chris Welch via The Verge. "It outperforms most Intel Macs in several benchmarks, runs apps reliably, and offers a fantastic day-to-day experience whether you're using it for web browsing and email or for creative editing and professional work. That potential will only grow when Apple inevitably raises the RAM ceiling and (hopefully) brings back those missing USB ports..."

"Quibbling about massively parallel workloads -- which the M1 wasn't designed for -- aside, Apple has clearly broken the ice on high-performance ARM desktop and laptop designs," writes Jim Salter via Ars Technica. "Yes, you can build an ARM system that competes strongly with x86, even at very high performance levels."

"The M1-equipped MacBook Air now packs far better performance than its predecessors, rivaling at times the M1-based MacBook Pro. At $999, it's the best value among macOS laptops," concludes PCMag.

"For developers, the Apple Silicon Macs also represent the very first full-fledged Arm machines on the market that have few-to-no compromises. This is a massive boost not just for Apple, but for the larger Arm ecosystem and the growing Arm cloud-computing business," writes Andrei Frumusanu via AnandTech. "Overall, Apple hit it out of the park with the M1."

Slashdot Top Deals