×
The Internet

Jagex Nixes Community-Built RuneScape HD Client, Massive Backlash Follows (runescape.com) 22

New submitter Sauce Tin writes: In a blog post, Jagex announced the shutdown of a community-driven RuneScape HD graphics client. The announcement came at an inopportune time -- the community client was prepped for release this week and had been announced years beforehand, with 2,000+ hours of effort of a single individual behind it. The effort had been noticed by Jagex, however no opposition from the company was made -- until recently. Thousands of players vented on the game's subreddit, ultimately reaching the top of r/all. Jagex has had a past of infuriating its player base over the years, including the removal of free trade, PvP combat, and LGBT holiday content.
Hardware

ASUS Bets on OLED for All of Its New Creator Laptops (engadget.com) 93

ASUS has just four letters to sell you on its latest creator-focused notebooks: OLED. From a report: The company is bringing OLED screens to all of its new models, a move meant to differentiate them in the increasingly crowded PC market. Compared to traditional LCD screens, OLED offers deeper blacks levels, vastly better contrast, and more responsiveness. Even today, as LCDs have evolved to be brighter and faster, OLED offers a more pronounced visual "pop." We've been seeing notebooks with OLED for years, like on the XPS 15 and ZenBook, but they've typically been positioned as a premium feature for select models. Now ASUS is trying to make its name synonymous with OLED, so much so that it's bringing it to new mid-range notebooks like the VivoBook Pro 14X and 16X. It's also touting the first 16-inch 4K OLED HDR screens on notebooks across several models: the ProArt Studiobook Pro, ProArt Studiobook and the Vivobook Pro.

Befitting its name, you can expect to see the fastest hardware on the market in the StudioBook Pro 16 OLED (starting at $2,500). It'll be powered by H-series Ryzen 5000 processors, 3rd-gen Intel Xeon chips and NVIDIA's professional-grade RTX A2000 and A5000 GPUs. And if you don't need all of that power, there's also the Studiobook 16 OLED ($2,000), which has the same Ryzen chips, Intel Core i7 CPUs and either RTX 3070 or 3060 graphics. Both notebooks will be equipped with 4K OLED HDR screens that reach up to 550 nits and cover 100 percent of DCI-P3 color gamut. They'll also sport ASUS Dial, a new rotary accessory located at the top of their trackpads, offering similar functionality to Microsoft's forgotten Surface Dial.

Chrome

Chrome 94 Beta Adds WebGPU API With Support For Apple's Metal (9to5mac.com) 36

An anonymous reader quotes a report from 9to5Mac, written by Filipe Esposito: Google this week announced the beta release of Chrome 94, the next update to Google's desktop web browser. In addition to general improvements, the update also adds support for the new WebGPU API, which comes to replace WebGL and can even access Apple's Metal API. As described by Google in a blog post, WebGPU is a new, more advanced graphics API for the web that is able to access GPU hardware, resulting in better performance for rendering interfaces in websites and web apps.

For those unfamiliar, Metal is an API introduced by Apple in 2014 that provides low-level access to GPU hardware for iOS, macOS, and tvOS apps. In other words, apps can access the GPU without overloading the CPU, which is one of the limitations of old APIs like OpenGL. Google says WebGPU is not expected to come enabled by default for all Chrome users until early 2022. The final release of Chrome 94 should enable WebCodecs for everyone, which is another API designed to improve the encoding and decoding of streaming videos.

PlayStation (Games)

Emulator Runs PS1 Games in 4K on the New Xboxes (inputmag.com) 13

Duckstation, an emulator that allows users to run Playstation games, was recently made available for installation onto the latest generation of Xbox consoles. From a report: It's time to jog those nostalgia muscles, as the emulator will not only be able to play your PS1 favorites but also scale those games up to native 4K resolution at 60fps. In addition to the 4K treatment, Duckstation will let gamers improve the overall look of the emulation experience in a couple of other ways.

Turning this on disables dithering, an effect that was built into the original Playstation hardware. Dithering in layman's terms was basically a function to improve depth of color by underpinning graphics with a series of lines or dots, which were then blurred by the system's video encoders. Turning this on helps improve graphic capabilities by smoothing out the blocky textures on 3D objects. The original low-poly graphics of the PS1 would often look cruder as they enlarged, so this function basically smoothes out those clunky compositions.

Transportation

Older Tesla Vehicles To Get UI Performance Boost Thanks To Famed Video Game Engineer (electrek.co) 86

Tesla is working with famed video game engineer John Carmack to improve the interface performance in older vehicles. Electrek reports: Carmack is a legend in the video game world and in the broader computer science industry. He made important advancements in 3D computer graphics and was the lead programmer on game-changing video games like Doom and Quake. Later in his career, he focused his talents on virtual reality and became CTO of Oculus. More recently, he stepped down from his role at Oculus to focus on general artificial intelligence. In the 2000s, Carmack also had an interest in rocketry and started Armadillo Aerospace.

Several of these interests overlap with Elon Musk's, who has a lot of respect for Carmack and tried to hire him for a long time. While it doesn't sound like Musk has convinced him to come work with him just yet, Carmack confirmed that he is actually working on a Tesla product. Carmack drives a Tesla Model S, and he confirmed that he is working with Tesla engineers to improve interface performance: "I did kind of volunteer to help them fix what I consider very poor user interface performance on the older model S (that I drive). Their engineers have been sharing data with me." Tesla has had performance issues with its older media control unit found in older Tesla Model S vehicles. The automaker offers a media computer upgrade to improve performance, but you are stuck if you don't want to pay the $2,500 upgrade.

Intel

45 Teraflops: Intel Unveils Details of Its 100-Billion Transistor AI Chip (siliconangle.com) 16

At its annual Architecture Day semiconductor event Thursday, Intel revealed new details about its powerful Ponte Vecchio chip for data centers, reports SiliconANGLE: Intel is looking to take on Nvidia Corp. in the AI silicon market with Ponte Vecchio, which the company describes as its most complex system-on-chip or SOC to date. Ponte Vecchio features some 100 billion transistors, nearly twice as many as Nvidia's flagship A100 data center graphics processing unit. The chip's 100 billion transistors are divided among no fewer than 47 individual processing modules made using five different manufacturing processes. Normally, an SOC's processing modules are arranged side by side in a flat two-dimensional design. Ponte Vecchio, however, stacks the modules on one another in a vertical, three-dimensional structure created using Intel's Foveros technology.

The bulk of Ponte Vecchio's processing power comes from a set of modules aptly called the Compute Tiles. Each Compute Tile has eight Xe cores, GPU cores specifically optimized to run AI workloads. Every Xe core, in turn, consists of eight vector engines and eight matrix engines, processing modules specifically built to run the narrow set of mathematical operations that AI models use to turn data into insights... Intel shared early performance data about the chip in conjunction with the release of the technical details. According to the company, early Ponte Vecchio silicon has demonstrated performance of more than 45 teraflops, or about 45 trillion operations per second.

The article adds that it achieved those speeds while processing 32-bit single-precision floating-point values floating point values — and that at least one customer has already signed up to use Ponte Vecchio. The Argonne National Laboratory will include Ponte Vecchio chips in its upcoming $500 million Aurora supercomputer. Aurora will provide one exaflop of performance when it becomes fully operational, the equivalent of a quintillion calculations per second.
Businesses

Laptop Shortage is Easing as Pandemic Demand Wanes (bloomberg.com) 17

Since early in the pandemic, soaring demand for consumer electronics led to persistent chip shortages. Some recent signs suggest the situation may finally be starting to change. From a report: An executive at the memory chip maker Micron Technology said last week at an investor conference that demand for consumer PCs is slowing and that some of its customers have more chips lying around. A day later, Morgan Stanley downgraded several chip stocks in a note titled "Winter is Coming." The analysts said PC inventory is rising and that the smartphone market is likely to experience similar deterioration. An old investor maxim says technology companies tend to handily outperform during cyclical upswings while the reverse is true on the downside. Well, the industry is beginning to fall short of estimates.

Global PC shipments grew by 13% in the second quarter, according to research firm IDC. That was below Evercore ISI's expectation of 18% and a big deceleration from the 55% rise in the first quarter. Furthermore, wireless router manufacturer Netgear Inc. gave disappointing guidance last month, adding that sales were worse-than-expected in its consumer networking category. Still, it's probably too soon to declare an end. Outbreaks of the delta variant and the long-term efficacy of vaccines make predictions even harder than usual. Some chip analysts have said reports of weakness are primarily seasonal and that sales will pick up through next year. Shortages also vary by part. So even if you can walk into a store and find plenty of laptops, you'll still struggle to get a new car or a video game console.

In some cases, chip delivery times are longer than 20 weeks, the longest wait in at least four years. But as I wrote last month, the pandemic rush to computers and printers won't repeat itself. Once a worker or student buys a laptop, they don't need another one for several years. Retailers are offering extensive discounts on nearly every PC-related category, with the exception of graphics cards. (It's still a good time to be in the games business.) The waning demand for PCs will likely last for at least several more quarters.

Intel

Intel Enters the PC Gaming GPU Battle With Arc 92

Dave Knott writes: Intel is branding its upcoming consumer GPUs as Intel Arc. This new Arc brand will cover both the hardware and software powering Intel's high-end discrete GPUs, as well as multiple hardware generations. The first of those, known previously as DG2, is expected to arrive in the form of codename "Alchemist" in Q1 2022. Intel's Arc GPUs will be capable of mesh shading, variable rate shading, video upscaling, and real-time ray tracing. Most importantly, Intel is also promising AI-accelerated super sampling, which sounds like Intel has its own competitor to Nvidia's Deep Learning Super Sampling (DLSS) technology.
IBM

The IBM PC Turns 40 (theregister.com) 117

The Register's Richard Speed commemorates the 40th anniversary of the introduction of the IBM Model 5150: IBM was famously late to the game when the Model 5150 (or IBM PC) put in an appearance. The likes of Commodore and Apple pretty much dominated the microcomputer world as the 1970s came to a close and the 1980s began. Big Blue, on the other hand, was better known for its sober, business-orientated products and its eyewatering price tags. However, as its customers began eying Apple products, IBM lumbered toward the market, creating a working group that could dispense with the traditional epic lead-times of Big Blue and take a more agile approach. A choice made was to use off-the-shelf hardware and software and adopt an open architecture. A significant choice, as things turned out.

Intel's 8088 was selected over the competition (including IBM's own RISC processor) and famously, Microsoft was tapped to provide PC DOS as well as BASIC that was included in the ROM. So this marks the 40th anniversary of PC DOS, aka MS-DOS, too. You can find Microsoft's old MS-DOS source code here. The basic price for the 5150 was $1,565, with a fully loaded system rising to more than $3,000. Users could enjoy high resolution monochrome text via the MDA card or some low resolution graphics (and vaguely nauseating colors) through a CGA card (which could be installed simultaneously.) RAM landed in 16 or 64kB flavors and could be upgraded to 256kB while the Intel 8088 CPU chugged along at 4.77 MHz.

Storage came courtesy of up to two 5.25" floppy disks, and the ability to attach a cassette recorder -- an option swiftly stripped from later models. There was no hard disk, and adding one presented a problem for users with deep enough pockets: the motherboard and software didn't support it and the power supply was a bit weedy. IBM would resolve this as the PC evolved. Importantly, the motherboard also included slots for expansion, which eventually became known as the Industry Standard Architecture (ISA) bus as the IBM PC clone sector exploded. IBM's approach resulted in an immense market for expansion cards and third party software.
While the Model 5150 "sold like hotcakes," Speed notes that it was eventually discontinued in 1987.
AI

Nvidia Reveals Its CEO Was Computer Generated in Keynote Speech (vice.com) 61

Graphics processor company Nvidia showcased its prowess at computer animation by sneaking a virtual replica of its CEO into a keynote speech. From a report: On Wednesday, Nvidia revealed in a blog post that its CEO Jensen Huang did not do the keynote presentation at the company's GTC conference in April. At least part of it was actually led by a virtual replica of Huang, created by digitizing Huang with a truck full of DSLR cameras, and then animating him with the help of an AI, according to the company.

Huang's kitchen, which has become Nvidia's venue for speaking to customers and investors since the beginning of the pandemic, was also entirely computer generated. It's not clear exactly which part of the keynote speech features CGI Huang (which is what makes the replica so impressive), but if you jump to this part of the presentation you can see Huang magically disappear and his kitchen explode into multiple different 3D models.

Hardware

Wear OS Is Getting a Multi-Generational Leap In Power Thanks To Samsung (arstechnica.com) 37

An anonymous reader quotes a report from Ars Technica: Google is cooking up the first major Wear OS release since 2018, and Samsung is abandoning Tizen for smartwatches and going all-in on Wear OS with the Galaxy Watch 4. Last night, Samsung took the wraps off the main SoC for the Galaxy Watch 4, and compared to what Wear OS usually gets, Samsung is shipping a beast of an SoC. The "Samsung Exynos W920" will be a multi-generational leap in performance for Wear OS. Samsung says this is a 5 nm chip with two ARM Cortex A55 cores and an ARM Mali-G68 GPU. For the always-on display mode, there's an additional Cortex M55 CPU, which can keep the watch face ticking along while using minimal power. There's also an integrated LTE modem for on-the-go connectivity.

Compared to Samsung's previous smartwatch chip, the Tizen-only Exynos 9110 (10 nm, 2x Cortex A53), the company is promising "around 20 percent" better CPU performance and "ten times better graphics performance." Remember that the Exynos 9110 is from 2018, so those comparative numbers are inflated, but at 5 nm, this is a more modern chip than Wear OS has ever seen. Wear OS has suffered for years at the hands of Qualcomm, which has been starving the ecosystem of quality SoCs for wearables. Most people's last experience with Wear OS is the Snapdragon Wear 2100 or 3100 SoCs, both of which were ancient Cortex A7 CPUs built on a 28 nm process. Qualcomm introduced a slightly more modern chip, the Wear 4100 in 2020 (a Cortex A53-based, 12 nm chip), but almost no manufacturers actually shipped that chip a year later, and we're still getting Wear 3100 launches today. Qualcomm's answer to Samsung's chip will be the Wear 5100, which isn't due until 2022.
We should know more about Wear OS 3.0 tomorrow when Samsung holds its Aug. 11 "Unpacked" event. Not only are they expected to reveal the Galaxy Watch 4 and the big Wear OS revamp, but they're planning to launch at least two new foldable smartphones -- the Galaxy Z Fold 3 and Galaxy Z Flip 3.
Graphics

AMD's Radeon RX 6600 XT Launched To Compete Against NVIDIA GeForce RTX 3060 (hothardware.com) 21

MojoKid writes: AMD officially unveiled the Radeon RX 6600 XT in late July but the cards have officially launched today, aimed at 1080p gaming. In a review at HotHardware, PowerColor is offering both a high-end Radeon RX 6600 XT Red Devil and its somewhat more mainstream "Fighter" branded counterpart, for example. Whereas AMD's reference Radeon RX 6600 XT offers a Game clock up to 2359MHz and a Boost clock of 25895MHz, the PowerColor Red Devil peaks at 2428MHz (Game) and 2607MHz (Boost). Those higher GPU clocks result in higher compute performance and fillrate, etc., but the memory configuration and frequency are the same -- so in memory bandwidth constrained situations, performance won't be all that much different.

Performance-wise, with most game titles that use traditional rasterization, the Radeon RX 6600 XT is clearly faster than the GeForce RTX 3060 and previous-gen cards like the Radeon RX 5700 XT or GeForce RTX 2060 Super. However, when you factor ray tracing into the equation, NVIDIA has a distinct and significant advantage still. The Radeon RX 6600 XT Fighter should sell for at or close to its $379 MSRP. PowerColor says that they should be readily available for gamers to purchase today.

Movies

Pixar Co-founder Shares 'the Real Story of Pixar' (ieee.org) 41

Alvy Ray Smith cofounded Pixar. He was the first director of computer graphics at Lucasfilm and the first graphics fellow at Microsoft. He has received two technical Academy Awards for his contributions to digital movie-making technology.

This week he shared "The Real Story of Pixar," in an article in IEEE Spectrum that Slashdot reader Tekla Perry says "corrects some of the things legend got wrong, and gives a fast tour of computer graphics history, as he talks about the key computer graphics breakthroughs that led up to Pixar and how Moore's Law saved the company." Its starts in 1980 when Smith is part of a team hired by Lucasfilm to create its Computer Division: This division was charged with computerizing editing, sound design and mixing, special effects, and accounting for the company's books, as if this fourth challenge would be as difficult as the other three. Ed Catmull, who led the Computer Division, made me head of the Computer Graphics Group, which was tasked with the special-effects project. At Lucasfilm, we continued to develop the software needed for three-dimensional computer-generated movies. And we worked on specialized hardware as well, designing a computer, called the Pixar Image Computer, that could run its calculations four times as fast as comparable general-purpose systems — but only for pixels. We were still waiting for Moore's Law to get general computers to where we needed them — it did, but this strategy gave us a boost for a few years.

We didn't get one of our fully computer-generated movie sequences into a major motion picture until 1982, with our one-minute "Genesis" sequence in Star Trek II: The Wrath of Khan. It showed a bare planet catching on fire, melting, and then forming mountains and seas and green forests. We followed that groundbreaking piece of a movie with a brief sequence in Return of the Jedi in 1983, featuring a "hologram" of the Death Star... But then our Computer Graphics Group, now numbering 40 people, got the news that the Computer Division was on the chopping block.

Then Smith continues the story with an excerpt from his new book, "A Biography of the Pixel." ("We did have a prototype special-purpose computer, the Pixar Image Computer. So Ed and I wrote up a business plan to build and sell Pixar Image Computers, calling them 'supercomputers for pixels'...") 35 venture capital firms turned them down, as did 10 corporations where they'd proposed a "strategic partnership." Finally, they made a desperate pitch to Steve Jobs: Steve, who had just been ousted from Apple, proposed that he buy us from Lucasfilm and run us as his next company. We said no, that we wanted to run the company ourselves, but we would accept his money in the form of a venture investment. And he agreed...

Pixar was a lousy hardware company. We failed several times over our first five years. That's failure measured the usual way: We ran out of money and couldn't pay our bills or our employees. If we'd had any other investor than Steve, we would have been dead in the water. But at every failure — presumably because Steve couldn't sustain the embarrassment that his next enterprise after the Apple ouster would be a failure — he'd berate those of us in management . . . then write another check. And each check effectively reduced employee equity. After several such "refinancings," he had poured about $50 million (half of the fortune he had made from Apple) into Pixar. In today's money, that's more than $100 million. On March 6, 1991, in Pixar's fifth year, he finally did buy the company from the employees outright.

The company was still in financial trouble — but expanding computing power eventually made it possible to render an entire full-length movie, and Disney financed the years of production necessary for the 1995 movie Toy Story. But even before its release, Steve Jobs "took Pixar public on November 29, 1995, on nothing more than the promise of Toy Story.

"It salvaged his reputation and made him a billionaire."

The article's subheading? "How a bad hardware company turned itself into a great movie studio."
AI

Self-Driving Car Startup Wants to Spare AI From Making Life-or-Death Decisions (washingtonpost.com) 134

Instead of having AI in a self-driving car decide whether to kill its driver or pedestrians, the Washington Post reports there's a new philosophy gaining traction: Why not stop cars from getting in life-or-death situations in the first place? (Alternate URL): After all, the whole point of automated cars is to create road conditions where vehicles are more aware than humans are, and thus better at predicting and preventing accidents. That might avoid some of the rare occurrences where human life hangs in the balance of a split-second decision... The best way to kill or injure people probably isn't a decision you'd like to leave up to your car, or the company manufacturing it, anytime soon. That's the thinking now about advanced AI: It's supposed to prevent the scenarios that lead to crashes, making the choice of who's to die one that the AI should never have to face.

Humans get distracted by texting, while cars don't care what your friends have to say. Humans might miss objects obscured by their vehicle's blind spot. Lidar can pick those things up, and 360 cameras should work even if your eyes get tired. Radar can bounce around from one vehicle to the next, and might spot a car decelerating up ahead faster than a human can... [Serial entrepreneur Barry] Lunn is the founder and CEO of Provizio, an accident-prevention technology company. Provizio's secret sauce is a "five-dimensional" vision system made up of high-end radar, lidar and camera imaging. The company builds an Intel vision processor and Nvidia graphics processor directly onto its in-house radar sensor, enabling cars to run machine-learning algorithms directly on the radar sensor. The result is a stack of perception technology that sees farther and wider, and processes road data faster than traditional autonomy tech, Lunn says. Swift predictive analytics gives vehicles and drivers more time to react to other cars.

The founder has worked in vision technology for nearly a decade and has previously worked with NASA, General Motors and Boeing under the radar company Arralis, which Lunn sold in 2017. The start-up is in talks with big automakers, and its vision has a strong team of trailblazers behind it, including Scott Thayer and Jeff Mishler, developers of early versions of autonomous tech for Google's Waymo and Uber... Lunn thinks the auto industry prematurely pushed autonomy as a solution, long before it was safe or practical to remove human drivers from the equation. He says AI decision-making will play a pivotal role in the future of auto safety, but only after it has been shown to reduce the issues that lead to crashes. The goal is the get the tech inside passenger cars so that the system can learn from human drivers, and understand how they make decisions before allowing the AI to decide what happens in specified instances.

Transportation

Edmunds Reviews Ford's BlueCruise Hands-Free Driving Technology (edmunds.com) 52

An anonymous reader quotes a report from Edmunds, written by Cameron Rogers: BlueCruise operates on the same principle as Super Cruise. Once the vehicle is traveling on one of the 100,000 miles of qualified roadways (Ford dubs these Hands-Free Blue Zones) and certain conditions have been met, a graphic appears in the instrument panel to let you know that BlueCruise is ready for activation. Simply press the cruise control button on the steering wheel and you can take your hands off the wheel to let the vehicle drive itself. Like Super Cruise, Ford's BlueCruise system is not autonomous. As the driver, you have to be alert and prepared to take the wheel at any time. BlueCruise will not take evasive action if there is a small obstruction in the road -- a box on the freeway, for instance -- and you must be ready to perform advanced maneuvers if necessary. To that end, BlueCruise includes a head and eye position sensor to make sure you're watching the road ahead. Divert your attention for too long and the system will deactivate. And because BlueCruise relies on clearly visible lane markers, traveling on highway sections that lack them will deactivate the system. The first vehicles to receive BlueCruise functionality will be two of Ford's newest models -- the 2021 Ford Mustang Mach-E and 2021 Ford F-150. In both cases, the BlueCruise hardware is tied to the Ford Co-Pilot360 Active 2.0 package.

I had the opportunity to drive both an F-150 and Mustang Mach-E with BlueCruise, and there was no functional difference in how the system behaved in each vehicle. The system itself melds several driver aids that are already present on the majority of cars today, but with a head- and eye-tracking component that makes sure you're paying attention. Once this is established -- and you're driving on a preapproved road -- a ring will appear around a graphic of the vehicle in the digital instrument panel. This lets you know that BlueCruise is ready to activate. Simply press the cruise control button and acceleration, braking and turning is handed over to BlueCruise. In this way, BlueCruise functions similarly to GM's Super Cruise. The primary difference is that GM vehicles with Super Cruise have an LED light bar integrated into the steering wheel to let you know when Super Cruise can be activated. Ford's system isn't so obvious in letting you know when it's ready. When you press the cruise control button, however, the instrument panel graphics turn blue to inform you that BlueCruise is active and you can take your hands off the wheel.

The other difference between the two competing systems is that GM's Super Cruise has one prescribed distance for the adaptive cruise control (ACC) aspect. Ford has decided to treat BlueCruise like a typical ACC system in which you can choose one of four following distances. When engaged, BlueCruise does a good job at approximating typical human driving behavior. I never had to adjust the following distance from one of the medium settings, and the system gives you a few beats to put your hands on the wheel when it needs you to resume control. I didn't experience many technical issues in either vehicle on my limited test drive, but there was one instance in which I was forced to make an emergency maneuver. A Civic driver with little concern for their personal safety accelerated to merge right in front of my F-150, and the truck didn't slow down quickly enough. This wasn't necessarily a fault of BlueCruise itself -- I have found that ACC systems in general are slow to react to vehicles merging into or out of my lane -- but it goes to show that you still need to have your wits about you at all times.
"Like GM's Super Cruise, Ford's BlueCruise provides a hands-free driving experience on certain limited-access highways," writes Rogers in closing. "It certainly takes some stress out of driving in bumper-to-bumper traffic, and should be similarly pleasant on long-distance road trips. But these are not autonomous systems, and drivers need to be ready to take the wheel at any time to react to changing road conditions."
AMD

AMD Ryzen 5000G Series Launches With Integrated Graphics At Value Price Points (hothardware.com) 69

MojoKid writes: AMD is taking the wraps off of its latest integrated processors known as Ryzen 7 5700G and the Ryzen 5 5600G. As their branding suggests, these new products are based on the same excellent AMD Zen 3 core architecture, but with integrated graphics capabilities on board as well, hence the "G" designation. AMD is targeting more mainstream applications with these chips. The Ryzen 7 5700G is an 8-core/16-thread CPU with 4MB of L2 cache and 16MB of L3. Those CPU cores are mated to an 8 CU (Compute Unit) Radeon Vega graphics engine, and it has 24 lanes of PCIe Gen 3 connectivity. The 5700G's base CPU clock is 3.8GHz, with a maximum boost clock of 4.6GHz. The on-chip GPU can boost up to 2GHz, which is a massive uptick from the 1.4GHz of previous-gen 3000-series APUs.

The Ryzen 5 5600G takes things down a notch with 6 CPU cores (12 threads) and a smaller 3MB L2 cache while L3 cache size remains unchanged. The 5600G's iGPU is scaled down slightly as well with only 7 CUs. At 3.9GHz, the 5600G's base CPU clock is 100MHz higher than the 5700G's, but its max boost lands at 4.4GHz with a slightly lower GPU boost clock of 1.9GHz. In the benchmarks, the Ryzen 5 5600G and Ryzen 7 5700G both offer enough multi-threaded muscle for the vast majority of users, often besting similar Intel 11th Gen Core series chips, with highly competitive single-thread performance as well.

Desktops (Apple)

Mac Pro Gets a Graphics Update (sixcolors.com) 23

On Tuesday, Apple rolled out three new graphics card modules for the Intel-based Mac Pro, all based on AMD's Radeon Pro W6000 series GPU. From a report: (Apple posted a Mac Pro performance white paper [PDF] to celebrate.) The new modules (in Apple's MPX format) come in three variants, with a Radeon Pro W6800X, two W6800X GPUs, and the W6900X. Each module also adds four Thunderbolt 3 ports and an HDMI 2 port to the Mac Pro. The Mac Pro supports two MPX modules, so you could pop in two of the dual-GPU modules to max out performance. They can connect using AMD's Infinity Fabric Link, which can connect up to four GPUs to communicate with one another via a super-fast connection with much more bandwidth than is available via the PCIe bus.
Games

Someone Made a Playable Clone of Pokemon For the Pebble Smartwatch (gizmodo.com) 19

Developer Harrison Allen has developed a playable clone of Pokemon for the Pebble smartwatch, which was officially discontinued in late 2016 after the company was sold to Fitbit. Gizmodo reports: According to the game's developer, Harrison Allen, Pebblemon uses a graphics library they created that replicates Pokémon Yellow, which was the first version of the popular game series to take advantage of the Game Boy Color's limited color palette. As a result, while Pebblemon appears to be playable using the Pebble smartwatch's buttons (the wearable lacked a touchscreen), it's a smaller version of the original game featuring "various areas within the Johto region" but players will still "Encounter all 251 Pokemon from the Game Boy Color games" and will still be able to find items to help them out during gameplay.

Pebblemon is currently available through the Rebble.io repository, which was created shortly after the company died as a place to continue to allow users to maintain their smart wearables, and to give developers a way to distribute new apps. If you don't already use it, you'll have to jump through a few hoops to get it to play nice with your Pebble watch, but it doesn't appear terribly difficult. Alternately, Allen has provided all of his source code through GitHub, if you're in the mood to compile or adapt it into something else yourself. There are two things to keep in mind if you want to try Pebblemon out: it's only compatible with the Pebble Time, Pebble Time Round, and Pebble 2 models -- not the original version of the wearable -- and you're going to want to jump on this as soon as possible because there's a very good chance Nintendo's eager lawyers are already aware of the game, and are already working to wipe it off the face of the Earth.

Graphics

Amazon MMO New World Is Bricking RTX 3090s, Players Say; Amazon Responds (gamespot.com) 144

An anonymous reader quotes a report from GameSpot: Amazon [...] is now bricking high-end graphics cards with a beta for its MMO, New World, according to players. Amazon has now responded to downplay the incident but says it plans to implement a frame rate cap on the game's menus. According to users on Twitter and Reddit, New World has been frying extremely high-end graphics cards, namely Nvidia's RTX 3090. It's worth noting that while the RTX 3090 has an MSRP of $1,500, it's often selling for much more due to scarcity and scalpers, so players could easily be losing upwards of $2,000 if their card stops working.

Specifically, it seems that one model of the RTX 3090 is being consistently fried by New World. On Reddit, a lengthy thread of over 600 posts includes multiple users claiming that their EVGA 3090 graphics cards are now little more than expensive paperweights after playing the New World beta. The "red light of death," an indicator that something is disastrously wrong with your EVGA 3090, doesn't pop up consistently for players though. Some report their screen going black after a cutscene in the game while others have said that simply using the brightness calibration screen was enough to brick their card.
Amazon Games says a patch is on the way to prevent further issues. "Hundreds of thousands of people played in the New World Closed Beta yesterday, with millions of total hours played. We've received a few reports of players using high-performance graphics cards experiencing hardware failure when playing New World," said Amazon Games in an official statement.

"New World makes standard DirectX calls as provided by the Windows API. We have seen no indication of widespread issues with 3090s, either in the beta or during our many months of alpha testing. The New World Closed Beta is safe to play. In order to further reassure players, we will implement a patch today that caps frames per second on our menu screen. We're grateful for the support New World is receiving from players around the world, and will keep listening to their feedback throughout Beta and beyond."

New World is currently set to launch for PC on August 31.
Games

Valve Launches Steam Deck, a $400 PC Gaming Portable (techcrunch.com) 110

A new challenger has emerged in the gaming hardware category. Game distribution giant Valve today announced the launch of Steam Deck, a $399 gaming portable designed to take PC games on the go. From a report: The handheld (which has echoes of several portable gaming rigs of years past) features a seven-inch screen and runs on a quad-core Zen 2 CPU, coupled with AMD RDNA 2 graphics and 16GB of RAM. Storage runs 64GB to 512GB, the latter of which bumps the price up to $649. The built-in storage can be augmented via microSD.

[...] Flanking the 1280 x 800 touchscreen are a pair of trackpads and thumb sticks. A built-in gyroscope also uses movement to control the gaming experience. There's a single USB-C port for charging, peripherals and connecting to a big screen, while a 40Wh battery promises between 7-8 hours of gameplay, by Valve's numbers.

Slashdot Top Deals