IBM

The IBM PC Turns 40 (theregister.com) 117

The Register's Richard Speed commemorates the 40th anniversary of the introduction of the IBM Model 5150: IBM was famously late to the game when the Model 5150 (or IBM PC) put in an appearance. The likes of Commodore and Apple pretty much dominated the microcomputer world as the 1970s came to a close and the 1980s began. Big Blue, on the other hand, was better known for its sober, business-orientated products and its eyewatering price tags. However, as its customers began eying Apple products, IBM lumbered toward the market, creating a working group that could dispense with the traditional epic lead-times of Big Blue and take a more agile approach. A choice made was to use off-the-shelf hardware and software and adopt an open architecture. A significant choice, as things turned out.

Intel's 8088 was selected over the competition (including IBM's own RISC processor) and famously, Microsoft was tapped to provide PC DOS as well as BASIC that was included in the ROM. So this marks the 40th anniversary of PC DOS, aka MS-DOS, too. You can find Microsoft's old MS-DOS source code here. The basic price for the 5150 was $1,565, with a fully loaded system rising to more than $3,000. Users could enjoy high resolution monochrome text via the MDA card or some low resolution graphics (and vaguely nauseating colors) through a CGA card (which could be installed simultaneously.) RAM landed in 16 or 64kB flavors and could be upgraded to 256kB while the Intel 8088 CPU chugged along at 4.77 MHz.

Storage came courtesy of up to two 5.25" floppy disks, and the ability to attach a cassette recorder -- an option swiftly stripped from later models. There was no hard disk, and adding one presented a problem for users with deep enough pockets: the motherboard and software didn't support it and the power supply was a bit weedy. IBM would resolve this as the PC evolved. Importantly, the motherboard also included slots for expansion, which eventually became known as the Industry Standard Architecture (ISA) bus as the IBM PC clone sector exploded. IBM's approach resulted in an immense market for expansion cards and third party software.
While the Model 5150 "sold like hotcakes," Speed notes that it was eventually discontinued in 1987.
AI

Nvidia Reveals Its CEO Was Computer Generated in Keynote Speech (vice.com) 61

Graphics processor company Nvidia showcased its prowess at computer animation by sneaking a virtual replica of its CEO into a keynote speech. From a report: On Wednesday, Nvidia revealed in a blog post that its CEO Jensen Huang did not do the keynote presentation at the company's GTC conference in April. At least part of it was actually led by a virtual replica of Huang, created by digitizing Huang with a truck full of DSLR cameras, and then animating him with the help of an AI, according to the company.

Huang's kitchen, which has become Nvidia's venue for speaking to customers and investors since the beginning of the pandemic, was also entirely computer generated. It's not clear exactly which part of the keynote speech features CGI Huang (which is what makes the replica so impressive), but if you jump to this part of the presentation you can see Huang magically disappear and his kitchen explode into multiple different 3D models.

Hardware

Wear OS Is Getting a Multi-Generational Leap In Power Thanks To Samsung (arstechnica.com) 37

An anonymous reader quotes a report from Ars Technica: Google is cooking up the first major Wear OS release since 2018, and Samsung is abandoning Tizen for smartwatches and going all-in on Wear OS with the Galaxy Watch 4. Last night, Samsung took the wraps off the main SoC for the Galaxy Watch 4, and compared to what Wear OS usually gets, Samsung is shipping a beast of an SoC. The "Samsung Exynos W920" will be a multi-generational leap in performance for Wear OS. Samsung says this is a 5 nm chip with two ARM Cortex A55 cores and an ARM Mali-G68 GPU. For the always-on display mode, there's an additional Cortex M55 CPU, which can keep the watch face ticking along while using minimal power. There's also an integrated LTE modem for on-the-go connectivity.

Compared to Samsung's previous smartwatch chip, the Tizen-only Exynos 9110 (10 nm, 2x Cortex A53), the company is promising "around 20 percent" better CPU performance and "ten times better graphics performance." Remember that the Exynos 9110 is from 2018, so those comparative numbers are inflated, but at 5 nm, this is a more modern chip than Wear OS has ever seen. Wear OS has suffered for years at the hands of Qualcomm, which has been starving the ecosystem of quality SoCs for wearables. Most people's last experience with Wear OS is the Snapdragon Wear 2100 or 3100 SoCs, both of which were ancient Cortex A7 CPUs built on a 28 nm process. Qualcomm introduced a slightly more modern chip, the Wear 4100 in 2020 (a Cortex A53-based, 12 nm chip), but almost no manufacturers actually shipped that chip a year later, and we're still getting Wear 3100 launches today. Qualcomm's answer to Samsung's chip will be the Wear 5100, which isn't due until 2022.
We should know more about Wear OS 3.0 tomorrow when Samsung holds its Aug. 11 "Unpacked" event. Not only are they expected to reveal the Galaxy Watch 4 and the big Wear OS revamp, but they're planning to launch at least two new foldable smartphones -- the Galaxy Z Fold 3 and Galaxy Z Flip 3.
Graphics

AMD's Radeon RX 6600 XT Launched To Compete Against NVIDIA GeForce RTX 3060 (hothardware.com) 21

MojoKid writes: AMD officially unveiled the Radeon RX 6600 XT in late July but the cards have officially launched today, aimed at 1080p gaming. In a review at HotHardware, PowerColor is offering both a high-end Radeon RX 6600 XT Red Devil and its somewhat more mainstream "Fighter" branded counterpart, for example. Whereas AMD's reference Radeon RX 6600 XT offers a Game clock up to 2359MHz and a Boost clock of 25895MHz, the PowerColor Red Devil peaks at 2428MHz (Game) and 2607MHz (Boost). Those higher GPU clocks result in higher compute performance and fillrate, etc., but the memory configuration and frequency are the same -- so in memory bandwidth constrained situations, performance won't be all that much different.

Performance-wise, with most game titles that use traditional rasterization, the Radeon RX 6600 XT is clearly faster than the GeForce RTX 3060 and previous-gen cards like the Radeon RX 5700 XT or GeForce RTX 2060 Super. However, when you factor ray tracing into the equation, NVIDIA has a distinct and significant advantage still. The Radeon RX 6600 XT Fighter should sell for at or close to its $379 MSRP. PowerColor says that they should be readily available for gamers to purchase today.

Movies

Pixar Co-founder Shares 'the Real Story of Pixar' (ieee.org) 41

Alvy Ray Smith cofounded Pixar. He was the first director of computer graphics at Lucasfilm and the first graphics fellow at Microsoft. He has received two technical Academy Awards for his contributions to digital movie-making technology.

This week he shared "The Real Story of Pixar," in an article in IEEE Spectrum that Slashdot reader Tekla Perry says "corrects some of the things legend got wrong, and gives a fast tour of computer graphics history, as he talks about the key computer graphics breakthroughs that led up to Pixar and how Moore's Law saved the company." Its starts in 1980 when Smith is part of a team hired by Lucasfilm to create its Computer Division: This division was charged with computerizing editing, sound design and mixing, special effects, and accounting for the company's books, as if this fourth challenge would be as difficult as the other three. Ed Catmull, who led the Computer Division, made me head of the Computer Graphics Group, which was tasked with the special-effects project. At Lucasfilm, we continued to develop the software needed for three-dimensional computer-generated movies. And we worked on specialized hardware as well, designing a computer, called the Pixar Image Computer, that could run its calculations four times as fast as comparable general-purpose systems — but only for pixels. We were still waiting for Moore's Law to get general computers to where we needed them — it did, but this strategy gave us a boost for a few years.

We didn't get one of our fully computer-generated movie sequences into a major motion picture until 1982, with our one-minute "Genesis" sequence in Star Trek II: The Wrath of Khan. It showed a bare planet catching on fire, melting, and then forming mountains and seas and green forests. We followed that groundbreaking piece of a movie with a brief sequence in Return of the Jedi in 1983, featuring a "hologram" of the Death Star... But then our Computer Graphics Group, now numbering 40 people, got the news that the Computer Division was on the chopping block.

Then Smith continues the story with an excerpt from his new book, "A Biography of the Pixel." ("We did have a prototype special-purpose computer, the Pixar Image Computer. So Ed and I wrote up a business plan to build and sell Pixar Image Computers, calling them 'supercomputers for pixels'...") 35 venture capital firms turned them down, as did 10 corporations where they'd proposed a "strategic partnership." Finally, they made a desperate pitch to Steve Jobs: Steve, who had just been ousted from Apple, proposed that he buy us from Lucasfilm and run us as his next company. We said no, that we wanted to run the company ourselves, but we would accept his money in the form of a venture investment. And he agreed...

Pixar was a lousy hardware company. We failed several times over our first five years. That's failure measured the usual way: We ran out of money and couldn't pay our bills or our employees. If we'd had any other investor than Steve, we would have been dead in the water. But at every failure — presumably because Steve couldn't sustain the embarrassment that his next enterprise after the Apple ouster would be a failure — he'd berate those of us in management . . . then write another check. And each check effectively reduced employee equity. After several such "refinancings," he had poured about $50 million (half of the fortune he had made from Apple) into Pixar. In today's money, that's more than $100 million. On March 6, 1991, in Pixar's fifth year, he finally did buy the company from the employees outright.

The company was still in financial trouble — but expanding computing power eventually made it possible to render an entire full-length movie, and Disney financed the years of production necessary for the 1995 movie Toy Story. But even before its release, Steve Jobs "took Pixar public on November 29, 1995, on nothing more than the promise of Toy Story.

"It salvaged his reputation and made him a billionaire."

The article's subheading? "How a bad hardware company turned itself into a great movie studio."
AI

Self-Driving Car Startup Wants to Spare AI From Making Life-or-Death Decisions (washingtonpost.com) 134

Instead of having AI in a self-driving car decide whether to kill its driver or pedestrians, the Washington Post reports there's a new philosophy gaining traction: Why not stop cars from getting in life-or-death situations in the first place? (Alternate URL): After all, the whole point of automated cars is to create road conditions where vehicles are more aware than humans are, and thus better at predicting and preventing accidents. That might avoid some of the rare occurrences where human life hangs in the balance of a split-second decision... The best way to kill or injure people probably isn't a decision you'd like to leave up to your car, or the company manufacturing it, anytime soon. That's the thinking now about advanced AI: It's supposed to prevent the scenarios that lead to crashes, making the choice of who's to die one that the AI should never have to face.

Humans get distracted by texting, while cars don't care what your friends have to say. Humans might miss objects obscured by their vehicle's blind spot. Lidar can pick those things up, and 360 cameras should work even if your eyes get tired. Radar can bounce around from one vehicle to the next, and might spot a car decelerating up ahead faster than a human can... [Serial entrepreneur Barry] Lunn is the founder and CEO of Provizio, an accident-prevention technology company. Provizio's secret sauce is a "five-dimensional" vision system made up of high-end radar, lidar and camera imaging. The company builds an Intel vision processor and Nvidia graphics processor directly onto its in-house radar sensor, enabling cars to run machine-learning algorithms directly on the radar sensor. The result is a stack of perception technology that sees farther and wider, and processes road data faster than traditional autonomy tech, Lunn says. Swift predictive analytics gives vehicles and drivers more time to react to other cars.

The founder has worked in vision technology for nearly a decade and has previously worked with NASA, General Motors and Boeing under the radar company Arralis, which Lunn sold in 2017. The start-up is in talks with big automakers, and its vision has a strong team of trailblazers behind it, including Scott Thayer and Jeff Mishler, developers of early versions of autonomous tech for Google's Waymo and Uber... Lunn thinks the auto industry prematurely pushed autonomy as a solution, long before it was safe or practical to remove human drivers from the equation. He says AI decision-making will play a pivotal role in the future of auto safety, but only after it has been shown to reduce the issues that lead to crashes. The goal is the get the tech inside passenger cars so that the system can learn from human drivers, and understand how they make decisions before allowing the AI to decide what happens in specified instances.

Transportation

Edmunds Reviews Ford's BlueCruise Hands-Free Driving Technology (edmunds.com) 52

An anonymous reader quotes a report from Edmunds, written by Cameron Rogers: BlueCruise operates on the same principle as Super Cruise. Once the vehicle is traveling on one of the 100,000 miles of qualified roadways (Ford dubs these Hands-Free Blue Zones) and certain conditions have been met, a graphic appears in the instrument panel to let you know that BlueCruise is ready for activation. Simply press the cruise control button on the steering wheel and you can take your hands off the wheel to let the vehicle drive itself. Like Super Cruise, Ford's BlueCruise system is not autonomous. As the driver, you have to be alert and prepared to take the wheel at any time. BlueCruise will not take evasive action if there is a small obstruction in the road -- a box on the freeway, for instance -- and you must be ready to perform advanced maneuvers if necessary. To that end, BlueCruise includes a head and eye position sensor to make sure you're watching the road ahead. Divert your attention for too long and the system will deactivate. And because BlueCruise relies on clearly visible lane markers, traveling on highway sections that lack them will deactivate the system. The first vehicles to receive BlueCruise functionality will be two of Ford's newest models -- the 2021 Ford Mustang Mach-E and 2021 Ford F-150. In both cases, the BlueCruise hardware is tied to the Ford Co-Pilot360 Active 2.0 package.

I had the opportunity to drive both an F-150 and Mustang Mach-E with BlueCruise, and there was no functional difference in how the system behaved in each vehicle. The system itself melds several driver aids that are already present on the majority of cars today, but with a head- and eye-tracking component that makes sure you're paying attention. Once this is established -- and you're driving on a preapproved road -- a ring will appear around a graphic of the vehicle in the digital instrument panel. This lets you know that BlueCruise is ready to activate. Simply press the cruise control button and acceleration, braking and turning is handed over to BlueCruise. In this way, BlueCruise functions similarly to GM's Super Cruise. The primary difference is that GM vehicles with Super Cruise have an LED light bar integrated into the steering wheel to let you know when Super Cruise can be activated. Ford's system isn't so obvious in letting you know when it's ready. When you press the cruise control button, however, the instrument panel graphics turn blue to inform you that BlueCruise is active and you can take your hands off the wheel.

The other difference between the two competing systems is that GM's Super Cruise has one prescribed distance for the adaptive cruise control (ACC) aspect. Ford has decided to treat BlueCruise like a typical ACC system in which you can choose one of four following distances. When engaged, BlueCruise does a good job at approximating typical human driving behavior. I never had to adjust the following distance from one of the medium settings, and the system gives you a few beats to put your hands on the wheel when it needs you to resume control. I didn't experience many technical issues in either vehicle on my limited test drive, but there was one instance in which I was forced to make an emergency maneuver. A Civic driver with little concern for their personal safety accelerated to merge right in front of my F-150, and the truck didn't slow down quickly enough. This wasn't necessarily a fault of BlueCruise itself -- I have found that ACC systems in general are slow to react to vehicles merging into or out of my lane -- but it goes to show that you still need to have your wits about you at all times.
"Like GM's Super Cruise, Ford's BlueCruise provides a hands-free driving experience on certain limited-access highways," writes Rogers in closing. "It certainly takes some stress out of driving in bumper-to-bumper traffic, and should be similarly pleasant on long-distance road trips. But these are not autonomous systems, and drivers need to be ready to take the wheel at any time to react to changing road conditions."
AMD

AMD Ryzen 5000G Series Launches With Integrated Graphics At Value Price Points (hothardware.com) 69

MojoKid writes: AMD is taking the wraps off of its latest integrated processors known as Ryzen 7 5700G and the Ryzen 5 5600G. As their branding suggests, these new products are based on the same excellent AMD Zen 3 core architecture, but with integrated graphics capabilities on board as well, hence the "G" designation. AMD is targeting more mainstream applications with these chips. The Ryzen 7 5700G is an 8-core/16-thread CPU with 4MB of L2 cache and 16MB of L3. Those CPU cores are mated to an 8 CU (Compute Unit) Radeon Vega graphics engine, and it has 24 lanes of PCIe Gen 3 connectivity. The 5700G's base CPU clock is 3.8GHz, with a maximum boost clock of 4.6GHz. The on-chip GPU can boost up to 2GHz, which is a massive uptick from the 1.4GHz of previous-gen 3000-series APUs.

The Ryzen 5 5600G takes things down a notch with 6 CPU cores (12 threads) and a smaller 3MB L2 cache while L3 cache size remains unchanged. The 5600G's iGPU is scaled down slightly as well with only 7 CUs. At 3.9GHz, the 5600G's base CPU clock is 100MHz higher than the 5700G's, but its max boost lands at 4.4GHz with a slightly lower GPU boost clock of 1.9GHz. In the benchmarks, the Ryzen 5 5600G and Ryzen 7 5700G both offer enough multi-threaded muscle for the vast majority of users, often besting similar Intel 11th Gen Core series chips, with highly competitive single-thread performance as well.

Desktops (Apple)

Mac Pro Gets a Graphics Update (sixcolors.com) 23

On Tuesday, Apple rolled out three new graphics card modules for the Intel-based Mac Pro, all based on AMD's Radeon Pro W6000 series GPU. From a report: (Apple posted a Mac Pro performance white paper [PDF] to celebrate.) The new modules (in Apple's MPX format) come in three variants, with a Radeon Pro W6800X, two W6800X GPUs, and the W6900X. Each module also adds four Thunderbolt 3 ports and an HDMI 2 port to the Mac Pro. The Mac Pro supports two MPX modules, so you could pop in two of the dual-GPU modules to max out performance. They can connect using AMD's Infinity Fabric Link, which can connect up to four GPUs to communicate with one another via a super-fast connection with much more bandwidth than is available via the PCIe bus.
Games

Someone Made a Playable Clone of Pokemon For the Pebble Smartwatch (gizmodo.com) 19

Developer Harrison Allen has developed a playable clone of Pokemon for the Pebble smartwatch, which was officially discontinued in late 2016 after the company was sold to Fitbit. Gizmodo reports: According to the game's developer, Harrison Allen, Pebblemon uses a graphics library they created that replicates Pokémon Yellow, which was the first version of the popular game series to take advantage of the Game Boy Color's limited color palette. As a result, while Pebblemon appears to be playable using the Pebble smartwatch's buttons (the wearable lacked a touchscreen), it's a smaller version of the original game featuring "various areas within the Johto region" but players will still "Encounter all 251 Pokemon from the Game Boy Color games" and will still be able to find items to help them out during gameplay.

Pebblemon is currently available through the Rebble.io repository, which was created shortly after the company died as a place to continue to allow users to maintain their smart wearables, and to give developers a way to distribute new apps. If you don't already use it, you'll have to jump through a few hoops to get it to play nice with your Pebble watch, but it doesn't appear terribly difficult. Alternately, Allen has provided all of his source code through GitHub, if you're in the mood to compile or adapt it into something else yourself. There are two things to keep in mind if you want to try Pebblemon out: it's only compatible with the Pebble Time, Pebble Time Round, and Pebble 2 models -- not the original version of the wearable -- and you're going to want to jump on this as soon as possible because there's a very good chance Nintendo's eager lawyers are already aware of the game, and are already working to wipe it off the face of the Earth.

Graphics

Amazon MMO New World Is Bricking RTX 3090s, Players Say; Amazon Responds (gamespot.com) 144

An anonymous reader quotes a report from GameSpot: Amazon [...] is now bricking high-end graphics cards with a beta for its MMO, New World, according to players. Amazon has now responded to downplay the incident but says it plans to implement a frame rate cap on the game's menus. According to users on Twitter and Reddit, New World has been frying extremely high-end graphics cards, namely Nvidia's RTX 3090. It's worth noting that while the RTX 3090 has an MSRP of $1,500, it's often selling for much more due to scarcity and scalpers, so players could easily be losing upwards of $2,000 if their card stops working.

Specifically, it seems that one model of the RTX 3090 is being consistently fried by New World. On Reddit, a lengthy thread of over 600 posts includes multiple users claiming that their EVGA 3090 graphics cards are now little more than expensive paperweights after playing the New World beta. The "red light of death," an indicator that something is disastrously wrong with your EVGA 3090, doesn't pop up consistently for players though. Some report their screen going black after a cutscene in the game while others have said that simply using the brightness calibration screen was enough to brick their card.
Amazon Games says a patch is on the way to prevent further issues. "Hundreds of thousands of people played in the New World Closed Beta yesterday, with millions of total hours played. We've received a few reports of players using high-performance graphics cards experiencing hardware failure when playing New World," said Amazon Games in an official statement.

"New World makes standard DirectX calls as provided by the Windows API. We have seen no indication of widespread issues with 3090s, either in the beta or during our many months of alpha testing. The New World Closed Beta is safe to play. In order to further reassure players, we will implement a patch today that caps frames per second on our menu screen. We're grateful for the support New World is receiving from players around the world, and will keep listening to their feedback throughout Beta and beyond."

New World is currently set to launch for PC on August 31.
Games

Valve Launches Steam Deck, a $400 PC Gaming Portable (techcrunch.com) 110

A new challenger has emerged in the gaming hardware category. Game distribution giant Valve today announced the launch of Steam Deck, a $399 gaming portable designed to take PC games on the go. From a report: The handheld (which has echoes of several portable gaming rigs of years past) features a seven-inch screen and runs on a quad-core Zen 2 CPU, coupled with AMD RDNA 2 graphics and 16GB of RAM. Storage runs 64GB to 512GB, the latter of which bumps the price up to $649. The built-in storage can be augmented via microSD.

[...] Flanking the 1280 x 800 touchscreen are a pair of trackpads and thumb sticks. A built-in gyroscope also uses movement to control the gaming experience. There's a single USB-C port for charging, peripherals and connecting to a big screen, while a 40Wh battery promises between 7-8 hours of gameplay, by Valve's numbers.

AMD

AMD CEO Says Chip Shortages Will Continue Through 2021 (laptopmag.com) 39

During an interview, AMD CEO Lisa Su reiterated that the current chip supply shortages would continue to adversely affect makers' ability to meet consumer demands until the end of 2021. From a report: The interview also gets into the effects of the Covid-19 pandemic on AMD, with Su also mentioning the chipmaker's plans after its $35 billion purchase of Xilinx. Su discussed how tightly pressed supply chains have been, stating that over the past 12 months, "demand had far exceeded even our aggressive expectations." This is an understatement as consumers have felt the impact of supply shortages in the scarce availability of everything from graphics cards to CPUs.
Open Source

Linux 5.13 Kernel Released, Includes Apple M1 Support, Clang CFI, and Landlock's Linux Security Module (phoronix.com) 33

"Linus Torvalds has just released the Linux 5.13 kernel as stable," reports Phoronix: Linux 5.13 brings initial but still early support for the Apple M1 with basic support but not yet accelerated graphics and a lot more to iron out moving ahead. There are also new Linux 5.13 security features like the Landlock security module, Clang control flow integrity support, and optionally randomizing the kernel stack offset at each system call. There is also AMD fun this cycle around FreeSync HDMI support, initial Aldebaran bring-up, and more. Intel has more work on Alder Lake, a new cooling driver, and more discrete graphics bring-up. There are also other changes for Linux 5.13 around faster IO_uring, a generic USB display driver, and other new hardware enablement.
"5.13 overall is actually fairly large," Linus Torvalds posted on the Linux Kernel Mailing List, calling it "one of the bigger 5.x releases, with over 16,000 commits (over 17k if you count merges), from over 2,000 developers. But it's a "big all over" kind of thing, not something particular that stands out as particularly unusual..."
Data Storage

Xbox's DirectStorage API Will Speed Up Gaming PCs On Windows 11 Only (pcgamesn.com) 93

An anonymous reader quotes a report from PCGamesN: Microsoft has finally debuted Windows 11, and it's not just packing auto HDR and native Android apps. The long-teased DirectStorage API that's meant to cut down loading times on gaming PCs much in the same way the Xbox Velocity Architecture speeds things up on Microsoft's consoles is on its way, and it won't be coming to Windows 10 like we originally thought. The Windows 11 exclusive feature improves communication between your storage device and graphics card, allowing assets to load quicker without having to pass through the CPU first. Naturally, this means more time spent gaming and less time reading the same hints as you move from area to area.

It'll work best with systems that are dubbed 'DirectStorage Optimized', containing the right hardware and drivers for the job. If you're more of the DIY type that prefers to build the best gaming PC yourself, requirements demand an NVMe SSD with 1TB of storage or more. PCIe 4.0 NVMe SSDs and the latest GPUs from Nvidia and AMD will offer a better experience, but DirectStorage will still work with older standards like the third generation PCIe 3.0 -- you won't have much luck with 2.5-inch SATA drives, though. DirectStorage will only work with games built using DirectX 12, so there's no telling how many titles will support the feature when you upgrade to Windows 11 for free later this year.

Open Source

Ubuntu-maker Canonical Will Support Open Source Blender on Windows, Mac, and Linux (betanews.com) 24

An anonymous reader shares a report: Blender is one of the most important open source projects, as the 3D graphics application suite is used by countless people at home, for business, and in education. The software can be used on many platforms, such as Windows, Mac, and of course, Linux. Today, Ubuntu-maker Canonical announces it will offer paid enterprise support for Blender LTS. Surprisingly, this support will not only be for Ubuntu users. Heck, it isn't even limited to Linux installations. Actually, Canonical will offer this support to Blender LTS users on Windows, Mac, and Linux.
Graphics

Open Source AMD FidelityFX Super Resolution Impresses In PC Game Tests (hothardware.com) 35

MojoKid writes: AMD's FidelityFX Super Resolution (FSR) PC graphics up-scaling technology is ready for prime-time and the company has allowed members of the press to showcase performance and visuals of the tech in action with a number of game engines. AMD FidelityFX Super Resolution is vendor-agnostic and doesn't require specialized hardware to function like NVIDIA DLSS, which relies on Tensor cores on-board NVIDIA Turing or Ampere GPUs to accelerate neural network models that have been specifically trained on game engines. In contrast, AMD FSR utilizes more traditional spatial upscaling to create a super resolution image from a single input frame, not multiple frames. AMD FSR then employs a library of open-source algorithms that work on sharpening both image edge and texture detail. In game testing at HotHardware, frame rates can jump dramatically with little to no perceptible reduction in image quality, and the technology even works on many NVIDIA GPUs as well. There are currently 19 titles that are available or planned with support for AMD FSR, but with the open nature of the technology and cross-GPU compatibility, game developers theoretically should have significant incentive for adoption to breath new performance into their game titles.
Portables (Apple)

Apple Developing a Whole New Kind of MacBook Air (macrumors.com) 174

Bloomberg's Mark Gurman, Apple analyst Ming-Chi Kuo, and leaker Jon Prosser say Apple is working on a completely new, high-end version of the MacBook Air. MacRumors reports: The high-end MacBook Air will feature two USB-C ports and a more powerful Apple silicon chip, according to Gurman. The chip will apparently be a direct successor to Apple's M1 chip, featuring the same number of computing cores, but it will run faster. It is also expected to see an increase in graphics cores from seven or eight to nine or 10. This high-end MacBook Air would sit above the current MacBook Air models with the M1 chip, but below the MacBook Pro. Prosser recently unveiled renders that purport to depict the next-generation MacBook Air based on leaked images.

Prosser says that the MacBook Air will be available in a range of color options, much like the 24-inch iMac, and will feature larger function keys, a smaller trackpad, and redesigned feet on the underside of the machine. The biggest change in terms of design from the current MacBook Air seems to be the loss of its iconic tapered design. Instead, the MacBook Air will become considerably thinner as a whole, Prosser explained. Other features rumored to be coming to next-generation MacBook Air models include a mini-LED display and a MagSafe charging port. Gurman believes that the high-end MacBook Air could launch in the second half of this year at the earliest or in 2022, a timeframe that has been echoed by Prosser.

AMD

Falling GPU Pricing in Europe Suggests Shortage Is Easing 20

According to ComputerBase, graphics card prices have begun to drop as much as 50% in Europe. From a report: Availability has also improved significantly, with sales of most GPU models from both AMD and Nvidia doubling month-over-month. This report comes on the heels of ASRock, a GPU maker, noting that GPU pricing is easing as demand from Chinese cryptocurrency miners wanes. More budget-oriented cards like the Nvidia GeForce RTX 3060 and AMD Radeon RX 6700 XT are seeing the most positive results, with a near 50% drop in price compared to last month. For flagship cards like the RTX 3080 and RX 6800 XT, however, prices haven't moved as much. They have dipped a respectable 10-15% which is still a very positive change considering the shortage issues plaguing the technology industry. In the United States, GPU pricing is slowly catching up to Europe, but it's still going down nonetheless.
E3

Low-Budget Games Steal Spotlight After Covid Delays Big Names (bloomberg.com) 42

The annual video game convention E3 is normally full of teasers for splashy, graphic-rich games from big-name studios and surprise announcements about new titles. But this year's online-only event was much quieter, with many hot releases delayed as a result of the pandemic. That gave games from independent studios a chance to steal the show. From a report: Some of the most impressive reveals this year were small-scale, indie games that may not have the wow factor of something like Ubisoft Entertainment SA's Assassin's Creed but appealed to fans with interesting story lines, quirky graphics or unusual gameplay. Highlights included Replaced, a gorgeous cyberpunk-themed action game and debut title from Sad Cat Studios, and Twelve Minutes, in which players must break a time loop full of betrayal and murder. The game, from a division of film company Annapurna Pictures, stars Daisy Ridley and Willem Dafoe. Entries like these delighted fans and showcased the breadth of possibilities of video games.

Most years, E3 takes place in Los Angeles, where fans and industry professionals convene at the convention center to play demos and watch trailers for the hottest new games. Commercials and giant posters from expensive series like Call of Duty compete for attendees' eyeballs, and fans come away excited about what's coming in the fall. This year, while there will be Microsoft's Halo Infinite, promised in time for the holidays after a year's delay, Nintendo's highly anticipated next game in the Zelda series won't come until next year. Same with Elden Ring, a much-hyped dark fantasy based on the book that inspired Game of Thrones. Fans didn't seem to mind, and left the show raving instead about Tunic, a Zelda-inspired action-adventure game starring a small fox developed by Canadian creator Andrew Shouldice, and Neko Ghost, Jump, a platforming game from Burgos Games, in which you can shift between 2D and 3D perspectives.

This explosion of independent games, which are usually made by small teams that aren't funded by multi-billion-dollar corporations like Electronic Arts, or Activision Blizzard, is a relatively recent phenomenon. Until the late 2000s, developers mostly had to partner with big publishers to get their games to audiences. The rise of digital distribution on PCs and consoles combined with the increased accessibility of game-making tools such as the Unity Engine have made it easy for solo developers, or two or three people working in a garage, to release successful games on their own. Some companies, such as Annapurna Interactive and Devolver Digital, have thrived as independent publishers, partnering with developers to release exclusively small, creative games.

Slashdot Top Deals