Graphics

Was Flash Responsible For 'The Internet's Most Creative Era'? (vice.com) 72

A new article this week on Motherboard argues that Flash "is responsible for the internet's most creative era," citing a new 640-page book by Rob Ford on the evolution of web design.

[O]ne could argue that the web has actually gotten less creative over time, not more. This interpretation of events is a key underpinning of Web Design: The Evolution of the Digital World 1990-Today (Taschen, $50), a new visual-heavy book from author Rob Ford and editor Julius Wiedemann that does something that hasn't been done on the broader internet in quite a long time: It praises the use of Flash as a creative tool, rather than a bloated malware vessel, and laments the ways that visual convention, technical shifts, and walled gardens have started to rein in much of this unvarnished creativity.

This is a realm where small agencies supporting big brands, creative experimenters with nothing to lose, and teenage hobbyists could stand out simply by being willing to try something risky. It was a canvas with a built-in distribution model. What wasn't to like, besides a whole host of malware?

The book's author tells Motherboard that "Without the rebels we'd still be looking at static websites with gray text and blue hyperlinks." But instead we got wild experiments like Burger King's "Subservient Chicken" site or the interactive "Wilderness Downtown" site coded by Google.

There were also entire cartoon series like Radiskull and Devil Doll or Zombie College -- not to mention games like "A Murder of Scarecrows" or the laughably unpredictible animutations of 14-year-old Neil Cicierega. But Ford tells Motherboard that today, many of the wild ideas have moved from the web to augmented reality and other "physical mediums... The rise in interactive installations, AR, and experiential in general is where the excitement of the early days is finally happening again."

Motherboard calls the book "a fitting coda for a kind of digital creativity that -- like Geocities and MySpace pages, multimedia CD-ROMs, and Prodigy graphical interfaces before it -- has faded in prominence."
Graphics

Wired Remembers the Glory Days of Flash (wired.co.uk) 95

Wired recently remembered Flash as "the annoying plugin" that transformed the web "into a cacophony of noise, colour, and controversy, presaging the modern web."

They write that its early popularity in the mid-1990s came in part because "Microsoft needed software capable of showing video on their website, MSN.com, then the default homepage of every Internet Explorer user." But Flash allowed anyone to become an animator. (One Disney artist tells them that Flash could do in three days what would take a professional animator 7 months -- and cost $10,000.)

Their article opens in 2008, a golden age when Flash was installed on 98% of desktops -- then looks back on its impact: The online world Flash entered was largely static. Blinking GIFs delivered the majority of online movement. Constructed in early HTML and CSS, websites lifted clumsily from the metaphors of magazine design: boxy and grid-like, they sported borders and sidebars and little clickable numbers to flick through their pages (the horror).

Flash changed all that. It transformed the look of the web...

Some of these websites were, to put it succinctly, absolute trash. Flash was applied enthusiastically and inappropriately. The gratuitous animation of restaurant websites was particularly grievous -- kitsch abominations, these could feature thumping bass music and teleporting ingredients. Ishkur's 'guide to electronic music' is a notable example from the era you can still view -- a chaos of pop arty lines and bubbles and audio samples, it looks like the mind map of a naughty child...

In contrast to the web's modern, business-like aesthetic, there is something bizarre, almost sentimental, about billion-dollar multinationals producing websites in line with Flash's worst excess: long loading times, gaudy cartoonish graphics, intrusive sound and incomprehensible purpose... "Back in 2007, you could be making Flash games and actually be making a living," remembers Newgrounds founder Tom Fulp, when asked about Flash's golden age. "That was a really fun time, because that's kind of what everyone's dream is: to make the games you want and be able to make a living off it."

Wired summarizes Steve Jobs' "brutally candid" diatribe against Flash in 2010. "Flash drained batteries. It ran slow. It was a security nightmare. He asserted that an era had come to an end... '[T]he mobile era is about low power devices, touch interfaces and open web standards -- all areas where Flash falls short.'" Wired also argues that "It was economically viable for him to rubbish Flash -- he wanted to encourage people to create native games for iOS."

But they also write that today, "The post-Flash internet looks different. The software's downfall precipitated the rise of a new aesthetic...one moulded by the specifications of the smartphone and the growth of social media," favoring hits of information rather than striving for more immersive, movie-emulating thrills.

And they add that though Newgrounds long-ago moved away from Flash, the site's founder is now working on a Flash emulator to keep all that early classic content playable in a browser.
Graphics

NVIDIA's Job Listings Reveal 'Game Remastering' Studio, New Interest In RISC-V (forbes.com) 40

An anonymous reader quotes Forbes: Nvidia has a lot riding on the success of its GeForce RTX cards. The Santa Clara, California company is beating the real-time ray tracing drum loudly, adamant on being known as a champion of the technology before AMD steals some of its thunder next year with the PlayStation 5 and its own inevitable release of ray-tracing enabled PC graphics cards.

Nvidia has shown that, with ray tracing, it can breathe new life into a decades-old PC shooter like id Software's Quake 2, so why not dedicate an entire game studio to remastering timeless PC classics? A new job listing spotted by DSOGaming confirms that's exactly what Nvidia is cooking up.

The ad says NVIDIA's new game remastering program is "cherry-picking some of the greatest titles from the past decades and bringing them into the ray tracing age, giving them state-of-the-art visuals while keeping the gameplay that made them great." (And it adds that the initiative is "starting with a title that you know and love but we can't talk about here!")

Meanwhile, a China-based industry watcher on Medium reports that "six RISC-V positions have been advertised by NVIDIA, based in Shanghai and pertaining to architecture, design, and verification."
Intel

Intel Kills Kaby Lake G, Vows To Offer Drivers For Five Years (pcworld.com) 16

When Kaby Lake G debuted at CES 2018, it made a big bang. No one expected sworn rivals Intel and AMD to collaborate on a CPU package, marrying a 7th-gen Kaby Lake CPU with a unique AMD Radeon RX Vega GPU. But what began with a bang ended Monday with an unceremonious memo. From a report: The Product Change Notification published by Intel on Monday confirmed that pretty much every single Kaby Lake G, including the Core i7-8706G, the Core i7-8705G, and the Core i5-8305G, would be discontinued. Last call for orders will be on January 17, 2020, and the final shipments are scheduled for July 31, 2020. While the end of life of a processor isn't typically a big deal to consumers who own them, one sticking point could have been driver support. Specifically, Kaby Lake G drivers for the custom AMD Radeon RX Vega M graphics come only from Intel. With a normal discrete GPU, the consumer would download drivers from the original company, such as Nvidia or AMD. With Kaby Lake G kaput, where does that leave Kaby Lake G-owners? Intel said the company will follow its standard policy and provide driver support for Kaby Lake G for five years from the launch of the product. All told, that probably means another 3.5 years of driver updates.
Graphics

You Can Now Overclock a Raspberry Pi 4 For Some Nice Performance Gains (hothardware.com) 93

MojoKid writes: The Raspberry Pi 4 is one of the cheapest single-board computers around. The new 4th generation is a solid performance lift over its predecessor and good bang for the buck if you're interested in learning Linux, working with embedded computing, or just want to kick back and play some retro games on an emulator. In addition, the latest version of the Raspberry Pi Foundation's Linux distribution, Raspbian Buster, comes with a new firmware revision for the tiny DIY PC that removes its 2GHz clock speed limit and allows voltage adjustments to wring out additional performance, with proper cooling of course. In testing, while there are no guarantees in overclocking, HotHardware was able to realize more than a 40% lift in their Raspberry Pi 4's processor clock speed, and a 50% boost to the GPU with an air-cooled mini case kit. Combined, they're not enough to turn the Pi 4 into your every day PC, but the performance gains are measurable and valuable. All it takes is a quick firmware update and a couple of Linux commands to dial things in.
Graphics

Adobe Is Deactivating All Venezuelan Accounts To Comply With US Sanctions (theverge.com) 262

An anonymous reader quotes a report from The Verge: Adobe is shutting down service for users in Venezuela in order to comply with a U.S. executive order issued in August that prohibits trade with the country. The company sent out an email to customers in Venezuela today to let them know their accounts would be deactivated, and posted a support document further explaining the decision. In the document, Adobe explains: "The U.S. Government issued Executive Order 13884, the practical effect of which is to prohibit almost all transactions and services between U.S. companies, entities, and individuals in Venezuela. To remain compliant with this order, Adobe is deactivating all accounts in Venezuela."

Users will have until October 28th to download any content stored in their accounts, and will lose access the next day. To make matters worse, customers won't be able to receive refunds for any purchases or outstanding subscriptions, as Adobe says that the executive order calls for "the cessation of all activity with the entities including no sales, service, support, refunds, credits, etc."

Businesses

Intel Announces Price Cut for 9th Generation F and KF Processors (anandtech.com) 30

An anonymous reader shares a report: One of the interesting developments of Intel's 9th Generation Core processors for desktops, known as the S-series, was that the company decided to release versions of the hardware with the graphics disabled in order to use every chip from the wafer. At the time Intel was criticised on its pricing: it was offering the same processor minus graphics for the same bulk unit cost, with no discount. Today Intel is adjusting its strategy, and pricing these F and KF processors lower than before. Nearly every 9th Generation Core processor for the desktop has a corresponding graphics-free option: the Core i9-9900K has its Core i9-9900KF, the Core i5-9500 has a Core i5-9500F. The difference between these two parts is just a matter of disabled graphics, which means the user can't take advantage of Intel's QuickSync or a display, however most of these processors end up in systems with discrete graphics cards anyway. At the time of launch, Intel priced them identically to the parts that did have graphics, but ultimately retail outlets were selling the K and KF processors at a small discount. Intel's announcement today makes that price difference official.
Microsoft

Microsoft Will Model the Entire Planet For 'Breathtakingly Lifelike' New Flight Simulator (eaa.org) 84

A senior editor at the Experimental Aircraft Association tells the long and storied history of Microsoft's Flight Simulator, remembering how he'd used version 1.0 of the product "when I was about 12 years old (nearly 40 years ago)" before working on it when he was a Microsoft employee for more than 10 years, until it was cancelled in 2009. But in 2020 Microsoft now plans to release a stunningly-realistic new version for the PC and Xbox.

Long-time Slashdot reader ShoulderOfOrion shared their report: After the shutdown, variations of the product lived on here and there, including the enterprise edition, which Lockheed Martin now develops and publishes as Prepar3D, and a version that was licensed by Dovetail Games in the United Kingdom and sold on the Steam marketplace. Dovetail pursued further development with a product called Flight Sim World, and Microsoft itself briefly returned to the genre in 2012 with a limited product called Flight. But it was the community of hardcore simmers and add-on developers who truly kept the product alive for the past 10 years.
The essay describes the new version as "stunning" and "breathtakingly lifelike," using 2 petabytes of data to virtually model the entire planet, "including something like 40,000 airports... The scenery is built on Bing satellite and aerial imagery, augmented with cool buzzwordy stuff like photogrammetric 3D modeling and multiple other data sources, all of which is streamed via Microsoft's Azure cloud service... Throw in 1.5 trillion trees, individual blades of grass modeled in 3D, and a complete overhaul of lighting and shadows, and the result is an unprecedented level of detail for a flight simulator of any kind."

The simulator also features realistic modelling of the weather, including temperature, air pressure, humidity, dew point, wind direction and speed, and of course, clouds and precipitation. "You'll even see rainbows when conditions are just right... Weather is automatically downloaded from real-world sources, creating accurate conditions that change over time." (Though there's a drop-down menu that finally lets you do something about the weather.) And that's just the beginning...

Microsoft is incorporating a legacy mode that it expects will provide near-complete backward compatibility, so those of us who have huge libraries of old favorites won't be starting entirely from scratch. In addition, Microsoft is committed to providing a software development kit (SDK) with the product at launch that will give developers the tools they need to build add-ons, though they caution that it is something that will be polished and expanded through post-launch updates. In other news for add-on aircraft builders, every parameter is now exposed in plain text, with no more binaries. This means it's going to be easier than ever to create high-quality add-on aircraft, or to tinker with the ones you already have. For those who like emulating glass cockpits, those displays are fully programmable based on straightforward coding instead of a library of animations, and support things like touch screens and synthetic vision. While the team is currently evaluating something like an in-sim store for supplemental content, there will be no requirements to use it, and no restrictions of any kind on downloading freeware or payware add-ons from other sources.
The article includes some fond thoughts from the software's director of technology Jorg Neumann explaining the simulator's significance. "It is in the fiber of the company's being. It is older than Windows.

"I think there is a pride that comes with it, and I think seeing it come back in a meaningful way, I think makes lots of people proud."
Microsoft

Microsoft Unveils Surface Pro 7 and Surface Pro X (venturebeat.com) 41

At an event today, where Microsoft announced the Surface Laptop 3, Windows 10X, and an Android smartphone, the company also unveiled refreshed editions of its laptop-tablet hybrids: the Surface Pro 7, and the Surface Pro X. About the Surface Pro 7, which features a USB-C port: The price tag has also changed slightly: The Surface Pro 7 starts at $749 ($150 less than its predecessor). It's available for preorder today and ships on October 22. Microsoft has simply replaced the Mini DisplayPort with USB-C. There is still a USB-A port for all your existing accessories. Adding a USB-C port finally puts the Surface Pro on par with the Surface Book 2 of two years ago and last year's Surface Go. Surface fans have long asked for USB-C ports and Microsoft has been very slowly delivering. Surface Pro 7 comes with 10th-generation Intel Core processors (upgradeable all the way up to quad-core) and starts at 128GB of SSD storage (upgradable to 1TB). Like its predecessor, the Surface Pro 7 still comes with 4GB, 8GB, or 16GB of RAM. Otherwise, the design is largely unchanged. The Surface Pro 7 still has a 12.3-inch display, 2736 x1824 resolution, and 267ppi. The Surface Pro 6 was available in black and silver, and so is the Surface Pro 7. About the Surface Pro X: Seattle tech giant unveiled the Surface Pro X, the spiritual successor to the Surface, the Surface 2, the Surface 3, and the Surface Go. It's ultra-slim and lightweight, with a bezel-to-bezel 13-inch display and an adjustable kickstand. And it's the first machine to ship with a custom-designed, ARM-based Microsoft SQ1 system-on-chip co-engineered with Qualcomm. The Surface Pro X will be available on November 5, starting at $999, and Microsoft will begin taking preorders today.

On the display front, you're looking at a PixelSense panel with 2880 x 1920 resolution with a 267-pixel-per-inch screen density and a 1400:1 contrast ratio. Microsoft says it has the thinnest bezels of any 2-in-1. Under the hood, the Surface Pro X sports the aforementioned 7-nanometer SQ1, which Microsoft says delivers more performance per watt than the chip in the Surface Pro 6. It's an octa-core processor Qualcomm-designed Kryo cores clocked at 3GHz and running at 7 watts maximum, sitting alongside a redesigned GPU and integrated AI accelerator. Altogether, it delivers 9 teraflops of computational power, with the graphics chip alone pushing 2.1 teraflops.

Facebook

Facebook To Create Virtual Reality Social Media World Called Horizon (bbc.com) 58

dryriver shares a report from the BBC: Facebook is creating an immersive environment called Horizon to tempt people into spending more time in virtual reality. The VR app will be a mix of social places where users can mingle and chat, and other areas where they can play games against each other. People will inhabit and explore the virtual spaces via a cartoon avatar. The app will be made available and tested in early 2020, by a small group of Facebook users. Details about Horizon and early footage of the virtual space were shown off at Facebook's Oculus Connect 6 developer conference this week. Facebook said anyone using Horizon would be able to call on human "guides" to help them navigate and become more familiar with the virtual environment. The guides will not be "moderators" who will police behavior, said Facebook. It added that it would include tools that let people manage how they interact with other users. It will also have options that let people shape and build their own part of the environment. They will also be able to design their own avatars. The entire space has been given a cartoon-like feel as it is intended to be used on Facebook's Oculus Quest headset, which does not have the high resolution graphics of PC-linked headsets.

Sam Machkovech, a reporter for Ars Technica, who has tried Horizon, said Facebook had put "a ton of work" into the version he saw, to make it as welcoming as possible. But he noted that Horizon was "yet another" combination of apps, chat and avatars which Facebook had produced in just a few years. He suggested that it was still searching for a good combination that proved properly tempting to users. "We're still waiting for Facebook to inspire confidence that it will launch a social-VR app and stick with it for more than two years," he wrote. Anyone interested in joining Horizon can sign up to be an early tester.
You can watch the strange YouTube pre-rendered CGI ad for Facebook Horizon here.
Graphics

Ask Slashdot: Why Doesn't the Internet In 2019 Use More Interactive 3D? 153

dryriver writes: For the benefit of those who are not much into 3D technologies, as far back as the year 2000 and even earlier, there was excitement about "Web3D" -- interactive 3D content embedded in HTML webpages, using technologies like VRML and ShockWave 3D. 2D vector-based Flash and Flash animation was a big deal back then. Very popular with internet users. The more powerful but less installed ShockWave browser plugin -- also made by Macromedia -- got a fairly capable DirectX 7/OpenGL-based realtime 3D engine developed by Intel Labs around 2001 that could put 3D games, 3D product configurators and VR-style building/environment walkthroughs into an HTML page, and also go full-screen on demand. There were significant problems on the hardware side -- 20 years ago, not every PC or Mac connected to the internet had a decently capable 3D GPU by a long shot. But the 3D software technology was there, it was promising even then, and somehow it died -- ShockWave3D was neglected and killed off by Adobe shortly after they bought Macromedia, and VRML died pretty much on its own.

Now we are in 2019. Mobile devices like smartphones and tablets, PCs/Macs as well as Game Consoles have powerful 3D GPUs in them that could render great interactive 3D experiences in a web browser. The hardware is there, but 99% of the internet today is in flat 2D. Why is this? Why do tens of millions of gamers spend hours in 3D game worlds every day, and even the websites that cater to this "3D loving" demographic use nothing but text, 2D JPEGs and 2D YouTube videos on their webpages? Everything 3D -- 3D software, 3D hardware, 3D programming and scripting languages -- is far more evolved than it was around 2000. And yet there appears to be very little interest in putting interactive 3D anything into webpages. What causes this? Do people want to go into the 2020s with a 2D-based internet? Is the future of the internet text, 2D images, and streaming 2D videos?
Iphone

Apple Launches iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max (theverge.com) 91

Apple today unveiled the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max, its new smartphone lineup. While the 11 is the cheaper alternative following the iPhone XR -- there are a few design changes, like a "surgical-grade stainless steel" case and matte finish, but the iPhone 11 Pro and Pro Max are more focused on cramming in as much power as possible. About the iPhone 11: Like last year's model, the iPhone 11 includes a 6.1-inch display, and the design is almost identical to last year, too, with the notch at the front for the Face ID camera. Apple is adding new color options, with purple, white, green, yellow, black, and red all available. Apple's biggest design changes are in the camera at the rear of the device. Last year's iPhone XR had a single 12-megapixel wide-angle camera, but the iPhone 11 now includes a dual-camera system with an additional 12-megapixel ultra-wide camera that supports 2x optical zoom. There's even a new immersive camera interface that lets you see outside the frame, so you can see the details of the photos you're taking with the ultra-wide camera. [...] Inside the iPhone 11 is Apple's latest A13 Bionic processor, and naturally it's the "fastest CPU in a smartphone" and also the "fastest GPU in a smartphone." Apple demonstrated the performance on stage with a game called Pascal's Wager, which is launching on the App Store next month with some pretty impressive looking mobile graphics. Other than the gaming demo, Apple didn't reveal any additional performance improvements with the A13. It starts at $699. The 5.8-inch iPhone 11 Pro and 6.5-inch iPhone 11 Pro Max: Despite the number change, the two phones look pretty similar to last year's iPhone XS and iPhone XS Max, but with one major change: a third rear camera. Apple's also upgraded the display to a new OLED panel, which goes up to an even brighter 1,200 nits, a 2 million to 1 contrast ratio, and is 15 percent more energy efficient. Apple calls it a Super Retina XDR display (similar branding to the Pro Display XDR that the company announced earlier this year). Apple also claims that the glass here is the "toughest glass in a smartphone." Just like the standard iPhone 11, the new iPhone 11 Pro models will feature Apple's A13 Bionic chip which Apple says has both the fastest CPU and GPU ever in a smartphone. Apple also touted improved machine learning performance ("the best machine learning platform in a smartphone," it says).

Apple says that with all the improvements to efficiency, the 5.8-inch iPhone 11 Pro should get up to four hours better battery life than last year's XS, and the larger iPhone 11 Pro Max will get up to five hours better battery than the XS Max. The new camera system is one of the standout upgrades (quite literally, as it dominates the back of the phone in a gigantic square camera module). The new lens is a 12-megapixel ultra-wide lens with a 120-degree field of view, joining the wide-angle and telephoto cameras Apple has offered in the past. The telephoto camera also is getting an upgrade with a larger Æ'/2.0 aperture, which Apple says will capture up to 40 percent more light compared to the XS camera. And like the iPhone 11, the front-facing camera is now a 12 megapixel sensor, and can shoot both 4K and slow-motion videos.
The iPhone 11 Pro will start at $999, and the iPhone 11 Pro Max will start at $1199.
Supercomputing

University of Texas Announces Fastest Academic Supercomputer In the World (utexas.edu) 31

On Tuesday the University of Texas at Texas launched the fastest supercomputer at any academic facility in the world.

The computer -- named "Frontera" -- is also the fifth most-powerful supercomputer on earth. Slashdot reader aarondubrow quotes their announcement: The Texas Advanced Computing Center (TACC) at The University of Texas is also home to Stampede2, the second fastest supercomputer at any American university. The launch of Frontera solidifies UT Austin among the world's academic leaders in this realm...

Joined by representatives from the National Science Foundation (NSF) -- which funded the system with a $60 million award -- UT Austin, and technology partners Dell Technologies, Intel, Mellanox Technologies, DataDirect Networks, NVIDIA, IBM, CoolIT and Green Revolution Cooling, TACC inaugurated a new era of academic supercomputing with a resource that will help the nation's top researchers explore science at the largest scale and make the next generation of discoveries.

"Scientific challenges demand computing and data at the largest and most complex scales possible. That's what Frontera is all about," said Jim Kurose, assistant director for Computer and Information Science and Engineering at NSF. "Frontera's leadership-class computing capability will support the most computationally challenging science applications that U.S. scientists are working on today."

Frontera has been supporting science applications since June and has already enabled more than three dozen teams to conduct research on a range of topics from black hole physics to climate modeling to drug design, employing simulation, data analysis, and artificial intelligence at a scale not previously possible.

Here's more technical details from the announcement about just how fast this supercomputer really is.
Amiga

Ask Slashdot: What Would Computing Look Like Today If the Amiga Had Survived? 221

dryriver writes: The Amiga was a remarkable machine at the time it was released -- 1985. It had a multitasking capable GUI-driven OS and a mouse. It had a number of cleverly designed custom chips that gave the Amiga amazing graphics and sound capabilities far beyond the typical IBM/DOS PCs of its time. The Amiga was the multimedia beast of its time -- you could create animated and still 2D or 3D graphics on it, compose sophisticated electronic music, develop 2D or 3D 16-Bit games, edit and process digital video (using Video Toaster), and of course, play some amazing games. And after the Amiga -- as well as the Atari ST, Archimedes and so on -- died, everybody pretty much had to migrate to either the PC or Mac platforms. If Commodore and the Amiga had survived and thrived, there might have been four major desktop platforms in use today: Windows, OSX, AmigaOS and Linux. And who knows what the custom chips (ASICs? FPGAs?) of an Amiga in 2019 might have been capable of -- Amiga could possibly have been the platform that makes nearly life-like games and VR/AR a reality, and given Nvidia and AMD's GPUs a run for their money.

What do you think the computing landscape in 2019 would have looked like if the Amiga and AmigaOS as a platform had survived? Would Macs be as popular with digital content creators as they are today? Would AAA games target Windows 7/8/10 by default or tilt more towards the Amiga? Could there have been an Amiga hardware-based game console? Might AmigaOS and Linux have had a symbiotic existence of sorts, with AmigOS co-existing with Linux on many enthusiast's Amigas, or even becoming compatible with each other over time?
AMD

New Stats Suggest Strong Sales For AMD (techspot.com) 32

Windows Central reports: AMD surpassed NVIDIA when it comes to total GPU shipments according to new data from Jon Peddie Research (via Tom's Hardware). This is the first time that AMD ranked above NVIDIA in total GPU shipments since Q3 of 2014. AMD now has a 17.2 percent market share compared to NVIDIA's 16 percent according to the most recent data. John Peddie Research also reports that "AMD's overall unit shipments increased 9.85% quarter-to-quarter."

AMD gained 2.4 percent market share over the last year while NVIDIA lost 1 percent. Much of AMD's growth came in the last quarter, in which AMD saw a difference of 1.5 percent compared to NVIDIA's 0.1 percent.

The Motley Fool points out that "NVIDIA doesn't sell CPUs, so this comparison isn't apples-to-apples."

But meanwhile, TechSpot reports: German hardware retailer Mindfactory has published their CPU sales and revenue figures, and they show that for the past year AMD had sold slightly more units than Intel -- until Ryzen 3000 arrived. When the new hardware launched in July, AMD's sales volume doubled and their revenue tripled, going from 68% to 79% volume market share and 52% to 75% revenue share -- this is for a single major PC hardware retailer in Germany -- but the breakdown is very interesting to watch nonetheless...

Full disclaimer: German markets have historically been more biased towards Ryzen than American ones, and AMD's sales will fall a bit before stabilizing, while Intel's appear to have already plateaued.

Businesses

Ask Slashdot: Who Are the 'Steve Wozniaks' of the 21st Century? 155

dryriver writes: There are some computer engineers -- working in software or hardware, or both -- who were true pioneers. Steve Wozniak needs no introduction. Neither do Alan Turing, Ada Lovelace or Charles Babbage. Gordon Moore and Robert Noyce started Intel decades ago. John Carmack of Doom is a legend in realtime 3D graphics coding. Aleksey Pajitnov created Tetris. Akihiro Yokoi and Aki Maita invented the Tamagotchi. Jaron Lanier is the father of VR. Palmer Luckey hacked together the first Oculus Rift VR headset in his parent's garage in 2011. To the question: Who in your opinion are the 21st Century "Steve Wozniaks," working in either hardware or software, or both?
Science

Graphics That Seem Clear Can Easily Be Misread (scientificamerican.com) 54

An anonymous reader shares a report: "A picture is worth a thousand words." That saying leads us to believe that we can readily interpret a chart correctly. But charts are visual arguments, and they are easy to misunderstand if we do not pay close attention. Alberto Cairo, chair of visual journalism at the University of Miami, reveals pitfalls in an example diagrammed here. Learning how to better read graphics can help us navigate a world in which truth may be hidden or twisted. Say that you are obese, and you've grown tired of family, friends and your doctor telling you that obesity may increase your risk for diabetes, heart disease, even cancer -- all of which could shorten your life. One day you see this chart. Suddenly you feel better because it shows that, in general, the more obese people a country has (right side of chart), the higher the life expectancy (top of chart). Therefore, obese people must live longer, you think. After all, the correlation (red line) is quite strong.

The chart itself is not incorrect. But it doesn't really show that the more obese people are, the longer they live. A more thorough description would be: "At the national level -- country by country -- there is a positive association between obesity rates and life expectancy at birth, and vice versa." Still, this does not mean that a positive association will hold at the local or individual level or that there is a causal link. Two fallacies are involved. First, a pattern in aggregated data can disappear or even reverse once you explore the numbers at different levels of detail. If the countries are split by income levels, the strong positive correlation becomes much weaker as income rises. In the highest-income nations (chart on bottom right), the association is negative (higher obesity rates mean lower life expectancy). The pattern remains negative when you look at the U.S., state by state: life expectancy at birth drops as obesity rises. Yet this hides the second fallacy: the negative association can be affected by many other factors. Exercise and access to health care, for example, are associated with life expectancy. So is income.

Intel

Intel's Line of Notebook CPUs Gets More Confusing With 14nm Comet Lake (arstechnica.com) 62

Intel today launched a new series of 14nm notebook CPUs code-named Comet Lake. Going by Intel's numbers, Comet Lake looks like a competent upgrade to its predecessor Whiskey Lake. The interesting question -- and one largely left unanswered by Intel -- is why the company has decided to launch a new line of 14nm notebook CPUs less than a month after launching Ice Lake, its first 10nm notebook CPUs. From a report: Both the Comet Lake and Ice Lake notebook CPU lines this month consist of a full range of i3, i5, and i7 mobile CPUs in both high-power (U-series) and low-power (Y-series) variants. This adds up to a total of 19 Intel notebook CPU models released in August, and we expect to see a lot of follow-on confusion. During the briefing call, Intel executives did not want to respond to questions about differentiation between the Comet Lake and Ice Lake lines based on either performance or price, but the technical specs lead us to believe that Ice Lake is likely the far more attractive product line for most users.

Intel's U-series CPUs for both Comet Lake and Ice Lake operate at a nominal 15W TDP. Both lines also support a "Config Up" 25W TDP, which can be enabled by OEMs who choose to provide the cooling and battery resources necessary to support it. Things get more interesting for the lower-powered Y-series -- Ice Lake offers 9W/12W configurable TDP, but Comet Lake undercuts that to 7W/9W. This is already a significant drop in power budget, which Comet Lake takes even further by offering a new Config Down TDP, which is either 4.5W or 5.5W, depending on which model you're looking at. Comet Lake's biggest and meanest i7, the i7-10710U, sports 6 cores and 12 threads at a slightly higher boost clock rate than Ice Lake's 4C/8T i7-1068G7. However, the Comet Lake parts are still using the older UHD graphics chipset -- they don't get access to Ice Lake's shiny new Iris+, which offers up to triple the onboard graphics performance. This sharply limits the appeal of the Comet Lake i7 CPUs in any OEM design that doesn't include a separate Nvidia or Radeon GPU -- which would in turn bump the real-world power consumption and heat generation of such a system significantly.

The Internet

The Truth About Faster Internet: It's Not Worth It (wsj.com) 253

Americans are spending ever more for blazing internet speeds, on the promise that faster is better. Is that really the case? For most people, the answer is no. From a report: The Wall Street Journal studied the internet use of 53 of our journalists across the country, over a period of months, in coordination with researchers at Princeton University and the University of Chicago. Our panelists used only a fraction of their available bandwidth to watch streaming services including Netflix, Amazon Prime Video and YouTube, even simultaneously. Quality didn't improve much with higher speeds. Picture clarity was about the same. Videos didn't launch quicker. Broadband providers such as Comcast, Charter and AT&T are marketing speeds in the range of 250, 500 or even 1,000 megabits a second, often promising that streaming-video bingers will benefit. "Fast speeds for all of your shows," declares one online ad from Comcast. But for a typical household, the benefits of paying for more than 100 megabits a second are marginal at best, according to the researchers. That means many households are paying a premium for services they don't need.

To gauge how much bandwidth, or speed capacity, households need, it helps to look at an extreme scenario. Our users spent an evening streaming up to seven services simultaneously, including on-demand services like Netflix and live-TV services like Sling TV. We monitored the results. Peter Loftus, one of our panelists, lives outside Philadelphia and is a Comcast customer with a speed package of 150 megabits a second. Peter's median usage over 35 viewing minutes was 6.9 Mbps, 5% of the capacity he pays for. For the portion when all seven of his streams were going at once, he averaged 8.1 Mbps. At one point, for one second, Peter reached 65% of his capacity. Did his video launch faster or play more smoothly? Not really. The researchers said that to the extent there were differences in video quality such as picture resolution or the time it took to launch a show, they were marginal.

AI

Cerebras Systems Unveils a Record 1.2 Trillion Transistor Chip For AI (venturebeat.com) 67

An anonymous reader quotes a report from VentureBeat: New artificial intelligence company Cerebras Systems is unveiling the largest semiconductor chip ever built. The Cerebras Wafer Scale Engine has 1.2 trillion transistors, the basic on-off electronic switches that are the building blocks of silicon chips. Intel's first 4004 processor in 1971 had 2,300 transistors, and a recent Advanced Micro Devices processor has 32 billion transistors. Samsung has actually built a flash memory chip, the eUFS, with 2 trillion transistors. But the Cerebras chip is built for processing, and it boasts 400,000 cores on 42,225 square millimeters. It is 56.7 times larger than the largest Nvidia graphics processing unit, which measures 815 square millimeters and 21.1 billion transistors. The WSE also contains 3,000 times more high-speed, on-chip memory and has 10,000 times more memory bandwidth.

Slashdot Top Deals