×
Television

The BBC's 1992 TV Show About VR, 3D TVs With Glasses, and Holographic 3D Screens (youtu.be) 54

dryriver writes: 27 years ago, the BBC's "Tomorrow's World" show broadcasted this little gem of a program [currently available on YouTube]. After showing old Red-Cyan Anaglyph movies, Victorian Stereoscopes, lenticular-printed holograms and a monochrome laser hologram projected into a sheet of glass, the presenter shows off a stereoscopic 3D CRT computer display with active shutter glasses. The program then takes us to a laboratory at Massachusetts Institute Of Technology, where a supercomputer is feeding 3D wireframe graphics into the world's first glasses-free holographic 3D display prototype using a Tellurium Dioxide crystal. One of the researchers at the lab predicts that "years from now, advances in LCD technology may make this kind of display cheap enough to use in the home."

A presenter then shows a bulky plastic VR headset larger than an Oculus Rift and explains how VR will let you experience completely computer-generated worlds as if you are there. The presenter notes that 1992 VR headsets may be "too bulky" for the average user, and shows a mockup of much smaller VR glasses about the size of Magic Leap's AR glasses, noting that "these are already in development." What is astonishing about watching this 27-year-old TV broadcast is a) the realization that much of today's stereo stereo 3D tech was already around in some form or another in the early 1990s; b) VR headsets took an incredibly long time to reach the consumer and are still too bulky; and that c) almost three decades later, MIT's prototype holographic glasses-free 3D display technology never made its way into consumer hands or households.

Portables (Apple)

Walt Mossberg: Tim Cook's Apple Had a Great Decade But No New Blockbusters (theverge.com) 59

Veteran tech columnist, who retired two years ago, returns with one story to cap the end of the decade: Apple hasn't said how many Watches and AirPods it's sold, but they're widely believed to be the dominant players in each of their categories and, in the grand Apple tradition, the envy of competitors that scramble to ape them. Neither of these hardware successes has matched the impact or scale of Jobs' greatest hits. Even the iPad, despite annual unit sales that are sharply down from its heyday, generated almost as much revenue by itself in fiscal 2019 as the entire category of "wearables, home and accessories" where the Apple Watch and AirPods are slotted by Apple. [...] Cook does bear the responsibility for a series of actions that screwed up the Macintosh for years. The beloved mainstream MacBook Air was ignored for five years. At the other end of the scale, the Mac Pro, the mainstay of professional audio, graphics, and video producers, was first neglected then reissued in 2013 in a way that put form so far ahead of function that it enraged its customer base. Some insiders think Cook allowed Ive's design team far too much power and that the balance Jobs was able to strike between the designers and the engineers was gone, at least until Ive left the company earlier this year.

The design-first culture that took root under Cook struck again with the MacBook Pro, yielding new laptops so thin their keyboards were awful and featuring USB-C ports that required sleek Macs to be used with ugly dongles. Apple has only recently retreated back to decent keyboards on the latest MacBook Pro, and it issued a much more promising Mac Pro. But dongles are still a part of the Apple experience across its product lines. Cook's other success this decade was to nurture the iPhone along as smartphone sales first plateaued and then began to decline. The biggest change he made came in 2014, before the dip, when Apple introduced two new iPhone 6 models, which belatedly adopted big screens that Android phones had pioneered. Sales took off like a rocket, and there's been a big iPhone option every year since.

Graphics

Qualcomm To Offer GPU Driver Updates On Google Play Store For Some Snapdragon Chips (hothardware.com) 8

MojoKid writes: At its Snapdragon Summit in Maui, Hawaii this week, Qualcomm unveiled the new Snapdragon 865 Mobile Platform, which enable next year's flagship 5G Android phones with more performance, a stronger Tensor-based AI processor and a very interesting new forthcoming feature not yet offered for any smartphone platform to date. The company announced that it will eventually start delivering driver updates for its Adreno GPU engines on board the Snapdragon 865 as downloadable packages via the Google Play Store. This is big news for smartphones, as GPU drivers are rarely updated out of band, if ever, and typically have to wait for the next major Android release. And even then, many OEMs don't bother putting in the effort to ensure that mobile GPUs are running the most current graphics drivers from Qualcomm. The process, which would have to be pre-qualified by major OEMs as well, will be akin to what the PC GPU 3D graphics driver ecosystem has been benefiting from for a long time, for maximum performance and compatibility. Unfortunately, at least currently, GPU driver update support is limited to only the Adreno 650 core on board the new Snapdragon 865, which currently supports updating drivers in this fashion. Here's hoping this program is met with success and Qualcomm will begin to enable the feature for legacy and new midrange Snapdragon platforms as well.
Graphics

Ask Slashdot: How Much Faster Is an ASIC Than a Programmable GPU? 63

dryriver writes: When you run a real-time video processing algorithm on a GPU, you notice that some math functions execute very quickly on the GPU and some math functions take up a lot more processing time or cycles, slowing down the algorithm. If you were to implement that exact GPU algorithm as a dedicated ASIC hardware chip or perhaps on a beefy FPGA, what kind of speedup -- if any -- could you expect over a midrange GPU like a GTX 1070? Would hardwiring the same math operations as ASIC circuitry lead to a massive execution time speedup as some people claim -- e.g. 5x or 10x faster than a general purpose Nvidia GPU -- or are GPUs and ASICs close to each other in execution speed?

Bonus question: Is there a way to calculate the speed of an algorithm implemented as an ASIC chip without having an actual physical ASIC chip produced? Could you port the algorithm to, say, Verilog or similar languages and then use a software tool to calculate or predict how fast it would run if implemented as an ASIC with certain properties (clock speed, core count, manufacturing process... )?
Cloud

Google Addresses Complaints of Sub-4K Image Quality On Stadia (arstechnica.com) 44

An anonymous reader quotes a report from Ars Technica: Since March, Google has been promising that its streaming Stadia platform would be capable of full 4K, 60fps gameplay (for users with a robust Internet connection and $10/month Stadia Pro subscription). But technical analyses since launch have shown that some of the service's highest profile games aren't hitting that mark. A Digital Foundry analysis of Red Dead Redemption 2 on Stadia, for instance, found that the game actually runs at a native 2560x1440 resolution, which is then upscaled to the 4K standard of 4096x2160 via the Chromecast Ultra. And a Bungie representative said that the Stadia version of Destiny 2 runs at the PC equivalent of "medium" graphics settings and that the game will "render at a native 1080p and then upsample [to 4K] and apply a variety of techniques to increase the overall quality of effect."

Over the weekend, Google issued a statement to 9to5Google that essentially places the blame for this situation on Stadia developers themselves (emphasis added): "Stadia streams at 4K and 60fps -- and that includes all aspects of our graphics pipeline from game to screen: GPU, encoder, and Chromecast Ultra all outputting at 4K to 4K TVs, with the appropriate Internet connection. Developers making Stadia games work hard to deliver the best streaming experience for every game. Like you see on all platforms, this includes a variety of techniques to achieve the best overall quality. We give developers the freedom of how to achieve the best image quality and frame rate on Stadia, and we are impressed with what they have been able to achieve for day one. We expect that many developers can, and in most cases will, continue to improve their games on Stadia. And because Stadia lives in our data centers, developers are able to innovate quickly while delivering even better experiences directly to you without the need for game patches or downloads."

Transportation

Analysts, Gamers, and Blade Runner's Artistic Director React To The Look of Tesla's Cybertruck (businessinsider.com) 293

Syd Mead, the artistic director on Blade Runner says Tesla's new Cybertruck "has completely changed the vocabulary of the personal truck market design."

Or, for another perspective, "Tesla's Cybertruck looks weird... like, really weird," wrote Toni Sacconaghi, a senior equity research analyst at the global asset management firm AllianceBernstein. "Add a little bit of dirt, and you could even say it gives off a retro-future vibe a la Mad Max."

That's from a Market Insider article citing Wall Street analysts they say "aren't buying the futuristic design of Tesla's new electric pickup truck." For example, Dan Levy of Credit Suisse, who wrote "amid the radical design for Cybertruck, it's somewhat unclear to us who the core buyer will be." "We do not see this vehicle in its current form being a success," Jeffrey Osborne of Cowen wrote in a note on Friday, adding that he doesn't see the Tesla brand or the Cybertruck design "resonating with existing pickup truck owners...."

Still, the Cybertruck's design wasn't unanimously disliked by Wall Street. The design "will be a hit with the company's fanatic EV installed base globally as Musk & Co. are clearly thinking way out of the box on this model design," Dan Ives of Wedbush wrote in a Friday note....

[And] "While styling will always be subjective, we believe the unique and futuristic design will resonate with consumers, leading to solid demand," Jed Dorsheimer of Canaccord Genuity wrote in a Friday note.

The article also quotes Toni Sacconaghi of Bernstein as saying that the "really futuristic, like cyberpunk Blade Runner" design "is too bad, because its on-paper specs are insane."

But IGN reports there's another group commenting enthusiastically on the Cybertruck's looks: gamers. Unlike anything else we've seen from Musk's line of vehicles before, the Tesla truck resembles something you'd see in an old video game set in the future or sci-fi flick from the late '90s to the early 2000s.

Of course, gamers all over the internet couldn't help themselves from sharing images, making memes, and drawing comparisons to look-alikes we've seen in games, TV shows, and movies... According to the internet, the Tesla Cybertruck either hasn't finished rendering yet or is made of some very dated graphics. Either way, it takes us back to the days where we got to experience the famous low-poly Lara Croft.

Open Source

System76 Will Start Designing and Building Its Own Linux Laptops Beginning January 2020 (forbes.com) 24

An anonymous reader quotes a report from Forbes: Denver-based PC manufacturer and Pop!_OS Linux developer System76 plans to follow-up its custom Thelio desktop PC with an in-house laptop beginning next year, according to founder and CEO Carl Richell. During a recent interview, Richell was quick to emphasize that the entire process of designing, prototyping and iterating the final product could take two to three years. But the company is eager to break into this market and put the same signature "stamp" on its laptop hardware that graces its custom-built Thelio desktop.

System76 sells an extensive lineup of laptops, but the machines are designed by the likes of Sager and Clevo. The company doesn't merely buy a chassis and slap Pop!_OS on it, but Richell tells me he's confident that with the experience gained from developing Thelio -- and the recent investment into a factory at the company's Denver headquarters -- System76 is capable of building a laptop from the ground up that meets market needs and carries a unique value proposition. Richell says the company's first priority is locking down the aesthetic of the laptop and how various materials look and feel. It will simultaneously begin working on the supply chain aspects and speaking with various display and component manufacturers. System76 will design and build a U-class laptop first (basically an Ultrabook form factor like the existing Darter and Galago) and then evaluate what it might do with higher-end gaming and workstation notebooks with dedicated graphics.

Intel

Intel Unveils 7nm Ponte Vecchio GPU Architecture For Supercomputers and AI (hothardware.com) 28

MojoKid writes: Intel has unveiled its first discrete GPU solution that will hit the market in 2020, code name Ponte Vecchio. Based on 7nm silicon manufacturing and stack chiplet design with Intel's Foveros tech, Ponte Vecchio will target HPC markets for supercomputers and AI training in the datacenter. According to HotHardware, Ponte Vecchio will employ a combination of both its Foveros 3D packaging and EMIB (Embedded Multi-die Interconnect Bridge) technologies, along with High Bandwidth Memory (HBM) and Compute Express Link (CXL), which will operate over the newly ratified PCIe 5.0 interface and serve as Ponte Vecchio's high-speed switch fabric connecting all GPU resources. Intel is billing Ponte Vecchio as its first exascale GPU, proving its meddle in the U.S. Department of Energy's (DOE) Aurora supercomputer. Aurora will employ a topology of six Ponte Vecchio GPUs and two Intel Xeon Scalable processors based on Intel's next generation Sapphire Rapids architecture, along with Optane DC Persistent Memory on a single blade. The new supercomputer is schedule to arrive sometime in 2021.
Intel

Intel To Remove Old Drivers and BIOS Updates From Its Site (zdnet.com) 130

By Friday this week, Intel plans to remove old drivers and BIOS updates from its official website. From a report: "This download, BIOS Update [BLH6710H.86A] 0163, will no longer be available after November 22, 2019 and will not be supported with any additional functional, security, or other updates," reads a message posted to the download page of one of the impacted components. "Intel recommends that users of BIOS Update [BLH6710H.86A] 0163 uninstall and/or discontinue use as soon as possible," the message continues. The downloads are drivers and BIOS updates for Intel desktop components and motherboards the company released in the 90s and early-to-mid 2000s. Downloads for hundreds of components are believed to have been impacted, from motherboards to NIC cards and graphics cards. Most of the drivers are for Windows versions like 98, ME, XP, and older Windows Server editions -- old Windows OS versions that have themselves reached end-of-life (EOL) All components and motherboards reached (EOL) years ago, and Intel stopped delivering firmware updates as a result. Its website was merely hosting the older files for convenience.
Transportation

Tesla Owners Say Autopilot Makes Them Feel Safer (bloomberg.com) 135

"Bloomberg has conducted a survey of Tesla Model 3 owners," writes Slashdot reader Thelasko. "Some of the most interesting data are responses to questions about Autopilot." Here's an excerpt from the report: We asked 5,000 Model 3 owners about their experience with the electric sedan that Tesla Chief Executive Officer Elon Musk says will lead the world into a new era of driverless transportation. [...] Six drivers claimed that Autopilot actually contributed to a collision, while nine people in the Bloomberg survey went so far as to credit the system with saving their lives. Hundreds of owners recalled dangerous behaviors, such as phantom braking, veering or failing to stop for a road hazard. But even those who reported shortcomings gave Autopilot high overall ratings. More than 90% of owners said driving with Autopilot makes them safer -- including most of the respondents who simultaneously faulted the software for creating dangerous situations. Bloomberg also asked Model 3 owners about the quality and reliability of their vehicles, as well as the service and charging.
Graphics

Adobe and Twitter Are Designing a System For Permanently Attaching Artists' Names To Pictures (theverge.com) 62

Adobe, Twitter, and The New York Times Company have announced a new system for adding attribution to photos and other content. A tool will record who created a piece of content and whether it's been modified by someone else, then let other people and platforms check that data. The Verge reports: The overall project is called the Content Authenticity Initiative, and its participants will hold a summit on the system in the next few months. Based on what Adobe has announced, the attribution tool is a piece of metadata that can be attached to a file. Adobe doesn't describe precisely how it will keep the tag secure or prevent someone from copying the content in a way that strips it out. Adobe chief product officer Scott Belsky said that some technical details will be worked out at the summit. Adobe described this system as a way to verify "authenticity" online. And The New York Times Company's research and development head, Marc Lavallee, suggested it could fight misinformation by helping people discern "trusted news" on digital media platforms.

But the most obvious uses include identifying a photo's source and making sure artists get credit for their work. Many photos and webcomics circulate anonymously on platforms like Twitter, and an attribution tag would help trace those images back to their creator. This depends entirely on how well the CAI system works, however. Tags wouldn't be very useful if they could be easily altered or removed, but if the system preserves security by tightly controlling how people can interact with the image, it could have the same downsides as other digital rights management or DRM systems.

Graphics

Helvetica's Evil Twin, Hellvetica, Will Haunt Your Nightmares (fastcompany.com) 47

Freshly Exhumed shares a report from Fast Company: Hold your favorite graphic design tome close. We now know what the classic typeface Helvetica would look like if it came from the underworld. Yes, it will keep type enthusiasts up at night. The design darling Helvetica -- that ubiquitous sans-serif typeface developed by Max Miedinger in 1957, representative of the crisp Swiss design aesthetic of that period, and star of its own documentary by the same name -- has made a deal with the kerning devil. The results aren't pretty. They're not meant to be.

Zack Roif and Matthew Woodward, both associate creative directors at the international advertising agency R/GA, have released a new typeface available free to download, Hellvetica, and it will make all your worst kerning nightmares come true. While each character has the same form as the classic typeface it's riffing on, Hellvetica utilizes inconsistent, variable spacing between each letterform to give an overall effect that something has gone terribly astray. Nope, that wasn't a mistake. You might just say it was intentionally erroneous.
The project is a study in playfulness and rule-breaking, "an exercise in going against the 'designer instincts' to fix up that awful kerning. Hundred percent break the rules," says Woodward. "Don't listen to your gut. Forget your training... and make that logo kern in hell!"
Intel

Intel Launches Core i9-9900KS 8-Core CPU At 5GHz Across All Cores (hothardware.com) 89

MojoKid writes: As the "S" in its name implies, the new Intel Core i9-9900KS that launched today is something akin to a Special Edition version of the company's existing Core i9-9900K 8-core CPU. The processors are built from the same slab of silicon -- an 8-core, Coffee Lake-refresh based die and packaged up for Intel's LGA1151 socket. What makes the Core i9-9900KS different from its predecessor are its base and turbo boost clocks, which are rated for 4GHz and 5GHz across all-cores, respectively, with enhanced binning of the chips to meet its performance criteria. The Core i9-9900KS is arguably the fastest processor available right now for single and lightly-threaded workloads, and offers the highest performance in gaming and graphics tests. In more heavily-threaded workloads that can leverage all of the additional processing resources available in a 12-core CPU like the Ryzen 9-3900X, however, the 8-core Intel Core i9-9900KS doesn't fare as well. It did catch AMD's 12-core Threadripper 2920X, which is based on the previous-gen Zen+ architecture, on a couple of occasions, however. Intel's new Core i9-9900KS desktop processor is available starting today at $513 MSRP.
Graphics

Nvidia Launches GeForce GTX 1660 Super For Faster 1080p Gaming (hothardware.com) 14

MojoKid writes: NVIDIA is expanding its GeForce GTX family of GPUs today with a pair of new Turing-based graphics cards, the GeForce GTX 1660 Super and GeForce GTX 1650 Super. As their mode numbers suggest, these new cards reside above their "non-Super" branded counterparts in NVIDIA's line-up, but a notch below GeForce GTX Ti variants. The GeForce GTX 1660 Super is somewhat of a cross between standard GTX 1660 and a 1660 Ti. The GeForce GTX 1660 Super has a similar number of CUDA cores and texture units to the vanilla GTX 1660, but with faster 14 Gbps GDDR6 memory. The GeForce GTX 1660 Super has higher GPU and memory clocks than a GeForce GTX 1660 Ti, however. In its price range, the GeForce GTX 1660 Super appears to be a solid value. Its gaming performance is strong, especially at 1080p resolution and generally faster than AMD's Radeon RX 590. The cards are also power-efficient, cool and quiet, and the GPUs are significantly overclockable as well. GeForce GTX 1660 Super cards should be available at retail today for around $230, with GTX 1650 Super on its way late next month.
Android

Nvidia Revamps Shield Android TV Streamer, Introduces New $150 Model (variety.com) 47

Nvidia today introduced a revamped version of its Android TV streaming device, complete with a faster processor and a new remote control. From a report: In addition to a $200 Pro model, Nvidia is also introducing a new $150 version with a home theater-friendly form factor that is meant to cater to a mainstream audience interested in great picture quality and sound. Nvidia Shield has long been a favorite with Android enthusiasts looking for the most advanced streaming hardware. First introduced in 2015, the device doubles as a console for both local and cloud-based video games, and thanks to a powerful processor, it's capable of running a whole bunch of software that wouldn't work on your average Roku or Fire TV streamer. A Shield can be turned into a DVR to record over-the-air television, a hub to control smart home devices and even a Plex server to manage expansive media collections.

The flip side of this has always been the price: The 2017 version of the device started at $180. You could buy 3 Rokus for the same amount, and still have money left over. Still, for its 2019 revamp, the company decided to stay the course, and still aim for the upper end of the market. "We don't look to do easy, cheap products," said Nvidia Shield director of product management Chris Daniel during a recent interview with Variety. Nvidia's goal was to use its expertise in graphics and AI to push the limits of what a streaming device can do, Daniel said. To stay true to that mission, Nvidia based its new generation of Shield streamers on its Tegra X1+ processor, which promises to be 25% faster than the processor used in previous-generation Shield devices.

Earth

Facing Unbearable Heat, Qatar Has Begun To Air-Condition the Outdoors (washingtonpost.com) 183

It was 116 degrees Fahrenheit in the shade outside the new Al Janoub soccer stadium, and the air felt to air-conditioning expert Saud Ghani as if God had pointed "a giant hair dryer" at Qatar. From a report: Yet inside the open-air stadium, a cool breeze was blowing. Beneath each of the 40,000 seats, small grates adorned with Arabic-style patterns were pushing out cool air at ankle level. And since cool air sinks, waves of it rolled gently down to the grassy playing field. Vents the size of soccer balls fed more cold air onto the field. Ghani, an engineering professor at Qatar University, designed the system at Al Janoub, one of eight stadiums that the tiny but fabulously rich Qatar must get in shape for the 2022 World Cup. His breakthrough realization was that he had to cool only people, not the upper reaches of the stadium -- a graceful structure designed by the famed Zaha Hadid Architects and inspired by traditional boats known as dhows. "I don't need to cool the birds," Ghani said.

Qatar, the world's leading exporter of liquefied natural gas, may be able to cool its stadiums, but it cannot cool the entire country. Fears that the hundreds of thousands of soccer fans might wilt or even die while shuttling between stadiums and metros and hotels in the unforgiving summer heat prompted the decision to delay the World Cup by five months. It is now scheduled for November, during Qatar's milder winter. The change in the World Cup date is a symptom of a larger problem -- climate change. Already one of the hottest places on Earth, Qatar has seen average temperatures rise more than 2 degrees Celsius (3.6 F) above preindustrial times, the current international goal for limiting the damage of global warming. The 2015 Paris climate summit said it would be better to keep temperatures "well below" that, ideally to no more than 1.5 degrees Celsius (2.7 F).

[...] To survive the summer heat, Qatar not only air-conditions its soccer stadiums, but also the outdoors -- in markets, along sidewalks, even at outdoor malls so people can window shop with a cool breeze. "If you turn off air conditioners, it will be unbearable. You cannot function effectively," says Yousef al-Horr, founder of the Gulf Organization for Research and Development. Yet outdoor air conditioning is part of a vicious cycle. Carbon emissions create global warming, which creates the desire for air conditioning, which creates the need for burning fuels that emit more carbon dioxide. In Qatar, total cooling capacity is expected to nearly double from 2016 to 2030, according to the International District Cooling & Heating Conference. And it's going to get hotter.

Graphics

Was Flash Responsible For 'The Internet's Most Creative Era'? (vice.com) 72

A new article this week on Motherboard argues that Flash "is responsible for the internet's most creative era," citing a new 640-page book by Rob Ford on the evolution of web design.

[O]ne could argue that the web has actually gotten less creative over time, not more. This interpretation of events is a key underpinning of Web Design: The Evolution of the Digital World 1990-Today (Taschen, $50), a new visual-heavy book from author Rob Ford and editor Julius Wiedemann that does something that hasn't been done on the broader internet in quite a long time: It praises the use of Flash as a creative tool, rather than a bloated malware vessel, and laments the ways that visual convention, technical shifts, and walled gardens have started to rein in much of this unvarnished creativity.

This is a realm where small agencies supporting big brands, creative experimenters with nothing to lose, and teenage hobbyists could stand out simply by being willing to try something risky. It was a canvas with a built-in distribution model. What wasn't to like, besides a whole host of malware?

The book's author tells Motherboard that "Without the rebels we'd still be looking at static websites with gray text and blue hyperlinks." But instead we got wild experiments like Burger King's "Subservient Chicken" site or the interactive "Wilderness Downtown" site coded by Google.

There were also entire cartoon series like Radiskull and Devil Doll or Zombie College -- not to mention games like "A Murder of Scarecrows" or the laughably unpredictible animutations of 14-year-old Neil Cicierega. But Ford tells Motherboard that today, many of the wild ideas have moved from the web to augmented reality and other "physical mediums... The rise in interactive installations, AR, and experiential in general is where the excitement of the early days is finally happening again."

Motherboard calls the book "a fitting coda for a kind of digital creativity that -- like Geocities and MySpace pages, multimedia CD-ROMs, and Prodigy graphical interfaces before it -- has faded in prominence."
Graphics

Wired Remembers the Glory Days of Flash (wired.co.uk) 95

Wired recently remembered Flash as "the annoying plugin" that transformed the web "into a cacophony of noise, colour, and controversy, presaging the modern web."

They write that its early popularity in the mid-1990s came in part because "Microsoft needed software capable of showing video on their website, MSN.com, then the default homepage of every Internet Explorer user." But Flash allowed anyone to become an animator. (One Disney artist tells them that Flash could do in three days what would take a professional animator 7 months -- and cost $10,000.)

Their article opens in 2008, a golden age when Flash was installed on 98% of desktops -- then looks back on its impact: The online world Flash entered was largely static. Blinking GIFs delivered the majority of online movement. Constructed in early HTML and CSS, websites lifted clumsily from the metaphors of magazine design: boxy and grid-like, they sported borders and sidebars and little clickable numbers to flick through their pages (the horror).

Flash changed all that. It transformed the look of the web...

Some of these websites were, to put it succinctly, absolute trash. Flash was applied enthusiastically and inappropriately. The gratuitous animation of restaurant websites was particularly grievous -- kitsch abominations, these could feature thumping bass music and teleporting ingredients. Ishkur's 'guide to electronic music' is a notable example from the era you can still view -- a chaos of pop arty lines and bubbles and audio samples, it looks like the mind map of a naughty child...

In contrast to the web's modern, business-like aesthetic, there is something bizarre, almost sentimental, about billion-dollar multinationals producing websites in line with Flash's worst excess: long loading times, gaudy cartoonish graphics, intrusive sound and incomprehensible purpose... "Back in 2007, you could be making Flash games and actually be making a living," remembers Newgrounds founder Tom Fulp, when asked about Flash's golden age. "That was a really fun time, because that's kind of what everyone's dream is: to make the games you want and be able to make a living off it."

Wired summarizes Steve Jobs' "brutally candid" diatribe against Flash in 2010. "Flash drained batteries. It ran slow. It was a security nightmare. He asserted that an era had come to an end... '[T]he mobile era is about low power devices, touch interfaces and open web standards -- all areas where Flash falls short.'" Wired also argues that "It was economically viable for him to rubbish Flash -- he wanted to encourage people to create native games for iOS."

But they also write that today, "The post-Flash internet looks different. The software's downfall precipitated the rise of a new aesthetic...one moulded by the specifications of the smartphone and the growth of social media," favoring hits of information rather than striving for more immersive, movie-emulating thrills.

And they add that though Newgrounds long-ago moved away from Flash, the site's founder is now working on a Flash emulator to keep all that early classic content playable in a browser.
Graphics

NVIDIA's Job Listings Reveal 'Game Remastering' Studio, New Interest In RISC-V (forbes.com) 40

An anonymous reader quotes Forbes: Nvidia has a lot riding on the success of its GeForce RTX cards. The Santa Clara, California company is beating the real-time ray tracing drum loudly, adamant on being known as a champion of the technology before AMD steals some of its thunder next year with the PlayStation 5 and its own inevitable release of ray-tracing enabled PC graphics cards.

Nvidia has shown that, with ray tracing, it can breathe new life into a decades-old PC shooter like id Software's Quake 2, so why not dedicate an entire game studio to remastering timeless PC classics? A new job listing spotted by DSOGaming confirms that's exactly what Nvidia is cooking up.

The ad says NVIDIA's new game remastering program is "cherry-picking some of the greatest titles from the past decades and bringing them into the ray tracing age, giving them state-of-the-art visuals while keeping the gameplay that made them great." (And it adds that the initiative is "starting with a title that you know and love but we can't talk about here!")

Meanwhile, a China-based industry watcher on Medium reports that "six RISC-V positions have been advertised by NVIDIA, based in Shanghai and pertaining to architecture, design, and verification."
Intel

Intel Kills Kaby Lake G, Vows To Offer Drivers For Five Years (pcworld.com) 16

When Kaby Lake G debuted at CES 2018, it made a big bang. No one expected sworn rivals Intel and AMD to collaborate on a CPU package, marrying a 7th-gen Kaby Lake CPU with a unique AMD Radeon RX Vega GPU. But what began with a bang ended Monday with an unceremonious memo. From a report: The Product Change Notification published by Intel on Monday confirmed that pretty much every single Kaby Lake G, including the Core i7-8706G, the Core i7-8705G, and the Core i5-8305G, would be discontinued. Last call for orders will be on January 17, 2020, and the final shipments are scheduled for July 31, 2020. While the end of life of a processor isn't typically a big deal to consumers who own them, one sticking point could have been driver support. Specifically, Kaby Lake G drivers for the custom AMD Radeon RX Vega M graphics come only from Intel. With a normal discrete GPU, the consumer would download drivers from the original company, such as Nvidia or AMD. With Kaby Lake G kaput, where does that leave Kaby Lake G-owners? Intel said the company will follow its standard policy and provide driver support for Kaby Lake G for five years from the launch of the product. All told, that probably means another 3.5 years of driver updates.

Slashdot Top Deals