×
Graphics

Ask Slashdot: Why Is 3D Technology Stagnating So Badly? 188

dryriver writes: If you had asked someone doing 3D graphics seriously back in 2000 what 3D technology will look like two decades away in 2019, they might have said: "Most internet websites will have realtime 3D content embedded or will be completely in 3D. 3D Games will look as good as movies or reality. Everyone will have a cheap handheld 3D scanner to capture 3D models with. High-end VR headsets, gloves, bodysuits and haptics devices will be sold in electronics stores. Still and video cameras will be able to capture true holographic 3D images and video of the real world. TVs and broadcast TV content will be in holographic 3D. 3D stuff you create on a PC will be realtime -- no more waiting for images to slowly render thanks to really advanced new 3D hardware. 3D content creation software will be incredibly advanced and fast to work with in 2019. Many new types of 3D input devices will be available that make working in 3D a snap."

Except of course that that in the real 2019, none of this has come true at all, and the entire 3D field has been stagnating very, very badly since around 2010. It almost seems like a small army of 3D technology geniuses pushed and pushed 3D software and hardware hard during the 80s, 90s, 2000s, then retired or dropped off the face of the earth completely around 10 years ago. Why is this? Are consumers only interested in Facebook, YouTube, cartoony PlayStation graphics and smartphones anymore? Are we never going to see another major 3D technology innovation push again?
Operating Systems

Dell Begins Pre-Installing Linux On Range of Precision Laptops (phoronix.com) 139

"While Linux-preloaded laptops have been available for years from smaller companies, and have represented a fraction of their own sales with the much-admired XPS 13 developer model, Dell now offers a range of Precision models pre-installed with Ubuntu Linux," writes Slashdot reader Freshly Exhumed. Phoronix reports: At the start of May Dell announced an Ubuntu Linux option for their entry-level ~$700 Precision laptop while now they are closing out May by offering up Ubuntu 18.04 LTS on their higher-tier Precision laptop models. Ubuntu Linux has landed for the rest of Dell's current generation Precision mobile workstation line-up with support for the Precision 5540, 7540, and 7740. The Precision 5540 offers options of Xeon E or 9th Gen Core CPUs with up to 64GB of RAM and options for a NVIDIA Quadro T2000. The Precision 7540/7740 meanwhile are more powerful mobile workstations with supporting up to 128GB of ECC RAM and latest generation processors. The Precision 7740 model can also accomodate NVIDIA Quadro RTX 5000 series graphics. Additional details can be found via this blog post by Dell's Barton George.
Graphics

Intel Graphics Division Shares Wild Futuristic GPU Concept Cards (hothardware.com) 70

MojoKid writes: What do you think graphics cards will look like in the next decade and a half? Intel wanted to know that as well, so it commissioned designer Cristiano Siquiera to give us a taste of what graphics cards might look like in the year 2035. Siquiera, the original talented designer that brought the first set of Intel Odyssey GPU renders not long ago, focused primarily on the fan/shroud designs and what innovations could be fostered in the coming years. He was tasked with thinking far beyond current design conventions, materials and cooling technologies in current-gen graphics cards, and to envision new designs that could employ technologies and materials not even invented yet. One concept, called Gemini, shows an ionic-based cooling system that isn't too far beyond the realm of feasibility. Yet another design, called Prometheus, showcases top edge-mounted display readout that could also be fairly easily employed with flexible OLED display technology. Intel also just launched a new Graphics Command Center driver package today, which offers more customization, better control of power and cooling and one-click game optimization for Intel GPU-enabled systems.
Intel

Intel Boldly Claims Its 'Ice Lake' Integrated Graphics Are As Good as AMD's (pcworld.com) 147

While Intel is expected to detail its upcoming 10nm processor, Ice Lake, during its Tuesday keynote here at Computex, the company is already making one bold claim -- that Ice Lake's integrated Gen11 graphics engine is on par or better than AMD's current Ryzen 7 graphics. From a report: It's a bold claim, and one that Ryan Shrout, a former journalist and now the chief performance strategist for Intel, said that Intel doesn't make lightly. "I don't think we can overstate how important this is for us, to make this claim and this statement about the one area that people railed on us for in the mobile space," Shrout said shortly before Computex began. Though Intel actually supplies the largest number of integrated graphics chipsets in the PC space, it does so on the strength of its CPU performance (and also thanks to strong relationships with laptop makers). Historically, AMD has leveraged its Radeon "Vega" GPUs to attract buyers seeking a more powerful integrated graphics solution. But what Intel is trying to do now, with its Xe discrete graphics on the horizon, is let its GPUs stand on their own merits. Referencing a series of benchmarks and games from the 3DMark Sky Diver test to Fortnite to Overwatch, Intel claims performance that's 3 to 15 percent faster than the Ryzen 7. Intel's argument is based on a comparison of a U-series Ice Lake part at 25 watts, versus a Ryzen 7 3700U, also at 25 watts.
AMD

AMD Unveils the 12-Core Ryzen 9 3900X, at Half the Price of Intel's Competing Core i9 9920X Chipset (techcrunch.com) 261

AMD CEO Lisa Su today unveiled news about its chips and graphics processors that will increase pressure on competitors Intel and Nvidia, both in terms of pricing and performance. From a report: All new third-generation Ryzen CPUs, the first with 7-nanometer desktop chips, will go on sale on July 7. The showstopper of Su's keynote was the announcement of AMD's 12-core, 24-thread Ryzen 9 3900x chip, the flagship of its third-generation Ryzen family. It will retail starting at $499, half the price of Intel's competing Core i9 9920X chipset, which is priced at $1,189 and up. The 3900x has 4.6 Ghz boost speed and 70 MB of total cache and uses 105 watts of thermal design power (versus the i9 9920x's 165 watts), making it more efficient. AMD says that in a Blender demo against Intel i9-9920x, the 3900x finished about 18 percent more quickly. Starting prices for other chips in the family are $199 for the 6-core, 12-thread 3600; $329 for the 8-core, 16-thread Ryzen 3700x (with 4.4 Ghz boost, 36 MB of total cache and a 65 watt TDP); and $399 for the 8-core, 16-thread Ryzen 3800X (4.5 Ghz, 32MB cache, 105w).
Hardware

Nvidia Unveils RTX Studio For Desktop-Style Performance on Laptops (venturebeat.com) 47

Nvidia today unveiled the tech behind new RTX Studio laptops, which can provide desktop-level computing performance for laptop users. From a report: Aimed at creators, the machines are targeted at independent artists who are fueling growing fields like social media, digital advertising, and 3D development. Nvidia says these laptops can deliver up to seven times the performance of a MacBook Pro. The 17 new laptop models from seven manufacturers, powered by a range of Nvidia GeForce and Quadro RTX graphics processing units (GPUs). The ultra-long battery life and stability when using newly developed Nvidia Studio Drivers.

The Laptops that meet the highest qualifications for creative capabilities will be badged RTX Studio. That will help creators to easily identify the right hardware to meet their demands. These Quadro and GeForce RTX-based laptops are purpose-built for GPU-accelerated content creation. The laptops feature the new Quadro RTX 5000 mobile GPU and GeForce RTX 2080, 2070 and 2060 GPUs. Quadro RTX 5000-based laptops are the world's first with 16GB of graphics memory, enabling advanced multi-app creative workflows and use of large 3D models that previously were not possible while on the go, Nvidia said. [...] RTX Studio laptops will be available starting in June from top computer makers, including Acer, Asus, Dell, Gigabyte, HP, MSI, and Razer. Pricing starts at $1,600 and will vary based on partner designs, features, and region.

Microsoft

Sony and Microsoft Set Rivalry Aside For Streaming Alliance (nikkei.com) 33

Sony and Microsoft, bitter rivals in the video game console wars, will team up in on-demand gaming to better compete with newcomers like Google as the industry's main battlefield looks poised to shift to the cloud, news outlet Nikkei reported Thursday. From a report: Sony President and CEO Kenichiro Yoshida has signed a memorandum of understanding with Microsoft CEO Satya Nadella on a strategic tie-up. While details have yet to be hammered out, the partnership will center on artificial intelligence and the cloud. The latter category includes plans for joint development of cloud gaming technology. While this market is expected to grow as ultrafast fifth-generation wireless gains traction, such services require much processing power on the provider's end to deliver games with high-quality graphics and minimal lag. Sony and Microsoft plan to leverage the American computing behemoth's data centers for this purpose. The two companies, along with Nintendo, long dominated the gaming landscape. But the rise of mobile gaming has brought competition from such other players as China's Tencent Holdings, which publishes the mobile version of the wildly popular PlayerUnknown's Battlegrounds (PUBG). Press release: Sony and Microsoft to explore strategic partnership.
NES (Games)

28 Years Later, Hacker Fixes Rampant Slowdown of SNES' Gradius III (arstechnica.com) 58

Ars Technica's Kyle Orland reports that Brazilian ROM hacker Vitor Vilela has released a ROM patch for the hit arcade game Gradius III, creating a new, slowdown-free version of the game for play on SNES emulators and standard hardware. "In magazine screenshots, the game's huge, colorful sprites were a sight to behold, comparable to the 1989 arcade original," writes Orland. "In action, though, any scene with more than a handful of enemies would slow to a nearly unplayable crawl on the underpowered SNES hardware." From the report: The key to Vilela's efforts is the SA-1 chip, an enhancement co-processor that was found in some late-era SNES cartridges like Super Mario RPG and Kirby Super Star. Besides sporting a faster clock speed than the standard SNES CPU (up to 10.74 Mhz versus 3.58 Mhz for the CPU), SA-1 also opens up faster mathematical functions, improved graphics manipulation, and parallel processing capabilities for SNES programmers.

The result, as is apparent in the comparison videos embedded here, is a version of Gradius III that Vilela says runs two to three times faster than the original. It also keeps its silky smooth frame rate no matter how many detailed, screen-filling sprites clutter the scene. That's even true in the game's notorious, bubble-filled Stage 2, which is transformed from a jittery slide show to an amazing showcase of the SNES' enhanced power. As if that wasn't enough, the patch even slashes the game's loading times, cutting a full 3.25 seconds from the notably slow startup animation.
Vilela notes that the lack of slowdown "makes it incredibly super difficult" and even suggests that "some arcade segments of the game do not look RTA (real-time action) viable with SA-1. But we shouldn't underestimate the human capabilities."
Intel

Apple's Tim Cook and Luca Maestri on Intel (daringfireball.net) 174

Tim Cook and Luca Maestri's remarks on Apple's quarterly analyst call earlier this week: CEO Tim Cook: "For our Mac business overall, we faced some processor constraints in the March quarter, leading to a 5 percent revenue decline compared to last year. But we believe that our Mac revenue would have been up compared to last year without those constraints, and don't believe this challenge will have a significant impact on our Q3 results."

CFO Luca Maestri: "Next I'd like to talk about the Mac. Revenue was 5.5 billion compared to 5.8 billion a year ago, with the decline driven primarily by processor constraints on certain popular models."
Apple commentator John Gruber adds, "I asked an Apple source last fall why it took so long for Apple to release the new MacBook Air. Their one-word answer: "Intel." One of the big questions for next month's WWDC is whether this is the year Apple announces Macs with Apple's own ARM processors (and graphics?)."
Software

Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows (phoronix.com) 151

To not a lot of surprise compared to the world of proprietary graphics drivers on Windows where once the support is retired the driver releases stop, old open-source Linux OpenGL drivers are found to be better maintained. From a report: Blender developers working on shipping Blender 2.80 this July as the big update to this open-source 3D modeling software today rolled out the Linux GPU requirements for this next release. The requirements themselves aren't too surprising and cover NVIDIA GPUs released in the last ten years, AMD GCN for best support, and Intel Haswell graphics or newer. In the case of NVIDIA graphics they tend to do a good job maintaining their legacy driver branches. With the AMD Radeon and Intel graphics, Blender developers acknowledge older hardware may work better on Linux.
AMD

AMD Gained Market Share For 6th Straight Quarter, CEO Says (venturebeat.com) 123

Advanced Micro Devices CEO Lisa Su said during her remarks on AMD's first quarter earnings conference call with analysts today that she was confident about the state of competition with rivals like Intel and Nvidia in processors and graphics chips. She also pointed out that the company gained market share in processors for the 6th straight quarter. From a report: AMD's revenue was $1.27 billion for the first quarter, down 23% from the same quarter a year ago. But Su noted that Ryzen and Epyc processor and datacenter graphics processing units (GPUs) revenue more than doubled year-over-year, helping expand the gross margin by 5 percentage points. If there was a lag in the quarter, it was due to softness in the graphics channel and lower semi-custom revenue (which includes game console chips). Su said AMD's unit shipments increased significantly and the company's new products drove a higher client average selling price (ASP).
Graphics

Ask Slashdot: Why Are 3D Games, VR/AR Still Rendered Using Polygons In 2019? 230

dryriver writes: A lot of people seem to believe that computers somehow need polygons, NURBS surfaces, voxels or point clouds "to be able to define and render 3D models to the screen at all." This isn't really true. All a computer needs to light, shade, and display a 3D model is to know the answer to the question "is there a surface point at coordinate XYZ or not." Many different mathematical structures or descriptors can be dreamed up that can tell a computer whether there is indeed a 3D model surface point at coordinate XYZ or behind a given screen pixel XY. Polygons/triangles are a very old approach to 3D graphics that was primarily designed not to overstress the very limited CPU and RAM resources of the first computers capable of displaying raster 3D graphics. The brains who invented the technique back in the late 1960s probably figured that by the 1990s at the latest, their method would be replaced by something better and more clever. Yet here we are in 2019 buying pricey Nvidia, AMD, and other GPUs that are primarily polygon/triangle accelerators.

Why is this? Creating good-looking polygon models is still a slow, difficult, iterative and money intensive task in 2019. A good chunk of the $60 you pay for an AAA PC or console game is the sheer amount of time, manpower and effort required to make everything in a 15-hour-long game experience using unwieldy triangles and polygons. So why still use polygons at all? Why not dream up a completely new "there is a surface point here" technique that makes good 3D models easier to create and may render much, much faster than polygons/triangles on modern hardware to boot? Why use a 50-year-old approach to 3D graphics when new, better approaches can be pioneered?
Hardware

NVIDIA Launches GeForce GTX 16 Series Turing Gaming Laptop GPUs (hothardware.com) 22

MojoKid writes: NVIDIA has launched a new family of more budget friendly Turing graphics chips for gaming laptops, called the GeForce GTX 1650, GeForce GTX 1660, and GeForce GTX 1660 Ti. The new GPUs will power roughly 80 different OEM mainstream gaming notebook designs, starting in the $799 price range. Compared to a 4-year-old gaming laptop with a GeForce GTX 960M, NVIDIA says that a modern counterpart equipped with a GeForce GTX 1660 Ti can deliver 4x the performance in today's battle royale-style games like Apex Legends, Fortnite, and PUBG. As for the GeForce GTX 1650, NVIDIA is promising a 2.5x performance advantage compared to the GTX 950M and a 1.7x advantage compared to the previous generation GTX 1050. Gamers should expect consistent 60 fps performance in the above-mentioned gaming titles at 1080p, though the company didn't specifically mention GTX 1660 vs 1060 performance comparisons. According to NVIDIA, every major OEM will be releasing GeForce GTX 16 Series laptops, including well-known brands like ASUS, Dell/Alienware, Acer, Hewlett-Packard, and Lenovo (among others).
PlayStation (Games)

Sony Cracks Down On Sexually Explicit Content In Games (engadget.com) 299

Slashdot reader xavdeman writes: Hot on the heels of its announcement of the specifications of the next PlayStation, Sony has revealed a new crackdown on explicit content. Citing "the rise of the #MeToo movement" and a concern of "legal and social action" in the USA, Sony has said it wants to address concerns about the depiction of women in video games playable on its platform as well as protect children's "sound growth and development." The new rules were reportedly already responsible for puritan cutscene alterations in the Western PS4 release of the multi-platform title Devil May Cry 5, where lens flares were used to cover up partial nudity.
Emulation (Games)

HD Emulation Mod Makes 'Mode 7' SNES Games Look Like New (arstechnica.com) 44

An anonymous reader quotes a report from Ars Technica: Gamers of a certain age probably remember being wowed by the quick, smooth scaling and rotation effects of the Super Nintendo's much-ballyhooed "Mode 7" graphics. Looking back, though, those gamers might also notice how chunky and pixelated those background transformations could end up looking, especially when viewed on today's high-end screens. Emulation to the rescue. A modder going by the handle DerKoun has released an "HD Mode 7" patch for the accuracy-focused SNES emulator bsnes. In their own words, the patch "performs Mode 7 transformations... at up to 4 times the horizontal and vertical resolution" of the original hardware.

The results, as you can see in the above gallery and the below YouTube video, are practically miraculous. Pieces of Mode 7 maps that used to be boxy smears of color far in the distance are now sharp, straight lines with distinct borders and distinguishable features. It's like looking at a brand-new game. Perhaps the most impressive thing about these effects is that they take place on original SNES ROM and graphics files; DerKoun has said that "no artwork has been modified" in the games since the project was just a proof of concept a month ago. That makes this project different from upscaling emulation efforts for the N64 and other retro consoles, which often require hand-drawn HD texture packs to make old art look good at higher resolutions.

Hardware

Qualcomm's Snapdragon 665, 730, and 730G Target AI and Gaming (venturebeat.com) 13

Today at its annual AI Day conference in San Francisco, chipmaker Qualcomm revamped the midrange products in its system-on-chip portfolio with faster graphics, more power-efficient cores, and other silicon accouterments. From a report: The Snapdragon 670 gained a counterpart in the Snapdragon 665, and the Snapdragon 700 series has two new SKUs in the long-rumored Snapdragon 730 and a gaming-optimized variant dubbed Snapdragon 730G. "In the last several years, we've had a few different technologies that we've [explored]," said vice president of product management Kedar Kondap during a press briefing. "One is obviously [the] camera. Secondly, AI, and ... gaming ... [We've] focused on ... power, [making] sure we drive very high performance." The 11-nm Snapdragon 665 packs Kryo 260 cores and offers up to 20 percent power savings with the Adreno 610 GPU. The 8-nm Snapdragon 730 has Kryo 470 cores inside.
Android

'SPURV' Project Brings Windowed Android Apps To Desktop Linux (androidpolice.com) 52

mfilion shares a report from Android Police: A new "experimental containerized Android environment" from a company called Collabora allows Android apps to run in floating windows alongside native applications on desktop Linux. You can read all the technical details at the source link, but put simply, 'SPURV' creates a virtual Android device on your Linux computer, much like Bluestacks and other similar tools. There are various components of SPURV that allow the Android environment to play audio, connect to networks, and display hardware-accelerated graphics through the underlying Linux system.

The most interesting part is 'SPURV HWComposer,' which renders Android applications in windows, alongside the windows from native Linux applications. This is what sets SPURV apart from (most) other methods of running Android on a computer. For this to work, the Linux desktop has to be using the Wayland display server (some Linux-based OSes use X11). Pre-built binaries for SPURV are not currently available -- you have to build it yourself from the source code. Still, it's an interesting proof-of-concept, and hopefully someone turns it into a full-featured product.

Graphics

What's The Correct Way to Pronounce 'GIF'? (thenewstack.io) 453

"Apparently we're all fighting about how to pronounce 'GIF' again on Twitter," writes technology columnist Mike Melanson: I personally find the argument of web designer Aaron Bazinet, who managed to secure the domain howtoreallypronouncegif.com, rather convincing in its simplicity: "It's the most natural, logical way to pronounce it. That's why when everyone comes across the word for the first time, they use a hard G [as in "gift"]." Bazinet relates the origin of the debate as such:

"The creator of the GIF image format, Steve Wilhite of CompuServe, when deciding on the pronunciation, said he deliberately chose to echo the American peanut butter brand, Jif, and CompuServe employees would often say 'Choosy developers choose GIF(jif)', playing off of Jif's television commercials. If you hear anyone pronounce GIF with a soft G, it's because they know something of this history."

Wilhite attempted to settled the controversy in 2013 when accepting a lifetime achievement award at the 17th annual Webby awards. Using an actual animated .gif for his five-word acceptance speech, he authoritatively announced his preferred pronounciation. However, the chief editor of the Oxford English Dictionary argues that "A coiner effectively loses control of a word once it's out there," adding that "the pronunciation with a hard g is now very widespread and readily understood."

One linguist addressed the topic on Twitter this week, noting studies that found past usage of "gi" in words has been almost evenly split between hard and soft g sounds. Their thread also answers a related question: how will I weaponize a trivial and harmless consonant difference to make other people feel bad and self-conscious about themselves?

Her response? "Maybe just....don't do this."
Portables (Apple)

Apple Still Hasn't Fixed Its MacBook Keyboard Problem (wsj.com) 125

Joanna Stern, writing for the Wall Street Journal [the link may be paywalled]: Why is the breaking of my MacBook Air keyboard so insanely maddening? Let's take a trip down Memory Lane.
April 2015: Apple releases the all-new MacBook with a "butterfly" keyboard. In order to achieve extreme thinness, the keys are much flatter than older generations but the butterfly mechanism underneath, for which the keyboard is named, aims to replicate the bounce of a more traditional keyboard.
October 2016: The MacBook Pro arrives with a second-generation butterfly keyboard. A few months later, some begin to report that letters or characters don't appear, that keys get stuck or that letters unexpectedly repeat.
June 2018: Apple launches a keyboard repair program for what the company says is a "small percentage" of MacBook and MacBook Pro keyboards impacted.
July 2018: Apple releases a new high-end MacBook Pro with the third-generation of the keyboard that's said to fix the issues.
October 2018: Apple's new MacBook Air also has the third-generation keyboard. I recommend it, and even get one for myself.

Which brings us to the grand year 2019 and my MacBook Air's faulty E and R keys. Others have had problems with Apple's latest laptops, too. A proposed nationwide class-action suit alleges that Apple has been aware of the defective nature of these keyboards since 2015 yet sold affected laptops without disclosing the problem. "We are aware that a small number of users are having issues with their third-generation butterfly keyboard and for that we are sorry," an Apple spokesman said in a statement. "The vast majority of Mac notebook customers are having a positive experience with the new keyboard." If you have a problem, contact Apple customer service, he added.
John Gruber, a long time Apple columnist: I consider these keyboards the worst products in Apple history. MacBooks should have the best keyboards in the industry; instead they're the worst. They're doing lasting harm to the reputation of the MacBook brand.
AI

MIT Develops Algorithm To Accelerate Neural Networks By 200x (extremetech.com) 43

An anonymous reader quotes a report from ExtremeTech: MIT researchers have reportedly developed an algorithm that can accelerate [neural networks] by up to 200x. The NAS (Neural Architecture Search, in this context) algorithm they developed "can directly learn specialized convolutional neural networks (CNNs) for target hardware platforms -- when run on a massive image dataset -- in only 200 GPU hours," MIT News reports. This is a massive improvement over the 48,000 hours Google reported taking to develop a state-of-the-art NAS algorithm for image classification. The goal of the researchers is to democratize AI by allowing researchers to experiment with various aspects of CNN design without needing enormous GPU arrays to do the front-end work. If finding state of the art approaches requires 48,000 GPU arrays, precious few people, even at large institutions, will ever have the opportunity to try.

Algorithms produced by the new NAS were, on average, 1.8x faster than the CNNs tested on a mobile device with similar accuracy. The new algorithm leveraged techniques like path level binarization, which stores just one path at a time to reduce memory consumption by an order of magnitude. MIT doesn't actually link out to specific research reports, but from a bit of Google sleuthing, the referenced articles appear to be here and here -- two different research reports from an overlapping group of researchers. The teams focused on pruning entire potential paths for CNNs to use, evaluating each in turn. Lower probability paths are successively pruned away, leaving the final, best-case path. The new model incorporated other improvements as well. Architectures were checked against hardware platforms for latency when evaluated. In some cases, their model predicted superior performance for platforms that had been dismissed as inefficient. For example, 7x7 filters for image classification are typically not used, because they're quite computationally expensive -- but the research team found that these actually worked well for GPUs.

Slashdot Top Deals