Intel

Apple's Tim Cook and Luca Maestri on Intel (daringfireball.net) 174

Tim Cook and Luca Maestri's remarks on Apple's quarterly analyst call earlier this week: CEO Tim Cook: "For our Mac business overall, we faced some processor constraints in the March quarter, leading to a 5 percent revenue decline compared to last year. But we believe that our Mac revenue would have been up compared to last year without those constraints, and don't believe this challenge will have a significant impact on our Q3 results."

CFO Luca Maestri: "Next I'd like to talk about the Mac. Revenue was 5.5 billion compared to 5.8 billion a year ago, with the decline driven primarily by processor constraints on certain popular models."
Apple commentator John Gruber adds, "I asked an Apple source last fall why it took so long for Apple to release the new MacBook Air. Their one-word answer: "Intel." One of the big questions for next month's WWDC is whether this is the year Apple announces Macs with Apple's own ARM processors (and graphics?)."
Software

Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows (phoronix.com) 151

To not a lot of surprise compared to the world of proprietary graphics drivers on Windows where once the support is retired the driver releases stop, old open-source Linux OpenGL drivers are found to be better maintained. From a report: Blender developers working on shipping Blender 2.80 this July as the big update to this open-source 3D modeling software today rolled out the Linux GPU requirements for this next release. The requirements themselves aren't too surprising and cover NVIDIA GPUs released in the last ten years, AMD GCN for best support, and Intel Haswell graphics or newer. In the case of NVIDIA graphics they tend to do a good job maintaining their legacy driver branches. With the AMD Radeon and Intel graphics, Blender developers acknowledge older hardware may work better on Linux.
AMD

AMD Gained Market Share For 6th Straight Quarter, CEO Says (venturebeat.com) 123

Advanced Micro Devices CEO Lisa Su said during her remarks on AMD's first quarter earnings conference call with analysts today that she was confident about the state of competition with rivals like Intel and Nvidia in processors and graphics chips. She also pointed out that the company gained market share in processors for the 6th straight quarter. From a report: AMD's revenue was $1.27 billion for the first quarter, down 23% from the same quarter a year ago. But Su noted that Ryzen and Epyc processor and datacenter graphics processing units (GPUs) revenue more than doubled year-over-year, helping expand the gross margin by 5 percentage points. If there was a lag in the quarter, it was due to softness in the graphics channel and lower semi-custom revenue (which includes game console chips). Su said AMD's unit shipments increased significantly and the company's new products drove a higher client average selling price (ASP).
Graphics

Ask Slashdot: Why Are 3D Games, VR/AR Still Rendered Using Polygons In 2019? 230

dryriver writes: A lot of people seem to believe that computers somehow need polygons, NURBS surfaces, voxels or point clouds "to be able to define and render 3D models to the screen at all." This isn't really true. All a computer needs to light, shade, and display a 3D model is to know the answer to the question "is there a surface point at coordinate XYZ or not." Many different mathematical structures or descriptors can be dreamed up that can tell a computer whether there is indeed a 3D model surface point at coordinate XYZ or behind a given screen pixel XY. Polygons/triangles are a very old approach to 3D graphics that was primarily designed not to overstress the very limited CPU and RAM resources of the first computers capable of displaying raster 3D graphics. The brains who invented the technique back in the late 1960s probably figured that by the 1990s at the latest, their method would be replaced by something better and more clever. Yet here we are in 2019 buying pricey Nvidia, AMD, and other GPUs that are primarily polygon/triangle accelerators.

Why is this? Creating good-looking polygon models is still a slow, difficult, iterative and money intensive task in 2019. A good chunk of the $60 you pay for an AAA PC or console game is the sheer amount of time, manpower and effort required to make everything in a 15-hour-long game experience using unwieldy triangles and polygons. So why still use polygons at all? Why not dream up a completely new "there is a surface point here" technique that makes good 3D models easier to create and may render much, much faster than polygons/triangles on modern hardware to boot? Why use a 50-year-old approach to 3D graphics when new, better approaches can be pioneered?
Hardware

NVIDIA Launches GeForce GTX 16 Series Turing Gaming Laptop GPUs (hothardware.com) 22

MojoKid writes: NVIDIA has launched a new family of more budget friendly Turing graphics chips for gaming laptops, called the GeForce GTX 1650, GeForce GTX 1660, and GeForce GTX 1660 Ti. The new GPUs will power roughly 80 different OEM mainstream gaming notebook designs, starting in the $799 price range. Compared to a 4-year-old gaming laptop with a GeForce GTX 960M, NVIDIA says that a modern counterpart equipped with a GeForce GTX 1660 Ti can deliver 4x the performance in today's battle royale-style games like Apex Legends, Fortnite, and PUBG. As for the GeForce GTX 1650, NVIDIA is promising a 2.5x performance advantage compared to the GTX 950M and a 1.7x advantage compared to the previous generation GTX 1050. Gamers should expect consistent 60 fps performance in the above-mentioned gaming titles at 1080p, though the company didn't specifically mention GTX 1660 vs 1060 performance comparisons. According to NVIDIA, every major OEM will be releasing GeForce GTX 16 Series laptops, including well-known brands like ASUS, Dell/Alienware, Acer, Hewlett-Packard, and Lenovo (among others).
PlayStation (Games)

Sony Cracks Down On Sexually Explicit Content In Games (engadget.com) 299

Slashdot reader xavdeman writes: Hot on the heels of its announcement of the specifications of the next PlayStation, Sony has revealed a new crackdown on explicit content. Citing "the rise of the #MeToo movement" and a concern of "legal and social action" in the USA, Sony has said it wants to address concerns about the depiction of women in video games playable on its platform as well as protect children's "sound growth and development." The new rules were reportedly already responsible for puritan cutscene alterations in the Western PS4 release of the multi-platform title Devil May Cry 5, where lens flares were used to cover up partial nudity.
Emulation (Games)

HD Emulation Mod Makes 'Mode 7' SNES Games Look Like New (arstechnica.com) 44

An anonymous reader quotes a report from Ars Technica: Gamers of a certain age probably remember being wowed by the quick, smooth scaling and rotation effects of the Super Nintendo's much-ballyhooed "Mode 7" graphics. Looking back, though, those gamers might also notice how chunky and pixelated those background transformations could end up looking, especially when viewed on today's high-end screens. Emulation to the rescue. A modder going by the handle DerKoun has released an "HD Mode 7" patch for the accuracy-focused SNES emulator bsnes. In their own words, the patch "performs Mode 7 transformations... at up to 4 times the horizontal and vertical resolution" of the original hardware.

The results, as you can see in the above gallery and the below YouTube video, are practically miraculous. Pieces of Mode 7 maps that used to be boxy smears of color far in the distance are now sharp, straight lines with distinct borders and distinguishable features. It's like looking at a brand-new game. Perhaps the most impressive thing about these effects is that they take place on original SNES ROM and graphics files; DerKoun has said that "no artwork has been modified" in the games since the project was just a proof of concept a month ago. That makes this project different from upscaling emulation efforts for the N64 and other retro consoles, which often require hand-drawn HD texture packs to make old art look good at higher resolutions.

Hardware

Qualcomm's Snapdragon 665, 730, and 730G Target AI and Gaming (venturebeat.com) 13

Today at its annual AI Day conference in San Francisco, chipmaker Qualcomm revamped the midrange products in its system-on-chip portfolio with faster graphics, more power-efficient cores, and other silicon accouterments. From a report: The Snapdragon 670 gained a counterpart in the Snapdragon 665, and the Snapdragon 700 series has two new SKUs in the long-rumored Snapdragon 730 and a gaming-optimized variant dubbed Snapdragon 730G. "In the last several years, we've had a few different technologies that we've [explored]," said vice president of product management Kedar Kondap during a press briefing. "One is obviously [the] camera. Secondly, AI, and ... gaming ... [We've] focused on ... power, [making] sure we drive very high performance." The 11-nm Snapdragon 665 packs Kryo 260 cores and offers up to 20 percent power savings with the Adreno 610 GPU. The 8-nm Snapdragon 730 has Kryo 470 cores inside.
Android

'SPURV' Project Brings Windowed Android Apps To Desktop Linux (androidpolice.com) 52

mfilion shares a report from Android Police: A new "experimental containerized Android environment" from a company called Collabora allows Android apps to run in floating windows alongside native applications on desktop Linux. You can read all the technical details at the source link, but put simply, 'SPURV' creates a virtual Android device on your Linux computer, much like Bluestacks and other similar tools. There are various components of SPURV that allow the Android environment to play audio, connect to networks, and display hardware-accelerated graphics through the underlying Linux system.

The most interesting part is 'SPURV HWComposer,' which renders Android applications in windows, alongside the windows from native Linux applications. This is what sets SPURV apart from (most) other methods of running Android on a computer. For this to work, the Linux desktop has to be using the Wayland display server (some Linux-based OSes use X11). Pre-built binaries for SPURV are not currently available -- you have to build it yourself from the source code. Still, it's an interesting proof-of-concept, and hopefully someone turns it into a full-featured product.

Graphics

What's The Correct Way to Pronounce 'GIF'? (thenewstack.io) 453

"Apparently we're all fighting about how to pronounce 'GIF' again on Twitter," writes technology columnist Mike Melanson: I personally find the argument of web designer Aaron Bazinet, who managed to secure the domain howtoreallypronouncegif.com, rather convincing in its simplicity: "It's the most natural, logical way to pronounce it. That's why when everyone comes across the word for the first time, they use a hard G [as in "gift"]." Bazinet relates the origin of the debate as such:

"The creator of the GIF image format, Steve Wilhite of CompuServe, when deciding on the pronunciation, said he deliberately chose to echo the American peanut butter brand, Jif, and CompuServe employees would often say 'Choosy developers choose GIF(jif)', playing off of Jif's television commercials. If you hear anyone pronounce GIF with a soft G, it's because they know something of this history."

Wilhite attempted to settled the controversy in 2013 when accepting a lifetime achievement award at the 17th annual Webby awards. Using an actual animated .gif for his five-word acceptance speech, he authoritatively announced his preferred pronounciation. However, the chief editor of the Oxford English Dictionary argues that "A coiner effectively loses control of a word once it's out there," adding that "the pronunciation with a hard g is now very widespread and readily understood."

One linguist addressed the topic on Twitter this week, noting studies that found past usage of "gi" in words has been almost evenly split between hard and soft g sounds. Their thread also answers a related question: how will I weaponize a trivial and harmless consonant difference to make other people feel bad and self-conscious about themselves?

Her response? "Maybe just....don't do this."
Portables (Apple)

Apple Still Hasn't Fixed Its MacBook Keyboard Problem (wsj.com) 125

Joanna Stern, writing for the Wall Street Journal [the link may be paywalled]: Why is the breaking of my MacBook Air keyboard so insanely maddening? Let's take a trip down Memory Lane.
April 2015: Apple releases the all-new MacBook with a "butterfly" keyboard. In order to achieve extreme thinness, the keys are much flatter than older generations but the butterfly mechanism underneath, for which the keyboard is named, aims to replicate the bounce of a more traditional keyboard.
October 2016: The MacBook Pro arrives with a second-generation butterfly keyboard. A few months later, some begin to report that letters or characters don't appear, that keys get stuck or that letters unexpectedly repeat.
June 2018: Apple launches a keyboard repair program for what the company says is a "small percentage" of MacBook and MacBook Pro keyboards impacted.
July 2018: Apple releases a new high-end MacBook Pro with the third-generation of the keyboard that's said to fix the issues.
October 2018: Apple's new MacBook Air also has the third-generation keyboard. I recommend it, and even get one for myself.

Which brings us to the grand year 2019 and my MacBook Air's faulty E and R keys. Others have had problems with Apple's latest laptops, too. A proposed nationwide class-action suit alleges that Apple has been aware of the defective nature of these keyboards since 2015 yet sold affected laptops without disclosing the problem. "We are aware that a small number of users are having issues with their third-generation butterfly keyboard and for that we are sorry," an Apple spokesman said in a statement. "The vast majority of Mac notebook customers are having a positive experience with the new keyboard." If you have a problem, contact Apple customer service, he added.
John Gruber, a long time Apple columnist: I consider these keyboards the worst products in Apple history. MacBooks should have the best keyboards in the industry; instead they're the worst. They're doing lasting harm to the reputation of the MacBook brand.
AI

MIT Develops Algorithm To Accelerate Neural Networks By 200x (extremetech.com) 43

An anonymous reader quotes a report from ExtremeTech: MIT researchers have reportedly developed an algorithm that can accelerate [neural networks] by up to 200x. The NAS (Neural Architecture Search, in this context) algorithm they developed "can directly learn specialized convolutional neural networks (CNNs) for target hardware platforms -- when run on a massive image dataset -- in only 200 GPU hours," MIT News reports. This is a massive improvement over the 48,000 hours Google reported taking to develop a state-of-the-art NAS algorithm for image classification. The goal of the researchers is to democratize AI by allowing researchers to experiment with various aspects of CNN design without needing enormous GPU arrays to do the front-end work. If finding state of the art approaches requires 48,000 GPU arrays, precious few people, even at large institutions, will ever have the opportunity to try.

Algorithms produced by the new NAS were, on average, 1.8x faster than the CNNs tested on a mobile device with similar accuracy. The new algorithm leveraged techniques like path level binarization, which stores just one path at a time to reduce memory consumption by an order of magnitude. MIT doesn't actually link out to specific research reports, but from a bit of Google sleuthing, the referenced articles appear to be here and here -- two different research reports from an overlapping group of researchers. The teams focused on pruning entire potential paths for CNNs to use, evaluating each in turn. Lower probability paths are successively pruned away, leaving the final, best-case path. The new model incorporated other improvements as well. Architectures were checked against hardware platforms for latency when evaluated. In some cases, their model predicted superior performance for platforms that had been dismissed as inefficient. For example, 7x7 filters for image classification are typically not used, because they're quite computationally expensive -- but the research team found that these actually worked well for GPUs.

AI

NVIDIA's Latest AI Software Turns Rough Doodles Into Realistic Landscapes (theverge.com) 35

An anonymous reader quotes a report from The Verge: AI is going to be huge for artists, and the latest demonstration comes from Nvidia, which has built prototype software that turns doodles into realistic landscapes. Using a type of AI model known as a generative adversarial network (GAN), the software gives users what Nvidia is calling a "smart paint brush." This means someone can make a very basic outline of a scene (drawing, say, a tree on a hill) before filling in their rough sketch with natural textures like grass, clouds, forests, or rocks. The results are not quite photorealistic, but they're impressive all the same. The software generates AI landscapes instantly, and it's surprisingly intuitive. For example, when a user draws a tree and then a pool of water underneath it, the model adds the tree's reflection to the pool. Nvidia didn't say if it has any plans to turn the software into an actual product, but it suggests that tools like this could help "everyone from architects and urban planners to landscape designers and game developers" in the future. The company has published a video showing off the imagery it handles particularly well.
Graphics

Crytek Shows 4K 30 FPS Ray Tracing On Non-RTX AMD and NVIDIA GPUs (techspot.com) 140

dryriver writes: Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450 -- real-time ray tracing a complex 3D city environment at 4K 30 FPS. Crytek says that the technology demo runs fine on most normal NVIDIA and AMD gaming GPUs. As if this wasn't impressive already, the software real-time ray tracing technology is still in development and not even final. The framerates achieved may thus go up further, raising the question of precisely what the benefits of owning a super-expensive NVIDIA RTX 20xx series GPU are. Nvidia has claimed over and over again that without its amazing new RTX cores and AI denoiser, GPUs will choke on real-time ray tracing tasks in games. Crytek appears to have proven already that with some intelligently written code, bog ordinary GPU cores can handle real-time ray tracing just fine -- no RTX cores, AI denoiser or anything else NVIDIA touts as necessary.
iMac

Apple Finally Updates the iMac With Significantly More Powerful CPU and GPU Options (arstechnica.com) 143

Today, Apple will finally begin taking orders for newly refreshed 21- and 27-inch iMacs. The new versions don't change the basic design or add major new features, but they offer substantially faster configuration options for the CPU and GPU. From a report: The 21.5-inch iMac now has a 6-core, eighth-generation Intel CPU option -- up from a maximum of four cores before. The 27-inch now has six cores as the standard configuration, with an optional upgrade to a 3.6GHz, 9th-gen, 8-core Intel Core i9 CPU that Apple claims will double performance over the previous 27-inch iMac. The base 27-inch model has a 3GHz 6-core Intel Core i5 CPU, with intermediate configurations at 3.1GHz and 3.7GHz (both Core i5). The big news is arguably that both sizes now offer high-end, workstation-class Vega-graphics options for the first time. Apple added a similar upgrade option to the 15-inch MacBook Pro late last year. In this case, the 21.6-inch iMac has an option for the 20-compute-unit version of Vega with 4GB of HBM2 video memory. That's the same as the top-end 15-inch MacBook Pro option.

The 27-inch iMac can now be configured with the Radeon Pro Vega 48 with 8GB of HBM2. For reference, the much pricier iMac Pro has Vega 56 and Vega 64 options. Apple claims the Vega 48 will net a 50-percent performance improvement over the Radeon Pro 580, the previous top configuration. Speaking of the previous top configuration, the non-Vega GPU options are the same as what was available yesterday. The only difference is that they now have an "X" affixed to the numbers in their names, per AMD branding conventions -- i.e., Radeon Pro 580X instead of 580. RAM options are the same in terms of volume (up to 32GB for the 21.5-inch and 64GB for the 27-inch), but the DDR4 RAM is slightly faster now, at 2666MHz.

Graphics

NVIDIA's Ray Tracing Tech Will Soon Run On Older GTX Cards (engadget.com) 98

NVIDIA's older GeForce GTX 10-series cards will be getting the company's new ray-tracing tech in April. The technology, which is currently only available on its new RTX cards, "will work on GPUs from the 1060 and up, albeit with some serious caveats," reports Engadget. "Some games like Battlefield V will run just fine and deliver better visuals, but other games, like the freshly released Metro Exodus, will run at just 18 fps at 1440p -- obviously an unplayable frame-rate." From the report: What games you'll be able to play with ray-tracing tech (also known as DXR) on NVIDIA GTX cards depends entirely on how it's implemented. In Battlefield V, for instance, the tech is only used for things like reflections. On top of that, you can dial down the strength of the effect so that it consumes less computing horsepower. Metro Exodus, on the other hand, uses ray tracing to create highly realistic "global illumination" effects, simulating lighting from the real world. It's the first game that really showed the potential of RTX cards and actually generated some excitement about the tech. However, because it's so computationally intensive, GTX cards (which don't have the RTX tensor cores) will be effectively be too slow to run it.

NVIDIA explained that when it was first developing the next gen RTX tech, it found chips using Pascal tech would be "monster" sized and consume up to 650 watts. That's because the older cards lack both the integer cores and tensor cores found on the RTX cards. They get particularly stuck on ray-tracing, running about four times slower than the RTX cards on Metro Exodus. Since Metro Exodus is so heavily ray-traced, the RTX cards run it three times quicker than older GTX 10-series cards. However, that falls to two times for Shadow of the Tomb Raider, and 1.6 times for Battlefield V, because both of those games use ray tracing less. The latest GTX 1660 and 1660 Ti GPUs, which don't have RT but do have integer cores, will run ray-traced games moderately better than last-gen 10-series GPUs.
NVIDIA also announced that Unity and Unreal Engine now support ray-tracing, allowing developers to implement the tech into their games. Developers can use NVIDIA's new set of tools called GameWorks RTX to achieve this.

"It includes the RTX Denoiser SDK that enables real-time ray-tracing through techniques that reduce the required ray count and number of samples per pixel," adds Engadget. "It will support ray-traced effects like area light shadows, glossy reflections, ambient occlusion and diffuse global illumination (the latter is used in Metro Exodus). Suffice to say, all of those things will make game look a lot prettier."
Displays

VR Company Co-Founder Spends an Entire Week in a VR Headset (pcgamer.com) 39

An anonymous reader quotes PC Gamer: Not too long into a 168-hour VR marathon session, Jak Wilmot admits the monotony got to him. Wilmot, who is the co-founder of Disrupt VR, also says this experiment is "quite possibly the dumbest thing" he's ever done. So, why do it? For science, of course. I can't imagine immersing myself in a virtual world for a full week, nonstop night and day. Wilmot did it, though, for the most part -- he allowed himself 30 seconds to switch VR headsets when needed, and 30 seconds without a headset on to eat, if required. Other than those small breaks, he spent every other moment in VR...

There doesn't seem to be some big takeaway from this experiment (aside from, perhaps, don't drink coffee while playing VR), though one thing I also found interesting was his integration back into the real world when the experiment was over. "I have never appreciated the smell of outside air so much. One thing we cannot replicate is nature. We can do it visually and auditorally, but there is something about the energy of outside that is amazing," Wilmot observed.

PC Gamer calls it "probably at least partially a publicity stunt. But it's still interesting to see how donning a VR headset for an extended period of time and essentially living in virtual worlds can mess with the mind." Wilmot wore VR gear while working -- and even while showering (with the VR gear protected by plastic), blacking out his windows so he couldn't tell day from night, calling it "a week in the future..."

"I almost feel like I'm in my own 500-suare-foot spaceship," he says at one point, "and I'm really missing earth, and I'm missing nature." Early on he also reported some mild claustrophobia.

You can watch the moment where after seven days he removes the headset and returns to conventional reality, joking "Oh my gosh, the graphics are so good." He reports a slight disorientation as his eyes catch up with real ilfe, and says it changed his perspective on people in the real world, seeing them as "individuals in one collection, one environment -- as avatars."
Patents

Apple Dealt Legal Blow as Jury Awards Qualcomm $31 Million (cnet.com) 47

Apple violated three Qualcomm patents and should pay the chipmaker $31 million for infringing on its technology, a jury decided Thursday, giving Qualcomm momentum as it heads into another legal skirmish with the iPhone maker next month. From a report: Qualcomm, which filed the suit in July 2017, alleged Apple had used its technology without permission in some versions of its popular iPhone. The jury awarded Qualcomm the full amount it requested at the start of the two-week trial, which was held in San Diego. One disputed Qualcomm patent covers technology that lets a smartphone quickly connect to the internet once the device is turned on. Another deals with graphics processing and battery life. The third addresses technology that shifts traffic between a phone's apps processor and modem. The $31 million in damages -- or $1.41 per infringing iPhone -- is a drop in the bucket for Apple, a company that briefly became a $1 trillion company last year. But it marks an important victory for Qualcomm, burnishing its reputation as a mobile components innovator. The win also lends credibility to the notion that much of the company's innovation is reflected in iPhones.
Graphics

NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660 (hothardware.com) 101

MojoKid writes: NVIDIA took the wraps off yet another lower cost Turing-based graphics card today, dubbed the GeForce GTX 1660. For a $219 MSRP, the card offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores with a 1785MHz boost clock and 6GB of GDDR6 RAM with 192.1GB/s of bandwidth. Generally speaking, the new GeForce GTX 1660 is 15% to 30% faster than NVIDIA's previous generation GeForce GTX 1060 but doesn't support new ray tracing and DLSS features that the majority of NVIDIA's new Turing cards support. Performance-wise, GeForce GTX 1660 is generally faster than an AMD Radeon RX 590 overall. Boards from various OEM partners should be in the channel for purchase this week.
Businesses

Amazon Lobbied More Government Entities Than Any Other Public US Company Last Year (fortune.com) 41

Amazon lobbied more government entities last year than any other public U.S. company, covering issues like healthcare, transportation, defense, and labor regulation. "Across 2018, Amazon contacted 40 different federal entities on 21 different general issue areas," reports Fortune, citing a report from Axios. "The only tech giant to lobby on more issues than Amazon was Google's Alphabet." From the report: In terms of money spent, Amazon's $14.4 million is topped only by Alphabet's $21 million, says Bloomberg. While the tech industry overall spent less than half of the $280 million from pharmaceutical and healthcare products companies in Washington, Amazon has increased spending 460% since 2012, growing quickly within its trade. According to Axios, Amazon lobbied on self-driving car and drone issues, hinting at new methods of delivery. It supported a law allowing pharmacists to tell patients when using their insurance is actually more expensive, aiding Amazon's new investment in PillPack. It also covered the labeling of bioengineered food and a pilot program allowing online shoppers to use the Supplemental Nutritional Assistance Program -- signs of Amazon's emerging grocery business.

Slashdot Top Deals