Android

'SPURV' Project Brings Windowed Android Apps To Desktop Linux (androidpolice.com) 52

mfilion shares a report from Android Police: A new "experimental containerized Android environment" from a company called Collabora allows Android apps to run in floating windows alongside native applications on desktop Linux. You can read all the technical details at the source link, but put simply, 'SPURV' creates a virtual Android device on your Linux computer, much like Bluestacks and other similar tools. There are various components of SPURV that allow the Android environment to play audio, connect to networks, and display hardware-accelerated graphics through the underlying Linux system.

The most interesting part is 'SPURV HWComposer,' which renders Android applications in windows, alongside the windows from native Linux applications. This is what sets SPURV apart from (most) other methods of running Android on a computer. For this to work, the Linux desktop has to be using the Wayland display server (some Linux-based OSes use X11). Pre-built binaries for SPURV are not currently available -- you have to build it yourself from the source code. Still, it's an interesting proof-of-concept, and hopefully someone turns it into a full-featured product.

Graphics

What's The Correct Way to Pronounce 'GIF'? (thenewstack.io) 453

"Apparently we're all fighting about how to pronounce 'GIF' again on Twitter," writes technology columnist Mike Melanson: I personally find the argument of web designer Aaron Bazinet, who managed to secure the domain howtoreallypronouncegif.com, rather convincing in its simplicity: "It's the most natural, logical way to pronounce it. That's why when everyone comes across the word for the first time, they use a hard G [as in "gift"]." Bazinet relates the origin of the debate as such:

"The creator of the GIF image format, Steve Wilhite of CompuServe, when deciding on the pronunciation, said he deliberately chose to echo the American peanut butter brand, Jif, and CompuServe employees would often say 'Choosy developers choose GIF(jif)', playing off of Jif's television commercials. If you hear anyone pronounce GIF with a soft G, it's because they know something of this history."

Wilhite attempted to settled the controversy in 2013 when accepting a lifetime achievement award at the 17th annual Webby awards. Using an actual animated .gif for his five-word acceptance speech, he authoritatively announced his preferred pronounciation. However, the chief editor of the Oxford English Dictionary argues that "A coiner effectively loses control of a word once it's out there," adding that "the pronunciation with a hard g is now very widespread and readily understood."

One linguist addressed the topic on Twitter this week, noting studies that found past usage of "gi" in words has been almost evenly split between hard and soft g sounds. Their thread also answers a related question: how will I weaponize a trivial and harmless consonant difference to make other people feel bad and self-conscious about themselves?

Her response? "Maybe just....don't do this."
Portables (Apple)

Apple Still Hasn't Fixed Its MacBook Keyboard Problem (wsj.com) 125

Joanna Stern, writing for the Wall Street Journal [the link may be paywalled]: Why is the breaking of my MacBook Air keyboard so insanely maddening? Let's take a trip down Memory Lane.
April 2015: Apple releases the all-new MacBook with a "butterfly" keyboard. In order to achieve extreme thinness, the keys are much flatter than older generations but the butterfly mechanism underneath, for which the keyboard is named, aims to replicate the bounce of a more traditional keyboard.
October 2016: The MacBook Pro arrives with a second-generation butterfly keyboard. A few months later, some begin to report that letters or characters don't appear, that keys get stuck or that letters unexpectedly repeat.
June 2018: Apple launches a keyboard repair program for what the company says is a "small percentage" of MacBook and MacBook Pro keyboards impacted.
July 2018: Apple releases a new high-end MacBook Pro with the third-generation of the keyboard that's said to fix the issues.
October 2018: Apple's new MacBook Air also has the third-generation keyboard. I recommend it, and even get one for myself.

Which brings us to the grand year 2019 and my MacBook Air's faulty E and R keys. Others have had problems with Apple's latest laptops, too. A proposed nationwide class-action suit alleges that Apple has been aware of the defective nature of these keyboards since 2015 yet sold affected laptops without disclosing the problem. "We are aware that a small number of users are having issues with their third-generation butterfly keyboard and for that we are sorry," an Apple spokesman said in a statement. "The vast majority of Mac notebook customers are having a positive experience with the new keyboard." If you have a problem, contact Apple customer service, he added.
John Gruber, a long time Apple columnist: I consider these keyboards the worst products in Apple history. MacBooks should have the best keyboards in the industry; instead they're the worst. They're doing lasting harm to the reputation of the MacBook brand.
AI

MIT Develops Algorithm To Accelerate Neural Networks By 200x (extremetech.com) 43

An anonymous reader quotes a report from ExtremeTech: MIT researchers have reportedly developed an algorithm that can accelerate [neural networks] by up to 200x. The NAS (Neural Architecture Search, in this context) algorithm they developed "can directly learn specialized convolutional neural networks (CNNs) for target hardware platforms -- when run on a massive image dataset -- in only 200 GPU hours," MIT News reports. This is a massive improvement over the 48,000 hours Google reported taking to develop a state-of-the-art NAS algorithm for image classification. The goal of the researchers is to democratize AI by allowing researchers to experiment with various aspects of CNN design without needing enormous GPU arrays to do the front-end work. If finding state of the art approaches requires 48,000 GPU arrays, precious few people, even at large institutions, will ever have the opportunity to try.

Algorithms produced by the new NAS were, on average, 1.8x faster than the CNNs tested on a mobile device with similar accuracy. The new algorithm leveraged techniques like path level binarization, which stores just one path at a time to reduce memory consumption by an order of magnitude. MIT doesn't actually link out to specific research reports, but from a bit of Google sleuthing, the referenced articles appear to be here and here -- two different research reports from an overlapping group of researchers. The teams focused on pruning entire potential paths for CNNs to use, evaluating each in turn. Lower probability paths are successively pruned away, leaving the final, best-case path. The new model incorporated other improvements as well. Architectures were checked against hardware platforms for latency when evaluated. In some cases, their model predicted superior performance for platforms that had been dismissed as inefficient. For example, 7x7 filters for image classification are typically not used, because they're quite computationally expensive -- but the research team found that these actually worked well for GPUs.

AI

NVIDIA's Latest AI Software Turns Rough Doodles Into Realistic Landscapes (theverge.com) 35

An anonymous reader quotes a report from The Verge: AI is going to be huge for artists, and the latest demonstration comes from Nvidia, which has built prototype software that turns doodles into realistic landscapes. Using a type of AI model known as a generative adversarial network (GAN), the software gives users what Nvidia is calling a "smart paint brush." This means someone can make a very basic outline of a scene (drawing, say, a tree on a hill) before filling in their rough sketch with natural textures like grass, clouds, forests, or rocks. The results are not quite photorealistic, but they're impressive all the same. The software generates AI landscapes instantly, and it's surprisingly intuitive. For example, when a user draws a tree and then a pool of water underneath it, the model adds the tree's reflection to the pool. Nvidia didn't say if it has any plans to turn the software into an actual product, but it suggests that tools like this could help "everyone from architects and urban planners to landscape designers and game developers" in the future. The company has published a video showing off the imagery it handles particularly well.
Graphics

Crytek Shows 4K 30 FPS Ray Tracing On Non-RTX AMD and NVIDIA GPUs (techspot.com) 140

dryriver writes: Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450 -- real-time ray tracing a complex 3D city environment at 4K 30 FPS. Crytek says that the technology demo runs fine on most normal NVIDIA and AMD gaming GPUs. As if this wasn't impressive already, the software real-time ray tracing technology is still in development and not even final. The framerates achieved may thus go up further, raising the question of precisely what the benefits of owning a super-expensive NVIDIA RTX 20xx series GPU are. Nvidia has claimed over and over again that without its amazing new RTX cores and AI denoiser, GPUs will choke on real-time ray tracing tasks in games. Crytek appears to have proven already that with some intelligently written code, bog ordinary GPU cores can handle real-time ray tracing just fine -- no RTX cores, AI denoiser or anything else NVIDIA touts as necessary.
iMac

Apple Finally Updates the iMac With Significantly More Powerful CPU and GPU Options (arstechnica.com) 143

Today, Apple will finally begin taking orders for newly refreshed 21- and 27-inch iMacs. The new versions don't change the basic design or add major new features, but they offer substantially faster configuration options for the CPU and GPU. From a report: The 21.5-inch iMac now has a 6-core, eighth-generation Intel CPU option -- up from a maximum of four cores before. The 27-inch now has six cores as the standard configuration, with an optional upgrade to a 3.6GHz, 9th-gen, 8-core Intel Core i9 CPU that Apple claims will double performance over the previous 27-inch iMac. The base 27-inch model has a 3GHz 6-core Intel Core i5 CPU, with intermediate configurations at 3.1GHz and 3.7GHz (both Core i5). The big news is arguably that both sizes now offer high-end, workstation-class Vega-graphics options for the first time. Apple added a similar upgrade option to the 15-inch MacBook Pro late last year. In this case, the 21.6-inch iMac has an option for the 20-compute-unit version of Vega with 4GB of HBM2 video memory. That's the same as the top-end 15-inch MacBook Pro option.

The 27-inch iMac can now be configured with the Radeon Pro Vega 48 with 8GB of HBM2. For reference, the much pricier iMac Pro has Vega 56 and Vega 64 options. Apple claims the Vega 48 will net a 50-percent performance improvement over the Radeon Pro 580, the previous top configuration. Speaking of the previous top configuration, the non-Vega GPU options are the same as what was available yesterday. The only difference is that they now have an "X" affixed to the numbers in their names, per AMD branding conventions -- i.e., Radeon Pro 580X instead of 580. RAM options are the same in terms of volume (up to 32GB for the 21.5-inch and 64GB for the 27-inch), but the DDR4 RAM is slightly faster now, at 2666MHz.

Graphics

NVIDIA's Ray Tracing Tech Will Soon Run On Older GTX Cards (engadget.com) 98

NVIDIA's older GeForce GTX 10-series cards will be getting the company's new ray-tracing tech in April. The technology, which is currently only available on its new RTX cards, "will work on GPUs from the 1060 and up, albeit with some serious caveats," reports Engadget. "Some games like Battlefield V will run just fine and deliver better visuals, but other games, like the freshly released Metro Exodus, will run at just 18 fps at 1440p -- obviously an unplayable frame-rate." From the report: What games you'll be able to play with ray-tracing tech (also known as DXR) on NVIDIA GTX cards depends entirely on how it's implemented. In Battlefield V, for instance, the tech is only used for things like reflections. On top of that, you can dial down the strength of the effect so that it consumes less computing horsepower. Metro Exodus, on the other hand, uses ray tracing to create highly realistic "global illumination" effects, simulating lighting from the real world. It's the first game that really showed the potential of RTX cards and actually generated some excitement about the tech. However, because it's so computationally intensive, GTX cards (which don't have the RTX tensor cores) will be effectively be too slow to run it.

NVIDIA explained that when it was first developing the next gen RTX tech, it found chips using Pascal tech would be "monster" sized and consume up to 650 watts. That's because the older cards lack both the integer cores and tensor cores found on the RTX cards. They get particularly stuck on ray-tracing, running about four times slower than the RTX cards on Metro Exodus. Since Metro Exodus is so heavily ray-traced, the RTX cards run it three times quicker than older GTX 10-series cards. However, that falls to two times for Shadow of the Tomb Raider, and 1.6 times for Battlefield V, because both of those games use ray tracing less. The latest GTX 1660 and 1660 Ti GPUs, which don't have RT but do have integer cores, will run ray-traced games moderately better than last-gen 10-series GPUs.
NVIDIA also announced that Unity and Unreal Engine now support ray-tracing, allowing developers to implement the tech into their games. Developers can use NVIDIA's new set of tools called GameWorks RTX to achieve this.

"It includes the RTX Denoiser SDK that enables real-time ray-tracing through techniques that reduce the required ray count and number of samples per pixel," adds Engadget. "It will support ray-traced effects like area light shadows, glossy reflections, ambient occlusion and diffuse global illumination (the latter is used in Metro Exodus). Suffice to say, all of those things will make game look a lot prettier."
Displays

VR Company Co-Founder Spends an Entire Week in a VR Headset (pcgamer.com) 39

An anonymous reader quotes PC Gamer: Not too long into a 168-hour VR marathon session, Jak Wilmot admits the monotony got to him. Wilmot, who is the co-founder of Disrupt VR, also says this experiment is "quite possibly the dumbest thing" he's ever done. So, why do it? For science, of course. I can't imagine immersing myself in a virtual world for a full week, nonstop night and day. Wilmot did it, though, for the most part -- he allowed himself 30 seconds to switch VR headsets when needed, and 30 seconds without a headset on to eat, if required. Other than those small breaks, he spent every other moment in VR...

There doesn't seem to be some big takeaway from this experiment (aside from, perhaps, don't drink coffee while playing VR), though one thing I also found interesting was his integration back into the real world when the experiment was over. "I have never appreciated the smell of outside air so much. One thing we cannot replicate is nature. We can do it visually and auditorally, but there is something about the energy of outside that is amazing," Wilmot observed.

PC Gamer calls it "probably at least partially a publicity stunt. But it's still interesting to see how donning a VR headset for an extended period of time and essentially living in virtual worlds can mess with the mind." Wilmot wore VR gear while working -- and even while showering (with the VR gear protected by plastic), blacking out his windows so he couldn't tell day from night, calling it "a week in the future..."

"I almost feel like I'm in my own 500-suare-foot spaceship," he says at one point, "and I'm really missing earth, and I'm missing nature." Early on he also reported some mild claustrophobia.

You can watch the moment where after seven days he removes the headset and returns to conventional reality, joking "Oh my gosh, the graphics are so good." He reports a slight disorientation as his eyes catch up with real ilfe, and says it changed his perspective on people in the real world, seeing them as "individuals in one collection, one environment -- as avatars."
Patents

Apple Dealt Legal Blow as Jury Awards Qualcomm $31 Million (cnet.com) 47

Apple violated three Qualcomm patents and should pay the chipmaker $31 million for infringing on its technology, a jury decided Thursday, giving Qualcomm momentum as it heads into another legal skirmish with the iPhone maker next month. From a report: Qualcomm, which filed the suit in July 2017, alleged Apple had used its technology without permission in some versions of its popular iPhone. The jury awarded Qualcomm the full amount it requested at the start of the two-week trial, which was held in San Diego. One disputed Qualcomm patent covers technology that lets a smartphone quickly connect to the internet once the device is turned on. Another deals with graphics processing and battery life. The third addresses technology that shifts traffic between a phone's apps processor and modem. The $31 million in damages -- or $1.41 per infringing iPhone -- is a drop in the bucket for Apple, a company that briefly became a $1 trillion company last year. But it marks an important victory for Qualcomm, burnishing its reputation as a mobile components innovator. The win also lends credibility to the notion that much of the company's innovation is reflected in iPhones.
Graphics

NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660 (hothardware.com) 101

MojoKid writes: NVIDIA took the wraps off yet another lower cost Turing-based graphics card today, dubbed the GeForce GTX 1660. For a $219 MSRP, the card offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores with a 1785MHz boost clock and 6GB of GDDR6 RAM with 192.1GB/s of bandwidth. Generally speaking, the new GeForce GTX 1660 is 15% to 30% faster than NVIDIA's previous generation GeForce GTX 1060 but doesn't support new ray tracing and DLSS features that the majority of NVIDIA's new Turing cards support. Performance-wise, GeForce GTX 1660 is generally faster than an AMD Radeon RX 590 overall. Boards from various OEM partners should be in the channel for purchase this week.
Businesses

Amazon Lobbied More Government Entities Than Any Other Public US Company Last Year (fortune.com) 41

Amazon lobbied more government entities last year than any other public U.S. company, covering issues like healthcare, transportation, defense, and labor regulation. "Across 2018, Amazon contacted 40 different federal entities on 21 different general issue areas," reports Fortune, citing a report from Axios. "The only tech giant to lobby on more issues than Amazon was Google's Alphabet." From the report: In terms of money spent, Amazon's $14.4 million is topped only by Alphabet's $21 million, says Bloomberg. While the tech industry overall spent less than half of the $280 million from pharmaceutical and healthcare products companies in Washington, Amazon has increased spending 460% since 2012, growing quickly within its trade. According to Axios, Amazon lobbied on self-driving car and drone issues, hinting at new methods of delivery. It supported a law allowing pharmacists to tell patients when using their insurance is actually more expensive, aiding Amazon's new investment in PillPack. It also covered the labeling of bioengineered food and a pilot program allowing online shoppers to use the Supplemental Nutritional Assistance Program -- signs of Amazon's emerging grocery business.
Graphics

Microsoft Brings DirectX 12 To Windows 7 (anandtech.com) 119

Microsoft has announced a form of DirectX 12 that will support Windows 7. "Now before you get too excited, this is currently only enabled for World of Warcraft; and indeed it's not slated to be a general-purpose solution like DX12 on Win10," reports AnandTech. "Instead, Microsoft has stated that they are working with a few other developers to bring their DX12 games/backends to Windows 7 as well. As a consumer it's great to see them supporting their product ten years after it launched, but with the entire OS being put out to pasture in nine months, it seems like an odd time to be dedicating resources to bringing it new features." From the report: For some background, Microsoft's latest DirectX API was created to remove some of the CPU bottlenecks for gaming by allowing for developers to use low-level programming conventions to shift some of the pressure points away from the CPU. This was a response to single-threaded CPU performance plateauing, making complex graphical workloads increasingly CPU-bounded. There's many advantages to using this API over traditional DX11, especially for threading and draw calls. But, Microsoft made the decision long ago to only support DirectX 12 on Windows 10, with its WDDM 2.0 driver stack.

Today's announcement is a pretty big surprise on a number of levels. If Microsoft had wanted to back-port DX12 to Windows 7, you would have thought they'd have done it before Windows 7 entered its long-term servicing state. As it is, even free security patches for Windows 7 are set to end on January 14, 2020, which is well under a year away, and the company is actively trying to migrate users to Windows 10 to avoid having a huge swath of machines sitting in an unpatched state. In fact, they are about to add a pop-up notification to Windows 7 to let users know that they are running out of support very soon. So adding a big feature like DX12 now not only risks undermining their own efforts to migrate people away from Windows 7, but also adding a new feature well after Windows 7 entered long-term support. It's just bizarre.

First Person Shooters (Games)

Study Shows Gamers At High FPS Have Better Kill-To-Death Ratios In Battle Royale Games (hothardware.com) 149

MojoKid writes: Gaming enthusiasts and pro-gamers have believed for a long time that playing on high refresh rates displays with high frame rates offers a competitive edge in fast-action games like PUBG, Fortnite and Apex Legends. The premise is, the faster the display can update the action for you, every millisecond saved will count when it comes to tracking targets and reaction times. This sounds logical but there's never been specific data tabulated to back this theory up and prove it. NVIDIA, however, just took it upon themselves with the use of their GeForce Experience tool, to compile anonymous data on gamers by hours played per week, panel refresh rate and graphics card type. Though obviously this data speaks to only NVIDIA GPU users, the numbers do speak for themselves.

The more powerful the GPU with a higher frame rate, along with higher panel refresh rate, generally speaking, the higher the kill-to-death ratio (K/D) for the gamers that were profiled. In fact, it really didn't matter hour many hours per week were played. Casual gamers and heavy-duty daily players alike could see anywhere from about a 50 to 150 percent increase in K/D ratio for significantly better overall player performance. It should be underscored that it really doesn't matter what GPU is at play; gamers with AMD graphics cards that can push high frame rates at 1080p or similar can see similar K/D gains. However, the new performance sweet spot seems to be as close to 144Hz/144FPS as your system can push, the better off you'll be and the higher the frame rate and refresh rate the better as well.

Open Source

Linux 5.0 Released (phoronix.com) 107

An anonymous reader writes: Linus Torvalds has released Linux 5.0 in kicking off the kernel's 28th year of development. Linux 5.0 features include AMD FreeSync support, open-source NVIDIA Turing GPU support, Intel Icelake graphics, Intel VT-d scalable mode, NXP PowerPC processors are now mitigated for Spectre Variant Two, and countless other additions. eWeek adds: Among the new features that have landed in Linux 5.0 is support for the Adiantum encryption system, developed by Google for low power devices. Google's Android mobile operating system and ChromeOS desktop operating system both rely on the Linux kernel. "Storage encryption protects your data if your phone falls into someone else's hands," Paul Crowley and Eric Biggers, Android Security and Privacy Team at Google wrote in a blog post. "Adiantum is an innovation in cryptography designed to make storage encryption more efficient for devices without cryptographic acceleration, to ensure that all devices can be encrypted. Memory management in Linux also gets a boost in the 5.0 kernel with a series of improvements designed to help prevent memory fragmentation, which can reduce performance.
Games

The New 'Red Dead Redemption' Reveals the Biggest Problem With Marquee Games Today: They're Boring as Hell. (theoutline.com) 211

An anonymous reader shares a column: Everything about "Red Dead Redemption 2" is big. The latest open-world western, released in October by Rockstar Games, constantly reminds you of this. It takes roughly 15 minutes for its bland everycowboy star, Arthur Morgan, to gallop across the 29-square-mile map. It has 200 species of animals, including grizzly bears, alligators, and a surprising number of birds. It takes about 45.5 hours to play through the main quest, and 150-plus hours to reach 100 percent completion. There are more than 50 weapons to choose from, such as a double-barreled shotgun and a rusty hatchet. It's big, big, big.

[...] On top of all the bigness, "Red Dead Redemption 2" is also incredibly dull. I've been playing it off and on since it was released, and I'm still waiting for it to get fun. I'm not alone in thinking so -- Mark Brown of Game Maker's Toolkit called it "quite boring" and Mashable said it's a "monumental disappointment." There are a glut of Reddit posts from people complaining about how slow the game feels, usually with a tone of extreme self-consciousness. Unless you're a real a**hole, it's not exactly fun to stray from popular consensus. Perhaps the general hesitancy to criticize the game is due to the fact that it's not technically bad. Its graphics and scale really are impressive. It is designed to please.

And yet "RDR2" seems to exemplify a certain kind of hollowness that's now standard among Triple-A titles. It's very big, with only tedium inside. Call it a Real World Game. The main problem with "RDR2" is that it's comprised almost entirely of tedious, mandatory chores. It always feels like it's stalling for time, trying to juke the number of hours it takes to complete it.

The Military

Microsoft CEO Defends Pentagon Contract Following Employee Outcry (theverge.com) 221

Microsoft CEO Satya Nadella is defending the company's $479 million contract with the Pentagon to supply augmented reality headsets to the U.S. military. "We made a principled decision that we're not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy," he told CNN Business at Mobile World Congress. "We were very transparent about that decision and we'll continue to have that dialogue [with employees]," he added during the exclusive interview. From the report: Microsoft was awarded the contract to supply "Integrated Visual Augmentation System" prototypes to the U.S. military in November. The company could eventually deliver over 100,000 headsets under the contract. Microsoft's HoloLens augmented reality technology allows users to see the world around them, but with virtual graphics overlaid. The Israeli military, which has taken delivery of some HoloLens headsets, says the technology can be used to help commanders visualize the battlefield and field medics to consult doctors. According to procurement documents, the U.S. military is seeking a single platform that provides its forces with "increased lethality, mobility and situational awareness" in combat. Microsoft employees have recently circulated a letter addressed to Nadella and Brad Smith, the company's president and chief legal officer, arguing that the company should not supply its HoloLens technology to the U.S. military. "It's not about taking arbitrary action by a single company, it's not about 50 people or 100 people or even 100,000 people in a company," he said. "It's really about being a responsible corporate citizen in a democracy."
Graphics

NVIDIA Turing-Based GeForce GTX 1660 Ti Launched At $279 (hothardware.com) 94

MojoKid writes: NVIDIA has launched yet another graphics card today based on the company's new Turing GPU. This latest GPU, however, doesn't support NVIDIA's RTX ray-tracing technology or its DLSS (Deep Learning Super Sampling) image quality tech. The new GeForce GTX 1660 Ti does, however, bring with it all of the other GPU architecture improvements NVIDIA Turing offers. The new TU116 GPU on board the GeForce GTX 1660 Ti supports concurrent integer and floating point instructions (rather than serializing integer and FP instructions), and it also has a redesigned cache structure with double the amount of L2 cache versus their predecessors, while its L1 cache has been outfitted with a wider memory bus that ultimately doubles the bandwidth. NVIDIA's TU116 has 1,536 active CUDA cores, which is a decent uptick from the GTX 1060, but less than the current gen RTX 2060. Cards will also come equipped with 6GB of GDDR6 memory at 12 Gbps for 288GB/s of bandwidth. Performance-wise, the new GeForce GTX 1660 Ti is typically slightly faster than a previous gen GeFore GTX 1070, and much faster than a GTX 1060. Cards should be available at retail in the next few days, starting at $279.
Intel

Intel Starts Publishing Open-Source Linux Driver Code For Discrete GPUs (phoronix.com) 43

fstack writes: Intel is still a year out from releasing their first discrete graphics processors, but the company has begun publishing their open-source Linux GPU driver code. This week they began by publishing patches on top of their existing Intel Linux driver for supporting device local memory for dedicated video memory as part of their restructuring effort to support discrete graphics cards. Intel later confirmed this is the start of their open-source driver support for discrete graphics solutions. They have also begun working on Linux driver support for Adaptive-Sync and better reset recovery.
Hardware

Nvidia CEO Foresees a Great Year for PC Gaming Laptops (venturebeat.com) 36

Nvidia has predicted that the year ahead would be a good one for the company, with demand for laptop gaming gear remaining strong. From a report: Looking forward, Huang said it would be a big year for gaming laptops, as Nvidia knows that more than 40 Turing-based gaming laptops (based on the GeForce RTX 2060) are poised to launch during the year. Those laptops use mid-range RTX cards based on graphics processing units (GPUs) using Nvidia's new Turing architecture -- the GeForce RTX graphics cards that can do real-time ray tracing -- that are battery efficient.

Huang acknowledged that visibility is limited. I asked him if cloud gaming would be a disruptive force during the year. But he noted that Nvidia had been providing its own cloud gaming solution, GeForce Now, with relatively little impact on the market for three years. So he said it remains to be seen if cloud gaming and the "Netflix of games" would make an impact on the market. In the meantime, he said that gaming laptops would launch.

Slashdot Top Deals