×
Businesses

Amazon Lobbied More Government Entities Than Any Other Public US Company Last Year (fortune.com) 41

Amazon lobbied more government entities last year than any other public U.S. company, covering issues like healthcare, transportation, defense, and labor regulation. "Across 2018, Amazon contacted 40 different federal entities on 21 different general issue areas," reports Fortune, citing a report from Axios. "The only tech giant to lobby on more issues than Amazon was Google's Alphabet." From the report: In terms of money spent, Amazon's $14.4 million is topped only by Alphabet's $21 million, says Bloomberg. While the tech industry overall spent less than half of the $280 million from pharmaceutical and healthcare products companies in Washington, Amazon has increased spending 460% since 2012, growing quickly within its trade. According to Axios, Amazon lobbied on self-driving car and drone issues, hinting at new methods of delivery. It supported a law allowing pharmacists to tell patients when using their insurance is actually more expensive, aiding Amazon's new investment in PillPack. It also covered the labeling of bioengineered food and a pilot program allowing online shoppers to use the Supplemental Nutritional Assistance Program -- signs of Amazon's emerging grocery business.
Graphics

Microsoft Brings DirectX 12 To Windows 7 (anandtech.com) 119

Microsoft has announced a form of DirectX 12 that will support Windows 7. "Now before you get too excited, this is currently only enabled for World of Warcraft; and indeed it's not slated to be a general-purpose solution like DX12 on Win10," reports AnandTech. "Instead, Microsoft has stated that they are working with a few other developers to bring their DX12 games/backends to Windows 7 as well. As a consumer it's great to see them supporting their product ten years after it launched, but with the entire OS being put out to pasture in nine months, it seems like an odd time to be dedicating resources to bringing it new features." From the report: For some background, Microsoft's latest DirectX API was created to remove some of the CPU bottlenecks for gaming by allowing for developers to use low-level programming conventions to shift some of the pressure points away from the CPU. This was a response to single-threaded CPU performance plateauing, making complex graphical workloads increasingly CPU-bounded. There's many advantages to using this API over traditional DX11, especially for threading and draw calls. But, Microsoft made the decision long ago to only support DirectX 12 on Windows 10, with its WDDM 2.0 driver stack.

Today's announcement is a pretty big surprise on a number of levels. If Microsoft had wanted to back-port DX12 to Windows 7, you would have thought they'd have done it before Windows 7 entered its long-term servicing state. As it is, even free security patches for Windows 7 are set to end on January 14, 2020, which is well under a year away, and the company is actively trying to migrate users to Windows 10 to avoid having a huge swath of machines sitting in an unpatched state. In fact, they are about to add a pop-up notification to Windows 7 to let users know that they are running out of support very soon. So adding a big feature like DX12 now not only risks undermining their own efforts to migrate people away from Windows 7, but also adding a new feature well after Windows 7 entered long-term support. It's just bizarre.

First Person Shooters (Games)

Study Shows Gamers At High FPS Have Better Kill-To-Death Ratios In Battle Royale Games (hothardware.com) 149

MojoKid writes: Gaming enthusiasts and pro-gamers have believed for a long time that playing on high refresh rates displays with high frame rates offers a competitive edge in fast-action games like PUBG, Fortnite and Apex Legends. The premise is, the faster the display can update the action for you, every millisecond saved will count when it comes to tracking targets and reaction times. This sounds logical but there's never been specific data tabulated to back this theory up and prove it. NVIDIA, however, just took it upon themselves with the use of their GeForce Experience tool, to compile anonymous data on gamers by hours played per week, panel refresh rate and graphics card type. Though obviously this data speaks to only NVIDIA GPU users, the numbers do speak for themselves.

The more powerful the GPU with a higher frame rate, along with higher panel refresh rate, generally speaking, the higher the kill-to-death ratio (K/D) for the gamers that were profiled. In fact, it really didn't matter hour many hours per week were played. Casual gamers and heavy-duty daily players alike could see anywhere from about a 50 to 150 percent increase in K/D ratio for significantly better overall player performance. It should be underscored that it really doesn't matter what GPU is at play; gamers with AMD graphics cards that can push high frame rates at 1080p or similar can see similar K/D gains. However, the new performance sweet spot seems to be as close to 144Hz/144FPS as your system can push, the better off you'll be and the higher the frame rate and refresh rate the better as well.

Open Source

Linux 5.0 Released (phoronix.com) 107

An anonymous reader writes: Linus Torvalds has released Linux 5.0 in kicking off the kernel's 28th year of development. Linux 5.0 features include AMD FreeSync support, open-source NVIDIA Turing GPU support, Intel Icelake graphics, Intel VT-d scalable mode, NXP PowerPC processors are now mitigated for Spectre Variant Two, and countless other additions. eWeek adds: Among the new features that have landed in Linux 5.0 is support for the Adiantum encryption system, developed by Google for low power devices. Google's Android mobile operating system and ChromeOS desktop operating system both rely on the Linux kernel. "Storage encryption protects your data if your phone falls into someone else's hands," Paul Crowley and Eric Biggers, Android Security and Privacy Team at Google wrote in a blog post. "Adiantum is an innovation in cryptography designed to make storage encryption more efficient for devices without cryptographic acceleration, to ensure that all devices can be encrypted. Memory management in Linux also gets a boost in the 5.0 kernel with a series of improvements designed to help prevent memory fragmentation, which can reduce performance.
Games

The New 'Red Dead Redemption' Reveals the Biggest Problem With Marquee Games Today: They're Boring as Hell. (theoutline.com) 211

An anonymous reader shares a column: Everything about "Red Dead Redemption 2" is big. The latest open-world western, released in October by Rockstar Games, constantly reminds you of this. It takes roughly 15 minutes for its bland everycowboy star, Arthur Morgan, to gallop across the 29-square-mile map. It has 200 species of animals, including grizzly bears, alligators, and a surprising number of birds. It takes about 45.5 hours to play through the main quest, and 150-plus hours to reach 100 percent completion. There are more than 50 weapons to choose from, such as a double-barreled shotgun and a rusty hatchet. It's big, big, big.

[...] On top of all the bigness, "Red Dead Redemption 2" is also incredibly dull. I've been playing it off and on since it was released, and I'm still waiting for it to get fun. I'm not alone in thinking so -- Mark Brown of Game Maker's Toolkit called it "quite boring" and Mashable said it's a "monumental disappointment." There are a glut of Reddit posts from people complaining about how slow the game feels, usually with a tone of extreme self-consciousness. Unless you're a real a**hole, it's not exactly fun to stray from popular consensus. Perhaps the general hesitancy to criticize the game is due to the fact that it's not technically bad. Its graphics and scale really are impressive. It is designed to please.

And yet "RDR2" seems to exemplify a certain kind of hollowness that's now standard among Triple-A titles. It's very big, with only tedium inside. Call it a Real World Game. The main problem with "RDR2" is that it's comprised almost entirely of tedious, mandatory chores. It always feels like it's stalling for time, trying to juke the number of hours it takes to complete it.

The Military

Microsoft CEO Defends Pentagon Contract Following Employee Outcry (theverge.com) 221

Microsoft CEO Satya Nadella is defending the company's $479 million contract with the Pentagon to supply augmented reality headsets to the U.S. military. "We made a principled decision that we're not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy," he told CNN Business at Mobile World Congress. "We were very transparent about that decision and we'll continue to have that dialogue [with employees]," he added during the exclusive interview. From the report: Microsoft was awarded the contract to supply "Integrated Visual Augmentation System" prototypes to the U.S. military in November. The company could eventually deliver over 100,000 headsets under the contract. Microsoft's HoloLens augmented reality technology allows users to see the world around them, but with virtual graphics overlaid. The Israeli military, which has taken delivery of some HoloLens headsets, says the technology can be used to help commanders visualize the battlefield and field medics to consult doctors. According to procurement documents, the U.S. military is seeking a single platform that provides its forces with "increased lethality, mobility and situational awareness" in combat. Microsoft employees have recently circulated a letter addressed to Nadella and Brad Smith, the company's president and chief legal officer, arguing that the company should not supply its HoloLens technology to the U.S. military. "It's not about taking arbitrary action by a single company, it's not about 50 people or 100 people or even 100,000 people in a company," he said. "It's really about being a responsible corporate citizen in a democracy."
Graphics

NVIDIA Turing-Based GeForce GTX 1660 Ti Launched At $279 (hothardware.com) 94

MojoKid writes: NVIDIA has launched yet another graphics card today based on the company's new Turing GPU. This latest GPU, however, doesn't support NVIDIA's RTX ray-tracing technology or its DLSS (Deep Learning Super Sampling) image quality tech. The new GeForce GTX 1660 Ti does, however, bring with it all of the other GPU architecture improvements NVIDIA Turing offers. The new TU116 GPU on board the GeForce GTX 1660 Ti supports concurrent integer and floating point instructions (rather than serializing integer and FP instructions), and it also has a redesigned cache structure with double the amount of L2 cache versus their predecessors, while its L1 cache has been outfitted with a wider memory bus that ultimately doubles the bandwidth. NVIDIA's TU116 has 1,536 active CUDA cores, which is a decent uptick from the GTX 1060, but less than the current gen RTX 2060. Cards will also come equipped with 6GB of GDDR6 memory at 12 Gbps for 288GB/s of bandwidth. Performance-wise, the new GeForce GTX 1660 Ti is typically slightly faster than a previous gen GeFore GTX 1070, and much faster than a GTX 1060. Cards should be available at retail in the next few days, starting at $279.
Intel

Intel Starts Publishing Open-Source Linux Driver Code For Discrete GPUs (phoronix.com) 43

fstack writes: Intel is still a year out from releasing their first discrete graphics processors, but the company has begun publishing their open-source Linux GPU driver code. This week they began by publishing patches on top of their existing Intel Linux driver for supporting device local memory for dedicated video memory as part of their restructuring effort to support discrete graphics cards. Intel later confirmed this is the start of their open-source driver support for discrete graphics solutions. They have also begun working on Linux driver support for Adaptive-Sync and better reset recovery.
Hardware

Nvidia CEO Foresees a Great Year for PC Gaming Laptops (venturebeat.com) 36

Nvidia has predicted that the year ahead would be a good one for the company, with demand for laptop gaming gear remaining strong. From a report: Looking forward, Huang said it would be a big year for gaming laptops, as Nvidia knows that more than 40 Turing-based gaming laptops (based on the GeForce RTX 2060) are poised to launch during the year. Those laptops use mid-range RTX cards based on graphics processing units (GPUs) using Nvidia's new Turing architecture -- the GeForce RTX graphics cards that can do real-time ray tracing -- that are battery efficient.

Huang acknowledged that visibility is limited. I asked him if cloud gaming would be a disruptive force during the year. But he noted that Nvidia had been providing its own cloud gaming solution, GeForce Now, with relatively little impact on the market for three years. So he said it remains to be seen if cloud gaming and the "Netflix of games" would make an impact on the market. In the meantime, he said that gaming laptops would launch.

Graphics

AMD Radeon VII Graphics Card Launched, Benchmarks Versus NVIDIA GeForce RTX (hothardware.com) 73

MojoKid writes: AMD officially launched its new Radeon VII flagship graphics card today, based on the company's 7nm second-generation Vega architecture. In addition to core GPU optimizations, Radeon VII provides 2X the graphics memory at 16GB and 2.1X the memory bandwidth at a full 1TB/s, compared to AMD's previous generation Radeon RX Vega 64. The move to 7nm allowed AMD to shrink the Vega 20 GPU die down to 331 square millimeters. This shrink and the subsequent silicon die area saving is what allowed them to add an additional two stacks of HBM2 memory and increase the high-bandwidth cache (frame buffer) capacity to 16GB. The GPU on board the Radeon VII has 60CUs and a total of 3,840 active stream processors with a board power TDP of 300 Watts. As you might expect, it's a beast in the benchmarks that's able to pull ahead of NVIDIA's GeForce RTX 2080 in spots but ultimately lands somewhere in between the performance of an RTX 2070 and 2080 overall. AMD Radeon VII cards will be available in a matter of days at an MSRP of $699 with custom boards from third-party partners showing up shortly as well.
AI

The World's Fastest Supercomputer Breaks an AI Record (wired.com) 66

Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. From a report: The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before. Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI's frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

"Deep learning has never been scaled to such levels of performance before," says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. His group collaborated with researchers at Summit's home base, Oak Ridge National Lab. Fittingly, the world's most powerful computer's AI workout was focused on one of the world's largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century's worth of three-hour forecasts for Earth's atmosphere.

Ubuntu

System76 Unveils 'Darter Pro' Linux Laptop With Choice of Ubuntu or Pop!_OS (betanews.com) 86

An anonymous reader writes: Today, System76 unveiled its latest laptop -- the 15.6-inch (full-HD) "Darter Pro." It is thin, but not overly so -- it still has USB-A ports (thankfully). The computer is quite modern, however, as it also has a USB-C/Thunderbolt 3 port. It supports Pop!_OS 18.04 LTS (64-bit), Pop!_OS 18.10 (64-bit), or Ubuntu 18.04 LTS (64-bit) operating system. It comes in two variants, with the following processor options: 8th Gen Intel Core i5-8265U: 1.6 up to 3.90 GHz -- 6MB Cache -- 4 Cores -- 8 Threads, or 8th Gen Intel Core i7-8565U: 1.8 up to 4.60 GHz -- 8MB Cache -- 4 Cores -- 8 Threads, with either coupled with Intel UHD Graphics 620 GPU, and up to 32GB Dual Channel DDR4 @ 2400 MHz, and M.2 SATA or PCIe NVMe SSD for storage. As for ports, there is USB 3.1 Type-C with Thunderbolt 3, 2 USB 3.0 Type-A, 1 x USB 2.0, SD Card Reader. The company says it will announce the pricing at a later stage,
AI

DeepMind AI AlphaStar Wins 10-1 Against 'StarCarft II' Pros (newscientist.com) 103

In a series of matches streamed on YouTube and Twitch, DeepMind AI AlphaStar defeated two top-ranked professionals 10-1 at real-time strategy game StarCraft II. "This is of course an exciting moment for us," said David Silver at DeepMind in a live stream watched by more than 55,000 people. "For the first time we saw an AI that was able to defeat a professional player." New Scientist reports: DeepMind created five versions of their AI, called AlphaStar, and trained them on footage of human games. The different AIs then played against each other in a league, with the leading AI accumulating the equivalent of 200 years of game experience. With this, AlphaStar beat professional players Dario Wunsch and Grzegorz Komincz -- ranked 44th and 13th in the world respectively. AlphaStar's success came with some caveats: the AI played only on a single map, and using a single kind of player (there are three in the game). The professionals also had to contend with playing different versions of AlphaStar from match to match. While the AlphaStar was playing on a single graphics processing unit, a computer chip found in many gaming computers, it was trained on 16 tensor processing units hosted in the Google cloud -- processing power beyond the realms of many.
Wine

Wine 4.0 Released With Vulkan Support, Initial Direct3D 12 and Better HiDPI (phoronix.com) 73

Michael Larabel writes via Phoronix: Wine 4.0 is now officially available as the new annual stable release to Wine for running Windows programs and games on Linux and other operating systems. Following seven weekly release candidates, Wine 4.0 was ready to ship today as judged by Wine founder Alexandre Julliard. Wine 4.0 is a big release bringing initial Vulkan graphics API support, Direct3D CSMT is enabled by default, early Direct3D 12 support via VKD3D, continued HiDPI work, various OpenGL improvements, multi-sample D3D texture support, 64-bit improvements, continued Android support, and much more. The release announcement and notes can be read via WineHQ.org. The source can be downloaded here.
Businesses

Battlefield 5's Poor Sales Numbers Have Become a Disaster For Electronic Arts (seekingalpha.com) 715

dryriver writes: Electronic Arts has mismanaged the Battlefield franchise in the past -- BF3 and BF4 were not great from a gameplay perspective -- but with Battlefield 5, Electronic Arts is facing a real disaster that has sent its stock plummeting on the stock exchanges. First came the fierce cultural internet backlash from gamers to the Battlefield 5 reveal trailer -- EA tried to inject so much 21st Century gender diversity and Hollywood action-movie style fighting into what was supposed to be a reasonably historically accurate WWII shooter trailer, that many gamers felt the game would be "a seriously inauthentic portrayal of what WW2 warfare really was like." Then the game sold very poorly after a delayed launch date -- far less than the mildly successful WW1 shooter Battlefield 1 for example -- and is currently discounted by 33% to 50% at all major game retailers to try desperately to push sales numbers up. This was also a disaster for Nvidia, as Battlefield 5 was the tentpole title supposed to entice gamers into buying expensive new realtime ray-tracing Nvidia 2080 RTX GPUs.

Electronic Arts had to revise its earnings estimates for 2019, some hedge funds sold off their EA stock, fearing low sales and stiff competition from popular Battle Royal games like Fortnite and PUBG, and EA stock is currently 45% down from its peak value in July 2018. EA had already become seriously unpopular with gamers because of annoying Battlefield franchise in-game mechanisms such as heaving to buy decent-aiming-accuracy weapons with additional cash, having to constantly pay for additional DLC content and game maps, and the very poor multiplayer gameplay of its two Star Wars: Battlefront titles (essentially Battlefield with laser blasters set in the Star Wars Universe). It seems that with Battlefield 5, EA -- not a company known for listening to its customers -- finally hit a brick wall, in the form of many Battlefield fans simply not buying or playing Battlefield 5.

AMD

Nvidia CEO Trashes AMD's New GPU: 'The Performance Is Lousy' (gizmodo.com) 115

An anonymous reader shares a report: Yesterday, AMD announced a new graphics card, the $700 Radeon VII, based on its second-generation Vega architecture. The GPU is the first one available to consumers based on the 7nm process. Smaller processes tend to be faster and more energy efficient, which means it could theoretically be faster than GPUs with larger processes, like the first generation Vega GPU (14nm) or Nvidia's RTX 20-series (12nm). I say "could," because so far Nvidia's RTX 20-series has been speedy in our benchmarks. From the $1,000+ 2080 Ti down to $350 2060 announced Sunday, support ray tracing. This complex technology allows you to trace a point of light from a source to a surface in a digital environment. What it means in practice is video games with hyperrealistic reflections and shadows.

It's impressive technology, and Nvidia has touted it as the primary reason to upgrade from previous generation GPUs. AMD's GPUs, notably, do not support it. And at a round table Gizmodo attended with Nvidia CEO Jensen Huang he jokingly dismissed AMD's Tuesday announcement, claiming the announcement itself was "underwhelming" and that his company's 2080 would "crush" the Radeon VII in benchmarks. "The performance is lousy," he said of the rival product. When asked to comment about these slights, AMD CEO Lisa Su told a collection of reporters, "I would probably suggest he hasn't seen it." When pressed about his comments, especially his touting of ray tracing she said, "I'm not gonna get into it tit for tat that's just not my style."

AMD

AMD Announces Radeon VII, Its Next-Generation $699 Graphics Card (theverge.com) 145

An anonymous reader shares a report: AMD has been lagging behind Nvidia for years in the high-end gaming graphics card race, to the point that it's primarily been pushing bang-for-the-buck cards like the RX 580 instead. But at CES, the company says it has a GPU that's competitive with Nvidia's RTX 2080. It's called the Radeon VII ("Seven"), and it uses the company's first 7nm graphics chip that we'd seen teased previously. It'll ship on February 7th for $699, according to the company. That's the same price as a standard Nvidia RTX 2080. [...] AMD says the second-gen Vega architecture offers 25 percent more performance at the same power as previous Vega graphics, and the company showed it running Devil May Cry 5 here at 4K resolution, ultra settings, and frame rates "way above 60 fps." AMD says it has a terabyte-per-second of memory bandwidth.
Graphics

Dell Alienware Area-51m Packs Desktop Hardware Into Powerful, Upgradeable Laptop (hothardware.com) 89

MojoKid writes: Dell just unveiled its latest desktop-replacement class notebook, the new Alienware Area-51m. Unlike most other notebooks, however, the Area-51m is actually packing an array of desktop-class hardware. Intel's Core i9-9900K is an available CPU option, for example, and NVIDIA's GeForce RTX 2080 will be offered in the machine as well. The Area-51m also supports up to 64GB of RAM via quad SO-DIMM slots, multiple NVMe M.2 solid state drives and a SATA drive can be installed, and numerous 17.3" display options will be available as well, including a 144Hz IPS G-SYNC model. The Alienware Area-51m is also upgradeable, thanks to the use of socketed desktop processors and a custom GPU module. The machine will be available starting January 29th in two color options, Lunar Light and Dark Side of the Moon.
Intel

Intel Demonstrates 10nm Ice Lake Processor, Promises PCs Will Ship With it Later this Year (theverge.com) 80

Intel announced a major rethink of its chip design back in December, just before it finally delivers 10nm chips for PCs and laptops. At CES 2019 this week, Intel is demonstrating its first Ice Lake 10nm processor that's based on its new Sunny Cove microarchitecture. From a report: Intel is building in Thunderbolt 3, Wi-Fi 6, and DL Boost (deep learning boost) into these Ice Lake chips for laptops and PCs to take advantage of. Intel is now promising that PC makers will have devices with Ice Lake processors on shelves by the end of 2019. At its CES keynote today, Intel demonstrated ODM systems from Pegatron and Wistron, and Dell even joined Intel on stage to show off an Ice Lake-powered XPS laptop that will be available later this year. Dell didn't show the device powered on, but it appeared to be a 2-in-1 device that looked similar to the XPS 13. Intel is also looking to the future, too. The chip giant is planning to use Foveros 3D chip stacking technology to build future chips, a method that allows Intel's chip designers to stack extra processing power on top of an already-assembled chip die. These "chiplets" can be stacked atop one another to form a processor that includes graphics, AI processing, and more.
Graphics

NVIDIA Launches $349 GeForce RTX 2060, Will Support Other Adaptive Sync Monitors (hothardware.com) 145

MojoKid writes from a report via Hot Hardware: NVIDIA launched a new, more reasonably-priced GeForce RTX card today, dubbed the GeForce RTX 2060. The new midrange graphics card will list for $349 and pack all the same features as NVIDIA's higher-end GeForce RTX 2080 and 2070 series cards. The card is also somewhat shorter than other RTX 20-series cards at only 9.5" (including the case bracket), and its GPU has a few functional blocks disabled. Although it's packing a TU106 like the 2070, six Streaming Multiprocessors (SMs) have been disabled, along with 20% of its Tensor and RT cores. All told, the RTX 2060 has 1,920 active CUDA cores, with 240 Tensor cores, and 30 RT cores. Although the GeForce RTX 2060 seems like the next-gen cousin to the 1060, the RTX 2060 is significantly more powerful and more in line with the GeForce GTX 1070 Ti and GTX 1080 in terms of raw performance in the benchmarks. It can also play ray-tracing enabled games like Battlefield V with decent frame rates at 1080p with high image quality and max ray-tracing enabled. NVIDIA has also apparently decided to support open standards-based adaptive refresh rate monitor technology and will soon begin supporting even AMD FreeSync monitors in future driver update.

Slashdot Top Deals