×
Graphics

Nvidia Stops Promotional Game Resales By Tying Codes To Hardware (arstechnica.com) 120

Nvidia is putting a stop to the resale of bundled promotional game keys by tying them to a specific graphics card purchase, according to Ars Technica. Users will now have to redeem codes via the GeForce Experience (GFE) app, which is directly linked to third-party services like Steam and Uplay. Users must also ensure that the requisite graphics card is "installed before redemption." GFE then performs "a hardware verification step to ensure the coupon code is redeemed on the system with the qualifying GPU." From the report: Previously, retailers sent promotional game codes to customers that purchased a qualifying product. Those codes could then be redeemed on Nvidia's website, which spit out the relevant Steam, Uplay, Origin, or Microsoft Store key. Since the promotional game codes were not tied to a specific account, many users took to either gifting spare keys to friends or selling them on eBay in order to offset the cost of the graphics card purchase. [Ars Technica has updated their report with additional information:] Nvidia has confirmed that while GFE checks to ensure a user has installed a qualifying graphics card like a GTX 1070 or GTX 1080, the game itself is not permanently linked to the hardware. GFE's hardware check is based only on the wider product range, and not on a specific serial number. The company has also confirmed that the redemption process permanently adds the game to the appropriate third-party service. For example, if users redeems a promotional game key through to Steam, that game will be useable on any other device, just like normal Steam games. Users can also opt to uninstall GFE, or install a different graphics card, once the promotional code has been redeemed and still retain full ownership of the game. A full set of instructions for redeeming codes is now available on Nvidia's website.
Displays

LG's UltraFine 5K Display Becomes Useless When It's Within Two Meters of a Router (9to5mac.com) 173

The LG UltraFine 5K Display was designed in part by Apple to work with the New MacBook Pro and as a replacement for the Thunderbolt Display, which was discontinued late last year. According to 9to5Mac, the display apparently wasn't designed to work next to routers as it will flicker, disconnect, or freeze computers when it's within two meters of a router due to electromagnetic interference. The Verge reports: In emails to 9to5Mac, LG acknowledged the problem -- which LG says isn't an issue for any of its other monitors -- noting that routers "may affect the performance of the monitor" and that users should "have the router placed at least two meters away from the monitor" to avoid issues. Once the monitor was moved into a different room away from the router, 9to5Mac says the issues subsided. Despite the fact that it's insane to require a router to be far away from what is likely the main computer in your home, there's been no indication that LG is working on a fix for the issue, which may be more troublesome.
Wine

Wine 2.0 Released (softpedia.com) 202

An anonymous reader quotes a report from Softpedia: It's finally here! After so many months of development and hard work, during which over 6,600 bugs have been patched, the Wine project is happy to announce today, January 24, 2017, the general availability of Wine 2.0. Wine 2.0 is the biggest and most complete version of the open-source software project that allows Linux and macOS users to run applications and games designed only for Microsoft Windows operating systems. As expected, it's a massive release that includes dozens of improvements and new features, starting with support for Microsoft Office 2013 and 64-bit application support on macOS. Highlights of Wine 2.0 include the implementation of more DirectWrite features, such as drawing of underlines, font fallback support, and improvements to font metrics resolution, font embedding in PDF files, Unicode 9.0.0 support, Retina rendering mode for the macOS graphics driver, and support for gradients in GDI enhanced metafiles. Additional Shader Model 4 and 5 shader instructions have been added to Direct3D 10 and Direct3D 11 implementation, along with support for more graphics cards, support for Direct3D 11 feature levels, full support for the D3DX (Direct3D Extension) 9 effect framework, as well as support for the GStreamer 1.0 multimedia framework. The Gecko engine was updated to Firefox 47, IDN name resolutions are now supported out-of-the-box, and Wine can correctly handle long URLs. The included Mono engine now offers 64-bit support, as well as the debug registers. Other than that, the winebrowser, winhlp32, wineconsole, and reg components received improvements. You can read the full list of features and download Wine 2.0 from WineHQ's websiteS.
Firefox

Firefox 51 Arrives With HTTP Warning, WebGL 2 and FLAC Support (venturebeat.com) 130

Reader Krystalo writes: Mozilla today launched Firefox 51 for Windows, Mac, Linux, and Android. The new version includes a new warning for websites which collect passwords but don't use HTTPS, WebGL 2 support for better 3D graphics, and FLAC (Free Lossless Audio Codec) playback. Mozilla doesn't break out the exact numbers for Firefox, though the company does say "half a billion people around the world" use the browser. In other words, it's a major platform that web developers target -- even in a world increasingly dominated by mobile apps.
Portables (Apple)

Apple To Offer 32GB of Desktop RAM, Kaby Lake In Top-End 2017 MacBook Pro, Says Analyst (appleinsider.com) 300

AppleInsider has obtained a note to investors from KGI analyst Ming-Chi Kuo that says Apple's 2017 laptop line will focus on internal component updates, including the platform-wide adoption of Intel's Kaby Lake architecture. What's more is that Apple is expected to manufacture a 15-inch MacBook Pro with up to 32GB of RAM in the fourth quarter of 2017. AppleInsider reports: Apple took flak in releasing its latest MacBook Pro with Touch Bar models with a hard memory cap of 16GB, an minimal allotment viewed as a negative for imaging and video professionals. Responding to customer criticism, Apple said the move was made in a bid to maximize battery life. Essentially, the Intel Skylake CPUs used in Apple's MacBook Pro only support up to 16GB of LPDDR3 RAM at 2133MHz. Though Intel does make processors capable of addressing more than 16GB of memory, those particular chipsets rely on less efficient DDR4 RAM and are usually deployed in desktops with access to dedicated mains power. In order to achieve high memory allotments and keep unplugged battery life performance on par with existing MacBook Pro models, Apple will need to move to an emerging memory technology like LPDDR4 or DDR4L. Such hardware is on track for release later this year. As for the 12-inch MacBook, Kuo believes next-generation versions of the thin-and-light will enter mass production in the second quarter with the same basic design aesthetic introduced in 2015. New for 2017 is a 16GB memory option that will make an appearance thanks to Intel's new processor class.
Data Storage

Raspberry Pi Upgrades Compute Module With 10 Times the CPU Performance (arstechnica.com) 71

An anonymous reader quotes a report from Ars Technica: The Raspberry Pi Compute Module is getting a big upgrade, with the same processor used in the recently released Raspberry Pi 3. The Compute Module, which is intended for industrial applications, was first released in April 2014 with the same CPU as the first-generation Raspberry Pi. The upgrade announced today has 1GB of RAM and a Broadcom BCM2837 processor that can run at up to 1.2GHz. "This means it provides twice the RAM and roughly ten times the CPU performance of the original Compute Module," the Raspberry Pi Foundation announcement said. This is the second major version of the Compute Module, but it's being called the "Compute Module 3" to match the last flagship Pi's version number. The new Compute Module has more flexible storage options than the original. "One issue with the [Compute Module 1] was the fixed 4GB of eMMC flash storage," the announcement said. But some users wanted to add their own flash storage. "To solve this, two versions of the [Compute Module 3] are being released: one with 4GB eMMC on-board and a 'Lite' model which requires the user to add their own SD card socket or eMMC flash." The core module is tiny so that it can fit into other hardware, but for development purposes there is a separate I/O board with GPIO, USB and MicroUSB, CSI and DSI ports for camera and display boards, HDMI, and MicroSD. The Compute Module 3 and the lite version cost $30 and $25, respectively.
Patents

Apple Patent Paves Way For iPhone With Full-Face Display, HUD Windows (appleinsider.com) 75

An anonymous reader quotes a report from Apple Insider: Apple on Tuesday was granted a patent detailing technology that allows for ear speakers, cameras and even a heads-up display to hide behind an edge-to-edge screen, a design rumored to debut in a next-generation iPhone later this year. Awarded by the U.S. Patent and Trademark Office, Apple's U.S. Patent No. 9,543,364 for "Electronic devices having displays with openings" describes a method by which various components can be mounted behind perforations in a device screen that are so small as to be imperceptible to the human eye. This arrangement would allow engineers to design a smartphone or tablet with a true edge-to-edge, or "full face," display. With smartphones becoming increasingly more compact, there has been a push to move essential components behind the active -- or light-emitting -- area of incorporated displays. Apple in its patent suggests mounting sensors and other equipment behind a series of openings, or through-holes, in the active portion of an OLED or similar panel. These openings might be left empty or, if desired, filled with glass, polymers, radio-transparent ceramic or other suitable material. Positioning sensor inputs directly in line with said openings facilitates the gathering of light, radio waves and acoustic signals. Microphones, cameras, antennas, light sensors and other equipment would therefore have unimpeded access beyond the display layer. The design also accommodates larger structures like iPhone's home button. According to the document, openings are formed between pixels, suggesting a self-illuminating display technology like OLED is preferred over traditional LCD structures that require backlight and filter layers. Hole groupings can be arranged in various shapes depending on the application, and might be larger or smaller than the underlying component. If implemented into a future iPhone, the window-based HUD could be Apple's first foray into augmented reality. Apple leaves the mechanics unmentioned, but the system could theoretically go beyond AR and into mixed reality applications.
AMD

AMD Announces X300 and X370 AM4 Motherboards For Ryzen, All CPUs Unlocked (hothardware.com) 71

MojoKid writes: AMD has a lot riding on Ryzen, its new generation CPU architecture that is supposed to return the chip designer to a competitive position versus Intel in the high-end desktop X86 processor market. Late last week, at CES 2017, AMD has lined up over a dozen high-performance AM4 motherboards from five hardware partners, including ASRock, ASUS, Biostar, Gigabyte, and MSI. All AM4 motherboards are built around one of two desktop chipsets for Ryzen, the AMD X370 or X300. Motherboards based on the X370 chipset are intended for power users and gamers. These boards bring more robust overclocking controls and support for dual graphics cards, along with more I/O connectivity and dual-channel DDR4 memory support. The X300 is AMD's chipset for mini-ITX motherboards for small form factor (SFF) system platforms. The X300 also supports dual-channel DDR4 memory, PCIe 3.0, M.2 SATA devices, NVMe, and USB 3.1 Gen 1 and Gen 1. Finally, AMD representatives on hand at CES also reported that all Ryzen processors will be multiplier unlocked, hopefully for some rather flexible overclocking options. There will also be several processors in the family, with varying core counts depending on SKU, at launch.
Music

Dell Unveils XPS 27 All-In-One With 10 Speaker Dual 50W Sound System (hothardware.com) 53

MojoKid writes: Over the past couple of years, Dell has been driving a redesign effort of its consumer and commercial product lines and has systematically been updating both design signatures and the technology platforms within them. Dell's premium consumer XPS product line, perhaps more so than any other, has seen the most significant design reinvention with the likes of its XPS 13 and XPS 15 notebook line. At CES 2017, Dell announced the XPS 27 7760 all-in-one PC that has a radically new look that draws at least one design cue from its XPS notebook siblings, specifically with respect to the display bezel, or the lack thereof. Though Dell isn't officially branding the touch-enabled version of XPS 27 with an "InfinityEdge" display, the side and top bezel is cut to a minimum, accentuating a beautiful 4K IPS panel. However, the machine's display might not be the most standout feature of the 2017 Dell XPS 27. Under that display, Dell actually expanded things mechanically to make room not only for a Windows Hello capable camera but a 10 speaker sound system that was designed in conjunction with Grammy Award-winning music producer and audio engineer, JJ Puig, that takes the system's audio reproduction and output capabilities to a whole new level. Its sound system is very accurate with dual 50 watt amplifiers at less than 1% THD (Total Harmonic Distortion) and a 70Hz to 20KHz frequency response. Though the system is currently built on Intel's Skylake platform, Kaby Lake versions are imminent and with discrete AMD Radeon R9 M470X graphics, it has decent gaming and multimedia chops as well.
AMD

AMD Unveils Vega GPU Architecture With 512 Terabytes of Memory Address Space (hothardware.com) 125

MojoKid writes: AMD lifted the veil on its next generation GPU architecture, codenamed Vega, this morning. One of the underlying forces behind Vega's design is that conventional GPU architectures have not been scaling well for diverse data types. Gaming and graphics workloads have shown steady progress, but today's GPUs are used for much more than just graphics. In addition, the compute capability of GPUs may have been increasing at a good pace, but memory capacity has not kept up. Vega aims to improve both compute performance and addressable memory capacity, however, through some new technologies not available on any previous-gen architecture. First, is that Vega has the most scalable GPU memory architecture built to date with 512TB of address space. It also has a new geometry pipeline tuned for more performance and better efficiency with over 2X peak throughput per clock, a new Compute Unit design, and a revamped pixel engine. The pixel engine features a new draw stream binning rasterizer (DSBR), which reportedly improves performance and saves power. All told, Vega should offer significant improvements in terms of performance and efficiency when products based on the architecture begin shipping in a few months.
HP

HP Made a Laptop Slightly Thicker To Add 3 Hours of Battery Life (theverge.com) 167

When a technology company like Apple releases a new product, chances are it's going to be thinner than its predecessor -- even if may be slightly worse off for it. HP is taking a different approach with its new 15.6-inch Spectre x360 laptop, which was recently announced at CES. The machine is slightly thicker than its predecessor, and HP claims it features three hours of additional battery life. The Verge reports: The difference between the new x360 and the old x360, in terms of thickness, is minimal, from 15.9mm to 17.8mm. (For reference, the 2015 MacBook Pro was 18mm thick.) It's an increase of 1.9mm for the Spectre, but HP says it's now including a battery that's 23 percent larger in exchange. At the same time, the laptop is also getting narrower, with its body shrinking from 14.8 inches wide to 14 inches wide. Unfortunately, the claimed three hours of additional battery life aren't meant to make this laptop into some long-lasting wonder -- they're really just meant to normalize its battery life. HP will only be selling the 15.6-inch x360 with a 4K display this year, and that requires a lot more power. By increasing the laptop's battery capacity, HP is able to push the machine's battery life from the 9.5 hours it estimated for the 4K version of its 2016 model to about 12 hours and 45 minutes for this model. So it is adding three hours of battery life, but in doing so, it's merely matching the battery life of last year's 1080p model. The x360 is also being updated to include Intel's Kaby Lake processors. It includes options that max out at an i7 processor, 16GB of RAM, a 1TB SSD, and Nvidia GeForce 940MX graphics. It's supposed to be released February 26th, with pricing starting at $1,278 for an entry-level model.
Android

Qualcomm Details Snapdragon 835 Processor (pcmag.com) 42

Qualcomm has detailed the Snapdragon 835 processor, which will power most of the leading Android smartphones this year. It's designed to grab information from the air at gigabit speeds and turn it into rich virtual and augmented reality experiences, according to several executives at a pre-CES briefing. Qualcomm SVP Keith Kressin said, "The 835 is going to be one of the key devices that propels the VR use case." PC Magazine reports: The hardest thing to understand about the Snapdragon 835, especially if you're thinking from a desktop CPU space, is how much Qualcomm has been prioritizing elements of the system-on-chip other than the CPU. This has been coming for years, and it can be tricky because it relies on firmware and the Android OS to properly distribute work to non-CPU components of the chip. During the briefing, it was striking how little Qualcomm talked about its Kryo 280 CPU, as compared to other components. Qualcomm tries to counter that by pointing out that this is the first 10nm mobile processor, which will improve efficiency, and also by saying the CPU is "tightly integrated" with other components using the new Symphony system manager, which operates automatically yet can be customized by application developers. This distributes work across the CPU, GPU, DSP, and more exotic components, letting the Snapdragon 835 work better than it would with CPU alone. How that will combine with Qualcomm's recent announcement that it will support Windows 10 on mobile PCs, including legacy Win32 apps, is yet to be seen. The Snapdragon 835 consumes 25 percent less power than the 820, according to Qualcomm. That means seven hours of 4K streaming video and two hours of VR gaming on a typical device, the company said. These new uses are really power hungry. Since Qualcomm can only do so much on power efficiency, it's also introducing Quick Charge 4, which supposedly charges a phone to five hours of use in five minutes and is USB-C power delivery compliant. The new Adreno 540 graphics chip improves 3D performance by 25 percent over the previous generation, Qualcomm said. But it also enables features like HDR10, which improves colors; foveated rendering, which most clearly renders what you're looking at rather than elements in the periphery of a scene; and low latency, which allows you to move your head smoothly around VR scenes. With one 32MP or two 16MP cameras running at the same time, the Snapdragon 835 supports various dual-camera functions. The Snapdragon 835 will feature the X16 modem, which Qualcomm announced earlier this year and will be able to boost LTE to gigabit speeds. The keys to gigabit LTE are triple 20MHz carrier aggregation with 256-QAM encoding and 4x4 MIMO antennas, said Qualcomm's senior director of marketing, Peter Carson. That's going to be first introduced with a Netgear hotspot in Australia this January, but Sprint and T-Mobile have said they're trying to assemble this set of technologies.
Intel

Intel Core I7-7700K Kaby Lake Review By Ars Technica: Is the Desktop CPU Dead? (arstechnica.co.uk) 240

Reader joshtops writes: Ars Technica has reviewed the much-anticipated Intel Core i7-7700K Kaby Lake, the recently launched desktop processor from the giant chipmaker. And it's anything but a good sign for enthusiasts who were hoping to see significant improvements in performance. From the review, "The Intel Core i7-7700K is what happens when a chip company stops trying. The i7-7700K is the first desktop Intel chip in brave new post-"tick-tock" world -- which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. [...] If you're still rocking an older Ivy Bridge or Haswell processor and weren't convinced to upgrade to Skylake, there's little reason to upgrade to Kaby Lake. Even Sandy Bridge users may want to consider other upgrades first, such as a new SSD or graphics card. The first Sandy Bridge parts were released six years ago, in January 2011. [...] As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate -- but if Kaby Lake is any indication, it won't be coming from Intel. While Ars Technica has complained about the minimal upgrades, AnandTech looks at the positive side: The Core i7-7700K sits at the top of the stack, and performs like it. A number of enthusiasts complained when they launched the Skylake Core i7-6700K with a 4.0/4.2 GHz rating, as this was below the 4.0/4.4 GHz rating of the older Core i7-4790K. At this level, 200-400 MHz has been roughly the difference of a generational IPC upgrade, so users ended up with similar performing chips and the difference was more in the overclocking. However, given the Core i7-7700K comes out of the box with a 4.2/4.5 GHz arrangement, and support for Speed Shift v2, it handily mops the floor with the Devil's Canyon part, resigning it to history.
AMD

AMD Debuts Radeon FreeSync 2 For Gaming Displays With Stunning Image Quality (venturebeat.com) 67

AMD announced Tuesday it is introducing Radeon FreeSync 2, a new display technology that will enable monitors to show the exact intended image pixels that a game or other application wants to. The result will be better image quality for gamers, according to AMD. From a report on VentureBeat: With the FreeSync 2 specification, monitor makers will be able to create higher-quality monitors that build on the two-year-old FreeSync technology. Sunnyvale, Calif.-based AMD is on a quest for "pixel perfection," said David Glen, senior fellow at AMD, in a press briefing. With FreeSync 2, you won't have to mess with your monitor's settings to get the perfect setting for your game, Glen said. It will be plug-and-play, deliver brilliant pixels that have twice as much color gamut and brightness over other monitors, and have low-latency performance for high-speed games. AMD's FreeSync technology and Nvidia's rival G-Sync allow a graphics card to adjust the monitor's refresh rate on the fly, matching it to the computer's frame rate. This synchronization prevents the screen-tearing effect -- with visibly mismatched graphics on different parts of the screen -- which happens when the refresh rate of the display is out of sync with the computer.
Microsoft

Specs of Qualcomm's First ARM Processor Capable of Running Windows 10 Leaks (mspoweruser.com) 107

Qualcomm's upcoming Snapdragon 835's specs have leaked ahead of its CES reveal. An anonymous reader writes: According to the leaked press release, Qualcomm's Snapdragon 835 sports the Qualcomm Kryo 280 CPU (quad-core), Qualcomm Adreno 540 GPU, and Qualcomm Hexagon DSP to manage the different workloads. All of this combined together will result in a 27% increase in performance when compared to the previous generation. Qualcomm is also making significant improvements with the Snapdragon 835 when it comes to power consumption. To be precise, the Snapdragon 835 consumes 40% less power than the older generation which is supposed to offer the following: "1+ day of talk time, 5+ days of music playback, and 7+ hours of 4K video streaming. Should your phone need more power, Qualcomm Quick Charge 4 provides five hours of battery life for five minutes of charging." Qualcomm stated in the press release that the Snapdragon also comes with substantial improvements to the graphics rendering and virtual reality. According to the company, the Snapdragon 835 includes "game-changing" enhancements to improve audio, intuitive interactions, and vibrant visuals. The processor also offers 25 percent faster 3D graphic rendering and produces 60X display colors than the Snapdragon 820.
Windows

Dell Launches XPS 13 2-in-1 Laptop With Intel Kaby Lake Chip, Starts at $1,000 (venturebeat.com) 114

Ahead of CES 2016, which officially kicks off Tuesday, Dell has announced a convertible version of its popular XPS 13 laptop. The machine is powered by a seventh-generation Kaby Lake Intel Core i chip (i5 and i7 options are available), Intel HD Graphics 615 integrated GPU, 4 to 16GB LPDDR3 RAM, a 128GB-1TB solid-state drive (SSD), a 720p webcam on the bottom of the display with support for Windows Hello, a fingerprint scanner, a 46 watt-hour battery, and a 13.3-inch touchscreen, available in QHD+ or FHD configurations. From a report on VentureBeat: The bezel is very narrow, in keeping with the XPS style. The fanless PC offers an SD card slot and two USB-C ports, and a USB-A to USB-C adapter comes in the box. The laptop is 0.32-0.54 inch thick, which is thinner than Dell's 2016 XPS. But the keyboard hasn't been squished down -- the keys have 1.3mm travel, or just a tad bit (0.1mm) more than you get on the XPS laptop -- which is impressive. The laptop weighs 2.7 pounds. The question is whether people will want the convertible option when the laptop is fine as is. The convertible XPS 13 starts at $1000, which is $200 more than the XPS 13 laptop.
Businesses

Foxconn and Sharp Team Up To Build $8.8 Billion LCD Plant In China (reuters.com) 66

Foxconn was in the news recently for plans to "automate entire factories" in China, but the electronics manufacturing company has also announced plans with Sharp to build a $8.8 billion (61 million yuan) factory in China to produce liquid-crystal displays (LCDs). Reuters reports: Sakai Display Products Corp's plant will be a so-called Gen-10.5 facility specializing in large-screen LCDs and will be operational by 2019, the company said at a signing event with local officials in Guangzhou on Friday. It said the plant will have capacity equating to 92 billion yuan a year. The heavy investment is aimed at increasing production to meet expected rising demand for large-screen televisions and monitors in Asia. Sakai Display Products Corp's plans for the Guangzhou plant come as Hon Hai seeks to turn the joint venture into a subsidiary, investing a total of 15.1 billion yuan in the company. The venture will also sell 436,000 shares for 17.1 billion yuan to an investment co-owned by Hon Hai Chairman Terry Gou, giving Hon Hai a 53 percent interest in the business and lowering Sharp's stake from to 26 percent from 40 percent.
Software

Ask Slashdot: Why Are Some Great Games Panned and Some Inferior Games Praised? (soldnersecretwars.de) 145

dryriver writes: A few years ago I bought a multiplayer war game called Soldner: Secret Wars that I had never heard of before. (The game is entirely community maintained now and free to download and play at www.soldnersecretwars.de.) The professional reviews completely and utterly destroyed Soldner -- buggy, bad gameplay, no single-player mode, disappointing graphics, server problems and so on. For me and many other players who did give it a chance beyond the first 30 minutes, Soldner turned out to be the most fun, addictive, varied, satisfying and multi-featured multiplayer war game ever. It had innovative features that AAA titles like Battlefield and COD did not have at all at the time -- fully destructible terrain, walls and buildings, cool physics on everything from jeeps flying off mountaintops to Apache helicopters crashing into Hercules transport aircraft, to dozens of trees being blown down by explosions and then blocking an incoming tank's way. Soldner took a patch or three to become fully stable, but then was just fun, fun, fun to play. So much freedom, so much cool stuff you can do in-game, so many options and gadgets you can play with. By contrast, the far, far simpler -- but better looking -- Battlefield, COD, Medal Of Honor, CounterStrike war games got all the critical praise, made the tens of millions in profit per release, became longstanding franchises and are, to this day, not half the fun to play that Soldner is. How does this happen? How does a title like Soldner, that tried to do more new stuff than the other war games combined, get trashed by every reviewer, and then far less innovative and fun to play war games like BF, COD, CS sell tens of millions of copies per release and get rave reviews all around?
Advertising

Ask Slashdot: Is Computing As Cool and Fun As It Once Was? 449

dryriver writes: I got together with old computer nerd friends the other day. All of us have been at it since the 8-bit/1980s days of Amstrad, Atari, Commodore 64-type home computers. Everybody at the meeting agreed on one thing -- computing is just not as cool and as much fun as it once was. One person lamented that computer games nowadays are tied to internet DRM like Steam, that some crucial DCC software is available to rent only now (e.g. Photoshop) and that many "basic freedoms" of the old-school computer nerd are increasingly disappearing. Another said that Windows 10's spyware aspects made him give up on his beloved PC platform and that he will use Linux and Android devices only from now on, using consoles to game on instead of a PC because of this. A third complained about zero privacy online, internet advertising, viruses, ransomware, hacking, crapware. I lamented that the hardware industry still hasn't given us anything resembling photorealistic realtime 3D graphics, and that the current VR trend arrived a full decade later than it should have. A point of general agreement was that big tech companies in particular don't treat computer users with enough respect anymore. What do Slashdotters think? Is computing still as cool and fun as it once was, or has something "become irreversibly lost" as computing evolved into a multi-billion dollar global business?
Hardware

NVIDIA Quadro P6000 and P5000 Pascal Pro Graphics Powerhouses Put To the Test (hothardware.com) 21

Reader MojoKid writes: NVIDIA's Pascal architecture has been wildly successful in the consumer space. The various GPUs that power the GeForce GTX 10 series are all highly competitive at their respective price points, and the higher-end variants are currently unmatched by any single competing GPU. NVIDIA has since retooled Pascal for the professional workstation market as well, with products that make even the GeForce GTX 1080 and TITAN X look quaint in comparison. NVIDIA's beastly Quadro P6000 and Quadro P5000 are Pascal powered behemoths, packing up to 24GB of GDDR5X memory and GPUs that are more capable than their consumer-targeted counterparts. Though it is built around the same GP102 GPU, the Quadro P6000 is particularly interesting, because it is outfitted with a fully-functional Pascal GPU with all of its SMs enabled, which results in 3,840 active cores, versus 3,584 on the TITAN X. The P5000 has the same GP104 GPU as the GTX 1080, but packs in twice the amount of memory -- 8GB vs 16GB. In the benchmarks, with cryptographic workloads and pro-workstation targeted graphics tests, the Quadro P6000 and Quadro P5000 are dominant across the board. The P6000 significantly outpaced the previous-generation Maxwell-based Quadro M6000 throughout testing, and the P5000 managed to outpace the M6000 on a few occasions as well. Of particular note is that the Quadro P6000 and P5000, while offering better performance than NVIDIA's previous-gen, high-end professional graphics cards, do it in much lower power envelopes, and they're quieter too. In a couple of quick gaming benchmarks, the P6000 may give us a hint at what NVIDIA has in store for the rumored GeForce GTX 1080 Ti, with all CUDA cores enabled in its GP102 GPU and performance over 10% faster than a Titan X.

Slashdot Top Deals