AMD

AMD Unveils Vega GPU Architecture With 512 Terabytes of Memory Address Space (hothardware.com) 125

MojoKid writes: AMD lifted the veil on its next generation GPU architecture, codenamed Vega, this morning. One of the underlying forces behind Vega's design is that conventional GPU architectures have not been scaling well for diverse data types. Gaming and graphics workloads have shown steady progress, but today's GPUs are used for much more than just graphics. In addition, the compute capability of GPUs may have been increasing at a good pace, but memory capacity has not kept up. Vega aims to improve both compute performance and addressable memory capacity, however, through some new technologies not available on any previous-gen architecture. First, is that Vega has the most scalable GPU memory architecture built to date with 512TB of address space. It also has a new geometry pipeline tuned for more performance and better efficiency with over 2X peak throughput per clock, a new Compute Unit design, and a revamped pixel engine. The pixel engine features a new draw stream binning rasterizer (DSBR), which reportedly improves performance and saves power. All told, Vega should offer significant improvements in terms of performance and efficiency when products based on the architecture begin shipping in a few months.
HP

HP Made a Laptop Slightly Thicker To Add 3 Hours of Battery Life (theverge.com) 167

When a technology company like Apple releases a new product, chances are it's going to be thinner than its predecessor -- even if may be slightly worse off for it. HP is taking a different approach with its new 15.6-inch Spectre x360 laptop, which was recently announced at CES. The machine is slightly thicker than its predecessor, and HP claims it features three hours of additional battery life. The Verge reports: The difference between the new x360 and the old x360, in terms of thickness, is minimal, from 15.9mm to 17.8mm. (For reference, the 2015 MacBook Pro was 18mm thick.) It's an increase of 1.9mm for the Spectre, but HP says it's now including a battery that's 23 percent larger in exchange. At the same time, the laptop is also getting narrower, with its body shrinking from 14.8 inches wide to 14 inches wide. Unfortunately, the claimed three hours of additional battery life aren't meant to make this laptop into some long-lasting wonder -- they're really just meant to normalize its battery life. HP will only be selling the 15.6-inch x360 with a 4K display this year, and that requires a lot more power. By increasing the laptop's battery capacity, HP is able to push the machine's battery life from the 9.5 hours it estimated for the 4K version of its 2016 model to about 12 hours and 45 minutes for this model. So it is adding three hours of battery life, but in doing so, it's merely matching the battery life of last year's 1080p model. The x360 is also being updated to include Intel's Kaby Lake processors. It includes options that max out at an i7 processor, 16GB of RAM, a 1TB SSD, and Nvidia GeForce 940MX graphics. It's supposed to be released February 26th, with pricing starting at $1,278 for an entry-level model.
Android

Qualcomm Details Snapdragon 835 Processor (pcmag.com) 42

Qualcomm has detailed the Snapdragon 835 processor, which will power most of the leading Android smartphones this year. It's designed to grab information from the air at gigabit speeds and turn it into rich virtual and augmented reality experiences, according to several executives at a pre-CES briefing. Qualcomm SVP Keith Kressin said, "The 835 is going to be one of the key devices that propels the VR use case." PC Magazine reports: The hardest thing to understand about the Snapdragon 835, especially if you're thinking from a desktop CPU space, is how much Qualcomm has been prioritizing elements of the system-on-chip other than the CPU. This has been coming for years, and it can be tricky because it relies on firmware and the Android OS to properly distribute work to non-CPU components of the chip. During the briefing, it was striking how little Qualcomm talked about its Kryo 280 CPU, as compared to other components. Qualcomm tries to counter that by pointing out that this is the first 10nm mobile processor, which will improve efficiency, and also by saying the CPU is "tightly integrated" with other components using the new Symphony system manager, which operates automatically yet can be customized by application developers. This distributes work across the CPU, GPU, DSP, and more exotic components, letting the Snapdragon 835 work better than it would with CPU alone. How that will combine with Qualcomm's recent announcement that it will support Windows 10 on mobile PCs, including legacy Win32 apps, is yet to be seen. The Snapdragon 835 consumes 25 percent less power than the 820, according to Qualcomm. That means seven hours of 4K streaming video and two hours of VR gaming on a typical device, the company said. These new uses are really power hungry. Since Qualcomm can only do so much on power efficiency, it's also introducing Quick Charge 4, which supposedly charges a phone to five hours of use in five minutes and is USB-C power delivery compliant. The new Adreno 540 graphics chip improves 3D performance by 25 percent over the previous generation, Qualcomm said. But it also enables features like HDR10, which improves colors; foveated rendering, which most clearly renders what you're looking at rather than elements in the periphery of a scene; and low latency, which allows you to move your head smoothly around VR scenes. With one 32MP or two 16MP cameras running at the same time, the Snapdragon 835 supports various dual-camera functions. The Snapdragon 835 will feature the X16 modem, which Qualcomm announced earlier this year and will be able to boost LTE to gigabit speeds. The keys to gigabit LTE are triple 20MHz carrier aggregation with 256-QAM encoding and 4x4 MIMO antennas, said Qualcomm's senior director of marketing, Peter Carson. That's going to be first introduced with a Netgear hotspot in Australia this January, but Sprint and T-Mobile have said they're trying to assemble this set of technologies.
Intel

Intel Core I7-7700K Kaby Lake Review By Ars Technica: Is the Desktop CPU Dead? (arstechnica.co.uk) 240

Reader joshtops writes: Ars Technica has reviewed the much-anticipated Intel Core i7-7700K Kaby Lake, the recently launched desktop processor from the giant chipmaker. And it's anything but a good sign for enthusiasts who were hoping to see significant improvements in performance. From the review, "The Intel Core i7-7700K is what happens when a chip company stops trying. The i7-7700K is the first desktop Intel chip in brave new post-"tick-tock" world -- which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. [...] If you're still rocking an older Ivy Bridge or Haswell processor and weren't convinced to upgrade to Skylake, there's little reason to upgrade to Kaby Lake. Even Sandy Bridge users may want to consider other upgrades first, such as a new SSD or graphics card. The first Sandy Bridge parts were released six years ago, in January 2011. [...] As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate -- but if Kaby Lake is any indication, it won't be coming from Intel. While Ars Technica has complained about the minimal upgrades, AnandTech looks at the positive side: The Core i7-7700K sits at the top of the stack, and performs like it. A number of enthusiasts complained when they launched the Skylake Core i7-6700K with a 4.0/4.2 GHz rating, as this was below the 4.0/4.4 GHz rating of the older Core i7-4790K. At this level, 200-400 MHz has been roughly the difference of a generational IPC upgrade, so users ended up with similar performing chips and the difference was more in the overclocking. However, given the Core i7-7700K comes out of the box with a 4.2/4.5 GHz arrangement, and support for Speed Shift v2, it handily mops the floor with the Devil's Canyon part, resigning it to history.
AMD

AMD Debuts Radeon FreeSync 2 For Gaming Displays With Stunning Image Quality (venturebeat.com) 67

AMD announced Tuesday it is introducing Radeon FreeSync 2, a new display technology that will enable monitors to show the exact intended image pixels that a game or other application wants to. The result will be better image quality for gamers, according to AMD. From a report on VentureBeat: With the FreeSync 2 specification, monitor makers will be able to create higher-quality monitors that build on the two-year-old FreeSync technology. Sunnyvale, Calif.-based AMD is on a quest for "pixel perfection," said David Glen, senior fellow at AMD, in a press briefing. With FreeSync 2, you won't have to mess with your monitor's settings to get the perfect setting for your game, Glen said. It will be plug-and-play, deliver brilliant pixels that have twice as much color gamut and brightness over other monitors, and have low-latency performance for high-speed games. AMD's FreeSync technology and Nvidia's rival G-Sync allow a graphics card to adjust the monitor's refresh rate on the fly, matching it to the computer's frame rate. This synchronization prevents the screen-tearing effect -- with visibly mismatched graphics on different parts of the screen -- which happens when the refresh rate of the display is out of sync with the computer.
Microsoft

Specs of Qualcomm's First ARM Processor Capable of Running Windows 10 Leaks (mspoweruser.com) 107

Qualcomm's upcoming Snapdragon 835's specs have leaked ahead of its CES reveal. An anonymous reader writes: According to the leaked press release, Qualcomm's Snapdragon 835 sports the Qualcomm Kryo 280 CPU (quad-core), Qualcomm Adreno 540 GPU, and Qualcomm Hexagon DSP to manage the different workloads. All of this combined together will result in a 27% increase in performance when compared to the previous generation. Qualcomm is also making significant improvements with the Snapdragon 835 when it comes to power consumption. To be precise, the Snapdragon 835 consumes 40% less power than the older generation which is supposed to offer the following: "1+ day of talk time, 5+ days of music playback, and 7+ hours of 4K video streaming. Should your phone need more power, Qualcomm Quick Charge 4 provides five hours of battery life for five minutes of charging." Qualcomm stated in the press release that the Snapdragon also comes with substantial improvements to the graphics rendering and virtual reality. According to the company, the Snapdragon 835 includes "game-changing" enhancements to improve audio, intuitive interactions, and vibrant visuals. The processor also offers 25 percent faster 3D graphic rendering and produces 60X display colors than the Snapdragon 820.
Windows

Dell Launches XPS 13 2-in-1 Laptop With Intel Kaby Lake Chip, Starts at $1,000 (venturebeat.com) 114

Ahead of CES 2016, which officially kicks off Tuesday, Dell has announced a convertible version of its popular XPS 13 laptop. The machine is powered by a seventh-generation Kaby Lake Intel Core i chip (i5 and i7 options are available), Intel HD Graphics 615 integrated GPU, 4 to 16GB LPDDR3 RAM, a 128GB-1TB solid-state drive (SSD), a 720p webcam on the bottom of the display with support for Windows Hello, a fingerprint scanner, a 46 watt-hour battery, and a 13.3-inch touchscreen, available in QHD+ or FHD configurations. From a report on VentureBeat: The bezel is very narrow, in keeping with the XPS style. The fanless PC offers an SD card slot and two USB-C ports, and a USB-A to USB-C adapter comes in the box. The laptop is 0.32-0.54 inch thick, which is thinner than Dell's 2016 XPS. But the keyboard hasn't been squished down -- the keys have 1.3mm travel, or just a tad bit (0.1mm) more than you get on the XPS laptop -- which is impressive. The laptop weighs 2.7 pounds. The question is whether people will want the convertible option when the laptop is fine as is. The convertible XPS 13 starts at $1000, which is $200 more than the XPS 13 laptop.
Businesses

Foxconn and Sharp Team Up To Build $8.8 Billion LCD Plant In China (reuters.com) 66

Foxconn was in the news recently for plans to "automate entire factories" in China, but the electronics manufacturing company has also announced plans with Sharp to build a $8.8 billion (61 million yuan) factory in China to produce liquid-crystal displays (LCDs). Reuters reports: Sakai Display Products Corp's plant will be a so-called Gen-10.5 facility specializing in large-screen LCDs and will be operational by 2019, the company said at a signing event with local officials in Guangzhou on Friday. It said the plant will have capacity equating to 92 billion yuan a year. The heavy investment is aimed at increasing production to meet expected rising demand for large-screen televisions and monitors in Asia. Sakai Display Products Corp's plans for the Guangzhou plant come as Hon Hai seeks to turn the joint venture into a subsidiary, investing a total of 15.1 billion yuan in the company. The venture will also sell 436,000 shares for 17.1 billion yuan to an investment co-owned by Hon Hai Chairman Terry Gou, giving Hon Hai a 53 percent interest in the business and lowering Sharp's stake from to 26 percent from 40 percent.
Software

Ask Slashdot: Why Are Some Great Games Panned and Some Inferior Games Praised? (soldnersecretwars.de) 145

dryriver writes: A few years ago I bought a multiplayer war game called Soldner: Secret Wars that I had never heard of before. (The game is entirely community maintained now and free to download and play at www.soldnersecretwars.de.) The professional reviews completely and utterly destroyed Soldner -- buggy, bad gameplay, no single-player mode, disappointing graphics, server problems and so on. For me and many other players who did give it a chance beyond the first 30 minutes, Soldner turned out to be the most fun, addictive, varied, satisfying and multi-featured multiplayer war game ever. It had innovative features that AAA titles like Battlefield and COD did not have at all at the time -- fully destructible terrain, walls and buildings, cool physics on everything from jeeps flying off mountaintops to Apache helicopters crashing into Hercules transport aircraft, to dozens of trees being blown down by explosions and then blocking an incoming tank's way. Soldner took a patch or three to become fully stable, but then was just fun, fun, fun to play. So much freedom, so much cool stuff you can do in-game, so many options and gadgets you can play with. By contrast, the far, far simpler -- but better looking -- Battlefield, COD, Medal Of Honor, CounterStrike war games got all the critical praise, made the tens of millions in profit per release, became longstanding franchises and are, to this day, not half the fun to play that Soldner is. How does this happen? How does a title like Soldner, that tried to do more new stuff than the other war games combined, get trashed by every reviewer, and then far less innovative and fun to play war games like BF, COD, CS sell tens of millions of copies per release and get rave reviews all around?
Advertising

Ask Slashdot: Is Computing As Cool and Fun As It Once Was? 449

dryriver writes: I got together with old computer nerd friends the other day. All of us have been at it since the 8-bit/1980s days of Amstrad, Atari, Commodore 64-type home computers. Everybody at the meeting agreed on one thing -- computing is just not as cool and as much fun as it once was. One person lamented that computer games nowadays are tied to internet DRM like Steam, that some crucial DCC software is available to rent only now (e.g. Photoshop) and that many "basic freedoms" of the old-school computer nerd are increasingly disappearing. Another said that Windows 10's spyware aspects made him give up on his beloved PC platform and that he will use Linux and Android devices only from now on, using consoles to game on instead of a PC because of this. A third complained about zero privacy online, internet advertising, viruses, ransomware, hacking, crapware. I lamented that the hardware industry still hasn't given us anything resembling photorealistic realtime 3D graphics, and that the current VR trend arrived a full decade later than it should have. A point of general agreement was that big tech companies in particular don't treat computer users with enough respect anymore. What do Slashdotters think? Is computing still as cool and fun as it once was, or has something "become irreversibly lost" as computing evolved into a multi-billion dollar global business?
Hardware

NVIDIA Quadro P6000 and P5000 Pascal Pro Graphics Powerhouses Put To the Test (hothardware.com) 21

Reader MojoKid writes: NVIDIA's Pascal architecture has been wildly successful in the consumer space. The various GPUs that power the GeForce GTX 10 series are all highly competitive at their respective price points, and the higher-end variants are currently unmatched by any single competing GPU. NVIDIA has since retooled Pascal for the professional workstation market as well, with products that make even the GeForce GTX 1080 and TITAN X look quaint in comparison. NVIDIA's beastly Quadro P6000 and Quadro P5000 are Pascal powered behemoths, packing up to 24GB of GDDR5X memory and GPUs that are more capable than their consumer-targeted counterparts. Though it is built around the same GP102 GPU, the Quadro P6000 is particularly interesting, because it is outfitted with a fully-functional Pascal GPU with all of its SMs enabled, which results in 3,840 active cores, versus 3,584 on the TITAN X. The P5000 has the same GP104 GPU as the GTX 1080, but packs in twice the amount of memory -- 8GB vs 16GB. In the benchmarks, with cryptographic workloads and pro-workstation targeted graphics tests, the Quadro P6000 and Quadro P5000 are dominant across the board. The P6000 significantly outpaced the previous-generation Maxwell-based Quadro M6000 throughout testing, and the P5000 managed to outpace the M6000 on a few occasions as well. Of particular note is that the Quadro P6000 and P5000, while offering better performance than NVIDIA's previous-gen, high-end professional graphics cards, do it in much lower power envelopes, and they're quieter too. In a couple of quick gaming benchmarks, the P6000 may give us a hint at what NVIDIA has in store for the rumored GeForce GTX 1080 Ti, with all CUDA cores enabled in its GP102 GPU and performance over 10% faster than a Titan X.
The Almighty Buck

Worldwide Gaming Market Hits $91 Billion In 2016, Says Report (venturebeat.com) 76

According to a new SuperData Research report, the worldwide gaming market was worth a whopping $91 billion this year, with mobile gaming leading the way with a total estimated market value of $41 billion. The PC gaming market did very well too, as it pulled in nearly $36 billion over the year. PC Gamer reports: The mobile game segment was the largest at $41 billion (up 18 percent), followed by $26 billion for retail games and $19 billion for free-to-play online games. New categories such as virtual reality, esports, and gaming video content were small in size, but they are growing fast and holding promise for 2017, SuperData said. Mobile gaming was driven by blockbuster hits like Pokemon Go and Clash Royale. The mobile games market has started to mature and now more closely resembles traditional games publishing, requiring ever higher production values and marketing spend. Monster Strike was the No. 1 mobile game, with $1.3 billion in revenue. VR grew to $2.7 billion in 2016. Gaming video reached $4.4 billion, up 34 percent. Consumers increasingly download games directly to their consoles, spending $6.6 billion on digital downloads in 2016. PC gaming continues to do well, earning $34 billion (up 6.7 percent) and driven largely by free-to-play online titles and downloadable games. Incumbents like League of Legends together with newcomers like Overwatch are driving the growth in PC games. PC gamers also saw a big improvement with the release of a new generation of graphics cards, offering a 40 percent increase in graphics power and a 20 percent reduction of power consumption.
Businesses

The Loyalty To AMD's GPU Product Among AMD CPU Buyers Is Decreasing (parsec.tv) 157

An anonymous reader shares a report: Data from the builds on PCPartPicker show an interesting trend among the buyers of AMD CPUs. Of the 25,780 builds on PCPartPicker from the last 31 months with a price point between $450âS - $5,000, 19% included an AMD CPU. This is in-line with the Steam Hardware Surveys, but things have changed recently. Builds with AMD CPUs tend to be much less expensive than those with Intel CPUs. The builds with an AMD CPU were $967 on average versus the Intel CPU builds, which were on average $1,570. In the last 31 months, brand loyalty to AMD seemed to push AMD CPU builders to choose AMD graphics cards at a much higher rate than Intel CPU builders. 55% of machines with an AMD CPU also had an AMD GPU; whereas, only 19% of builds with an Intel CPU included an AMD GPU. In the last six months, AMD has started to lose even more ground to Intel and to Nvidia. On the CPU builds, only 10% of gamers building on PCPartPicker were opting to buy an AMD CPU. Among these, the percentage that decided to pair their AMD CPU with an AMD GPU dropped to 51%. The challenges that AMD is seeing in the overall GPU market are being felt even amongst their loyal supporters.
Nintendo

Nintendo Switch Uses Nvidia Tegra X1 SoC, Clock Speeds Outed (arstechnica.com) 105

The Nintendo Switch -- the hybrid portable games console/tablet due for release in March 2017 -- will be powered by Nvidia's older Tegra X1 SoC and not its upcoming Tegra X2 "Parker" SoC as initially rumored. From a report on ArsTechnica: The use of Tegra X1, which also powers the Nvidia Shield Android TV, means the graphics hardware inside the Switch is based on Nvidia's older second-generation Maxwell architecture, rather than the latest Pascal architecture. While the two architectures share a very similar design, the Switch will miss out on some of the smaller performance improvements made in Pascal. When docked, the Switch's GPU runs at a 768MHz, already lower than the 1GHz of the Shield Android TV. When used as a portable, the Switch downclocks the GPU to 307.2MHz -- just 40 percent of the clock speed when docked. Given the Switch is highly likely to use a 720p screen rather than 1080p -- this is currently assumed to be a 6.2-inch IPS LCD with 10-point multi-touch support -- there is some overhead to run games at 1080p when docked. However, it's questionable how many developers will go to the effort of creating games that make use of the extra horsepower when docked, rather than simply opting to program for the slower overall GPU clock speed. While GPU performance is variable, the rest of the Switch's specs remain static. Its four ARM A57 CPU cores are purported to run at 1020MHz regardless of whether the console is docked or undocked, while the memory controller can run at either 1600MHz or 1331MHz in either mode.
Open Source

3D Freeciv-Web (Beta) Released (freeciv.org) 68

It's the open source web version of the classic Linux strategy game, and now Slashdot reader Andreas(R) -- one of its developers -- has an announcement. Now the developers are working on bringing the game to the modern era with 3D WebGL graphics [and] a beta of the 3D WebGL version of Freeciv has been released today. The game will work on any device with a browser with HTML5 and WebGL support, and three gigabytes of RAM... It's a volunteer community development project and anyone is welcome to contribute to the project. Have fun and remember to sleep!
The developers of Freeciv-web are now also working on a VR version using Google Cardboard, according to the site, while the original Freeciv itself has still been maintained for over 20 years -- and apparently even has its own dedicated port number.
AMD

AMD Introduces Radeon Instinct Machine Intelligence Accelerators (hothardware.com) 55

Reader MojoKid writes: AMD is announcing a new series of Radeon-branded products today, targeted at machine intelligence and deep learning enterprise applications, called Radeon Instinct. As its name suggests, the new Radeon Instinct line of products are comprised of GPU-based solutions for deep learning, inference and training. The new GPUs are also complemented by a free, open-source library and framework for GPU accelerators, dubbed MIOpen. MIOpen is architected for high-performance machine intelligence applications and is optimized for the deep learning frameworks in AMD's ROCm software suite. The first products in the lineup consist of the Radeon Instinct MI6, the MI8, and the MI25. The 150W Radeon Instinct MI6 accelerator is powered by a Polaris-based GPU, packs 16GB of memory (224GB/s peak bandwidth), and will offer up to 5.7 TFLOPS of peak FP16 performance. Next up in the stack is the Fiji-based Radeon Instinct MI8. Like the Radeon R9 Nano, the Radeon Instinct MI8 features 4GB of High-Bandwidth Memory (HBM) with peak bandwidth of 512GB/s. The MI8 will offer up to 8.2 TFLOPS of peak FP16 compute performance, with a board power that typical falls below 175W. The Radeon Instinct MI25 accelerator will leverage AMD's next-generation Vega GPU architecture and has a board power of approximately 300W. All of the Radeon Instinct accelerators are passively cooled but when installed into a server chassis you can bet there will be plenty of air flow. Like the recently released Radeon Pro WX series of professional graphics cards for workstations, Radeon Instinct accelerators will be built by AMD. All of the Radeon Instinct cards will also support AMD MultiGPU (MxGPU) hardware virtualization technology.
ISS

Japan Sends Its New Space Junk-Fighting Technology To The ISS (phys.org) 64

What floats 249 miles in the sky, stretches 2,300 feet, and took over 10 years to develop? An anonymous reader quotes Phys.org: Japan launched a cargo ship Friday bound for the International Space Station, carrying a "space junk" collector that was made with the help of a fishnet company... Researchers are using a so-called electrodynamic tether made from thin wires of stainless steel and aluminum... The electricity generated by the tether as it swings through the Earth's magnetic field is expected to have a slowing effect on the space junk, which should, scientists say, pull it into a lower and lower orbit. Eventually the detritus will enter the Earth's atmosphere, burning up harmlessly long before it has a chance to crash to the planet's surface.
Bloomberg has some interesting background: The experiment is part of an international cleanup effort planning to safeguard astronauts and about $900 billion worth of space stations, satellites and other infrastructure... Satellite collisions and testing of anti-satellite weapons have added thousands of debris fragments in the atmosphere since 2007, according to NASA... With debris traveling at up to 17,500 miles an hour, the impact of even a marble-size projectile can cause catastrophic damage.
Open Source

Linux Kernel 4.9 Officially Released (kernel.org) 80

"As expected, today, December 11, 2016, Linus Torvalds unleashed the final release of the highly anticipated Linux 4.9 kernel," reports Softpedia. prisoninmate shares their article: Linux kernel 4.9 entered development in mid-October, on the 15th, when Linus Torvalds decided to cut the merge window short by a day just to keep people on their toes, but also to prevent them from sending last-minute pull requests that might cause issues like it happened with the release of Linux kernel 4.8, which landed just two weeks before first RC of Linux 4.9 hit the streets... There are many great new features implemented in Linux kernel 4.9, but by far the most exciting one is the experimental support for older AMD Radeon graphics cards from the Southern Islands/GCN 1.0 family, which was injected to the open-source AMDGPU graphics driver...

There are also various interesting improvements for modern AMD Radeon GPUs, such as virtual display support and better reset support, both of which are implemented in the AMDGPU driver. For Intel GPU users, there's DMA-BUF implicit fencing, and some Intel Atom processors got a P-State performance boost. Intel Skylake improvements are also present in Linux kernel 4.9.

There's also dynamic thread-tracing, according to Linux Today. (And hopefully they fixed the "buggy crap" that made it into Linux 4.8.) LWN.net calls this "by far the busiest cycle in the history of the kernel project."
The Almighty Buck

Every US Taxpayer Has Effectively Paid Apple At Least $6 in Recent Years (arstechnica.com) 267

An anonymous reader shares an ArsTechnica report: Apple has received at least $6 per American taxpayer over the last five years in the form of interest payments on billions' worth of United States Treasury bonds, according to a report by Bloomberg. Citing Apple's regulatory filings and unnamed sources, the business publication found "the Treasury Department paid Apple at least $600 million and possibly much more over the past five years in the form of interest." By taking advantage of a provision in the American tax code, Bloomberg says that Apple has "stashed much of its foreign earnings -- tax-free -- right here in the US, in part by purchasing government bonds." As The Wall Street Journal reported in September, American companies are believed to be holding approximately $2 trillion in cash overseas that is shielded from US taxes. Under American law, companies must pay a 35-percent corporate tax rate on global profits when that money is brought home -- so there is an incentive to keep as much of that money overseas as possible.
AMD

AMD's Major Radeon Software Graphics Driver Update Goes Live With Gameplay Capture, More (venturebeat.com) 98

Advanced Micro Devices, or AMD is launching an update for its Radeon graphics drivers that will help PC gamers enjoy more power-efficient gameplay during the holiday season. Radeon Software Crimson ReLive Edition offers high-performance gaming and better stability for consumers, professionals, and developers. From a report on VentureBeat: The new edition enables power-efficient gameplay with Radeon Chill and seamless in-game screen capture and streaming with Radeon ReLive. For designers, content creators, and game developers, Radeon Pro Software Crimson ReLive Edition delivers productivity and stability with up to 30 percent performance improvements in key applications. With Radeon ReLive, gamers can "relive" their gameplay by capturing, streaming, and sharing recorded gaming sessions. Highly efficient with minimal impact to gameplay, Radeon ReLive enables seamless playback of ReLive recordings via an easily accessible in-game toolbar, and offers quick and convenient customizable settings, custom scene layouts, and more, AMD said. With Radeon ReLive, gamers now have a way to capture gaming highlights, and share their gaming exploits and conquests with online friends and competitors -- all integrated within Radeon Software.

Slashdot Top Deals