Bitcoin

Cryptocurrency Miners Are Using Old Tires to Power Their Rigs (vice.com) 79

Christopher Malmo, writing for Motherboard: An entrepreneurial cryptocurrency mining company has just announced an unusual deal: it has partnered with a tire-based waste-to-energy company in the United States to power its mining computers. Standard American Mining and PRTI, a tire "thermal demanufacturing" company based in North Carolina, are powering graphics cards-based mining equipment to earn a range of alternative cryptocurrencies like Ethereum. Basically, they take used tires and heat them to a precise temperature, resulting in components like steel (from belted tires), carbon black, and a burnable fuel. That fuel is the energy source driving turbines to make electricity, which powers an onsite cryptocurrency mining farm. Taking advantage of an underutilized electricity source to run computers isn't groundbreaking, but the unusual set-up shows that cryptocurrency mining is now profitable enough to justify finding quite unconventional sources of cheap or new energy generation.
AMD

AMD Quietly Made Some Radeon RX 560 Graphics Cards Worse (pcworld.com) 40

Brad Chacos: When the Radeon RX 560 launched in April it was the only RX 500-series card with a meaningful under-the-hood tech boost compared to the RX 400-series. The graphics processor in the older RX 460 cards packed 14 compute units and 896 stream processors; the upgraded Radeon RX 560 bumped that to 16 CUs and 1,024 SPs. Now, some -- but not all -- of the Radeon RX 560s you'll find online have specs that match the older 460 cards, and sometimes run at lower clock speeds to boot. AMD's Radeon RX 560 page was also quietly altered to include the new configurations at some point, Heise.de discovered. The last snapshot of the page by the Internet Archive's Wayback Machine occurred on July 7 and only lists the full-fat 16 CU version of the card, so the introduction of the nerfed 896 SP model likely occurred some time after that. Sifting through all of the available Radeon RX 560s on Newegg this morning reveals a fairly even split between the two configurations, all of which are being sold under the same RX 560 name. In a statement, AMD acknowledged the existence of 14 Compute Unit (896 stream processors) and 16 Compute Unit (1024 stream processor) versions of the Radeon RX 560. "We introduced the 14CU version this summer to provide AIBs and the market with more RX 500 series options. It's come to our attention that on certain AIB and etail websites there's no clear delineation between the two variants. We're taking immediate steps to remedy this: we're working with all AIB and channel partners to make sure the product descriptions and names clarify the CU count, so that gamers and consumers know exactly what they're buying. We apologize for the confusion this may have caused."
Graphics

HDMI 2.1 Is Here With 10K and Dynamic HDR Support (engadget.com) 176

Swapna Krishna reports via Engadget: Back in January, the HDMI Forum unveiled its new specifications for the HDMI connector, called HDMI 2.1. Now, that HDMI specification is available to all HDMI 2.0 adopters. It's backwards compatible with all previous HDMI specifications. The focus of HDMI 2.1 is on higher video bandwidth; it supports 48 GB per second with a new backwards-compatible ultra high speed HDMI cable. It also supports faster refresh rates for high video resolution -- 60 Hz for 8K and 120 Hz for 4K. The standard also supports Dynamic HDR and resolutions up to 10K for commercial and specialty use. This new version of the HDMI specification also introduces an enhanced refresh rate that gamers will appreciate. VRR, or Variable Refresh Rate, reduces, or in some cases eliminates, lag for smoother gameplay, while Quick Frame Transport (QFT) reduces latency. Quick Media Switching, or QMS, reduces the amount of blank-screen wait time while switching media. HDMI 2.1 also includes Auto Low Latency Mode (ALLM), which automatically sets the ideal latency for the smoothest viewing experience.
Bitcoin

Tesla Owners Are Mining Bitcoins With Free Power From Charging Stations (vice.com) 141

dmoberhaus writes: Someone claimed to use their Tesla to power a cryptocurrency mine to take advantage of the free energy given to Tesla owners. But even with free energy, does this scheme make sense? Motherboard ran the numbers.

From the report: "...If we assume that each of the GPUs in this rig draws around 150 watts, then the 16 GPUs have a total power draw of 2.4 kilowatts per hour or 57.6 kilowatt hours per day if they ran for a full 24 hours. According to Green Car Reports, a Tesla Model S gets about 3 miles per kilowatt hour, meaning that running this mining rig for a full day is the equivalent of driving nearly 173 miles in the Tesla. According to the Federal Highway Administration, the average American drives around 260 miles a week. In other words, running this cryptocurrency mine out of the trunk of a Tesla for a day and a half would use as much energy as driving that Tesla for a full week, on average. Moreover, drivers who are not a part of Tesla's unlimited free energy program are limited to 400 kilowatt hours of free electricity per year, meaning they could only run their rig for a little over 7 days on free energy.

Okay, but how about the cost? Let's assume that this person is mining Ethereum with their GPUs. Out of the box, an average GPU can do about 20 megahashes per second on the Ethereum network (that is, performing a math problem known as hashing 20 million times per second). This Tesla rig, then, would have a total hashrate of about 320 megahashes. According to the Cryptocompare profitability calculator, if the Tesla rig was used to mine Ethereum using free electricity, it would result in about .05 Ether per day -- equivalent to nearly $23, going by current prices at the time of writing. In a month, this would result in $675 in profit, or about the monthly lease for a Tesla Model S. So the Tesla would pay for itself, assuming the owner never drove it or used it for anything other than mining Ethereum, Ethereum doesn't drop in value below $450, and the Tesla owner gets all of their energy for free."
Motherboard also notes that this conclusion "doesn't take into account the price of each of the mining rigs, which likely cost about $1,000 each, depending on the quality of the GPUs used." TL;DR: Mining cryptocurrency out of your electric car is not worth it.
AMD

First AMD Ryzen Mobile Laptop Tested Shows Strong Zen-Vega Performance (hothardware.com) 85

MojoKid writes: AMD Ryzen Mobile processors are arriving now in retail laptops from the likes of HP, Lenovo and Acer. With the first CPUs to hit the market, AMD took quad-core Ryzen and coupled it with 8 or 10-core Vega GPUs on a single piece of silicon in an effort to deliver a combination of strong Ryzen CPU performance along with significantly better integrated graphics performance over Intel's current 8th Gen Kaby Lake laptop chips. AMD Ryzen 7 2700U and Ryzen 5 2500U chips have 4MB of shared L3 cache each, but differ with respect to top-end CPU boost clock speeds, number of integrated Radeon Vega Compute Units (CUs), and the GPU's top-end clocks. Ryzen 7 2700U is more powerful with 10 Radeon Vega CUs, while Ryzen 5 2500U sports 8. Ryzen 7 2700U also boosts to 3.8GHz, while Ryzen 5 2500U tops out at 3.6GHz. In the benchmarks, Ryzen Mobile looks strong, competing well with Intel quad-core 8th Gen laptop CPUs, while offering north of 60 percent better performance in graphics and gaming. Battery life is still a question mark, however, as some of the very first models to hit the market from HP have inefficient displays and hard drives instead of SSDs. As more premium configurations hit the market in the next few weeks, hopefully we'll get a better picture of Ryzen Mobile battery life in more optimized laptop builds.
Windows

Microsoft Confirms Surface Book 2 Can't Stay Charged During Gaming Sessions (engadget.com) 138

The Verge mentioned in their review that the Surface Book 2's power supply can't charge the battery fast enough to prevent it from draining in some cases. Microsoft has since confirmed that "in some intense, prolonged gaming scenarios with Power Mode Slider set to 'best performance' the battery may discharge while connected to the power supply." Engadget reports: To let you choose between performance and battery life, the Surface Book has a range of power settings. If you're doing video editing or other GPU intensive tasks, you can crank it up to "best performance" to activate the NVIDIA GPU and get more speed. Battery drain is normally not an issue with graphics apps because the chip only kicks in when needed. You'll also need the "best performance" setting for GPU-intensive games, as they'll slow down or drop frames otherwise. The problem is that select titles like Destiny 2 use the NVIDIA chip nearly continuously, pulling up to 70 watts of power on top of the 35 watt CPU. Unfortunately, the Surface Book comes with a 102-watt charger, and only about 95 watts of that reaches the device, the Verge points out. Microsoft says that the power management system will prevent the battery from draining completely, even during intense gaming, but it would certainly mess up your Destiny 2 session. It also notes that the machine is intended for designers, developers and engineers, with the subtext that it's not exactly marketed as a gaming rig.
Graphics

Google Cloud Platform Cuts the Price of GPUs By Up To 36 Percent (techcrunch.com) 28

In a blog post, Google's Product Manager, Chris Kleban, announced that the company is cutting the price of using Nvidia's Tesla GPUs through its Compute Engine by up to 36 percent. The older K80 GPUs will now cost $0.45 per hour while the more powerful P100 machines will cost $1.46 per minute (all with per-second billing). TechCrunch reports: The company is also dropping the prices for preemptible local SSDs by almost 40 percent. "Preemptible local SSDs" refers to local SSDs attached to Google's preemptible VMs. You can't attach GPUs to preemptible instances, though, so this is a nice little bonus announcement -- but it isn't going to directly benefit GPU users. As for the new GPU pricing, it's clear that Google is aiming this feature at developers who want to run their own machine learning workloads on its cloud, though there also are a number of other applications -- including physical simulations and molecular modeling -- that greatly benefit from the hundreds of cores that are now available on these GPUs. The P100, which is officially still in beta on the Google Cloud Platform, features 3594 cores, for example. Developers can attach up to four P100 and eight K80 dies to each instance. Like regular VMs, GPU users will also receive sustained-use discounts, though most users probably don't keep their GPUs running for a full month.
Graphics

NVIDIA Launches Modded Collector's Edition Star Wars Titan Xp Graphics Card (hothardware.com) 45

MojoKid writes: NVIDIA just launched its fastest graphics card yet and this GPU is targeted at Star Wars fans. In concert with EA's official launch today of Star Wars Battlefront II, NVIDIA unveiled the new Star Wars Titan Xp Collector's Edition graphics card for enthusiast gamers. There are two versions of the cards available -- the Galactic Empire version and a Jedi Order version. Both of the cards feature customized coolers, shrouds, and lighting, designed to mimic the look of a lightsaber. They also ship in specialized packaging that can be used to showcase the cards if they're not installed in a system. The GPU powering the TITAN Xp Collector's Edition has a base clock of 1,481MHz and a boost clock of 1,582MHz. It's packing a fully-enabled NVIDIA GP102 GPU with 3,840 cores and 12GB of GDDR5X memory clocked at 5.5GHz for an effective data rate of 11Gbps, resulting in 547.2GB/s of peak memory bandwidth. At those clocks, the card also offers a peak texture fillrate of 379.75 GigaTexels/s and 12.1TFLOPs of FP32 compute performance, which is significantly higher than a GeForce GTX 1080 Ti. In the benchmarks, it's the fastest GPU out there right now (it better be for $1200), but this card is more about nostalgia and the design customizations NVIDIA made to the cards that should appeal to gamers and Star Wars fans alike.
United States

America's 'Retail Apocalypse' Is Really Just Beginning (bloomberg.com) 398

An anonymous reader quotes a report from Bloomberg: The so-called retail apocalypse has become so ingrained in the U.S. that it now has the distinction of its own Wikipedia entry. The industry's response to that kind of doomsday description has included blaming the media for hyping the troubles of a few well-known chains as proof of a systemic meltdown. There is some truth to that. In the U.S., retailers announced more than 3,000 store openings in the first three quarters of this year. But chains also said 6,800 would close. And this comes when there's sky-high consumer confidence, unemployment is historically low and the U.S. economy keeps growing. Those are normally all ingredients for a retail boom, yet more chains are filing for bankruptcy and rated distressed than during the financial crisis. That's caused an increase in the number of delinquent loan payments by malls and shopping centers. The reason isn't as simple as Amazon.com Inc. taking market share or twenty-somethings spending more on experiences than things. The root cause is that many of these long-standing chains are overloaded with debt -- often from leveraged buyouts led by private equity firms. There are billions in borrowings on the balance sheets of troubled retailers, and sustaining that load is only going to become harder -- even for healthy chains. The debt coming due, along with America's over-stored suburbs and the continued gains of online shopping, has all the makings of a disaster. The spillover will likely flow far and wide across the U.S. economy. There will be displaced low-income workers, shrinking local tax bases and investor losses on stocks, bonds and real estate. If today is considered a retail apocalypse, then what's coming next could truly be scary.
Intel

Intel Recruits AMD RTG Exec Raja Koduri To Head New Visual Computing Group (hothardware.com) 58

MojoKid writes: Intel just announced that former AMD Radeon Technologies Group SVP, Raja Koduri, would be joining its team to head up a newly formed Core and Visual Computing Group, and as a general manager of a new initiative to drive edge and client visual computing solutions. With Koduri's help, Intel plans to unify and expand its IP across multiple segments including core computing, graphics, media, imaging and machine learning capabilities for the client and data center segments, artificial intelligence, and emerging opportunities. Intel also explicitly stated that it would also expand its strategy to develop and deliver high-end, discrete graphics solutions. This announcement also comes just after Intel revealed it would be employing AMD's Vega GPU architecture in a new mobile processor that will drive high-end graphics performance into smaller, slimmer, and sleeker mobile form factors. With AMD essentially spinning the Radeon Technologies Group into its own entity, Intel now leveraging AMD graphics technology, and a top-level executive like Koduri responsible for said graphics tech switching teams, we have to wonder how the relationship between Intel and AMD's RTG with evolve.
Intel

Arch-rivals Intel and AMD Team Up on PC Chips To Battle Nvidia (pcworld.com) 169

Intel and AMD, arch-rivals for decades, are teaming up to thwart a common competitor, Nvidia. On Monday, the two companies said they are co-designing an Intel Core microprocessor with a custom AMD Radeon graphics core inside the processor package. The chip is intended for laptops that are thin and lightweight but powerful enough to run high-end videogames, the companies said. From a report: Executives from both AMD and Intel told PCWorld that the combined AMD-Intel chip will be an "evolution" of Intel's 8th-generation, H-series Core chips, with the ability to power-manage the entire module to preserve battery life. It's scheduled to ship as early as the first quarter of 2018. Though both companies helped engineer the new chip, this is Intel's project -- Intel first approached AMD, both companies confirmed. AMD, for its part, is treating the Radeon core as a single, semi-custom design, in the same vein as the chips it supplies to consoles like the Microsoft Xbox One X and Sony Playstation 4. Some specifics, though, remain undisclosed: Intel refers to it as a single product, though it seems possible that it could eventually be offered at a range of clock speeds. [...] Shaking hands on this partnership represents a rare moment of harmony in an often bitter rivalry that began when AMD reverse-engineered the Intel 8080 microchip in 1975.
AMD

AMD, Which Lost Over $2.8 Billion In 5 Years, Takes a Hit After New Report (arstechnica.com) 91

An anonymous reader quotes a report from Ars Technica: On Monday, AMD's stock price plunged nearly 9 percent after a report by Morgan Stanley, a major investment bank, which found that "microprocessor momentum" has slowed. According to CNBC, a new report by analyst Joseph Moore found that "cryptocurrency mining driven sales for AMD's graphics chips will decline by 50 percent next year or a $250 million decline in revenue. He also forecasts video game console demand will decline by 5.5 percent in 2018." As per AMD's own SEC filings, the company lost over $2.8 billion from 2012 through 2016. However, new releases from AMD suggest that it may be on something of a resurgent track. As Ars reported last month, AMD's Ryzen and Threadripper processors re-established AMD's chips as competitive with Intel's.
Networking

PCIe 4.0 Specs Revealed: 16GTps Rate and Not Just For Graphics Cards Anymore (tomshardware.com) 62

Freshly Exhumed writes: PCI-SIG has released the specifications for version 4.0 of the PCIe (Peripheral Component Interconnect Express) bus, which, according to Chairman Al Yanes, promises data transfer rates of 16GTps, extended tags and credits for service devices, reduced system latency, lane margining, superior RAS capabilities, scalability for added lanes and bandwidth, improved I/O virtualization and platform integration. Tom's Hardware has posted a slide deck of the new version's specifications.
AMD

AMD Unveils Ryzen Mobile Processors Combining Zen Cores and Vega Graphics (hothardware.com) 41

MojoKid writes: AMD is officially launching a processor family today known by the code name Raven Ridge, but now referred to as Ryzen Mobile. The architecture combines AMD's new Zen CPU core architecture, along with its RX Vega GPU integrated into a single chip for laptops. There are two initial chips in the mobile processor family that AMD is announcing today: the Ryzen 5 2500U and the Ryzen 7 2700U. Both processors feature four cores capable of executing 8 threads with SMT. However, there are differences with respect to processor clocks and GPU specs. AMD's Ryzen 5 2500U has a base clock of 2GHz and a boost clock of 3.6GHz, while Ryzen 7 2700U cranks up another 200MHz on both of those figures. Ryzen 5 2500U features 8 Radeon Vega graphics CUs (Compute Units) and a GPU clock of 1.1GHz, compared to 10 Radeon Vega CUs and a GPU clock of 1.3GHz for the higher-end Ryzen 7 2700U. AMD is making rather ambitious claims for the new processors, and promises some impressive gains over its 7th generation Bristol Ridge predecessors. According to AMD, CPU and GPU performance will see 200 percent and 128 percent uplifts, respectively. AMD is also showcasing benchmark numbers that have the new CPUs outgunning Intel's new quad-core Kaby Lake R chips in spots, along with significant performance advantages in gaming and graphics, on par with discrete, entry-level laptop GPUs like NVIDIA's GeForce 950M. Thin and light laptops from HP, Lenovo and Acer powered by Ryzen Mobile are expected to ship in Q4 this year.
Transportation

Chipmaker Nvidia's CEO Sees Fully Autonomous Cars Within 4 Years (reuters.com) 77

An anonymous reader shares a report: Nvidia chief executive Jensen Huang said on Thursday artificial intelligence would enable fully automated cars within 4 years, but sought to tamp down expectations for a surge in demand for its chips from cryptocurrency miners. Nvidia came to prominence in the gaming industry for designing graphics-processing chips, but in recent years has been expanding into newer technologies including high-performance computing, artificial intelligence, and self-driving cars. Its expansion has been richly rewarded with a 170 percent stock surge over the past year, boosting its market value to $116 billion. "It will take no more than 4 years to have fully autonomous cars on the road. How long it takes for the vast majority of cars on the road to become that, it really just depends," Huang told media after a company event in Taipei.
Desktops (Apple)

Tim Cook Confirms the Mac Mini Isn't Dead (macrumors.com) 191

Apple has refreshed just about every Mac product within the last couple of years -- except for the Mac Mini. Naturally, this has left many analysts questioning whether or not the company would be phasing out the Mini to focus more on its mobile devices. A MacRumors reader decided to email Apple CEO Tim Cook to get an update on the Mac mini and he received a response. Cook said it was "not time to share any details," but he confirmed that the Mac mini will be an important part of the company's product lineup in the future. MacRumors reports: Cook's response echoes a similar statement from Apple marketing chief Phil Schiller, who commented on the Mac mini when Apple's plans for a new Mac Pro were unveiled. "The Mac mini is an important product in our lineup and we weren't bringing it up because it's more of a mix of consumer with some pro use," he said. Positioned as a "bring your own peripherals" machine that comes without a mouse, keyboard, or display, the Mac mini is Apple's most affordable desktop machine. The current version is woefully outdated though, and continues to use Haswell processors and integrated Intel HD 5000/Intel Iris Graphics. It's not clear when Apple will introduce a new Mac mini, and aside from a single rumor hinting at a new high-end Mac mini with a redesign that "won't be so mini anymore," we've heard no rumors about work on a possible Mac mini refresh.
Intel

Intel Aims To Take on Nvidia With a Processor Specially Designed for AI (fastcompany.com) 43

An anonymous reader shares a report: In what looks like a repeat of its loss to Qualcomm on smartphones, Intel has lagged graphics chip (GPU) maker Nvidia in the artificial intelligence revolution. Today Intel announced that its first AI chip, the Nervana Neural Network Processor, will roll out of factories by year's end. Originally called Lake Crest, the chip gets its name from Nervana, a company Intel purchased in August 2016, taking on the CEO, Naveen Rao, as Intel's AI guru. Nervana is designed from the ground up for machine learning, Rao tells me. You can't play Call of Duty with it. Rao claims that ditching the GPU heritage made room for optimizations like super-fast data interconnections allowing a bunch of Nervanas to act together like one giant chip. They also do away with the caches that hold data the processor might need to work on next. "In neural networks... you know ahead of time where the data's coming from, what operation you're going to apply to that data, and where the output is going to," says Rao.
Microsoft

Microsoft Surface Book 2 Puts Desktop Brains in a Laptop Body (wired.com) 141

David Pierce, writing for Wired: As Microsoft went to create the Surface Book 2, the company once again tried to bust categories. The result is the most combinatory device Microsoft's made yet. It's a laptop (screens measure 13 or 15 inches; there's a keyboard and trackpad) -- and it's also a tablet (the screen detaches, you can use a pen, everything's touch-friendly), and it's also a desktop. A stupendously powerful one, at that: It runs on Intel's new eighth-generation quad-core processors, in either a Core i5 or Core i7 version. The higher-end models come with Nvidia's GeForce discrete graphics, up to 16 gigs of RAM, and as much as 1 terabyte of solid storage. All that in a fanless body that gets up to 17 hours of battery life, and weighs about 3.5 pounds for the smaller model or 4.2 pounds for the larger. What does all that mean? Microsoft claims the smaller model is three times more powerful than the last Surface Book, and the 15-inch runs five times as fast. Those are meaningless comparisons, but the point holds. This thing screams. More useful are the comparisons to Apple's latest MacBook Pros: Microsoft claims up to 70 percent more battery life, and double the performance of Apple's laptops.
Transportation

Nvidia Introduces a Computer For Level 5 Autonomous Cars (engadget.com) 175

From a report: At the center of many of the semi-autonomous cars currently on the road is NVIDIA hardware. Once automakers realized that GPUs could power their latest features, the chipmaker, best known for the graphics cards that make your games look outstanding, became the darling of the car world. But while automakers are still dropping level 2 and sometimes level 3 vehicles into the market, NVIDIA's first AI computer, the NVIDIA Drive PX Pegasus, is apparently capable of level 5 autonomy. That means no pedals, no steering wheel, no need for anyone to ever take control. The new computer delivers 320 trillion operations per second, 10 times more than its predecessor. Before you start squirreling away cash for your own self-driving car, though, NVIDIA's senior director of automotive, Danny Shapiro, notes that it's likely going to be robotaxis that drive us around. In fact, the company said that over 25 of its partners are already working on fully autonomous taxis. The goal with this smaller, more powerful computer is to remove the huge computer arrays that sit in the prototype vehicles of OEMs, startups and any other company that's trying to crack the autonomous car nut.
Iphone

Apple Doesn't Deliberately Slow Down Older Devices According To Benchmark Analysis (macrumors.com) 163

According to software company Futuremark, Apple doesn't intentionally slow down older iPhones when it releases new software updates as a way to encourage its customers to buy new devices. MacRumors reports: Starting in 2016, Futuremark collected over 100,000 benchmark results for seven different iPhone models across three versions of iOS, using that data to create performance comparison charts to determine whether there have been performance drops in iOS 9, iOS 10, and iOS 11. The first device tested was the iPhone 5s, as it's the oldest device capable of running iOS 11. iPhone 5s, released in 2013, was the first iPhone to get a 64-bit A7 chip, and iOS 11 is limited to 64-bit devices. Futuremark used the 3DMark Sling Shot Extreme Graphics test and calculated all benchmark scores from the iPhone 5s across a given month to make its comparison. The higher the bar, the better the performance, and based on the testing, GPU performance on the iPhone 5s has remained constant from iOS 9 to iOS 11 with just minor variations that Futuremark says "fall well within normal levels." iPhone 5s CPU performance over time was measured using the 3DMark Sling Shot Extreme Physics test, and again, results were largely consistent. CPU performance across those three devices has dropped slightly, something Futuremark attributes to "minor iOS updates or other factors."

Slashdot Top Deals