×
AMD

AMD Unveils 'EPYC' Server CPUs, Ryzen Mobile, Threadripper CPU and Radeon Vega Frontier Edition GPU (hothardware.com) 76

MojoKid writes: Today, at its financial analyst day, AMD lifted the veil on a number of new products based on the company's Zen CPU architecture and next generation Vega GPU architecture. AMD CEO Lisa Su lifted a very large server chip in the air that the company now has branded EPYC. AMD is going for the jugular when it comes to comparisons with Intel's Xeon family, providing up to 128 PCI Express 3.0 lanes, which Su says "allows you to connect more GPUs directly to the CPU than any other solution in the industry." EPYC currently scales to 32 cores/64 threads per socket and supports up to 8-channel DDR4 memory (16 DIMMs per CPU, up to 4TB total memory support). AMD also confirmed the previously rumored Threadripper CPU, a 16-core/32-thread beast of a chip for the enthusiast desktop PC space. AMD's Raja Koduri, Senior Vice President and Chief Architect for Radeon Technologies Group, also unveiled Radeon Vega Frontier Edition, a workstation and pro graphics card targeted at VR content creation, visualization and machine learning. Radeon Vega Frontier Edition offers 13 TFLOPS of FP32 throughput, 25 TFLOPS of FP16 performance and is powered by 64 computer units and 16GB of HMB2 memory for about 480GB/sec of memory bandwidth. The cards are expected to ship in June but there was no word just yet on when consumer versions of Vega will hit. Finally, AMD also shared info on Ryzen Mobile, which will incorporate both the Zen CPU architecture and an integrated Vega GPU core. Compared to AMD's 7th generation APUs, AMD claims Ryzen Mobile will up CPU performance by 50 percent while offering 40 percent better graphics performance. AMD also claimed those gains will not come at the expense of battery life, with a 50 percent reduction in power consumption, which reportedly will pave the way for faster, longer lasting premium notebooks and 2-in-1 devices.
AI

NVIDIA Unveils Tesla V100 AI Accelerator Powered By 5120 CUDA Core Volta GPU (hothardware.com) 37

MojoKid writes: NVIDIA CEO Jen-Hsun Huang just offered the first public unveiling of a product based on the company's next generation GPU architecture, codenamed Volta. NVIDIA just announced its new Tesla V100 accelerator that's designed for AI and machine learning applications, and at the heart of the Tesla V100 is NVIDIA's Volta GV100 GPU. The chip features a 21.1 billion transistors on a die that measures 815mm2 (compared to 12 billion transistors and 610mm2 respectively for the previous gen Pascal GP100). The GV100 is built on a 12nm FinFET manufacturing process by TSMC. It is comprised of 5,120 CUDA cores with a boost clock of 1455MHz, compared to 3585 CUDA cores for the GeForce GTX 1080 Ti and previous gen Tesla P100 AI accelerator, for example. The new Volta GPU delivers 15 TFLOPS FP32 compute performance and 7.5 TFLOPS of FP64 compute performance. Also on board is 16MB of cache and 16GB of second generation High Bandwidth (HBM2) memory with 900GB/sec of bandwidth via a 4096-bit interface. The GV100 also has dedicated Tensor cores (640 in total) accelerating AI workloads. NVIDIA notes the dedicated Tensor cores also allow for a 12x uplift in deep learning performance compared to Pascal, which relies solely on its CUDA cores. NVIDIA is targeting a Q3 2017 release for the Tesla V100 with Volta, but the timetable for a GeForce derivative family of consumer graphics cards has has not been disclosed.
Google

Google's Upcoming 'Fuchsia' Smartphone OS Dumps Linux, Has a Wild New UI (arstechnica.com) 219

More details have emerged about Fuchsia, the new mobile OS Google has been working on. ArsTechnica reports that Fuchsia is not based on Linux (unlike Android and Chrome OS). Instead, the OS uses a new, Google-developed microkernel called "Magenta." From the article: With Fuchsia, Google would not only be dumping the Linux kernel, but also the GPL: the OS is licensed under a mix of BSD 3 clause, MIT, and Apache 2.0. Dumping Linux might come as a bit of a shock, but the Android ecosystem seems to have no desire to keep up with upstream Linux releases. Even the Google Pixel is still stuck on Linux Kernel 3.18, which was first released at the end of 2014. [...] The interface and apps are written using Google's Flutter SDK, a project that actually produces cross-platform code that runs on Android and iOS. Flutter apps are written in Dart, Google's reboot of JavaScript which, on mobile, has a focus on high-performance, 120fps apps. It also has a Vulkan-based graphics renderer called "Escher" that lists "Volumetric soft shadows" as one of its features, which seems custom-built to run Google's shadow-heavy "Material Design" interface guidelines. The publication put the Flutter SDK to test on an Android device to get a sneak peek into the user interface of Fuchsia. "The home screen is a giant vertically scrolling list. In the center you'll see a (placeholder) profile picture, the date, a city name, and a battery icon," the author wrote. "Above the are 'Story' cards -- basically Recent Apps -- and below it is a scrolling list of suggestions, sort of like a Google Now placeholder. Leave the main screen and you'll see a Fuchsia 'home' button pop up on the bottom of the screen, which is just a single white circle."
Businesses

Splitting Up With Apple is a Chipmaker's Nightmare (engadget.com) 98

Apple is such a powerful company that, for third-party suppliers, it's hard not to become reliant on the cash that it pays you. Engadget adds: But when Apple says that it's done, choosing to move whatever technology you provide in house, the results can be really painful. Imagination Technologies is one such supplier, famously designing the iPhone's PowerVR graphics as well as pushing MIPS, a rival to ARM. But back in March, Imagination publicly announced that Apple was ditching it in favor of its own graphics silicon. Now, Imagination has revealed that it's going to take Apple to dispute resolution, maintaining that the iPhone maker used Imagination's IP without permission. It's the second chipmaker in recent months who believes Apple isn't playing fair, with Qualcomm counter-suing Apple in its own licensing dispute. Secondly, Imagination is going to have to sell off MIPS and Ensigma, two parts of its business that aren't as profitable as PowerVR. Gamers with long memories will remember that MIPS designed the CPUs that lurked inside the PlayStation, PS2 and Nintendo 64.
Open Source

Linux Kernel 4.11 Officially Released (softpedia.com) 55

prisoninmate quotes Softpedia: Linux kernel 4.11 has been in development for the past two months, since very early March, when the first Release Candidate arrived for public testing. Eight RCs later, we're now able to download and compile the final release of Linux 4.11 on our favorite GNU/Linux distributions and enjoy its new features. Prominent ones include scalable swapping for SSDs, a brand new perf ftrace tool, support for OPAL drives, support for the SMC-R (Shared Memory Communications-RDMA) protocol, journalling support for MD RAID5, all new statx() system call to replace stat(2), and persistent scrollback buffers for VGA consoles... The Linux 4.11 kernel also introduces initial support for Intel Gemini Lake chips, which is an Atom-based, low-cost computer processor family developed using Intel's 14-nanometer technology, and better power management for AMD Radeon GPUs when the AMDGPU open-source graphics driver is used.
Programming

Stack Overflow Reveals Which Programming Languages Are Most Used At Night (stackoverflow.blog) 99

Stack Overflow data scientist David Robinson recently calculated when people visit the popular programming question-and-answer site, but then also calculated whether those results differed by programming language. Quoting his results:
  • "C# programmers start and stop their day earlier, and tend to use the language less in the evenings. This might be because C# is often used at finance and enterprise software companies, which often start earlier and have rigid schedules."
  • "C programmers start the day a bit later, keep using the language in the evening, and stay up the longest. This suggests C may be particularly popular among hobbyist programmers who code during their free time (or perhaps among summer school students doing homework)."
  • "Python and Javascript are somewhere in between: Python and Javascript developers start and end the day a little later than C# users, and are a little less likely than C programmers to work in the evening."

The site also released an interactive app which lets users see how the results for other languages compared to C#, JavaScript, Python, and C, though of those four, "C# would count as the 'most nine-to-five,' and C as the least."

And they've also calculated the technologies used most between 9 to 5 (which "include many Microsoft technologies, such as SQL Server, Excel, VBA, and Internet Explorer, as well as technologies like SVN and Oracle that are frequently used at enterprise software companies.") Meanwhile, the technologies most often used outside the 9-5 workday "include web frameworks like Firebase, Meteor, and Express, as well as graphics libraries like OpenGL and Unity. The functional language Haskell is the tag most visited outside of the workday; only half of its visits happen between 9 and 5."


The Internet

Newest Firefox Browser Bashes Crashes (cnet.com) 134

Nobody likes it when a web browser bombs instead of opening up a website. Mozilla is addressing that in the newly released v53 of its Firefox browser, which it claims crashes 10 percent fewer times. CNET adds: The improvement comes through the first big debut of a part of Project Quantum, an effort launched in 2016 to beef up and speed up Firefox. To improve stability, Firefox 53 on Windows machines isolates software called a compositor that's in charge of painting elements of a website onto your screen. That isolation into a separate computing process cuts down on trouble spots that can occur when Firefox employs computers' graphics chips, Mozilla said.
Android

Benchmarks Show Galaxy S8 With Snapdragon 835 Is a Much Faster Android Handset (hothardware.com) 82

MojoKid writes: Samsung recently launched the Galaxy S8 series of Android smartphones to much fanfare but only recently did the handsets begin to arrive in market for testing and review. Though the high-polish styling of the Galaxy S8 and Galaxy S8+ may or may not appeal to you, few would argue with its claims of significant performance gains and improved battery life. As it turns out, in deep-dive testing and benchmarking, the Galaxy S8 series is significantly faster than any other Android handset on the market currently, especially when it comes to graphics and gaming workloads. The Qualcomm Snapdragon 835 processor on board the GS8 is currently a Samsung exclusive, though it's expected to arrive in other handsets later this year. The Adreno 540 graphics engine on board the new Snapdragon chip is roughly 25% faster than the previous generation 820/821 series, though the chip is only about 10 percent faster in standard CPU-intensive tasks. Regardless, these are appreciable gains, especially in light of the fact that the new Galaxy S8 also has much better battery life than the previous generation Galaxy S7 series. The Samsung Galaxy S8 (5.8-inch) and Galaxy S8+ (6.2-inch) are expected to arrive at retail this week and though pricing is carrier-dependent, list for roughly $720 and $850 respectively, off contract.
Desktops (Apple)

StarCraft Is Now Free, Nearly 20 Years After Its Release (techcrunch.com) 239

An anonymous reader quotes a report from TechCrunch: Nearly two decades after its 1998 release, StarCraft is now free. Legally! Blizzard has just released the original game -- plus the Brood War expansion -- for free for both PC and Mac. You can find it here. Up until a few weeks ago, getting the game with its expansion would've cost $10-15 bucks. The company says they've also used this opportunity to improve the game's anti-cheat system, add "improved compatibility" with Windows 7, 8.1, and 10, and fix a few long lasting bugs. So why now? The company is about to release a remastered version of the game in just a few months, its graphics/audio overhauled for modern systems. Once that version hits, the original will probably look a bit ancient by comparison -- so they might as well use it to win over a few new fans, right?
AMD

AMD Launches Higher Performance Radeon RX 580 and RX 570 Polaris Graphics Cards (hothardware.com) 93

Reader MojoKid writes: In preparation for the impending launch of AMD's next-generation Vega GPU architecture, which will eventually reside at the top of the company's graphics product stack, the company unveiled a refresh of its mainstream graphics card line-up with more-powerful Polaris-based GPUs. The new AMD Radeon RX 580 and RX 570 are built around AMD's Polaris 20 GPU, which is an updated revision of Polaris 10. The Radeon RX 580 features 36 Compute Units, with a total of 2,304 shader processors and boost / base GPU clocks of 1340MHz and 1257MHz, respectively, along with 8GB of GDDR5 over a 256-bit interface. The Radeon RX 580 offers up a total of 6.17 TFLOPs of compute performance with up to 256GB/s of peak memory bandwidth. Though based on the same chip, the Radeon RX 570 has only 32 active CUs and 2048 shader processors. Boost and base reference clocks are 1244MHz and 1168MHz, respectively with 4GB of GDDR5 memory also connected over a 256-bit interface. At reference clocks, the peak compute performance of the Radeon RX 570 is 5.1TFLOPs with 224GB/s of memory bandwidth. In the benchmarks, the AMD Radeon RX 580 clearly outpaced AMD's previous gen Radeon RX 480, and was faster than an NVIDIA GeForce GTX 1060 Founder's Edition card more often than not. It was more evenly matched with factory-overclocked OEM GeForce GTX 1060 cards, however. Expected retail price points are around $245 and $175 for 8GB Radeon RX 580 and 4GB RX 570s cards, though more affordable options will also be available.
United States

Steve Ballmer's New Project: Find Out How the Government Spends Your Money (theverge.com) 251

Former Microsoft CEO Steve Ballmer isn't satisfied with owning the Los Angeles Clippers and teaching at Stanford and USC. On Tuesday, the billionaire announced USAFacts, his new startup that aims to improve political discourse by making government financial data easier to access. A small "army" of economists, professors and other professionals will be looking into and publishing data structured similarly to the 10-K filings companies issue each year -- expenses, revenues and key metrics pulled from dozens of government data sources and compiled into a single massive collection of tables. From a report on The Verge: The nonpartisan site traces $5.4 trillion in government spending under four categories derived from language in the US Constitution. Defense spending, for example, is categorized under the header "provide for the common defense," while education spending is under "secure the blessing of liberty to ourselves and our prosperity." Spending allocation and revenue sources are each mapped out in blue and pink graphics, with detailed breakdowns along federal, state and local lines. Users can also search for specific datasets, such as airport revenue or crime rates, and the site includes a report of "risk factors" that could inhibit economic growth. The New York Times has the story on how this startup came to be.
Movies

Slashdot Asks: What's Your Favorite Sci-Fi Movie? 1222

Many say it's the golden age of science fiction cinema. And rightly so, every month, we have a couple of movies that bend the rules of science to explore possibilities that sometimes make us seriously consider if things we see on the big screen could actually be true. The advances in graphics, and thanks to ever-so-increasing video resolution, we're increasingly leaving the theaters with visually appealing memories. That said, there are plenty of movies made back in the day that are far from ever getting displaced by the reboots spree that the Hollywood is currently embarking. With readers suggesting us this question every week, we think it's time we finally asked, what's your favorite science-fiction movie? Also, what are some other sci-fi movies that you have really enjoyed but think they have not received enough praises or even much acknowledgement?

Editor's note: the story has been moved up on the front page due its popularity.
Hardware

Ask Slashdot: What Was Your First Home Computer? 857

We've recently seen stories about old computers and sys-ops resurrecting 1980s BBS's, but now an anonymous reader has a question for all Slashdot readers: Whenever I meet geeks, there's one question that always gets a reaction: Do you remember your first home computer? This usually provokes a flood of fond memories about primitive specs -- limited RAM, bad graphics, and early versions of long-since-abandoned operating systems. Now I'd like to pose the same question to Slashdot's readers.

Use the comments to share details about your own first home computer. Was it a back-to-school present from your parents? Did it come with a modem? Did you lovingly upgrade its hardware for years to come? Was it a Commodore 64 or a BeBox?

It seems like there should be some good stories, so leave your best answers in the comments. What was your first home computer?
Classic Games (Games)

Celebrating '21 Things We Miss About Old Computers' (denofgeek.com) 467

"Today, we look back at the classic era of home computing that existed alongside the dreariness of business computing and the heart-pounding noise and colour of the arcades," writes the site Den of Geek. An anonymous reader reports: The article remembers the days of dial-up modems, obscure computer magazines, and the forgotten phenomenon of computer clubs. ("There was a time when if you wanted to ask a question about something computer related, or see something in action, you'd have to venture outside and into another building to go and see it.") Gamers grappled with old school controllers, games distributed on cassette tapes, low-resolution graphics and the "playground piracy" of warez boards -- when they weren't playing the original side-scrolling platformers like Mario Bros and Donkey Kong at video arcades.

In a world where people published fanzines on 16-bit computers, shared demo programs, and even played text adventures, primitive hardware may have inspired future coders, since "Old computers typically presented you with a command prompt as soon as you switched them on, meaning that they were practically begging to be programmed on." Home computers "mesmerised us, educated us, and in many cases, bankrupted us," the article remembers -- until they were replaced by more powerful hardware. "You move on, but you never fully get over your first love," it concludes -- while also adding that "what came next was pretty amazing."

Does this bring back any memories for anybody -- or provoke any wistful nostalgic for a bygone era? Either way, I really liked the way that the article ended. "The most exciting chapter of all, my geeky friends? The future!"
Hardware

Nvidia Titan Xp Introduced as 'the World's Most Powerful Graphics Card' (pcgamer.com) 69

Nvidia has unveiled its new Titan, the Xp. It features 3840 Cuda cores running at 1.6GHz, and 12GB of DDR5X memory. The card runs on Nvidia's Pascal architecture and comes with a suitably titanic price tag of $1200. From a report: "They made 1080 Ti so fast that they need a new top-tier Titan," says PC Gamer hardware expert Jarred Walton. "It's the full GP102 chip, so just like we had GTX 780, the Titan, the 780 Ti and the Titan Black, we're getting the 1080, Titan X (Pascal), 1080 Ti, and Titan Xp."
Businesses

Apple To Develop Its Own GPU, UK Chip Designer Imagination Reveals In 'Bombshell' PR (anandtech.com) 148

From a report on AnandTech: In a bombshell of a press release issued this morning, Imagination has announced that Apple has informed their long-time GPU partner that they will be winding down their use of Imagination's IP. Specifically, Apple expects that they will no longer be using Imagination's IP in 15 to 24 months. Furthermore the GPU design that replaces Imagination's designs will be, according to Imagination, "a separate, independent graphics design." In other words, Apple is developing their own GPU, and when that is ready, they will be dropping Imagination's GPU designs entirely. This alone would be big news, however the story doesn't stop there. As Apple's long-time GPU partner and the provider for the basis of all of Apple's SoCs going back to the very first iPhone, Imagination is also making a case to investors (and the public) that while Apple may be dropping Imagination's GPU designs for a custom design, that Apple can't develop a new GPU in isolation -- that any GPU developed by the company would still infringe on some of Imagination's IP. As a result the company is continuing to sit down with Apple and discuss alternative licensing arrangements, with the intent of defending their IP rights.
Emulation (Games)

Ask Slashdot: Can Linux Run a GPU-Computing Application Written For Windows? 117

dryriver writes: I have been told that Linux can run Windows software using Wine or perhaps a VM. What happens if that Windows software is a GPU-computing application -- accessing the GPU through HLSL/GLSL/CUDA/OpenCL or similar interfaces? Can Wine or other solutions run that software at a decent speed under Linux? Or is GPU-computing software written for the Windows platform unsuitable for use -- emulated or otherwise -- under Linux? This sounds like one of those cases where there's a theoretical answer and then your own real-world experiences. So leave your best answers in the comments. Can Linux run a GPU-computing application that's written for Windows?
Data Storage

Next-Generation DDR5 RAM Will Double the Speed of DDR4 In 2018 (arstechnica.com) 77

An anonymous reader quotes a report from Ars Technica: You may have just upgraded your computer to use DDR4 recently or you may still be using DDR3, but in either case, nothing stays new forever. JEDEC, the organization in charge of defining new standards for computer memory, says that it will be demoing the next-generation DDR5 standard in June of this year and finalizing the standard sometime in 2018. DDR5 promises double the memory bandwidth and density of DDR4, and JEDEC says it will also be more power-efficient, though the organization didn't release any specific numbers or targets. Like DDR4 back when it was announced, it will still be several years before any of us have DDR5 RAM in our systems. That's partly because the memory controllers in processors and SoCs need to be updated to support DDR5, and these chips normally take two or three years to design from start to finish. DDR4 RAM was finalized in 2012, but it didn't begin to go mainstream until 2015 when consumer processors from Intel and others added support for it. DDR5 has no relation to GDDR5, a separate decade-old memory standard used for graphics cards and game consoles.
AMD

AMD Ryzen Game Patch Optimizations Show Significant Gains On Zen Architecture (hothardware.com) 121

MojoKid writes: AMD got the attention of PC performance enthusiasts everywhere with the recent launch of its Ryzen 7 series processors. The trio of 8-core chips competitively take on Intel's Core i7 series at the high-end of its product stack. However, with the extra attention AMD garnered, came significant scrutiny as well. With any entirely new platform architecture, there are bound to be a few performance anomalies -- as was the case with the now infamous lower performance "1080p gaming" situation with Ryzen. In a recent status update, AMD noted they were already working with developers to help implement "simple changes" that can help a game engine's understanding of the AMD Zen core topology that would likely provide an additional performance uplift with Ryzen. Today, we have some early proof-positive of that, as Oxide Games, in concert with AMD, released a patch for its game title Ashes Of The Singularity. Ashes has been a "poster child" game engine of sorts for AMD Radeon graphics over the years (especially with respect to DX12) and it was one that ironically showed some of the worst variations in Ryzen CPU performance versus Intel. With this new patch that is now public for the game, however, AMD claims to have regained significant ground in benchmark results at all resolutions. In the 1080p benchmarks with powerful GPUs, a Ryzen 7 1800X shows an approximate 20% performance improvement with the latest version of the Ashes, closing the gap significantly versus Intel. This appears to be at least an early sign that AMD can indeed work with game and other app developers to tune for the Ryzen architecture and wring out additional performance.
IBM

A 21st-Century Version Of OS/2 Warp May Be Released Soon (arcanoae.com) 232

dryriver writes: A company named Arca Noae is working on a new release of the X86 OS/2 operating system code named "Blue Lion" and likely called ArcaOS 5 in its final release. Blue Lion wants to be a modern 21st Century OS/2 Warp, with support for the latest hardware and networking standards, a modern accelerated graphics driver, support for new cryptographic security standards, full backward compatibility with legacy OS/2, DOS and Windows 3.1 applications, suitability for use in mission-critical applications, and also, it appears, the ability to run "ported Linux applications". Blue Lion, which appears to be in closed beta with March 31st 2017 cited as the target release date, will come with up to date Firefox browser and Thunderbird mail client, Apache OpenOffice, other productivity tools, a new package manager, and software update and support subscription to ensure system stability. It is unclear from the information provided whether Blue Lion will be able to run modern Windows applications.

Slashdot Top Deals