AMD

AMD Launches Higher Performance Radeon RX 580 and RX 570 Polaris Graphics Cards (hothardware.com) 93

Reader MojoKid writes: In preparation for the impending launch of AMD's next-generation Vega GPU architecture, which will eventually reside at the top of the company's graphics product stack, the company unveiled a refresh of its mainstream graphics card line-up with more-powerful Polaris-based GPUs. The new AMD Radeon RX 580 and RX 570 are built around AMD's Polaris 20 GPU, which is an updated revision of Polaris 10. The Radeon RX 580 features 36 Compute Units, with a total of 2,304 shader processors and boost / base GPU clocks of 1340MHz and 1257MHz, respectively, along with 8GB of GDDR5 over a 256-bit interface. The Radeon RX 580 offers up a total of 6.17 TFLOPs of compute performance with up to 256GB/s of peak memory bandwidth. Though based on the same chip, the Radeon RX 570 has only 32 active CUs and 2048 shader processors. Boost and base reference clocks are 1244MHz and 1168MHz, respectively with 4GB of GDDR5 memory also connected over a 256-bit interface. At reference clocks, the peak compute performance of the Radeon RX 570 is 5.1TFLOPs with 224GB/s of memory bandwidth. In the benchmarks, the AMD Radeon RX 580 clearly outpaced AMD's previous gen Radeon RX 480, and was faster than an NVIDIA GeForce GTX 1060 Founder's Edition card more often than not. It was more evenly matched with factory-overclocked OEM GeForce GTX 1060 cards, however. Expected retail price points are around $245 and $175 for 8GB Radeon RX 580 and 4GB RX 570s cards, though more affordable options will also be available.
United States

Steve Ballmer's New Project: Find Out How the Government Spends Your Money (theverge.com) 251

Former Microsoft CEO Steve Ballmer isn't satisfied with owning the Los Angeles Clippers and teaching at Stanford and USC. On Tuesday, the billionaire announced USAFacts, his new startup that aims to improve political discourse by making government financial data easier to access. A small "army" of economists, professors and other professionals will be looking into and publishing data structured similarly to the 10-K filings companies issue each year -- expenses, revenues and key metrics pulled from dozens of government data sources and compiled into a single massive collection of tables. From a report on The Verge: The nonpartisan site traces $5.4 trillion in government spending under four categories derived from language in the US Constitution. Defense spending, for example, is categorized under the header "provide for the common defense," while education spending is under "secure the blessing of liberty to ourselves and our prosperity." Spending allocation and revenue sources are each mapped out in blue and pink graphics, with detailed breakdowns along federal, state and local lines. Users can also search for specific datasets, such as airport revenue or crime rates, and the site includes a report of "risk factors" that could inhibit economic growth. The New York Times has the story on how this startup came to be.
Movies

Slashdot Asks: What's Your Favorite Sci-Fi Movie? 1222

Many say it's the golden age of science fiction cinema. And rightly so, every month, we have a couple of movies that bend the rules of science to explore possibilities that sometimes make us seriously consider if things we see on the big screen could actually be true. The advances in graphics, and thanks to ever-so-increasing video resolution, we're increasingly leaving the theaters with visually appealing memories. That said, there are plenty of movies made back in the day that are far from ever getting displaced by the reboots spree that the Hollywood is currently embarking. With readers suggesting us this question every week, we think it's time we finally asked, what's your favorite science-fiction movie? Also, what are some other sci-fi movies that you have really enjoyed but think they have not received enough praises or even much acknowledgement?

Editor's note: the story has been moved up on the front page due its popularity.
Hardware

Ask Slashdot: What Was Your First Home Computer? 857

We've recently seen stories about old computers and sys-ops resurrecting 1980s BBS's, but now an anonymous reader has a question for all Slashdot readers: Whenever I meet geeks, there's one question that always gets a reaction: Do you remember your first home computer? This usually provokes a flood of fond memories about primitive specs -- limited RAM, bad graphics, and early versions of long-since-abandoned operating systems. Now I'd like to pose the same question to Slashdot's readers.

Use the comments to share details about your own first home computer. Was it a back-to-school present from your parents? Did it come with a modem? Did you lovingly upgrade its hardware for years to come? Was it a Commodore 64 or a BeBox?

It seems like there should be some good stories, so leave your best answers in the comments. What was your first home computer?
Classic Games (Games)

Celebrating '21 Things We Miss About Old Computers' (denofgeek.com) 467

"Today, we look back at the classic era of home computing that existed alongside the dreariness of business computing and the heart-pounding noise and colour of the arcades," writes the site Den of Geek. An anonymous reader reports: The article remembers the days of dial-up modems, obscure computer magazines, and the forgotten phenomenon of computer clubs. ("There was a time when if you wanted to ask a question about something computer related, or see something in action, you'd have to venture outside and into another building to go and see it.") Gamers grappled with old school controllers, games distributed on cassette tapes, low-resolution graphics and the "playground piracy" of warez boards -- when they weren't playing the original side-scrolling platformers like Mario Bros and Donkey Kong at video arcades.

In a world where people published fanzines on 16-bit computers, shared demo programs, and even played text adventures, primitive hardware may have inspired future coders, since "Old computers typically presented you with a command prompt as soon as you switched them on, meaning that they were practically begging to be programmed on." Home computers "mesmerised us, educated us, and in many cases, bankrupted us," the article remembers -- until they were replaced by more powerful hardware. "You move on, but you never fully get over your first love," it concludes -- while also adding that "what came next was pretty amazing."

Does this bring back any memories for anybody -- or provoke any wistful nostalgic for a bygone era? Either way, I really liked the way that the article ended. "The most exciting chapter of all, my geeky friends? The future!"
Hardware

Nvidia Titan Xp Introduced as 'the World's Most Powerful Graphics Card' (pcgamer.com) 69

Nvidia has unveiled its new Titan, the Xp. It features 3840 Cuda cores running at 1.6GHz, and 12GB of DDR5X memory. The card runs on Nvidia's Pascal architecture and comes with a suitably titanic price tag of $1200. From a report: "They made 1080 Ti so fast that they need a new top-tier Titan," says PC Gamer hardware expert Jarred Walton. "It's the full GP102 chip, so just like we had GTX 780, the Titan, the 780 Ti and the Titan Black, we're getting the 1080, Titan X (Pascal), 1080 Ti, and Titan Xp."
Businesses

Apple To Develop Its Own GPU, UK Chip Designer Imagination Reveals In 'Bombshell' PR (anandtech.com) 148

From a report on AnandTech: In a bombshell of a press release issued this morning, Imagination has announced that Apple has informed their long-time GPU partner that they will be winding down their use of Imagination's IP. Specifically, Apple expects that they will no longer be using Imagination's IP in 15 to 24 months. Furthermore the GPU design that replaces Imagination's designs will be, according to Imagination, "a separate, independent graphics design." In other words, Apple is developing their own GPU, and when that is ready, they will be dropping Imagination's GPU designs entirely. This alone would be big news, however the story doesn't stop there. As Apple's long-time GPU partner and the provider for the basis of all of Apple's SoCs going back to the very first iPhone, Imagination is also making a case to investors (and the public) that while Apple may be dropping Imagination's GPU designs for a custom design, that Apple can't develop a new GPU in isolation -- that any GPU developed by the company would still infringe on some of Imagination's IP. As a result the company is continuing to sit down with Apple and discuss alternative licensing arrangements, with the intent of defending their IP rights.
Emulation (Games)

Ask Slashdot: Can Linux Run a GPU-Computing Application Written For Windows? 117

dryriver writes: I have been told that Linux can run Windows software using Wine or perhaps a VM. What happens if that Windows software is a GPU-computing application -- accessing the GPU through HLSL/GLSL/CUDA/OpenCL or similar interfaces? Can Wine or other solutions run that software at a decent speed under Linux? Or is GPU-computing software written for the Windows platform unsuitable for use -- emulated or otherwise -- under Linux? This sounds like one of those cases where there's a theoretical answer and then your own real-world experiences. So leave your best answers in the comments. Can Linux run a GPU-computing application that's written for Windows?
Data Storage

Next-Generation DDR5 RAM Will Double the Speed of DDR4 In 2018 (arstechnica.com) 77

An anonymous reader quotes a report from Ars Technica: You may have just upgraded your computer to use DDR4 recently or you may still be using DDR3, but in either case, nothing stays new forever. JEDEC, the organization in charge of defining new standards for computer memory, says that it will be demoing the next-generation DDR5 standard in June of this year and finalizing the standard sometime in 2018. DDR5 promises double the memory bandwidth and density of DDR4, and JEDEC says it will also be more power-efficient, though the organization didn't release any specific numbers or targets. Like DDR4 back when it was announced, it will still be several years before any of us have DDR5 RAM in our systems. That's partly because the memory controllers in processors and SoCs need to be updated to support DDR5, and these chips normally take two or three years to design from start to finish. DDR4 RAM was finalized in 2012, but it didn't begin to go mainstream until 2015 when consumer processors from Intel and others added support for it. DDR5 has no relation to GDDR5, a separate decade-old memory standard used for graphics cards and game consoles.
AMD

AMD Ryzen Game Patch Optimizations Show Significant Gains On Zen Architecture (hothardware.com) 121

MojoKid writes: AMD got the attention of PC performance enthusiasts everywhere with the recent launch of its Ryzen 7 series processors. The trio of 8-core chips competitively take on Intel's Core i7 series at the high-end of its product stack. However, with the extra attention AMD garnered, came significant scrutiny as well. With any entirely new platform architecture, there are bound to be a few performance anomalies -- as was the case with the now infamous lower performance "1080p gaming" situation with Ryzen. In a recent status update, AMD noted they were already working with developers to help implement "simple changes" that can help a game engine's understanding of the AMD Zen core topology that would likely provide an additional performance uplift with Ryzen. Today, we have some early proof-positive of that, as Oxide Games, in concert with AMD, released a patch for its game title Ashes Of The Singularity. Ashes has been a "poster child" game engine of sorts for AMD Radeon graphics over the years (especially with respect to DX12) and it was one that ironically showed some of the worst variations in Ryzen CPU performance versus Intel. With this new patch that is now public for the game, however, AMD claims to have regained significant ground in benchmark results at all resolutions. In the 1080p benchmarks with powerful GPUs, a Ryzen 7 1800X shows an approximate 20% performance improvement with the latest version of the Ashes, closing the gap significantly versus Intel. This appears to be at least an early sign that AMD can indeed work with game and other app developers to tune for the Ryzen architecture and wring out additional performance.
IBM

A 21st-Century Version Of OS/2 Warp May Be Released Soon (arcanoae.com) 232

dryriver writes: A company named Arca Noae is working on a new release of the X86 OS/2 operating system code named "Blue Lion" and likely called ArcaOS 5 in its final release. Blue Lion wants to be a modern 21st Century OS/2 Warp, with support for the latest hardware and networking standards, a modern accelerated graphics driver, support for new cryptographic security standards, full backward compatibility with legacy OS/2, DOS and Windows 3.1 applications, suitability for use in mission-critical applications, and also, it appears, the ability to run "ported Linux applications". Blue Lion, which appears to be in closed beta with March 31st 2017 cited as the target release date, will come with up to date Firefox browser and Thunderbird mail client, Apache OpenOffice, other productivity tools, a new package manager, and software update and support subscription to ensure system stability. It is unclear from the information provided whether Blue Lion will be able to run modern Windows applications.
Government

After Healthcare Defeat, Can The Trump Administration Fix America's H-1B Visa Program? (bloomberg.com) 566

Friday the Trump administration suffered a political setback when divisions in the president's party halted a move to repeal healthcare policies passed in 2010. But if Trump hopes to turn his attention to how America's H-1B visa program is affecting technology workers, "time is running out," writes Slashdot reader pteddy. Bloomberg reports: [T]he application deadline for the most controversial visa program is the first week of April, which means new rules have to be in place for that batch of applicants or another year's worth of visas will be handed out under the existing guidelines... There probably isn't enough time to pass legislation on such a contentious issue. But Trump could sign an executive order with some changes. The article points out that under the current system, one outsourcing firm was granted 6.5 times as many U.S. visas as Amazon. There's also an interesting map showing which countries' workers received the most H-1B visas in 2015 -- 69.4% went to workers in India, with another 10.5% going to China -- and a chart showing which positions are most in demand, indicating that two-thirds of the visa applications are for tech workers.
Patents

Apple Explores Using An iPhone, iPad To Power a Laptop (appleinsider.com) 76

According to the U.S. Patent and Trademark Office, Apple has filed a patent for an "Electronic accessory device." It describes a "thin" accessory that contains traditional laptop hardware like a large display, physical keyboard, GPU, ports and more -- all of which is powered by an iPhone or iPad. The device powering the hardware would fit into a slot built into the accessory. AppleInsider reports: While the accessory can take many forms, the document for the most part remains limited in scope to housings that mimic laptop form factors. In some embodiments, for example, the accessory includes a port shaped to accommodate a host iPhone or iPad. Located in the base portion, this slot might also incorporate a communications interface and a means of power transfer, perhaps Lightning or a Smart Connector. Alternatively, a host device might transfer data and commands to the accessory via Wi-Fi, Bluetooth or other wireless protocol. Onboard memory modules would further extend an iOS device's capabilities. Though the document fails to delve into details, accessory memory would presumably allow an iPhone or iPad to write and read app data. In other cases, a secondary operating system or firmware might be installed to imitate a laptop environment or store laptop-ready versions of iOS apps. In addition to crunching numbers, a host device might also double as a touch input. For example, an iPhone positioned below the accessory's keyboard can serve as the unit's multitouch touchpad, complete with Force Touch input and haptic feedback. Coincidentally, the surface area of a 5.5-inch iPhone 7 Plus is very similar to that of the enlarged trackpad on Apple's new MacBook Pro models. Some embodiments also allow for the accessory to carry an internal GPU, helping a host device power the larger display or facilitate graphics rendering not possible on iPhone or iPad alone. Since the accessory is technically powered by iOS, its built-in display is touch-capable, an oft-requested feature for Mac. Alternatively, certain embodiments have an iPad serving as the accessory's screen, with keyboard, memory, GPU and other operating guts located in the attached base portion. This latter design resembles a beefed up version of Apple's Smart Case for iPad.
Government

US Federal Budget Proposal Cuts Science Funding (washingtonpost.com) 649

hey! writes: The U.S. Office of Management and Budget has released a budget "blueprint" which outlines substantial cuts in both basic research and applied technology funding. The proposal includes a whopping 18% reduction in National Institutes of Health medical research. NIH does get a new $500 million fund to track emerging infectious agents like Zika in the U.S., but loses its funding to monitor those agents overseas. The Department of Energy's research programs also get an 18% cut in research, potentially affecting basic physics research, high energy physics, fusion research, and supercomputing. Advanced Research Projects Agency (ARPA-E) gets the ax, as does the Advanced Technology Vehicle Manufacturing Program, which enabled Tesla to manufacture its Model S sedan. EPA loses all climate research funding, and about half the research funding targeted at human health impacts of pollution. The Energy Star program is eliminated; Superfund funding is drastically reduced. The Chesapeake Bay and Great Lakes cleanup programs are also eliminated, as is all screening of pesticides for endocrine disruption. In the Department of Commerce, Sea Grant is eliminated, along with all coastal zone research funding. Existing weather satellites GOES and JPSS continue funding, but JPSS-3 and -4 appear to be getting the ax. Support for transfer of federally funded research and technology to small and mid-sized manufacturers is eliminated. NASA gets a slight trim, and a new focus on deep space exploration paid for by an elimination of Earth Science programs. You can read more about this "blueprint" in Nature, Science, and the Washington Post, which broke the story. The Environmental Protection Agency, the State Department and Agriculture Department took the hardest hits, while the Defense Department, Department of Homeland Security, and Department of Veterans Affairs have seen their budgets grow.
Operating Systems

NetBSD 7.1 Released (netbsd.org) 45

New submitter fisted writes: The NetBSD Project is pleased to announce NetBSD 7.1, the first feature update of the NetBSD 7 release branch. It represents a selected subset of fixes deemed important for security or stability reasons, as well as new features and enhancements. Some highlights of the 7.1 release are:

-Support for Raspberry Pi Zero.
-Initial DRM/KMS support for NVIDIA graphics cards via nouveau (Disabled by default. Uncomment nouveau and nouveaufb in your kernel config to test).
The addition of vioscsi, a driver for the Google Compute Engine disk.
-Linux compatibility improvements, allowing, e.g., the use of Adobe Flash Player 24.
-wm(4): C2000 KX and 2.5G support; Wake On Lan support; 82575 and newer SERDES based systems now work.
-ODROID-C1 Ethernet now works.
-Numerous bug fixes and stability improvements.

NetBSD is free. All of the code is under non-restrictive licenses, and may be used without paying royalties to anyone. Free support services are available via our mailing lists and website. Commercial support is available from a variety of sources. More extensive information on NetBSD is available from http://www.NetBSD.org.
You can download NetBSD 7.1 from one of these mirror sites.
Firefox

Will WebAssembly Replace JavaScript? (medium.com) 235

On Tuesday Firefox 52 became the first browser to support WebAssembly, a new standard "to enable near-native performance for web applications" without a plug-in by pre-compiling code into low-level, machine-ready instructions. Mozilla engineer Lin Clark sees this as an inflection point where the speed of browser-based applications increases dramatically. An anonymous reader quotes David Bryant, the head of platform engineering at Mozilla. This new standard will enable amazing video games and high-performance web apps for things like computer-aided design, video and image editing, and scientific visualization... Over time, many existing productivity apps (e.g. email, social networks, word processing) and JavaScript frameworks will likely use WebAssembly to significantly reduce load times while simultaneously improving performance while running... developers can integrate WebAssembly libraries for CPU-intensive calculations (e.g. compression, face detection, physics) into existing web apps that use JavaScript for less intensive work... In some ways, WebAssembly changes what it means to be a web developer, as well as the fundamental abilities of the web.
Mozilla celebrated with a demo video of the high-resolution graphics of Zen Garden, and while right now WebAssembly supports compilation from C and C++ (plus some preliminary support for Rust), "We expect that, as WebAssembly continues to evolve, you'll also be able to use it with programming languages often used for mobile apps, like Java, Swift, and C#."
Graphics

NVIDIA Lifts Veil On GeForce GTX 1080 Ti Performance Reviews, Which Show Faster Speeds Than Titan X (hothardware.com) 51

MojoKid writes from a report via HotHardware: NVIDIA is officially launching its most powerful gaming graphics card today, the GeForce GTX 1080 Ti. It was announced last week at the Game Developers Conference and pre-orders began shortly thereafter. However, the cards will begin shipping today and NVIDIA has lifted the veil on performance reviews. Though its memory complement and a few blocks within the GPU are reduced versus NVIDIA's previous top-end card, the Titan X, the GeForce GTX 1080 Ti makes up for its shortcomings with a combination of refinement and the brute force of higher memory clocks, based on new and improved Micron GDDR5X memory, faster core clocks and an improved cooler. For gamers, the good news is, the 1080 Ti retails for $699, versus $1200 for the Titan X, and it is in fact faster, for the most part. Throughout a battery of game tests and benchmarks, regardless of the resolution or settings used, the GeForce GTX 1080 Ti performed on par with or slightly faster than the NVIDIA Titan X and roughly 30-35% better than the standard GeForce GTX 1080 Founders Edition. Versus AMD's current flagship GPU, the Radeon R9 Fury X, there is no competition; the GeForce GTX 1080 Ti was nearly 2x faster than the Fury X in some cases.
Operating Systems

Dell Doubles Down On High-End Ubuntu Linux Laptops (zdnet.com) 128

Dell became the first major OEM to offer a laptop with Linux pre-installed in it in 2007. Ten years later, the company says it is more committed than ever to offering Linux-powered machines to users. From a report on ZDNet: The best known of these is the Dell XPS 13 developer edition, but it's not the only Linux laptop Dell offers. In a blog post, Barton George, senior principal engineer at Dell's Office of the CTO, announced "the next generation of our Ubuntu-based Precision mobile workstation line." All of these systems boast Ubuntu 16.04 long-term support (LTS), 7th generation Intel Core or Intel Xeon processors, and Thunderbolt 3, AKA 40 Gigabit per second (Gbps) USB-C, ports. As the Xeon processor option shows, these are top-of-the-line laptops for professionals. It took longer than expected for Dell to get this new set of five Ubuntu-powered Precision mobile workstations out the door. The Precision 5520 and 3520 are now available. The 3520, the entry-level workstation, starts with an Intel Core 2.5GHz i5-7300HQ Quad Core processor with Intel HD Graphics 630. From there, you can upgrade it all the way to an Intel Core Xeon 3 GHz E3-1505M v6 processor with Nvidia Quadro M62 graphics.
Graphics

Ask Slashdot: Why Are There No Huge Leaps Forward In CPU/GPU Power? 474

dryriver writes: We all know that CPUs and GPUs and other electronic chips get a little faster with each generation produced. But one thing never seems to happen -- a CPU/GPU manufacturer suddenly announcing a next generation chip that is, say, 4-8 times faster than the fastest model they had 2 years ago. There are moderate leaps forward all the time, but seemingly never a HUGE leap forward due to, say, someone clever in R&D discovering a much faster way to process computing instructions. Is this because huge leaps forward in computing power are technically or physically impossible/improbable? Or is nobody in R&D looking for that huge leap forward, and rather focused on delivering a moderate leap forward every 2 years? Maybe striving for that "rare huge leap forward in computing power" is simply too expensive for chip manufacturers? Precisely what is the reason that there is never a next-gen CPU or GPU that is, say, advertised as being 16 times faster than the one that came 2 years before it due to some major breakthrough in chip engineering and manufacturing?
Graphics

NVIDIA Unveils Its $700 Top of the Line GeForce GTX 1080 Ti Graphics Card (hothardware.com) 151

MojoKid writes from a report via HotHardware: NVIDIA just lifted the veil on its latest monster graphics card for gamers -- the long-rumored GeForce GTX 1080 Ti -- at an event this evening in San Francisco during the Game Developers Conference (GDC). The card will sit at the top of NVIDIA's GeForce offering with the Titan X and GeForce GTX 1080 in NVIDIA's Pascal-powered product stack, promising significant performance gains over the GTX 1080 and faster than Titan X performance, for a much lower price of $699. The 12 billion NVIDIA GP102 transistor on the card has 3,584 CUDA cores, which is actually the same as NVIDIA's Titan X. However, the GeForce GTX 1080 Ti will have fewer ROP units at 88, versus 96 in the Titan X. The 1080 Ti will also, however, come equipped with 11GB of premium GDDR5X memory from Micron clocked at 11,000 MHz for an effective 11Gbps data rate. Peak compute throughput of the GeForce GTX 1080 Ti is slightly higher than the Titan X due to the Ti's higher boost clock. Memory bandwidth over its narrower 352-bit GDDR5 memory interface is 484GB/s, which is also slightly higher than a Titan X as well. NVIDIA also noted that peak overclocks on the core should hit 2GHz or higher with minimal coaxing. As a result, the GeForce GTX 1080 Ti will be faster than the Titan X out of the box, faster still when overclocked.

Slashdot Top Deals