×
AMD

AMD Wants To Hear From GPU Resellers and Partners Bullied By Nvidia (forbes.com) 127

An anonymous reader quotes a report from Forbes: Nvidia may not be talking about its GeForce Partner Program, but AMD has gone from silent to proactive in less than 24 hours. Hours ago Scott Herkelman, Corporate VP and General Manager of AMD Radeon Gaming, addressed AMD resellers via Twitter, not only acknowledging the anti-competitive tactics Nvidia has leveraged against them, but inviting others to share their stories. The series of tweets coincides with an AMD sales event held in London this week. This was preceded by an impassioned blog post from Herkelman yesterday where he comes out swinging against Nvidia's GeForce Partner Program, and references other closed, proprietary technologies like G-Sync and GameWorks.

AMD's new mantra is "Freedom of Choice," a tagline clearly chosen to combat Nvidia's new program which is slowly taking gaming GPU brands from companies like MSI and Gigabyte, and locking them exclusively under the GeForce banner. The GeForce Partner Program also seems to threaten the business of board partners who are are not aligned with the program. Here's what Herkelman -- who was a former GeForce marketing executive at Nvidia -- had to say on Twitter: "I wanted to personally thank all of our resellers who are attending our AMD sales event in London this week, it was a pleasure catching up with you and thank you for your support. Many of you told me how our competition tries to use funding and allocation to restrict or block [...] your ability to market and sell Radeon based products in the manner you and your customers desire. I want to let you know that your voices have been heard and that I welcome any others who have encountered similar experiences to reach out to me..."
The report adds that Kyle Bennett of HardOCP, the author who broke the original GPP story, "says that Nvidia is beginning a disinformation campaign against him, claiming that he was paid handsomely for publishing the story."
Government

Palantir Knows Everything About You (bloomberg.com) 111

Palantir, a data-mining company created by Peter Thiel, is aiding government agencies by tracking American citizens using the War on Terror, Bloomberg reports. From the report: The company's engineers and products don't do any spying themselves; they're more like a spy's brain, collecting and analyzing information that's fed in from the hands, eyes, nose, and ears. The software combs through disparate data sources -- financial documents, airline reservations, cellphone records, social media postings -- and searches for connections that human analysts might miss. It then presents the linkages in colorful, easy-to-interpret graphics that look like spider webs.

[...] The U.S. Department of Health and Human Services uses Palantir to detect Medicare fraud. The FBI uses it in criminal probes. The Department of Homeland Security deploys it to screen air travelers and keep tabs on immigrants. Police and sheriff's departments in New York, New Orleans, Chicago, and Los Angeles have also used it, frequently ensnaring in the digital dragnet people who aren't suspected of committing any crime.

Operating Systems

'Fuchsia Is Not Linux': Google Publishes Documentation Explaining Their New OS (xda-developers.com) 245

An anonymous reader quotes a report from XDA Developers: You've probably seen mentions of the Fuchsia operating system here and there since it has been in development for almost 2 years. It's Google's not-so-secretive operating system which many speculate will eventually replace Android. We've seen it grow from a barely functional mock-up UI in an app form to a version that actually boots on existing hardware. We've seen how much importance Google places on the project as veteran Android project managers are starting to work on it. But after all of this time, we've never once had either an official announcement from Google about the project or any documentation about it -- all of the information thus far has come as a result of people digging into the source code.

Now, that appears to be changing as Google has published a documentation page called "The Book." The page aims to explain what Fuchsia, the "modular, capability-based operating system" is and is not. The most prominent text on that page is a large section explaining that Fuchsia is NOT Linux, in case that wasn't clear already. Above that are several readme pages explaining Fuchsia's file systems, boot sequence, core libraries, sandboxing, and more. The rest of the page has sections explaining what the Zircon micro-kernel is and how the framework, storage, networking, graphics, media, user interface, and more are implemented.

Graphics

Intel Reportedly Designing Arctic Sound Discrete GPU For Gaming, Pro Graphics (hothardware.com) 68

MojoKid shares a report from HotHardware: When AMD's former graphics boss Raja Koduri landed at Intel after taking a much-earned hiatus from the company, it was seen as a major coup for the Santa Clara chip outfit, one that seemed to signal that Intel might be targeting to compete in the discrete graphics card market. While nothing has been announced in that regard, some analysts are claiming that there will indeed be a gaming variant of Intel's upcoming discrete "Arctic Sound" GPU. According to reports, Intel originally planned to build Arctic Sound graphics chips mainly for video streaming chores and data center activities. However, claims are surfacing that the company has since decided to build out a gaming variant at the behest of Koduri, who wants to "enter the market with a bang." Certainly a gaming GPU that could compete with AMD and NVIDIA would accomplish that goal. Reportedly, Intel could pull together two different version of Arctic Sound. One would be an integrated chip package, like the Core i7-8809G (Kaby Lake-G) but with Intel's own discrete graphics, as well as a standalone chip that will end up in a traditional graphics cards. Likely both of those will have variants designed for gaming, just as AMD and NVIDIA build GPUs for professional use and gaming as well.
Media

Ask Slashdot: How Do You Stream/Capture Video? 155

A user writes: I am starting to look at capturing and streaming video, specifically video games in 4K at 60 frames per second. I have a Windows 10 box with a 6GB GTX 1060 GPU and a modern AMD octa-core CPU recording with Nvidia ShadowPlay. This works flawlessly, even in 4K at 60 fps. ShadowPlay produces MP4 files which play nice locally but seem to take a long time to upload to YouTube -- a 15-minute 4K 60fps video took almost three hours. Which tools are you fellow Slashdotters using to create, edit, and upload video in the most efficient manner?
Bitcoin

GPU Prices Soar as Bitcoin Miners Buy Up Hardware To Build Rigs (computerworld.com) 157

"Bitcoin and other cryptocurrency miners have created a dearth of mid-range and high-end GPU cards that are selling for twice as much as suggested retail," reports Computerworld. "The reason: miners are setting up server farms with the cards." Lucas123 writes: GPU prices have more than doubled in some cases... Some of the most popular GPUs can't even be found anymore as they've sold out due to demand. Meanwhile, some retailers are pushing back against bitcoin miners by showing favoritism to their traditional gamer customers, allowing them to purchase GPUs at manufacturer's suggested retail price. Earlier this year, NVIDIA asked retailers of its hardware to prioritize sales to gamers over cryptocurrency miners.
Graphics

Programmer Unveils OpenGL Bindings for Bash (opensource.com) 47

Slashdot reader silverdirk writes: Compiled languages have long provided access to the OpenGL API, and even most scripting languages have had OpenGL bindings for a decade or more. But, one significant language missing from the list is our old friend/nemesis Bash. But worry no longer! Now you can create your dazzling 3D visuals right from the comfort of your command line!
"You'll need a system with both Bash and OpenGL support to experience it firsthand," explains software engineer Michael Conrad, who created the first version 13 years ago as "the sixth in a series of 'Abuse of Technology' projects," after "having my technical sensibilities offended that someone had written a real-time video game in Perl.

"Back then, my primary language was C++, and I was studying OpenGL for video game purposes. I declared to my friends that the only thing worse would be if it had been 3D and written in Bash. Having said the idea out loud, it kept prodding me, and I eventually decided to give it a try to one-up the 'awfulness'..."
Anime

Animation Legend Isao Takahata, Co-founder of Studio Ghibli, Dies at 82 (nbcnews.com) 27

Isao Takahata, co-founder of the prestigious Japanese animator Studio Ghibli, which stuck to a hand-drawn "manga" look in the face of digital filmmaking, has died. He was 82. From a report: Takahata started Ghibli with Oscar-winning animator Hayao Miyazaki in 1985, hoping to create Japan's Disney. He directed "Grave of the Fireflies," a tragic tale about wartime childhood, and produced some of the studio's films, including Miyazaki's 1984 "Nausicaa of the Valley of the Wind," which tells the horror of environmental disaster through a story about a princess. Takahata died Thursday of lung cancer at a Tokyo hospital, the studio said in a statement Friday.

He was fully aware of how the floating sumie-brush sketches of faint pastel in his works stood as a stylistic challenge to Hollywood's computer-graphics cartoons. In a 2015 interview with The Associated Press, Takahata talked about how Edo-era woodblock-print artists like Hokusai had the understanding of Western-style perspective and the use of light, but they purposely chose to depict reality with lines, and in a flat way, with minimal shading.
"Pom Poko", a movie released in 1994, is often considered the best work of Takahata. The New York Times described it as, "a comic allegory about battling packs of tanuki (Japanese raccoon dogs) joining forces to fight human real estate developers. It's earthy and rollicking in a way that his co-founder's films aren't." In an interview with Wired in 2015, when Takahata was asked what he felt about people regarding him as the heart of Studio Ghibli. "Now you've both finished your final films, what are your feelings on Ghibli's legacy and reputation?, the interviewer asked. Takahata said, "I'm not sure I can respond in any meaningful way. What Hayao Miyazaki has built up is the greatest contribution. The existence of that thick trunk has allowed leaves to unfurl and flowers to bloom to become the fruitful tree that is Studio Ghibli."

Further reading: Isao Takahata's stark world of reality (The Japan Times).
Displays

Latest macOS Update Disables DisplayLink, Rendering Thousands of Monitors Dead (displaylink.com) 331

rh2600 writes: Four days ago, Apple's latest macOS 10.13.4 update broke DisplayLink protocol support (perhaps permanently), turning what may be hundreds of thousands of external monitors connected to MacBook Pros via DisplayLink into paperweights. Some days in, DisplayLink has yet to announce any solution, and most worryingly there are indications that this is a permanent change to macOS moving forward. Mac Rumors is reporting that "users of the popular Mac desktop extension app Duet Display are being advised not to update to macOS 10.13.4, due to 'critical bugs' that prevent the software from communicating with connected iOS devices used as extra displays." Users of other desktop extensions apps like Air Display and iDisplay are also reporting incompatibility with the latest version of macOS.
Intel

Intel Unveils New Coffee Lake 8th Gen Core Line-Up With First Core i9 Mobile CPU (hothardware.com) 73

MojoKid writes: Intel is announcing a big update to its processor families today, with new 8th Gen Coffee Lake-based Core chips for both mobile and desktop platforms. On the mobile side of the equation, the most interesting processors are no doubt Intel's new six-core Coffee Lake parts, starting with the Core i7-8750H. This processor comes with base/max single-core turbo boost clocks of 2.2GHz and 4.2GHz respectively, while the Core i7-8850H bumps those clocks to 2.6GHz and 4.3GHz respectively. Both processors have six cores (12 threads), a TDP of 45 watts and 9MB of shared Smart Cache.However, the new flagship processor is without question the Intel Core i9-8950HK, which is the first Core i9-branded mobile processor. It retains the 6/12 (core/thread) count of the lower-end parts, but features base and turbo clocks of 2.9GHz and 4.8GHz respectively. The chip also comes unlocked since it caters to gaming enthusiasts and bumps the amount of Smart Cache to 12MB. Intel is also announcing a number of lower powered Coffee Lake-U series chips for thin and light notebooks, some of which have on board Iris Plus integrated graphics with 128MB of on-chip eDRAM, along with some lower powered six-core and quad-core desktop chips that support the company's Optane memory in Intel's new 300 series chipset platform.
Graphics

Ask Slashdot: Should CPU, GPU Name-Numbering Indicate Real World Performance? 184

dryriver writes: Anyone who has built a PC in recent years knows how confusing the letters and numbers that trail modern CPU and GPU names can be because they do not necessarily tell you how fast one electronic part is compared to another electronic part. A Zoomdaahl Core C-5 7780 is not necessarily faster than a Boomberg ElectronRipper V-6 6220 -- the number at the end, unlike a GFLOPS or TFLOPS number for example, tells you very little about the real-world performance of the part. It is not easy to create one unified, standardized performance benchmark that could change this. One part may be great for 3D gaming, a competing part may smoke the first part in a database server application, and a third part may compress 4K HEVC video 11% faster. So creating something like, say, a Standardized Real-World Application Performance Score (SRWAPS) and putting that score next to the part name, letters, or series number will probably never happen. A lot of competing companies would have to agree to a particular type of benchmark, make sure all benchmarking is done fairly and accurately, and so on and so forth.

But how are the average consumers just trying to buy the right home laptop or gaming PC for their kids supposed to cope with the "letters and numbers salad" that follows CPU, GPU and other computer part names? If you are computer literate, you can dive right into the different performance benchmarks for a certain part on a typical tech site that benchmarks parts. But what if you are "Computer Buyer Joe" or "Jane Average" and you just want to glean quickly which two products -- two budget priced laptops listed on Amazon.com for example -- have the better performance overall? Is there no way to create some kind of rough numeric indicator of real-world performance and put it into a product's specs for quick comparison?
Cloud

Move Over Moore's Law, Make Way For Huang's Law (ieee.org) 55

Tekla Perry writes: Are graphics processors a law unto themselves? Nvidia's Jensen Huang says a 25-times speedup over five years is evidence that they are. He calls this the 'supercharged law,' and says it's time to start counting advances on multiple fronts, including architecture, interconnects, memory technology, and algorithms, not just circuits on a chip.
Operating Systems

macOS 10.13.4 Enables Support for External GPU (engadget.com) 53

With the latest release of macOS High Sierra, Apple has officially delivered on a couple of items in the works since WWDC 2017 last June. macOS 10.13.4 brings the external GPU (eGPU) support that lets developers, VR users gamers and anyone else in need of some extra oomph to plug in a more powerful graphics card via Thunderbolt 3. From a report: While that may not make every underpowered laptop VR ready, it certainly makes staying macOS-only more palatable for some power users. Another notable addition is Business Chat in Messages for users in the US. Twitter, Facebook, WhatsApp and others have tweaked their services to enable customer service linkups and now Apple has its version available on the desktop. With it, you can interact with business representatives or even make purchases. Other tweaks include waiting for the user to select login fields before autofilling password information in Safari, a smoke cloud wallpaper that had previously been restricted to the iMac Pro and a Safari shortcut for jumping to the rightmost tab by pressing Command-9. Further reading: Gizmodo.
AI

NVIDIA Unveils 2 Petaflop DGX-2 AI Supercomputer With 32GB Tesla V100, NVSwitch Tech 41

bigwophh writes from a report via HotHardware: NVIDIA CEO Jensen Huang took to the stage at GTC today to unveil a number of GPU-powered innovations for machine learning, including a new AI supercomputer and an updated version of the company's powerful Tesla V100 GPU that now sports a hefty 32GB of on-board HBM2 memory. A follow-on to last year's DGX-1 AI supercomputer, the new NVIDIA DGX-2 can be equipped with double the number of Tesla V100 processing modules for double the GPU horsepower. The DGX-2 can also have four times the available memory space, thanks to the updated Tesla V100's larger 32GB of memory. NVIDIA's new NVSwitch technology is a fully crossbar GPU interconnect fabric that allows NVIDIA's platform to scale to up to 16 GPUs and utilize their memory space contiguously, where the previous DGX-1 NVIDIA platform was limited to 8 total GPU complexes and associated memory. NVIDIA claims NVSwitch is five times faster than the fastest PCI Express switch and offers an aggregate 2.4TB per second of bandwidth. A new Quadro card was also announced. Called the Quadro GV100, it too is being powered by Volta. The Quadro GV100 packs 32GB of memory and supports NVIDIA's recently announced RTX real-time ray tracing technology.
Graphics

Ask Slashdot: How Did Real-Time Ray Tracing Become Possible With Today's Technology? 145

dryriver writes: There are occasions where multiple big tech manufacturers all announce the exact same innovation at the same time -- e.g. 4K UHD TVs. Everybody in broadcasting and audiovisual content creation knew that 4K/8K UHD and high dynamic range (HDR) were coming years in advance, and that all the big TV and screen manufacturers were preparing 4K UHD HDR product lines because FHD was beginning to bore consumers. It came as no surprise when everybody had a 4K UHD product announcement and demo ready at the same time. Something very unusual happened this year at GDC 2018 however. Multiple graphics and GPU companies, like Microsoft, Nvidia, and AMD, as well as other game developers and game engine makers, all announced that real-time ray tracing is coming to their mass-market products, and by extension, to computer games, VR content and other realtime 3D applications.

Why is this odd? Because for many years any mention of 30+ FPS real-time ray tracing was thought to be utterly impossible with today's hardware technology. It was deemed far too computationally intensive for today's GPU technology and far too expensive for anything mass market. Gamers weren't screaming for the technology. Technologists didn't think it was doable at this point in time. Raster 3D graphics -- what we have in DirectX, OpenGL and game consoles today -- was very, very profitable and could easily have evolved further the way it has for another 7 to 8 years. And suddenly there it was: everybody announced at the same time that real-time ray tracing is not only technically possible, but also coming to your home gaming PC much sooner than anybody thought. Working tech demos were shown. What happened? How did real-time ray tracing, which only a few 3D graphics nerds and researchers in the field talked about until recently, suddenly become so technically possible, economically feasible, and so guaranteed-to-be-profitable that everybody announced this year that they are doing it?
Graphics

A New Era For Linux's Low-level Graphics (collabora.com) 61

Slashdot reader mfilion writes: Over the past couple of years, Linux's low-level graphics infrastructure has undergone a quiet revolution. Since experimental core support for the atomic modesetting framework landed a couple of years ago, the DRM subsystem in the kernel has seen roughly 300,000 lines of code changed and 300,000 new lines added, when the new AMD driver (~2.5m lines) is excluded. Lately Weston has undergone the same revolution, albeit on a much smaller scale. Here, Daniel Stone, Graphics Lead at Collabora, puts the spotlight on the latest enhancements to Linux's low-level graphics infrastructure, including Atomic modesetting, Weston 4.0, and buffer modifiers.
AMD

Linux Mint Ditches AMD For Intel With New Mintbox Mini 2 (betanews.com) 46

An anonymous reader writes: Makers of Mint Box, a diminutive desktop which runs Linux Mint -- an Ubuntu-based OS, on Friday announced the Mintbox Mini 2. While the new model has several new aspects, the most significant is that the Linux Mint Team has switched from AMD to Intel (the original Mini used an A4-Micro 6400T). For $299, the Mintbox Mini 2 comes with a quad-core Intel Celeron J3455 processor, 4GB of RAM, and a 60GB SSD. For $50 more you can opt for the "Pro" model which doubles the RAM to 8GB and increases the SSD capacity to 120GB. Graphics are fairly anemic, as it uses integrated Intel HD 500, but come on -- you shouldn't expect to game with this thing. For video connectivity, you get both HDMI and Mini DisplayPort. Both can push 4K, and while the mini DP port can do 60Hz, the HDMI is limited to 30.
Graphics

NVIDIA RTX Technology To Usher In Real-Time Ray Tracing Holy Grail of Gaming Graphics (hothardware.com) 159

HotHardware writes: NVIDIA has been dabbling in real-time ray tracing for over a decade. However, the company just introduced NVIDIA RTX, which is its latest effort to deliver real-time ray tracing to game developers and content creators for implementation in actual game engines. Historically, the computational horsepower to perform real-time ray tracing has been too great to be practical in actual games, but NVIDIA hopes to change that with its new Volta GPU architecture and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements. Ray tracing is a method by which images are enhanced by tracing rays or paths of light as they bounce in and around an object (or objects) in a scene. Under optimum conditions, ray tracing delivers photorealistic imagery with shadows that are correctly cast; water effects that show proper reflections and coloring; and scenes that are cast with realistic lighting effects. NVIDIA RTX is a combination of software (the company's Gameworks SDK, now with ray tracing support), and next generation GPU hardware. NVIDIA notes its Volta architecture has specific hardware support for real-time ray tracing, including offload via its Tensor core engines. To show what's possible with the technology, developers including Epic, 4A Games and Remedy Entertainment will be showcasing their own game engine demonstrations this week at the Game Developers Conference. NVIDIA expects the ramp to be slow at first, but believes eventually most game developers will adopt real-time ray tracing in the future.
Businesses

How Amazon Became Corporate America's Nightmare (bloomberg.com) 243

Zorro shares a report from Bloomberg that details Amazon's rapid growth in the last three years: Amazon makes no sense. It's the most befuddling, illogically sprawling, and -- to a growing sea of competitors -- flat-out terrifying company in the world. It sells soap and produces televised soap operas. It sells complex computing horsepower to the U.S. government and will dispatch a courier to deliver cold medicine on Christmas Eve. It's the third-most-valuable company on Earth, with smaller annual profits than Southwest Airlines Co., which as of this writing ranks 426th. Chief Executive Officer Jeff Bezos is the world's richest person, his fortune built on labor conditions that critics say resemble a Dickens novel with robots, yet he has enough mainstream appeal to play himself in a Super Bowl commercial. Amazon was born in cyberspace, but it occupies warehouses, grocery stores, and other physical real estate equivalent to 90 Empire State Buildings, with a little left over. The company has grown so large and difficult to comprehend that it's worth taking stock of why and how it's left corporate America so thoroughly freaked out. Executives at the biggest U.S. companies mentioned Amazon thousands of times during investor calls last year, according to transcripts -- more than President Trump and almost as often as taxes. Other companies become verbs because of their products: to Google or to Xerox. Amazon became a verb because of the damage it can inflict on other companies. To be Amazoned means to have your business crushed because the company got into your industry. And fear of being Amazoned has become such a defining feature of commerce, it's easy to forget the phenomenon has arisen mostly in about three years.
Graphics

Vulkan Graphics is Coming To macOS and iOS, Will Enable Faster Games and Apps (anandtech.com) 94

The Khronos Group, a consortium of hardware and software companies, has announced that the Vulkan graphics technology is coming to Apple's platforms, allowing games and apps to run at faster performance levels on Macs and iOS devices. From a report: In collaboration with Valve, LunarG, and The Brenwill Workshop, this free open-source collection includes the full 1.0 release of the previously-commercial MoltenVK, a library for translating Vulkan API calls to Apple's Metal 1 and 2 calls, as well LunarG's new Vulkan SDK for macOS. Funding the costs of open-sourcing, Valve has been utilizing these tools on their applications, noting performance gains over native OpenGL drivers with Vulkan DOTA 2 on macOS as a production-load example. Altogether, this forms the next step in Khronos' Vulkan Portability Initiative, which was first announced at GDC 2017 as their "3D Portability Initiative," and later refined as the "Vulkan Portability Initiative" last summer. Spurred by industry demand, Khronos is striving for a cross-platform API portability solution, where an appropriate subset of Vulkan can act as a 'meta-API'-esque layer to map to DirectX 12 and Metal; the holy grail being that developers can craft a single Vulkan portable application or engine that can be seamlessly deployed across Vulkan, DX12, and Metal supporting platforms.

Slashdot Top Deals