Desktops (Apple)

Apple Announces All-New Redesigned Mac Pro, Starting at $5,999 (theverge.com) 317

The long-awaited Mac Pro is here. From a report: The new Intel Xeon processor inside the Mac Pro will have up to 28 cores, with up to 300W of power and heavy-duty cooling, "so it can run unconstrained at full power at all times." System memory can be maxed out at an eyebrow-raising 1.5TB, says Apple. There are eight internal PCI Express slots, with four of them being double-wide. Two USB-C and two USB-A ports will grace the front of the system, which is at least one more USB-C port than you'll find on a majority of desktop PC systems and cases today. With this Mac Pro, Apple is launching a custom expansion module it calls an MPX Module. This is a giant quad-wide PCIe card that fits two graphics cards, has its own dedicated heatsink, and also has a Thunderbolt 3 connector on the bottom for extra bandwidth / power / display connectivity. Apple says you can spec that out with AMD's Radeon Pro Vega 2 or Radeon Pro Vega 2 Duo, the latter of which would get you four GPUs in total. The power supply of the new Mac Pro maxes out at 1.4kW. Three large fans sit at the front, just behind the new aluminum grille, blowing air across the system at a rate of 300 cubic feet per minute. It starts at $5,999.
AMD

Samsung and AMD Announce Multi-Year Strategic Graphics IP Licensing Deal For SLSI Mobile GPUs (anandtech.com) 17

Samsung and AMD announced today a new multi-year strategic partnership between the two companies, where Samsung SLSI will license graphics IP from AMD for use in mobile GPUs. From a report: The announcement is a major disruptive move for the mobile graphics landscape as it signifies that Samsung is going forward with the productization of their own in-house GPU architecture in future Exynos chipsets. Samsung is said to have started work on their own "S-GPU" at its research division back around in 2012, with the company handing over the new IP to a new division called "ACL," or Advanced Computing Lab in San Jose, which has a joint charter with SARC (Samsung Austin R&D Center, where Samsung currently designs its custom mobile CPU & memory controller IP). With today's announced partnership, Samsung will license "custom graphics IP" from AMD. What this IP means is a bit unclear from the press release, but we have some strong pointers on what it might be.

Samsung's own GPU architecture is already quite far along, having seen 7 years of development, and already being integrated in test silicon chipsets. Unless the deal was signed years ago and only publicly announced today, it would signify that the IP being talked about now is a patent-deal, rather than new architectural IP from AMD that Samsung would integrate in their own designs. Samsung's new GPU IP is the first from-scratch design in over a decade, in an industry with very old incumbents with massive patent-pools. Thus what today's announcement likely means is likely that Samsung is buying a patent-chest from AMD in order to protect themselves from possible litigation from other industry players.

Security

Apple Still Has Problems With Stopping Synthetic Clicks (zdnet.com) 22

Synthetic events remain a big security hole for macOS in spite of Apple's recent efforts to prevent malicious applications from abusing this feature. From a report: Speaking at the second edition of the Objective by the Sea security conference that was held in Monaco over the weekend, Patrick Wardle, a well-known Apple security expert, has revealed a zero-day impacting Apple's macOS operating system, including the new version launched today. The zero-day is a bypass of the security protections that Apple has put in place to prevent unauthorized access to synthetic events. Synthetic events are a macOS mechanism that allows applications to automate mouse clicks and keyboard input. It was created for the sake of automation and can be used via either the Core Graphics framework or the AppleScript scripting language. [...]

For almost two years now, Wardle has been looking at Apple's countermeasures aimed to prevent the abuse of synthetic events. He previously showed two methods[1, 2] of bypassing Apple's synthetic events protections, so much so that Apple decided last year to block access to synthetic events by default. But over the weekend, Wardle disclosed a new way of bypassing these latest protections, once again. "It's the gift that keeps giving," Wardle told ZDNet via email. "And actually gets more and more valuable as Apple adds more protections (privacy and security mechanisms) that can be 'allowed' by a single synthetic click." The new technique is possible because of the Transparency Consent and Control (TCC) system. Wardle says the TCC contains a compatibility database in the form of a file named AllowApplications.plist. This file lists apps and app versions that are allowed to access various privacy and security features, including synthetic events.

Open Source

NLNet Funds Development of a Libre RISC-V 3D CPU (crowdsupply.com) 75

The NLNet Foundation is a non-profit supporting privacy, security, and the "open internet". Now the group has approved funding for the hybrid Libre RISC-V CPU/VPU/GPU, which will "pay for full-time engineering work to be carried out over the next year, and to pay for bounty-style tasks."

Long-time Slashdot reader lkcl explains why that's significant: High security software is irrelevant if the hardware is fundamentally compromised, for example with the Intel spying backdoor co-processor known as the Management Engine. The Libre RISCV SoC was begun as a way for users to regain trust and ownership of the hardware that they legitimately purchase.

This processor will be the first of its kind, as the first commercial SoC designed to give users the hardware and software source code of the 3D GPU, Video Decoder, main processor, boot process and the OS.

Shockingly, in the year 2019, whilst there are dozens of SoCs with full source code that are missing either a VPU or a GPU (such as the TI OMAP Series and Xilinx ZYNQ7000s), there does not exist a single commercial embedded SoC which has full source code for the bootloader, CPU, VPU and GPU. The iMX6 for example has etnaviv support for its GPU however the VPU is proprietary, and all of Rockchip and Allwinner's offerings use either MALI or PowerVR yet their VPUs have full source (reverse engineered in the case of Allwinner).

This processor, which will be quad core dual issue 800mhz RV64GC and capable of running full GNU/Linux SMP OSes, with 720p video playback and embedded level 25fps 3D performance in around 2.5 watts at 28nm, is designed to address that imbalance. Links and details on the Libre RISC-V SoC wiki.

The real question is: why is this project the only one of its kind, and why has no well funded existing Fabless Semiconductor Company tried something like this before? The benefits to businesses of having full source code are already well-known.

Graphics

Ask Slashdot: Why Is 3D Technology Stagnating So Badly? 188

dryriver writes: If you had asked someone doing 3D graphics seriously back in 2000 what 3D technology will look like two decades away in 2019, they might have said: "Most internet websites will have realtime 3D content embedded or will be completely in 3D. 3D Games will look as good as movies or reality. Everyone will have a cheap handheld 3D scanner to capture 3D models with. High-end VR headsets, gloves, bodysuits and haptics devices will be sold in electronics stores. Still and video cameras will be able to capture true holographic 3D images and video of the real world. TVs and broadcast TV content will be in holographic 3D. 3D stuff you create on a PC will be realtime -- no more waiting for images to slowly render thanks to really advanced new 3D hardware. 3D content creation software will be incredibly advanced and fast to work with in 2019. Many new types of 3D input devices will be available that make working in 3D a snap."

Except of course that that in the real 2019, none of this has come true at all, and the entire 3D field has been stagnating very, very badly since around 2010. It almost seems like a small army of 3D technology geniuses pushed and pushed 3D software and hardware hard during the 80s, 90s, 2000s, then retired or dropped off the face of the earth completely around 10 years ago. Why is this? Are consumers only interested in Facebook, YouTube, cartoony PlayStation graphics and smartphones anymore? Are we never going to see another major 3D technology innovation push again?
Operating Systems

Dell Begins Pre-Installing Linux On Range of Precision Laptops (phoronix.com) 139

"While Linux-preloaded laptops have been available for years from smaller companies, and have represented a fraction of their own sales with the much-admired XPS 13 developer model, Dell now offers a range of Precision models pre-installed with Ubuntu Linux," writes Slashdot reader Freshly Exhumed. Phoronix reports: At the start of May Dell announced an Ubuntu Linux option for their entry-level ~$700 Precision laptop while now they are closing out May by offering up Ubuntu 18.04 LTS on their higher-tier Precision laptop models. Ubuntu Linux has landed for the rest of Dell's current generation Precision mobile workstation line-up with support for the Precision 5540, 7540, and 7740. The Precision 5540 offers options of Xeon E or 9th Gen Core CPUs with up to 64GB of RAM and options for a NVIDIA Quadro T2000. The Precision 7540/7740 meanwhile are more powerful mobile workstations with supporting up to 128GB of ECC RAM and latest generation processors. The Precision 7740 model can also accomodate NVIDIA Quadro RTX 5000 series graphics. Additional details can be found via this blog post by Dell's Barton George.
Graphics

Intel Graphics Division Shares Wild Futuristic GPU Concept Cards (hothardware.com) 70

MojoKid writes: What do you think graphics cards will look like in the next decade and a half? Intel wanted to know that as well, so it commissioned designer Cristiano Siquiera to give us a taste of what graphics cards might look like in the year 2035. Siquiera, the original talented designer that brought the first set of Intel Odyssey GPU renders not long ago, focused primarily on the fan/shroud designs and what innovations could be fostered in the coming years. He was tasked with thinking far beyond current design conventions, materials and cooling technologies in current-gen graphics cards, and to envision new designs that could employ technologies and materials not even invented yet. One concept, called Gemini, shows an ionic-based cooling system that isn't too far beyond the realm of feasibility. Yet another design, called Prometheus, showcases top edge-mounted display readout that could also be fairly easily employed with flexible OLED display technology. Intel also just launched a new Graphics Command Center driver package today, which offers more customization, better control of power and cooling and one-click game optimization for Intel GPU-enabled systems.
Intel

Intel Boldly Claims Its 'Ice Lake' Integrated Graphics Are As Good as AMD's (pcworld.com) 147

While Intel is expected to detail its upcoming 10nm processor, Ice Lake, during its Tuesday keynote here at Computex, the company is already making one bold claim -- that Ice Lake's integrated Gen11 graphics engine is on par or better than AMD's current Ryzen 7 graphics. From a report: It's a bold claim, and one that Ryan Shrout, a former journalist and now the chief performance strategist for Intel, said that Intel doesn't make lightly. "I don't think we can overstate how important this is for us, to make this claim and this statement about the one area that people railed on us for in the mobile space," Shrout said shortly before Computex began. Though Intel actually supplies the largest number of integrated graphics chipsets in the PC space, it does so on the strength of its CPU performance (and also thanks to strong relationships with laptop makers). Historically, AMD has leveraged its Radeon "Vega" GPUs to attract buyers seeking a more powerful integrated graphics solution. But what Intel is trying to do now, with its Xe discrete graphics on the horizon, is let its GPUs stand on their own merits. Referencing a series of benchmarks and games from the 3DMark Sky Diver test to Fortnite to Overwatch, Intel claims performance that's 3 to 15 percent faster than the Ryzen 7. Intel's argument is based on a comparison of a U-series Ice Lake part at 25 watts, versus a Ryzen 7 3700U, also at 25 watts.
AMD

AMD Unveils the 12-Core Ryzen 9 3900X, at Half the Price of Intel's Competing Core i9 9920X Chipset (techcrunch.com) 261

AMD CEO Lisa Su today unveiled news about its chips and graphics processors that will increase pressure on competitors Intel and Nvidia, both in terms of pricing and performance. From a report: All new third-generation Ryzen CPUs, the first with 7-nanometer desktop chips, will go on sale on July 7. The showstopper of Su's keynote was the announcement of AMD's 12-core, 24-thread Ryzen 9 3900x chip, the flagship of its third-generation Ryzen family. It will retail starting at $499, half the price of Intel's competing Core i9 9920X chipset, which is priced at $1,189 and up. The 3900x has 4.6 Ghz boost speed and 70 MB of total cache and uses 105 watts of thermal design power (versus the i9 9920x's 165 watts), making it more efficient. AMD says that in a Blender demo against Intel i9-9920x, the 3900x finished about 18 percent more quickly. Starting prices for other chips in the family are $199 for the 6-core, 12-thread 3600; $329 for the 8-core, 16-thread Ryzen 3700x (with 4.4 Ghz boost, 36 MB of total cache and a 65 watt TDP); and $399 for the 8-core, 16-thread Ryzen 3800X (4.5 Ghz, 32MB cache, 105w).
Hardware

Nvidia Unveils RTX Studio For Desktop-Style Performance on Laptops (venturebeat.com) 47

Nvidia today unveiled the tech behind new RTX Studio laptops, which can provide desktop-level computing performance for laptop users. From a report: Aimed at creators, the machines are targeted at independent artists who are fueling growing fields like social media, digital advertising, and 3D development. Nvidia says these laptops can deliver up to seven times the performance of a MacBook Pro. The 17 new laptop models from seven manufacturers, powered by a range of Nvidia GeForce and Quadro RTX graphics processing units (GPUs). The ultra-long battery life and stability when using newly developed Nvidia Studio Drivers.

The Laptops that meet the highest qualifications for creative capabilities will be badged RTX Studio. That will help creators to easily identify the right hardware to meet their demands. These Quadro and GeForce RTX-based laptops are purpose-built for GPU-accelerated content creation. The laptops feature the new Quadro RTX 5000 mobile GPU and GeForce RTX 2080, 2070 and 2060 GPUs. Quadro RTX 5000-based laptops are the world's first with 16GB of graphics memory, enabling advanced multi-app creative workflows and use of large 3D models that previously were not possible while on the go, Nvidia said. [...] RTX Studio laptops will be available starting in June from top computer makers, including Acer, Asus, Dell, Gigabyte, HP, MSI, and Razer. Pricing starts at $1,600 and will vary based on partner designs, features, and region.

Microsoft

Sony and Microsoft Set Rivalry Aside For Streaming Alliance (nikkei.com) 33

Sony and Microsoft, bitter rivals in the video game console wars, will team up in on-demand gaming to better compete with newcomers like Google as the industry's main battlefield looks poised to shift to the cloud, news outlet Nikkei reported Thursday. From a report: Sony President and CEO Kenichiro Yoshida has signed a memorandum of understanding with Microsoft CEO Satya Nadella on a strategic tie-up. While details have yet to be hammered out, the partnership will center on artificial intelligence and the cloud. The latter category includes plans for joint development of cloud gaming technology. While this market is expected to grow as ultrafast fifth-generation wireless gains traction, such services require much processing power on the provider's end to deliver games with high-quality graphics and minimal lag. Sony and Microsoft plan to leverage the American computing behemoth's data centers for this purpose. The two companies, along with Nintendo, long dominated the gaming landscape. But the rise of mobile gaming has brought competition from such other players as China's Tencent Holdings, which publishes the mobile version of the wildly popular PlayerUnknown's Battlegrounds (PUBG). Press release: Sony and Microsoft to explore strategic partnership.
NES (Games)

28 Years Later, Hacker Fixes Rampant Slowdown of SNES' Gradius III (arstechnica.com) 58

Ars Technica's Kyle Orland reports that Brazilian ROM hacker Vitor Vilela has released a ROM patch for the hit arcade game Gradius III, creating a new, slowdown-free version of the game for play on SNES emulators and standard hardware. "In magazine screenshots, the game's huge, colorful sprites were a sight to behold, comparable to the 1989 arcade original," writes Orland. "In action, though, any scene with more than a handful of enemies would slow to a nearly unplayable crawl on the underpowered SNES hardware." From the report: The key to Vilela's efforts is the SA-1 chip, an enhancement co-processor that was found in some late-era SNES cartridges like Super Mario RPG and Kirby Super Star. Besides sporting a faster clock speed than the standard SNES CPU (up to 10.74 Mhz versus 3.58 Mhz for the CPU), SA-1 also opens up faster mathematical functions, improved graphics manipulation, and parallel processing capabilities for SNES programmers.

The result, as is apparent in the comparison videos embedded here, is a version of Gradius III that Vilela says runs two to three times faster than the original. It also keeps its silky smooth frame rate no matter how many detailed, screen-filling sprites clutter the scene. That's even true in the game's notorious, bubble-filled Stage 2, which is transformed from a jittery slide show to an amazing showcase of the SNES' enhanced power. As if that wasn't enough, the patch even slashes the game's loading times, cutting a full 3.25 seconds from the notably slow startup animation.
Vilela notes that the lack of slowdown "makes it incredibly super difficult" and even suggests that "some arcade segments of the game do not look RTA (real-time action) viable with SA-1. But we shouldn't underestimate the human capabilities."
Intel

Apple's Tim Cook and Luca Maestri on Intel (daringfireball.net) 174

Tim Cook and Luca Maestri's remarks on Apple's quarterly analyst call earlier this week: CEO Tim Cook: "For our Mac business overall, we faced some processor constraints in the March quarter, leading to a 5 percent revenue decline compared to last year. But we believe that our Mac revenue would have been up compared to last year without those constraints, and don't believe this challenge will have a significant impact on our Q3 results."

CFO Luca Maestri: "Next I'd like to talk about the Mac. Revenue was 5.5 billion compared to 5.8 billion a year ago, with the decline driven primarily by processor constraints on certain popular models."
Apple commentator John Gruber adds, "I asked an Apple source last fall why it took so long for Apple to release the new MacBook Air. Their one-word answer: "Intel." One of the big questions for next month's WWDC is whether this is the year Apple announces Macs with Apple's own ARM processors (and graphics?)."
Software

Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows (phoronix.com) 151

To not a lot of surprise compared to the world of proprietary graphics drivers on Windows where once the support is retired the driver releases stop, old open-source Linux OpenGL drivers are found to be better maintained. From a report: Blender developers working on shipping Blender 2.80 this July as the big update to this open-source 3D modeling software today rolled out the Linux GPU requirements for this next release. The requirements themselves aren't too surprising and cover NVIDIA GPUs released in the last ten years, AMD GCN for best support, and Intel Haswell graphics or newer. In the case of NVIDIA graphics they tend to do a good job maintaining their legacy driver branches. With the AMD Radeon and Intel graphics, Blender developers acknowledge older hardware may work better on Linux.
AMD

AMD Gained Market Share For 6th Straight Quarter, CEO Says (venturebeat.com) 123

Advanced Micro Devices CEO Lisa Su said during her remarks on AMD's first quarter earnings conference call with analysts today that she was confident about the state of competition with rivals like Intel and Nvidia in processors and graphics chips. She also pointed out that the company gained market share in processors for the 6th straight quarter. From a report: AMD's revenue was $1.27 billion for the first quarter, down 23% from the same quarter a year ago. But Su noted that Ryzen and Epyc processor and datacenter graphics processing units (GPUs) revenue more than doubled year-over-year, helping expand the gross margin by 5 percentage points. If there was a lag in the quarter, it was due to softness in the graphics channel and lower semi-custom revenue (which includes game console chips). Su said AMD's unit shipments increased significantly and the company's new products drove a higher client average selling price (ASP).
Graphics

Ask Slashdot: Why Are 3D Games, VR/AR Still Rendered Using Polygons In 2019? 230

dryriver writes: A lot of people seem to believe that computers somehow need polygons, NURBS surfaces, voxels or point clouds "to be able to define and render 3D models to the screen at all." This isn't really true. All a computer needs to light, shade, and display a 3D model is to know the answer to the question "is there a surface point at coordinate XYZ or not." Many different mathematical structures or descriptors can be dreamed up that can tell a computer whether there is indeed a 3D model surface point at coordinate XYZ or behind a given screen pixel XY. Polygons/triangles are a very old approach to 3D graphics that was primarily designed not to overstress the very limited CPU and RAM resources of the first computers capable of displaying raster 3D graphics. The brains who invented the technique back in the late 1960s probably figured that by the 1990s at the latest, their method would be replaced by something better and more clever. Yet here we are in 2019 buying pricey Nvidia, AMD, and other GPUs that are primarily polygon/triangle accelerators.

Why is this? Creating good-looking polygon models is still a slow, difficult, iterative and money intensive task in 2019. A good chunk of the $60 you pay for an AAA PC or console game is the sheer amount of time, manpower and effort required to make everything in a 15-hour-long game experience using unwieldy triangles and polygons. So why still use polygons at all? Why not dream up a completely new "there is a surface point here" technique that makes good 3D models easier to create and may render much, much faster than polygons/triangles on modern hardware to boot? Why use a 50-year-old approach to 3D graphics when new, better approaches can be pioneered?
Hardware

NVIDIA Launches GeForce GTX 16 Series Turing Gaming Laptop GPUs (hothardware.com) 22

MojoKid writes: NVIDIA has launched a new family of more budget friendly Turing graphics chips for gaming laptops, called the GeForce GTX 1650, GeForce GTX 1660, and GeForce GTX 1660 Ti. The new GPUs will power roughly 80 different OEM mainstream gaming notebook designs, starting in the $799 price range. Compared to a 4-year-old gaming laptop with a GeForce GTX 960M, NVIDIA says that a modern counterpart equipped with a GeForce GTX 1660 Ti can deliver 4x the performance in today's battle royale-style games like Apex Legends, Fortnite, and PUBG. As for the GeForce GTX 1650, NVIDIA is promising a 2.5x performance advantage compared to the GTX 950M and a 1.7x advantage compared to the previous generation GTX 1050. Gamers should expect consistent 60 fps performance in the above-mentioned gaming titles at 1080p, though the company didn't specifically mention GTX 1660 vs 1060 performance comparisons. According to NVIDIA, every major OEM will be releasing GeForce GTX 16 Series laptops, including well-known brands like ASUS, Dell/Alienware, Acer, Hewlett-Packard, and Lenovo (among others).
PlayStation (Games)

Sony Cracks Down On Sexually Explicit Content In Games (engadget.com) 299

Slashdot reader xavdeman writes: Hot on the heels of its announcement of the specifications of the next PlayStation, Sony has revealed a new crackdown on explicit content. Citing "the rise of the #MeToo movement" and a concern of "legal and social action" in the USA, Sony has said it wants to address concerns about the depiction of women in video games playable on its platform as well as protect children's "sound growth and development." The new rules were reportedly already responsible for puritan cutscene alterations in the Western PS4 release of the multi-platform title Devil May Cry 5, where lens flares were used to cover up partial nudity.
Emulation (Games)

HD Emulation Mod Makes 'Mode 7' SNES Games Look Like New (arstechnica.com) 44

An anonymous reader quotes a report from Ars Technica: Gamers of a certain age probably remember being wowed by the quick, smooth scaling and rotation effects of the Super Nintendo's much-ballyhooed "Mode 7" graphics. Looking back, though, those gamers might also notice how chunky and pixelated those background transformations could end up looking, especially when viewed on today's high-end screens. Emulation to the rescue. A modder going by the handle DerKoun has released an "HD Mode 7" patch for the accuracy-focused SNES emulator bsnes. In their own words, the patch "performs Mode 7 transformations... at up to 4 times the horizontal and vertical resolution" of the original hardware.

The results, as you can see in the above gallery and the below YouTube video, are practically miraculous. Pieces of Mode 7 maps that used to be boxy smears of color far in the distance are now sharp, straight lines with distinct borders and distinguishable features. It's like looking at a brand-new game. Perhaps the most impressive thing about these effects is that they take place on original SNES ROM and graphics files; DerKoun has said that "no artwork has been modified" in the games since the project was just a proof of concept a month ago. That makes this project different from upscaling emulation efforts for the N64 and other retro consoles, which often require hand-drawn HD texture packs to make old art look good at higher resolutions.

Hardware

Qualcomm's Snapdragon 665, 730, and 730G Target AI and Gaming (venturebeat.com) 13

Today at its annual AI Day conference in San Francisco, chipmaker Qualcomm revamped the midrange products in its system-on-chip portfolio with faster graphics, more power-efficient cores, and other silicon accouterments. From a report: The Snapdragon 670 gained a counterpart in the Snapdragon 665, and the Snapdragon 700 series has two new SKUs in the long-rumored Snapdragon 730 and a gaming-optimized variant dubbed Snapdragon 730G. "In the last several years, we've had a few different technologies that we've [explored]," said vice president of product management Kedar Kondap during a press briefing. "One is obviously [the] camera. Secondly, AI, and ... gaming ... [We've] focused on ... power, [making] sure we drive very high performance." The 11-nm Snapdragon 665 packs Kryo 260 cores and offers up to 20 percent power savings with the Adreno 610 GPU. The 8-nm Snapdragon 730 has Kryo 470 cores inside.

Slashdot Top Deals