Space

Help NASA Choose the Name For Its Next Mars Rover (nasa.gov) 80

Slashdot reader DevNull127 writes: NASA will launch a new rover to Mars this July — and 28,000 American schoolchildren wrote essays with suggestions for what NASA should name it.

NASA has now selected the top nine finalists, which they'll let the public vote on through Monday on a special web page where they're also displaying the schoolchildren's essays. "Scientists are tenacious," wrote one student who suggested the name Tenacity. "It is what keeps them thinking and experimenting... When scientists make mistakes they see what they did wrong and then try again.

"If they didn't have tenacity, Mars rovers wouldn't be a thing."

The new rover will also be carrying the names of 10,932,295 earthlings, etched onto a microchip.

Bloomberg points out that because Mars and Earth are unusually close in July and August -- a mere 39 million miles -- another rover will also be launched by the international ExoMars programme (led by the European Space Agency and the Russian Roscosmos State Corporation), while the United Arab Emirates will also try sending an orbiter to Mars, and China will deploy "an orbiter to circle the planet and a rover to land on it."
Desktops (Apple)

36 Years Ago Today, Steve Jobs Unveiled the First Macintosh (macrumors.com) 108

An anonymous reader quotes a report from MacRumors: On January 24, 1984, former Apple CEO Steve Jobs introduced the first Macintosh at Apple's annual shareholder's meeting in Cupertino, California, debuting the new computer equipped with a 9-inch black and white display, an 8MHz Motorola 68000 processor, 128KB of RAM, a 3.5-inch floppy drive, and a price tag of $2,495. The now iconic machine weighed in at a whopping 17 pounds and was advertised as offering a word processing program, a graphics package, and a mouse. At the time it was introduced, the Macintosh was seen as Apple's last chance to overcome IBM's domination of the personal computer market and remain a major player in the personal computer industry. Despite the high price at the time, which was equivalent to around $6,000 today, the Macintosh sold well, with Apple hitting 70,000 units sold by May 1984. The now iconic "1984" Super Bowl ad that Apple invested in and debuted days before the Macintosh was unveiled may have helped bolster sales.
AMD

AMD Launches Navi-Based Radeon RX 5600XT To Battle GeForce RTX 2060 Under $300 (hothardware.com) 57

MojoKid writes: Today AMD launched its latest midrange graphics card based on the company's all new Navi architecture. The AMD Radeon RX 5600 XT slots in under $300 ($279 MSRP) and is based on the same Navi 10 GPU as AMD's current high-end Radeon RX 5700 series cards. AMD's Radeon RX 5600 XT is outfitted with 36 compute units, with a total of 2,304 stream processors and is essentially a Radeon 5700 spec GPU with 2GB less GDDR 6 memory (6GB total) and a narrower 192-bit interface, versus Radeon RX 5700's 8GB, 256-bit config. HotHardware took a Sapphire Pulse Radeon RX 5600 XT around the benchmark track and this card has a BIOS switch on-board that toggles between performance and silent/quiet modes. In performance mode, the card has a 160W power target, 14Gbps memory data rate, a Boost Clock of 1,750MHz and a Game Clock of 1,615MHz. In silent/quiet mode, things are a bit more tame with a 135W power target, 12Gbps memory, and 1,620 MHz/1,460MHz Boost and Game Clocks, respectively. In the gaming benchmarks, the new Radeon RX 5600 XT is generally faster than NVIDIA's GeForce RTX 2060 overall, with the exception of a few titles that are more NVIDIA-optimized and in VR. Though it lacks the capability for hardware-accelerated ray tracing, the new AMD Radeon RX 5600 XT weighs in $20-30 less than NVIDIA's closest competitor and offers similar if not better performance.
Wine

Wine 5.0 Released (bleepingcomputer.com) 60

An anonymous reader quotes a report from BleepingComputer: Wine 5.0 has been released today and contains over 7,400 bug fixes and numerous audio and graphics improvements that will increase performance in gaming on Linux. With the release of Wine 5.0, WineHQ hopes to resolve many of these issues, with the main improvements being:

-Builtin modules in PE format: To make games think Wine is a real Windows environment, most Wine 5.0 modules have been converted into the PE format rather than ELF binaries. It is hoped that this will allow copy-protection and anti-cheat programs to not flag games running under Wine as being modified.
-Multi-monitor support: Multiple displays adapters and multi-monitor configurations are now supported under Wine.
-XAudio2 reimplementation: XAudio2 libraries have been added back to Wine and will use the FAudio library for better compatibility.
-Vulkan 1.1 support: "The Vulkan driver supports up to version 1.1.126 of the Vulkan spec."
Here are the release notes, download locations for the binary packages (when available) and source.
Intel

Intel's First Discrete GPU is Built For Developers (engadget.com) 50

At its CES 2020 keynote, Intel showed off its upcoming Xe discrete graphics chip and today, we're seeing exactly how that's going to be implemented. From a report: First off, Intel unveiled a standalone DG1 "software development vehicle" card that will allow developers to optimize apps for the new graphics system. It didn't reveal any performance details for the card, but did show it running the Warframe game. It also noted that it's now "sampling to ISVs (independent software vendors) worldwide... enabling developers to optimize for Xe." As far as we know right now, Intel's discrete graphics will be chips (not cards) installed together with the CPUs on a single package. However, it's interesting to see Intel graphics in the form of a standalone PCIe card, even one that will never be sold to consumers.
AMD

AMD Unveils Ryzen 4000 Mobile CPUs Claiming Big Gains, 64-Core Threadripper (hothardware.com) 71

MojoKid writes: Yesterday, AMD launched its new Ryzen 4000 Series mobile processors for laptops at CES 2020, along with a monstrous 64-core/128-thread third-generation Ryzen Threadripper workstation desktop CPU. In addition to the new processors, on the graphics front the oft-leaked Radeon RX 5600 XT that target's 1080p gamers in the sweet spot of the GPU market was also made official. In CPU news, AMD claims Ryzen 4000 series mobile processors offer 20% lower SOC power, 2X perf-per-watt, 5X faster power state switching, and significantly improved iGPU performance versus its previous-gen mobile Ryzen 3000 products. AMD's U-Series flagship, the Ryzen 7 4800U, is an 8-core/16-thread processor with a max turbo frequency of 4.2GHz and integrated Vega-derived 8-core GPU.

Along with architectural enhancements and the frequency benefits of producing the chips at 7nm, AMD is underscoring up to 59% improved performance per graphics core as well. AMD is also claiming superior single-thread CPU performance versus current Intel-processors and significantly better multi-threaded performance. The initial Ryzen 4000 U-Series line-up consists of five processors, starting with the 4-core/4-thread Ryzen 5 4300U, and topping off with the aforementioned Ryzen 7 4800U. On the other end of the spectrum, AMD revealed some new information regarding its 64-core/128-thread Ryzen Threadripper 3990X processor. The beast chip will have a base clock of 2.9GHz and a boost clock of 4.3GHz with a whopping 288MB of cache. The chip will drop into existing TRX40 motherboards and be available on February 7th for $3990. AMD showcased the chip versus a dual socket Intel Xeon Platinum in the VRAY 3D rendering benchmark beating the Xeon system by almost 30 minutes in a 90-minute workload, though the Intel system retails for around $20K.

AI

MIT's New Tool Predicts How Fast a Chip Can Run Your Code (thenextweb.com) 13

Folks at the Massachusetts Institute of Technology (MIT) have developed a new machine learning-based tool that will tell you how fast a code can run on various chips. This will help developers tune their applications for specific processor architectures. From a report: Traditionally, developers used the performance model of compilers through a simulation to run basic blocks -- fundamental computer instruction at the machine level -- of code in order to gauge the performance of a chip. However, these performance models are not often validated through real-life processor performance. MIT researchers developed an AI model called Ithmel by training it to predict how fast a chip can run unknown basic blocks. Later, it was supported by a database called BHive with 300,000 basic blocks from specialized fields such as machine learning, cryptography, and graphics. The team of researchers presented a paper [PDF] at the NeuralIPS conference in December to describe a new technique to measure code performance on various processors. The paper also describes Vemal, a new automatically generating algorithm that can be used to generate compiler optimizations.
Graphics

Apple Reunites With iPhone Graphics Chip Partner To License Technology (theverge.com) 28

Apple will once again license technology from Imagination Technologies, the chip designer that used to provide graphics processors for the iPhone and iPad, the UK-based company announced today. The Verge reports: In a short statement posted on its website, Imagination said that it had entered into a multiyear license agreement with Apple, under which the Cupertino, California-based firm will have access to "a wider range of Imagination's intellectual property in exchange for license fees." Apple announced its split from Imagination back in April 2017 when it said that it would start designing its own graphics chips, and it would stop licensing the company's technology within two years. After the split was announced, Imagination expressed skepticism that Apple could design its own chips "without violating Imagination's patents, intellectual property, and confidential information."
China

One-Quarter of World's Pigs Died In a Year Due To Swine Fever In China (nytimes.com) 104

An anonymous reader quotes a report from The New York Times: The [African swine fever disease] was first reported in Shenyang, Liaoning Province, in early August 2018. By the end of August 2019, the entire pig population of China had dropped by about 40 percent. China accounted for more than half of the global pig population in 2018, and the epidemic there alone has killed nearly one-quarter of all the world's pigs. By late September, the disease had cost economic losses of one trillion yuan (about $141 billion), according to Li Defa, dean of the College of Animal Science and Technology at China Agricultural University in Beijing. Qiu Huaji, a leading Chinese expert on porcine infectious diseases, has said that African swine fever has been no less devastating "than a war" -- in terms of "its effects on the national interest and people's livelihoods and its political, economic and social impact."

Much like severe acute respiratory syndrome, or SARS, exposed the shortcomings of China's public health system when it became an epidemic in 2002-3, swine fever today exposes the weaknesses of the country's animal-disease prevention and control. But it also reveals something much more fundamental: notably, the perverse effects that even sound regulations can have when they are deployed within a system of governance as unsound as China's. According to Yu Kangzhen, a deputy minister of agriculture, the localities that struggled to control the spread of African swine fever were also those that lacked staff, funding or other resources in animal-epidemic prevention. Yet that alone cannot explain the breadth of the epidemic or the speed with which it swept across China...

AI

Researchers Detail AI that De-hazes and Colorizes Underwater Photos (venturebeat.com) 25

Kyle Wiggers, writing for VentureBeat: Ever notice that underwater images tend to be be blurry and somewhat distorted? That's because phenomena like light attenuation and back-scattering adversely affect visibility. To remedy this, researchers at Harbin Engineering University in China devised a machine learning algorithm that generates realistic water images, along with a second algorithm that trains on those images to both restore natural color and reduce haze. They say that their approach qualitatively and quantitatively matches the state of the art, and that it's able to process upwards of 125 frames per second running on a single graphics card. The team notes that most underwater image enhancement algorithms (such as those that adjust white balance) aren't based on physical imaging models, making them poorly suited to the task. By contrast, this approach taps a generative adversarial network (GAN) -- an AI model consisting of a generator that attempts to fool a discriminator into classifying synthetic samples as real-world samples -- to produce a set of images of specific survey sites that are fed into a second algorithm, called U-Net.
Privacy

Ask Slashdot: What Will the 2020s Bring Us? 207

dryriver writes: The 2010s were not necessarily the greatest decade to live through. AAA computer games were not only DRM'd and internet tethered to death but became increasingly formulaic and pay-to-win driven, and poor quality console ports pissed off PC gamers. Forced software subscriptions for major software products you could previously buy became a thing. Personal privacy went out the window in ways too numerous to list, with lawmakers failing on many levels to regulate the tech, data-mining and internet advertising companies in any meaningful way. Severe security vulnerabilities were found in hundreds of different tech products, from Intel CPUs to baby monitors and internet-connected doorbells. Thousands of tech products shipped with microphones, cameras, and internet connectivity integration that couldn't be switched off with an actual hardware switch. Many electronics products became harder or impossible to repair yourself. Printed manuals coming with tech products became almost non-existent. Hackers, scammers, ransomwarers and identity thieves caused more mayhem than ever before. Troll farms, click farms and fake news factories damaged the integrity of the internet as an information source. Tech companies and media companies became afraid of pissing off the Chinese government.

Windows turned into a big piece of spyware. Intel couldn't be bothered to innovate until AMD Ryzen came along. Nvidia somehow took a full decade to make really basic realtime raytracing happen, even though smaller GPU maker Imagination had done it years earlier with a fraction of the budget, and in a mobile GPU to boot. Top-of-the-line smartphones became seriously expensive. Censorship and shadow banning on the once-more-open internet became a thing. Easily-triggered people trying to muzzle other people on social media became a thing. The quality of popular music and music videos went steadily downhill. Star Wars went to shit after Disney bought it, as did the Star Trek films. And mainstream cinema turned into an endless VFX-heavy comic book movies, remakes/reboots and horror movies fest. In many ways, television was the biggest winner of the 2010s, with many new TV shows with film-like production values being made. The second winner may be computer hardware that delivered more storage/memory/performance per dollar than ever before.

To the question: What, dear Slashdotters, will the 2020s bring us? Will things get better in tech and other things relevant to nerds, or will they get worse?
Hardware

Atari's Home Computers Turn 40 (fastcompany.com) 86

harrymcc writes: Atari's first home computers, the 400 and 800, were announced at Winter CES in January 1980. But they didn't ship until late in the year -- so over at Fast Company, Benj Edwards has marked their 40th anniversary with a look at their rise and fall. Though Atari ultimately had trouble competing with Apple and other entrenched PC makers, it produced machines with dazzling graphics and sound and the best games of their era, making its computers landmarks from both a technological and cultural standpoint.
Television

The BBC's 1992 TV Show About VR, 3D TVs With Glasses, and Holographic 3D Screens (youtu.be) 54

dryriver writes: 27 years ago, the BBC's "Tomorrow's World" show broadcasted this little gem of a program [currently available on YouTube]. After showing old Red-Cyan Anaglyph movies, Victorian Stereoscopes, lenticular-printed holograms and a monochrome laser hologram projected into a sheet of glass, the presenter shows off a stereoscopic 3D CRT computer display with active shutter glasses. The program then takes us to a laboratory at Massachusetts Institute Of Technology, where a supercomputer is feeding 3D wireframe graphics into the world's first glasses-free holographic 3D display prototype using a Tellurium Dioxide crystal. One of the researchers at the lab predicts that "years from now, advances in LCD technology may make this kind of display cheap enough to use in the home."

A presenter then shows a bulky plastic VR headset larger than an Oculus Rift and explains how VR will let you experience completely computer-generated worlds as if you are there. The presenter notes that 1992 VR headsets may be "too bulky" for the average user, and shows a mockup of much smaller VR glasses about the size of Magic Leap's AR glasses, noting that "these are already in development." What is astonishing about watching this 27-year-old TV broadcast is a) the realization that much of today's stereo stereo 3D tech was already around in some form or another in the early 1990s; b) VR headsets took an incredibly long time to reach the consumer and are still too bulky; and that c) almost three decades later, MIT's prototype holographic glasses-free 3D display technology never made its way into consumer hands or households.

Portables (Apple)

Walt Mossberg: Tim Cook's Apple Had a Great Decade But No New Blockbusters (theverge.com) 59

Veteran tech columnist, who retired two years ago, returns with one story to cap the end of the decade: Apple hasn't said how many Watches and AirPods it's sold, but they're widely believed to be the dominant players in each of their categories and, in the grand Apple tradition, the envy of competitors that scramble to ape them. Neither of these hardware successes has matched the impact or scale of Jobs' greatest hits. Even the iPad, despite annual unit sales that are sharply down from its heyday, generated almost as much revenue by itself in fiscal 2019 as the entire category of "wearables, home and accessories" where the Apple Watch and AirPods are slotted by Apple. [...] Cook does bear the responsibility for a series of actions that screwed up the Macintosh for years. The beloved mainstream MacBook Air was ignored for five years. At the other end of the scale, the Mac Pro, the mainstay of professional audio, graphics, and video producers, was first neglected then reissued in 2013 in a way that put form so far ahead of function that it enraged its customer base. Some insiders think Cook allowed Ive's design team far too much power and that the balance Jobs was able to strike between the designers and the engineers was gone, at least until Ive left the company earlier this year.

The design-first culture that took root under Cook struck again with the MacBook Pro, yielding new laptops so thin their keyboards were awful and featuring USB-C ports that required sleek Macs to be used with ugly dongles. Apple has only recently retreated back to decent keyboards on the latest MacBook Pro, and it issued a much more promising Mac Pro. But dongles are still a part of the Apple experience across its product lines. Cook's other success this decade was to nurture the iPhone along as smartphone sales first plateaued and then began to decline. The biggest change he made came in 2014, before the dip, when Apple introduced two new iPhone 6 models, which belatedly adopted big screens that Android phones had pioneered. Sales took off like a rocket, and there's been a big iPhone option every year since.

Graphics

Qualcomm To Offer GPU Driver Updates On Google Play Store For Some Snapdragon Chips (hothardware.com) 8

MojoKid writes: At its Snapdragon Summit in Maui, Hawaii this week, Qualcomm unveiled the new Snapdragon 865 Mobile Platform, which enable next year's flagship 5G Android phones with more performance, a stronger Tensor-based AI processor and a very interesting new forthcoming feature not yet offered for any smartphone platform to date. The company announced that it will eventually start delivering driver updates for its Adreno GPU engines on board the Snapdragon 865 as downloadable packages via the Google Play Store. This is big news for smartphones, as GPU drivers are rarely updated out of band, if ever, and typically have to wait for the next major Android release. And even then, many OEMs don't bother putting in the effort to ensure that mobile GPUs are running the most current graphics drivers from Qualcomm. The process, which would have to be pre-qualified by major OEMs as well, will be akin to what the PC GPU 3D graphics driver ecosystem has been benefiting from for a long time, for maximum performance and compatibility. Unfortunately, at least currently, GPU driver update support is limited to only the Adreno 650 core on board the new Snapdragon 865, which currently supports updating drivers in this fashion. Here's hoping this program is met with success and Qualcomm will begin to enable the feature for legacy and new midrange Snapdragon platforms as well.
Graphics

Ask Slashdot: How Much Faster Is an ASIC Than a Programmable GPU? 63

dryriver writes: When you run a real-time video processing algorithm on a GPU, you notice that some math functions execute very quickly on the GPU and some math functions take up a lot more processing time or cycles, slowing down the algorithm. If you were to implement that exact GPU algorithm as a dedicated ASIC hardware chip or perhaps on a beefy FPGA, what kind of speedup -- if any -- could you expect over a midrange GPU like a GTX 1070? Would hardwiring the same math operations as ASIC circuitry lead to a massive execution time speedup as some people claim -- e.g. 5x or 10x faster than a general purpose Nvidia GPU -- or are GPUs and ASICs close to each other in execution speed?

Bonus question: Is there a way to calculate the speed of an algorithm implemented as an ASIC chip without having an actual physical ASIC chip produced? Could you port the algorithm to, say, Verilog or similar languages and then use a software tool to calculate or predict how fast it would run if implemented as an ASIC with certain properties (clock speed, core count, manufacturing process... )?
Cloud

Google Addresses Complaints of Sub-4K Image Quality On Stadia (arstechnica.com) 44

An anonymous reader quotes a report from Ars Technica: Since March, Google has been promising that its streaming Stadia platform would be capable of full 4K, 60fps gameplay (for users with a robust Internet connection and $10/month Stadia Pro subscription). But technical analyses since launch have shown that some of the service's highest profile games aren't hitting that mark. A Digital Foundry analysis of Red Dead Redemption 2 on Stadia, for instance, found that the game actually runs at a native 2560x1440 resolution, which is then upscaled to the 4K standard of 4096x2160 via the Chromecast Ultra. And a Bungie representative said that the Stadia version of Destiny 2 runs at the PC equivalent of "medium" graphics settings and that the game will "render at a native 1080p and then upsample [to 4K] and apply a variety of techniques to increase the overall quality of effect."

Over the weekend, Google issued a statement to 9to5Google that essentially places the blame for this situation on Stadia developers themselves (emphasis added): "Stadia streams at 4K and 60fps -- and that includes all aspects of our graphics pipeline from game to screen: GPU, encoder, and Chromecast Ultra all outputting at 4K to 4K TVs, with the appropriate Internet connection. Developers making Stadia games work hard to deliver the best streaming experience for every game. Like you see on all platforms, this includes a variety of techniques to achieve the best overall quality. We give developers the freedom of how to achieve the best image quality and frame rate on Stadia, and we are impressed with what they have been able to achieve for day one. We expect that many developers can, and in most cases will, continue to improve their games on Stadia. And because Stadia lives in our data centers, developers are able to innovate quickly while delivering even better experiences directly to you without the need for game patches or downloads."

Transportation

Analysts, Gamers, and Blade Runner's Artistic Director React To The Look of Tesla's Cybertruck (businessinsider.com) 293

Syd Mead, the artistic director on Blade Runner says Tesla's new Cybertruck "has completely changed the vocabulary of the personal truck market design."

Or, for another perspective, "Tesla's Cybertruck looks weird... like, really weird," wrote Toni Sacconaghi, a senior equity research analyst at the global asset management firm AllianceBernstein. "Add a little bit of dirt, and you could even say it gives off a retro-future vibe a la Mad Max."

That's from a Market Insider article citing Wall Street analysts they say "aren't buying the futuristic design of Tesla's new electric pickup truck." For example, Dan Levy of Credit Suisse, who wrote "amid the radical design for Cybertruck, it's somewhat unclear to us who the core buyer will be." "We do not see this vehicle in its current form being a success," Jeffrey Osborne of Cowen wrote in a note on Friday, adding that he doesn't see the Tesla brand or the Cybertruck design "resonating with existing pickup truck owners...."

Still, the Cybertruck's design wasn't unanimously disliked by Wall Street. The design "will be a hit with the company's fanatic EV installed base globally as Musk & Co. are clearly thinking way out of the box on this model design," Dan Ives of Wedbush wrote in a Friday note....

[And] "While styling will always be subjective, we believe the unique and futuristic design will resonate with consumers, leading to solid demand," Jed Dorsheimer of Canaccord Genuity wrote in a Friday note.

The article also quotes Toni Sacconaghi of Bernstein as saying that the "really futuristic, like cyberpunk Blade Runner" design "is too bad, because its on-paper specs are insane."

But IGN reports there's another group commenting enthusiastically on the Cybertruck's looks: gamers. Unlike anything else we've seen from Musk's line of vehicles before, the Tesla truck resembles something you'd see in an old video game set in the future or sci-fi flick from the late '90s to the early 2000s.

Of course, gamers all over the internet couldn't help themselves from sharing images, making memes, and drawing comparisons to look-alikes we've seen in games, TV shows, and movies... According to the internet, the Tesla Cybertruck either hasn't finished rendering yet or is made of some very dated graphics. Either way, it takes us back to the days where we got to experience the famous low-poly Lara Croft.

Open Source

System76 Will Start Designing and Building Its Own Linux Laptops Beginning January 2020 (forbes.com) 24

An anonymous reader quotes a report from Forbes: Denver-based PC manufacturer and Pop!_OS Linux developer System76 plans to follow-up its custom Thelio desktop PC with an in-house laptop beginning next year, according to founder and CEO Carl Richell. During a recent interview, Richell was quick to emphasize that the entire process of designing, prototyping and iterating the final product could take two to three years. But the company is eager to break into this market and put the same signature "stamp" on its laptop hardware that graces its custom-built Thelio desktop.

System76 sells an extensive lineup of laptops, but the machines are designed by the likes of Sager and Clevo. The company doesn't merely buy a chassis and slap Pop!_OS on it, but Richell tells me he's confident that with the experience gained from developing Thelio -- and the recent investment into a factory at the company's Denver headquarters -- System76 is capable of building a laptop from the ground up that meets market needs and carries a unique value proposition. Richell says the company's first priority is locking down the aesthetic of the laptop and how various materials look and feel. It will simultaneously begin working on the supply chain aspects and speaking with various display and component manufacturers. System76 will design and build a U-class laptop first (basically an Ultrabook form factor like the existing Darter and Galago) and then evaluate what it might do with higher-end gaming and workstation notebooks with dedicated graphics.

Intel

Intel Unveils 7nm Ponte Vecchio GPU Architecture For Supercomputers and AI (hothardware.com) 28

MojoKid writes: Intel has unveiled its first discrete GPU solution that will hit the market in 2020, code name Ponte Vecchio. Based on 7nm silicon manufacturing and stack chiplet design with Intel's Foveros tech, Ponte Vecchio will target HPC markets for supercomputers and AI training in the datacenter. According to HotHardware, Ponte Vecchio will employ a combination of both its Foveros 3D packaging and EMIB (Embedded Multi-die Interconnect Bridge) technologies, along with High Bandwidth Memory (HBM) and Compute Express Link (CXL), which will operate over the newly ratified PCIe 5.0 interface and serve as Ponte Vecchio's high-speed switch fabric connecting all GPU resources. Intel is billing Ponte Vecchio as its first exascale GPU, proving its meddle in the U.S. Department of Energy's (DOE) Aurora supercomputer. Aurora will employ a topology of six Ponte Vecchio GPUs and two Intel Xeon Scalable processors based on Intel's next generation Sapphire Rapids architecture, along with Optane DC Persistent Memory on a single blade. The new supercomputer is schedule to arrive sometime in 2021.

Slashdot Top Deals