×
AI

NVIDIA's Latest AI Software Turns Rough Doodles Into Realistic Landscapes (theverge.com) 35

An anonymous reader quotes a report from The Verge: AI is going to be huge for artists, and the latest demonstration comes from Nvidia, which has built prototype software that turns doodles into realistic landscapes. Using a type of AI model known as a generative adversarial network (GAN), the software gives users what Nvidia is calling a "smart paint brush." This means someone can make a very basic outline of a scene (drawing, say, a tree on a hill) before filling in their rough sketch with natural textures like grass, clouds, forests, or rocks. The results are not quite photorealistic, but they're impressive all the same. The software generates AI landscapes instantly, and it's surprisingly intuitive. For example, when a user draws a tree and then a pool of water underneath it, the model adds the tree's reflection to the pool. Nvidia didn't say if it has any plans to turn the software into an actual product, but it suggests that tools like this could help "everyone from architects and urban planners to landscape designers and game developers" in the future. The company has published a video showing off the imagery it handles particularly well.
Graphics

Crytek Shows 4K 30 FPS Ray Tracing On Non-RTX AMD and NVIDIA GPUs (techspot.com) 140

dryriver writes: Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450 -- real-time ray tracing a complex 3D city environment at 4K 30 FPS. Crytek says that the technology demo runs fine on most normal NVIDIA and AMD gaming GPUs. As if this wasn't impressive already, the software real-time ray tracing technology is still in development and not even final. The framerates achieved may thus go up further, raising the question of precisely what the benefits of owning a super-expensive NVIDIA RTX 20xx series GPU are. Nvidia has claimed over and over again that without its amazing new RTX cores and AI denoiser, GPUs will choke on real-time ray tracing tasks in games. Crytek appears to have proven already that with some intelligently written code, bog ordinary GPU cores can handle real-time ray tracing just fine -- no RTX cores, AI denoiser or anything else NVIDIA touts as necessary.
iMac

Apple Finally Updates the iMac With Significantly More Powerful CPU and GPU Options (arstechnica.com) 143

Today, Apple will finally begin taking orders for newly refreshed 21- and 27-inch iMacs. The new versions don't change the basic design or add major new features, but they offer substantially faster configuration options for the CPU and GPU. From a report: The 21.5-inch iMac now has a 6-core, eighth-generation Intel CPU option -- up from a maximum of four cores before. The 27-inch now has six cores as the standard configuration, with an optional upgrade to a 3.6GHz, 9th-gen, 8-core Intel Core i9 CPU that Apple claims will double performance over the previous 27-inch iMac. The base 27-inch model has a 3GHz 6-core Intel Core i5 CPU, with intermediate configurations at 3.1GHz and 3.7GHz (both Core i5). The big news is arguably that both sizes now offer high-end, workstation-class Vega-graphics options for the first time. Apple added a similar upgrade option to the 15-inch MacBook Pro late last year. In this case, the 21.6-inch iMac has an option for the 20-compute-unit version of Vega with 4GB of HBM2 video memory. That's the same as the top-end 15-inch MacBook Pro option.

The 27-inch iMac can now be configured with the Radeon Pro Vega 48 with 8GB of HBM2. For reference, the much pricier iMac Pro has Vega 56 and Vega 64 options. Apple claims the Vega 48 will net a 50-percent performance improvement over the Radeon Pro 580, the previous top configuration. Speaking of the previous top configuration, the non-Vega GPU options are the same as what was available yesterday. The only difference is that they now have an "X" affixed to the numbers in their names, per AMD branding conventions -- i.e., Radeon Pro 580X instead of 580. RAM options are the same in terms of volume (up to 32GB for the 21.5-inch and 64GB for the 27-inch), but the DDR4 RAM is slightly faster now, at 2666MHz.

Graphics

NVIDIA's Ray Tracing Tech Will Soon Run On Older GTX Cards (engadget.com) 98

NVIDIA's older GeForce GTX 10-series cards will be getting the company's new ray-tracing tech in April. The technology, which is currently only available on its new RTX cards, "will work on GPUs from the 1060 and up, albeit with some serious caveats," reports Engadget. "Some games like Battlefield V will run just fine and deliver better visuals, but other games, like the freshly released Metro Exodus, will run at just 18 fps at 1440p -- obviously an unplayable frame-rate." From the report: What games you'll be able to play with ray-tracing tech (also known as DXR) on NVIDIA GTX cards depends entirely on how it's implemented. In Battlefield V, for instance, the tech is only used for things like reflections. On top of that, you can dial down the strength of the effect so that it consumes less computing horsepower. Metro Exodus, on the other hand, uses ray tracing to create highly realistic "global illumination" effects, simulating lighting from the real world. It's the first game that really showed the potential of RTX cards and actually generated some excitement about the tech. However, because it's so computationally intensive, GTX cards (which don't have the RTX tensor cores) will be effectively be too slow to run it.

NVIDIA explained that when it was first developing the next gen RTX tech, it found chips using Pascal tech would be "monster" sized and consume up to 650 watts. That's because the older cards lack both the integer cores and tensor cores found on the RTX cards. They get particularly stuck on ray-tracing, running about four times slower than the RTX cards on Metro Exodus. Since Metro Exodus is so heavily ray-traced, the RTX cards run it three times quicker than older GTX 10-series cards. However, that falls to two times for Shadow of the Tomb Raider, and 1.6 times for Battlefield V, because both of those games use ray tracing less. The latest GTX 1660 and 1660 Ti GPUs, which don't have RT but do have integer cores, will run ray-traced games moderately better than last-gen 10-series GPUs.
NVIDIA also announced that Unity and Unreal Engine now support ray-tracing, allowing developers to implement the tech into their games. Developers can use NVIDIA's new set of tools called GameWorks RTX to achieve this.

"It includes the RTX Denoiser SDK that enables real-time ray-tracing through techniques that reduce the required ray count and number of samples per pixel," adds Engadget. "It will support ray-traced effects like area light shadows, glossy reflections, ambient occlusion and diffuse global illumination (the latter is used in Metro Exodus). Suffice to say, all of those things will make game look a lot prettier."
Displays

VR Company Co-Founder Spends an Entire Week in a VR Headset (pcgamer.com) 39

An anonymous reader quotes PC Gamer: Not too long into a 168-hour VR marathon session, Jak Wilmot admits the monotony got to him. Wilmot, who is the co-founder of Disrupt VR, also says this experiment is "quite possibly the dumbest thing" he's ever done. So, why do it? For science, of course. I can't imagine immersing myself in a virtual world for a full week, nonstop night and day. Wilmot did it, though, for the most part -- he allowed himself 30 seconds to switch VR headsets when needed, and 30 seconds without a headset on to eat, if required. Other than those small breaks, he spent every other moment in VR...

There doesn't seem to be some big takeaway from this experiment (aside from, perhaps, don't drink coffee while playing VR), though one thing I also found interesting was his integration back into the real world when the experiment was over. "I have never appreciated the smell of outside air so much. One thing we cannot replicate is nature. We can do it visually and auditorally, but there is something about the energy of outside that is amazing," Wilmot observed.

PC Gamer calls it "probably at least partially a publicity stunt. But it's still interesting to see how donning a VR headset for an extended period of time and essentially living in virtual worlds can mess with the mind." Wilmot wore VR gear while working -- and even while showering (with the VR gear protected by plastic), blacking out his windows so he couldn't tell day from night, calling it "a week in the future..."

"I almost feel like I'm in my own 500-suare-foot spaceship," he says at one point, "and I'm really missing earth, and I'm missing nature." Early on he also reported some mild claustrophobia.

You can watch the moment where after seven days he removes the headset and returns to conventional reality, joking "Oh my gosh, the graphics are so good." He reports a slight disorientation as his eyes catch up with real ilfe, and says it changed his perspective on people in the real world, seeing them as "individuals in one collection, one environment -- as avatars."
Patents

Apple Dealt Legal Blow as Jury Awards Qualcomm $31 Million (cnet.com) 47

Apple violated three Qualcomm patents and should pay the chipmaker $31 million for infringing on its technology, a jury decided Thursday, giving Qualcomm momentum as it heads into another legal skirmish with the iPhone maker next month. From a report: Qualcomm, which filed the suit in July 2017, alleged Apple had used its technology without permission in some versions of its popular iPhone. The jury awarded Qualcomm the full amount it requested at the start of the two-week trial, which was held in San Diego. One disputed Qualcomm patent covers technology that lets a smartphone quickly connect to the internet once the device is turned on. Another deals with graphics processing and battery life. The third addresses technology that shifts traffic between a phone's apps processor and modem. The $31 million in damages -- or $1.41 per infringing iPhone -- is a drop in the bucket for Apple, a company that briefly became a $1 trillion company last year. But it marks an important victory for Qualcomm, burnishing its reputation as a mobile components innovator. The win also lends credibility to the notion that much of the company's innovation is reflected in iPhones.
Graphics

NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660 (hothardware.com) 101

MojoKid writes: NVIDIA took the wraps off yet another lower cost Turing-based graphics card today, dubbed the GeForce GTX 1660. For a $219 MSRP, the card offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores with a 1785MHz boost clock and 6GB of GDDR6 RAM with 192.1GB/s of bandwidth. Generally speaking, the new GeForce GTX 1660 is 15% to 30% faster than NVIDIA's previous generation GeForce GTX 1060 but doesn't support new ray tracing and DLSS features that the majority of NVIDIA's new Turing cards support. Performance-wise, GeForce GTX 1660 is generally faster than an AMD Radeon RX 590 overall. Boards from various OEM partners should be in the channel for purchase this week.
Businesses

Amazon Lobbied More Government Entities Than Any Other Public US Company Last Year (fortune.com) 41

Amazon lobbied more government entities last year than any other public U.S. company, covering issues like healthcare, transportation, defense, and labor regulation. "Across 2018, Amazon contacted 40 different federal entities on 21 different general issue areas," reports Fortune, citing a report from Axios. "The only tech giant to lobby on more issues than Amazon was Google's Alphabet." From the report: In terms of money spent, Amazon's $14.4 million is topped only by Alphabet's $21 million, says Bloomberg. While the tech industry overall spent less than half of the $280 million from pharmaceutical and healthcare products companies in Washington, Amazon has increased spending 460% since 2012, growing quickly within its trade. According to Axios, Amazon lobbied on self-driving car and drone issues, hinting at new methods of delivery. It supported a law allowing pharmacists to tell patients when using their insurance is actually more expensive, aiding Amazon's new investment in PillPack. It also covered the labeling of bioengineered food and a pilot program allowing online shoppers to use the Supplemental Nutritional Assistance Program -- signs of Amazon's emerging grocery business.
Graphics

Microsoft Brings DirectX 12 To Windows 7 (anandtech.com) 119

Microsoft has announced a form of DirectX 12 that will support Windows 7. "Now before you get too excited, this is currently only enabled for World of Warcraft; and indeed it's not slated to be a general-purpose solution like DX12 on Win10," reports AnandTech. "Instead, Microsoft has stated that they are working with a few other developers to bring their DX12 games/backends to Windows 7 as well. As a consumer it's great to see them supporting their product ten years after it launched, but with the entire OS being put out to pasture in nine months, it seems like an odd time to be dedicating resources to bringing it new features." From the report: For some background, Microsoft's latest DirectX API was created to remove some of the CPU bottlenecks for gaming by allowing for developers to use low-level programming conventions to shift some of the pressure points away from the CPU. This was a response to single-threaded CPU performance plateauing, making complex graphical workloads increasingly CPU-bounded. There's many advantages to using this API over traditional DX11, especially for threading and draw calls. But, Microsoft made the decision long ago to only support DirectX 12 on Windows 10, with its WDDM 2.0 driver stack.

Today's announcement is a pretty big surprise on a number of levels. If Microsoft had wanted to back-port DX12 to Windows 7, you would have thought they'd have done it before Windows 7 entered its long-term servicing state. As it is, even free security patches for Windows 7 are set to end on January 14, 2020, which is well under a year away, and the company is actively trying to migrate users to Windows 10 to avoid having a huge swath of machines sitting in an unpatched state. In fact, they are about to add a pop-up notification to Windows 7 to let users know that they are running out of support very soon. So adding a big feature like DX12 now not only risks undermining their own efforts to migrate people away from Windows 7, but also adding a new feature well after Windows 7 entered long-term support. It's just bizarre.

First Person Shooters (Games)

Study Shows Gamers At High FPS Have Better Kill-To-Death Ratios In Battle Royale Games (hothardware.com) 149

MojoKid writes: Gaming enthusiasts and pro-gamers have believed for a long time that playing on high refresh rates displays with high frame rates offers a competitive edge in fast-action games like PUBG, Fortnite and Apex Legends. The premise is, the faster the display can update the action for you, every millisecond saved will count when it comes to tracking targets and reaction times. This sounds logical but there's never been specific data tabulated to back this theory up and prove it. NVIDIA, however, just took it upon themselves with the use of their GeForce Experience tool, to compile anonymous data on gamers by hours played per week, panel refresh rate and graphics card type. Though obviously this data speaks to only NVIDIA GPU users, the numbers do speak for themselves.

The more powerful the GPU with a higher frame rate, along with higher panel refresh rate, generally speaking, the higher the kill-to-death ratio (K/D) for the gamers that were profiled. In fact, it really didn't matter hour many hours per week were played. Casual gamers and heavy-duty daily players alike could see anywhere from about a 50 to 150 percent increase in K/D ratio for significantly better overall player performance. It should be underscored that it really doesn't matter what GPU is at play; gamers with AMD graphics cards that can push high frame rates at 1080p or similar can see similar K/D gains. However, the new performance sweet spot seems to be as close to 144Hz/144FPS as your system can push, the better off you'll be and the higher the frame rate and refresh rate the better as well.

Open Source

Linux 5.0 Released (phoronix.com) 107

An anonymous reader writes: Linus Torvalds has released Linux 5.0 in kicking off the kernel's 28th year of development. Linux 5.0 features include AMD FreeSync support, open-source NVIDIA Turing GPU support, Intel Icelake graphics, Intel VT-d scalable mode, NXP PowerPC processors are now mitigated for Spectre Variant Two, and countless other additions. eWeek adds: Among the new features that have landed in Linux 5.0 is support for the Adiantum encryption system, developed by Google for low power devices. Google's Android mobile operating system and ChromeOS desktop operating system both rely on the Linux kernel. "Storage encryption protects your data if your phone falls into someone else's hands," Paul Crowley and Eric Biggers, Android Security and Privacy Team at Google wrote in a blog post. "Adiantum is an innovation in cryptography designed to make storage encryption more efficient for devices without cryptographic acceleration, to ensure that all devices can be encrypted. Memory management in Linux also gets a boost in the 5.0 kernel with a series of improvements designed to help prevent memory fragmentation, which can reduce performance.
Games

The New 'Red Dead Redemption' Reveals the Biggest Problem With Marquee Games Today: They're Boring as Hell. (theoutline.com) 211

An anonymous reader shares a column: Everything about "Red Dead Redemption 2" is big. The latest open-world western, released in October by Rockstar Games, constantly reminds you of this. It takes roughly 15 minutes for its bland everycowboy star, Arthur Morgan, to gallop across the 29-square-mile map. It has 200 species of animals, including grizzly bears, alligators, and a surprising number of birds. It takes about 45.5 hours to play through the main quest, and 150-plus hours to reach 100 percent completion. There are more than 50 weapons to choose from, such as a double-barreled shotgun and a rusty hatchet. It's big, big, big.

[...] On top of all the bigness, "Red Dead Redemption 2" is also incredibly dull. I've been playing it off and on since it was released, and I'm still waiting for it to get fun. I'm not alone in thinking so -- Mark Brown of Game Maker's Toolkit called it "quite boring" and Mashable said it's a "monumental disappointment." There are a glut of Reddit posts from people complaining about how slow the game feels, usually with a tone of extreme self-consciousness. Unless you're a real a**hole, it's not exactly fun to stray from popular consensus. Perhaps the general hesitancy to criticize the game is due to the fact that it's not technically bad. Its graphics and scale really are impressive. It is designed to please.

And yet "RDR2" seems to exemplify a certain kind of hollowness that's now standard among Triple-A titles. It's very big, with only tedium inside. Call it a Real World Game. The main problem with "RDR2" is that it's comprised almost entirely of tedious, mandatory chores. It always feels like it's stalling for time, trying to juke the number of hours it takes to complete it.

The Military

Microsoft CEO Defends Pentagon Contract Following Employee Outcry (theverge.com) 221

Microsoft CEO Satya Nadella is defending the company's $479 million contract with the Pentagon to supply augmented reality headsets to the U.S. military. "We made a principled decision that we're not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy," he told CNN Business at Mobile World Congress. "We were very transparent about that decision and we'll continue to have that dialogue [with employees]," he added during the exclusive interview. From the report: Microsoft was awarded the contract to supply "Integrated Visual Augmentation System" prototypes to the U.S. military in November. The company could eventually deliver over 100,000 headsets under the contract. Microsoft's HoloLens augmented reality technology allows users to see the world around them, but with virtual graphics overlaid. The Israeli military, which has taken delivery of some HoloLens headsets, says the technology can be used to help commanders visualize the battlefield and field medics to consult doctors. According to procurement documents, the U.S. military is seeking a single platform that provides its forces with "increased lethality, mobility and situational awareness" in combat. Microsoft employees have recently circulated a letter addressed to Nadella and Brad Smith, the company's president and chief legal officer, arguing that the company should not supply its HoloLens technology to the U.S. military. "It's not about taking arbitrary action by a single company, it's not about 50 people or 100 people or even 100,000 people in a company," he said. "It's really about being a responsible corporate citizen in a democracy."
Graphics

NVIDIA Turing-Based GeForce GTX 1660 Ti Launched At $279 (hothardware.com) 94

MojoKid writes: NVIDIA has launched yet another graphics card today based on the company's new Turing GPU. This latest GPU, however, doesn't support NVIDIA's RTX ray-tracing technology or its DLSS (Deep Learning Super Sampling) image quality tech. The new GeForce GTX 1660 Ti does, however, bring with it all of the other GPU architecture improvements NVIDIA Turing offers. The new TU116 GPU on board the GeForce GTX 1660 Ti supports concurrent integer and floating point instructions (rather than serializing integer and FP instructions), and it also has a redesigned cache structure with double the amount of L2 cache versus their predecessors, while its L1 cache has been outfitted with a wider memory bus that ultimately doubles the bandwidth. NVIDIA's TU116 has 1,536 active CUDA cores, which is a decent uptick from the GTX 1060, but less than the current gen RTX 2060. Cards will also come equipped with 6GB of GDDR6 memory at 12 Gbps for 288GB/s of bandwidth. Performance-wise, the new GeForce GTX 1660 Ti is typically slightly faster than a previous gen GeFore GTX 1070, and much faster than a GTX 1060. Cards should be available at retail in the next few days, starting at $279.
Intel

Intel Starts Publishing Open-Source Linux Driver Code For Discrete GPUs (phoronix.com) 43

fstack writes: Intel is still a year out from releasing their first discrete graphics processors, but the company has begun publishing their open-source Linux GPU driver code. This week they began by publishing patches on top of their existing Intel Linux driver for supporting device local memory for dedicated video memory as part of their restructuring effort to support discrete graphics cards. Intel later confirmed this is the start of their open-source driver support for discrete graphics solutions. They have also begun working on Linux driver support for Adaptive-Sync and better reset recovery.
Hardware

Nvidia CEO Foresees a Great Year for PC Gaming Laptops (venturebeat.com) 36

Nvidia has predicted that the year ahead would be a good one for the company, with demand for laptop gaming gear remaining strong. From a report: Looking forward, Huang said it would be a big year for gaming laptops, as Nvidia knows that more than 40 Turing-based gaming laptops (based on the GeForce RTX 2060) are poised to launch during the year. Those laptops use mid-range RTX cards based on graphics processing units (GPUs) using Nvidia's new Turing architecture -- the GeForce RTX graphics cards that can do real-time ray tracing -- that are battery efficient.

Huang acknowledged that visibility is limited. I asked him if cloud gaming would be a disruptive force during the year. But he noted that Nvidia had been providing its own cloud gaming solution, GeForce Now, with relatively little impact on the market for three years. So he said it remains to be seen if cloud gaming and the "Netflix of games" would make an impact on the market. In the meantime, he said that gaming laptops would launch.

Graphics

AMD Radeon VII Graphics Card Launched, Benchmarks Versus NVIDIA GeForce RTX (hothardware.com) 73

MojoKid writes: AMD officially launched its new Radeon VII flagship graphics card today, based on the company's 7nm second-generation Vega architecture. In addition to core GPU optimizations, Radeon VII provides 2X the graphics memory at 16GB and 2.1X the memory bandwidth at a full 1TB/s, compared to AMD's previous generation Radeon RX Vega 64. The move to 7nm allowed AMD to shrink the Vega 20 GPU die down to 331 square millimeters. This shrink and the subsequent silicon die area saving is what allowed them to add an additional two stacks of HBM2 memory and increase the high-bandwidth cache (frame buffer) capacity to 16GB. The GPU on board the Radeon VII has 60CUs and a total of 3,840 active stream processors with a board power TDP of 300 Watts. As you might expect, it's a beast in the benchmarks that's able to pull ahead of NVIDIA's GeForce RTX 2080 in spots but ultimately lands somewhere in between the performance of an RTX 2070 and 2080 overall. AMD Radeon VII cards will be available in a matter of days at an MSRP of $699 with custom boards from third-party partners showing up shortly as well.
AI

The World's Fastest Supercomputer Breaks an AI Record (wired.com) 66

Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. From a report: The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before. Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI's frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

"Deep learning has never been scaled to such levels of performance before," says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. His group collaborated with researchers at Summit's home base, Oak Ridge National Lab. Fittingly, the world's most powerful computer's AI workout was focused on one of the world's largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century's worth of three-hour forecasts for Earth's atmosphere.

Ubuntu

System76 Unveils 'Darter Pro' Linux Laptop With Choice of Ubuntu or Pop!_OS (betanews.com) 86

An anonymous reader writes: Today, System76 unveiled its latest laptop -- the 15.6-inch (full-HD) "Darter Pro." It is thin, but not overly so -- it still has USB-A ports (thankfully). The computer is quite modern, however, as it also has a USB-C/Thunderbolt 3 port. It supports Pop!_OS 18.04 LTS (64-bit), Pop!_OS 18.10 (64-bit), or Ubuntu 18.04 LTS (64-bit) operating system. It comes in two variants, with the following processor options: 8th Gen Intel Core i5-8265U: 1.6 up to 3.90 GHz -- 6MB Cache -- 4 Cores -- 8 Threads, or 8th Gen Intel Core i7-8565U: 1.8 up to 4.60 GHz -- 8MB Cache -- 4 Cores -- 8 Threads, with either coupled with Intel UHD Graphics 620 GPU, and up to 32GB Dual Channel DDR4 @ 2400 MHz, and M.2 SATA or PCIe NVMe SSD for storage. As for ports, there is USB 3.1 Type-C with Thunderbolt 3, 2 USB 3.0 Type-A, 1 x USB 2.0, SD Card Reader. The company says it will announce the pricing at a later stage,
AI

DeepMind AI AlphaStar Wins 10-1 Against 'StarCarft II' Pros (newscientist.com) 103

In a series of matches streamed on YouTube and Twitch, DeepMind AI AlphaStar defeated two top-ranked professionals 10-1 at real-time strategy game StarCraft II. "This is of course an exciting moment for us," said David Silver at DeepMind in a live stream watched by more than 55,000 people. "For the first time we saw an AI that was able to defeat a professional player." New Scientist reports: DeepMind created five versions of their AI, called AlphaStar, and trained them on footage of human games. The different AIs then played against each other in a league, with the leading AI accumulating the equivalent of 200 years of game experience. With this, AlphaStar beat professional players Dario Wunsch and Grzegorz Komincz -- ranked 44th and 13th in the world respectively. AlphaStar's success came with some caveats: the AI played only on a single map, and using a single kind of player (there are three in the game). The professionals also had to contend with playing different versions of AlphaStar from match to match. While the AlphaStar was playing on a single graphics processing unit, a computer chip found in many gaming computers, it was trained on 16 tensor processing units hosted in the Google cloud -- processing power beyond the realms of many.

Slashdot Top Deals