×
Graphics

Nvidia's GPU Technology Conference Will Now Be Online Only Due To Coronavirus Concerns (theverge.com) 40

Nvidia has announced it has shifted its GPU Technology Conference (GTC) to be online-only due to concerns from the coronavirus outbreak. People who had registered for the event, which was originally set to be held in San Jose, California, from March 22nd to March 26th, will get a full refund. The Verge reports: "This decision to move the event online instead of at the San Jose Convention Center reflects our top priority: the health and safety of our employees, our partners and our customers," the company said in a statement. Nvidia says company founder and CEO Jensen Huang will still deliver a keynote, and that it's working with speakers to begin publishing talks online. The online version of the event will still take place from March 22nd to March 26th. Other tech events cancelled due to the coronavirus include Mobile World Congress, Facebook's F8 developer conference, and the Game Developers Conference.
Businesses

Stealth Startup Plans Fundamentally New Kind of Computer with Circuit-Rearranging Processor (zdnet.com) 107

VCs have given nearly half a billion dollars to a stealth startup called SambaNova Systems to build "a new kind of computer to replace the typical Von Neumann machines expressed in processors from Intel and AMD, and graphics chips from Nvidia."

ZDNet reports: The last thirty years in computing, said CEO Rodrigo Liang, have been "focused on instructions and operations, in terms of what you optimize for. The next five, to ten, to twenty years, large amounts of data and how it flows through a system is really what's going to drive performance." It's not just a novel computer chip, said Liang, rather, "we are focused on the complete system," he told ZDNet. "To really provide a fundamental shift in computing, you have to obviously provide a new piece of silicon at the core, but you have to build the entire system, to integrate across several layers of hardware and software...."

[One approach to training neural networks with very little labeled data] is part of the shift of computer programming from hard-coded to differentiable, in which code is learned on the fly, commonly referred to as "software 2.0." Liang's co-founders include Stanford professor Kunle Olukotun, who says a programmable logic device similar to a field-programmable gate array could change its shape over and over to align its circuitry [to] that differentiated program, with the help of a smart compiler such as Spatial. [Spatial is "a computing language that can take programs and de-compose them into operations that can be run in parallel, for the purpose of making chips that can be 'reconfigurable,' able to change their circuitry on the fly."]

In an interview in his office last spring, Olukotun laid out a sketch of how all that might come together. In what he refers to as a "data flow," the computing paradigm is turned inside-out. Rather than stuffing a program's instructions into a fixed set of logic gates permanently etched into the processor, the processor re-arranges its circuits, perhaps every clock cycle, to variably manipulate large amounts of data that "flows" through the chip.... Today's chips execute instructions in an instruction "pipeline" that is fixed, he observed, "whereas in this reconfigurable data-flow architecture, it's not instructions that are flowing down the pipeline, it's data that's flowing down the pipeline, and the instructions are the configuration of the hardware that exists in place.

Hardware

ZX Spectrum Next, An Advanced Version of the Original 8-Bit Home Computer, Has Been Released 95

hackertourist shares an update on the status of the "ZX Spectrum Next" Kickstarter campaign: In 2017, a Kickstarter campaign was started to design and build "an updated and enhanced version of the ZX Spectrum totally compatible with the original, featuring the major hardware developments of the past many years packed inside a simple (and beautiful) design by the original designer, Rick Dickinson, inspired by his seminal work at Sinclair Research."

They didn't quite make their original planned delivery date (2018), but they made good on their promise in the end: the first machine was delivered on February 6 of this year. The Spectrum Next contains a Z80 processor on an FPGA, 1MB of RAM expandable to 2MB, hardware sprites, 256 colors, RGB/VGA/HDMI video output, and three AY-3-8912 audio chips. A Raspberry Pi Zero can be added as an expansion board. The computer can emulate any of the original Spectrum variants, but it also supports add-ons that have been designed by the Spectrum community over the years, such as games loaded onto SD cards, better processors and more memory, and improved graphics.
XBox (Games)

Microsoft Reveals More Xbox Series X Specs (polygon.com) 49

Microsoft revealed new details on its next-generation console, text, on Monday morning, confirming specifications on what the company calls its "superior balance of power and speed" for its new hardware. From a report: The next-gen Xbox, Microsoft said, will be four times as powerful as the original Xbox One. The Xbox Series X "next-generation custom processor" will employ AMD's Zen 2 and RDNA 2 architecture, head of Xbox Phil Spencer wrote on the Xbox website. "Delivering four times the processing power of an Xbox One and enabling developers to leverage 12 [teraflops] of GPU (Graphics Processing Unit) performance -- twice that of an Xbox One X and more than eight times the original Xbox One," Spencer said. He called the next-generation Xbox's processing and graphics power "a true generational leap," offering higher frame rates -- with support for up to 120 fps -- and more sophisticated game worlds.

That 12 teraflops claim is twice that of what Microsoft promised with the Xbox One X (then known as Project Scorpio) when it revealed the mid-generation console update back in 2016. Spencer also outlined the Xbox Series X's variable rate shading, saying, "Rather than spending GPU cycles uniformly to every single pixel on the screen, they can prioritize individual effects on specific game characters or important environmental objects. This technique results in more stable frame rates and higher resolution, with no impact on the final image quality." He also promised hardware-accelerated DirectX ray tracing, with "true-to-life lighting, accurate reflections and realistic acoustics in real time." Microsoft also reconfirmed features like SSD storage, which promise faster loading times, as well as new ones, like Quick Resume, for Xbox Series X.

Cloud

Nvidia's GeForce Now Is Losing All Activision Blizzard Games (theverge.com) 75

Nvidia's GeForce Now is a cloud gaming service that lets you play games stored on dedicated GeForce graphics-enabled PCs across a wide array of devices. While it lets you play PC games you already own, the game publisher must allow it on the service. "Today, Nvidia is revealing that Activision Blizzard is no longer playing ball, pulling down its catalog of games including Overwatch, WoW, and the Call of Duty series," reports The Verge. From the report: That means one of the service's biggest publishers, as well as its Battle.net catalog of games, will no longer be available just a week after the service's formal launch -- a launch that was already missing many games from Capcom, EA, Konami, Remedy, Rockstar and Square Enix, all of which seemed to have pulled out after Nvidia's beta period ended. Nvidia wouldn't tell us why this is happening now, but it's strange, because Nvidia previously told us it was contacting every publisher ahead of launch to make sure they were OK with their games staying available with the service. Did Activision Blizzard reneg on a deal, or did Nvidia fail to get permission? We're waiting to hear back from Nvidia; Activision Blizzard didn't respond to a request for comment.

In a statement, Nvidia says it hopes to work with Activision Blizzard to bring the games back, but the company confirmed to us that things are pretty cut-and-dried for now -- you shouldn't expect them to magically reappear after a few days (or even a few weeks) thanks to a deal. Nvidia also declined to tell us whether it'd be open to sharing a slice of its subscription fees with publishers, citing the quiet period before its earnings. It's true that Blizzard, at least, has an EULA that specifically prevents users from playing a game on cloud gaming services, but that doesn't seem to explain this move. Activision's EULA doesn't contain anything of the sort, and again, Activision Blizzard didn't seem to have any problem with it during the GeForce Now beta.

United States

The CIA Secretly Bought a Company That Sold Encryption Devices Across the World. Then, Its Spies Read Everything. (washingtonpost.com) 277

Greg Miller, reporting for Washington Post: For more than half a century, governments all over the world trusted a single company to keep the communications of their spies, soldiers and diplomats secret. The company, Crypto AG, got its first break with a contract to build code-making machines for U.S. troops during World War II. Flush with cash, it became a dominant maker of encryption devices for decades, navigating waves of technology from mechanical gears to electronic circuits and, finally, silicon chips and software. The Swiss firm made millions of dollars selling equipment to more than 120 countries well into the 21st century. Its clients included Iran, military juntas in Latin America, nuclear rivals India and Pakistan, and even the Vatican.

But what none of its customers ever knew was that Crypto AG was secretly owned by the CIA in a highly classified partnership with West German intelligence. These spy agencies rigged the company's devices so they could easily break the codes that countries used to send encrypted messages. The decades-long arrangement, among the most closely guarded secrets of the Cold War, is laid bare in a classified, comprehensive CIA history of the operation obtained by The Washington Post and ZDF, a German public broadcaster, in a joint reporting project. The account identifies the CIA officers who ran the program and the company executives entrusted to execute it. It traces the origin of the venture as well as the internal conflicts that nearly derailed it. It describes how the United States and its allies exploited other nations' gullibility for years, taking their money and stealing their secrets. The operation, known first by the code name "Thesaurus" and later "Rubicon," ranks among the most audacious in CIA history.

Ubuntu

Ubuntu vs Windows 10: Performance Tests on a Walmart Laptop (phoronix.com) 147

Phoronix's Michael Larabel is doing some performance testing on Walmart's $199 Motile-branded M141 laptop (which has an AMD Ryzen 3 3200U processor, Vega 3 graphics, 4GB of RAM, and a 14-inch 1080p display).

But first he compared the performance of its pre-installed Windows 10 OS against the forthcoming Ubuntu 20.04 LTS Linux distribution.

Some highlights: - Java text rendering performance did come out much faster on Ubuntu 20.04 with this Ryzen 3 3200U laptop...

- The GraphicsMagick imaging program tended to run much better on Linux, which we've seen on other systems in the past as well.

- Intel's Embree path-tracer was running faster on Ubuntu...

- Various video benchmarks were generally favoring Ubuntu for better performance though I wouldn't recommend much in the way of video encoding from such a low-end device...

- The GIMP image editing software was running much faster on Ubuntu 20.04 in its development state than GIMP 2.10 on Windows 10...

- Python 3 performance is still much faster on Linux than Windows.

- If planning to do any web/LAMP development from the budget laptop and testing PHP scripts locally, Ubuntu's PHP7 performance continues running much stronger than Windows 10. - Git also continues running much faster on Linux.

Their conclusion? "Out of 63 tests ran on both operating systems, Ubuntu 20.04 was the fastest... coming in front 60% of the time." (This sounds like 38 wins for Ubuntu versus 25 wins for Windows 10.)

"If taking the geometric mean of all 63 tests, the Motile $199 laptop with Ryzen 3 3200U was 15% faster on Ubuntu Linux over Windows 10."
Windows

iPad Launch Blindsided Windows Team, Reveals Former Microsoft Executive (twitter.com) 109

The launch of the iPad ten years ago was a big surprise to everyone in the industry -- including to Microsoft executives. Steven Sinofsky, the former President of the Windows Division at Microsoft, shares Microsoft's perspective as well as those of the other industry figures and press on the iPad: The announcement 10 years ago today of the "magical" iPad was clearly a milestone in computing. It was billed to be the "next" computer. For me, managing Windows, just weeks after the launch of Microsoft's "latest creation" Windows 7, it was a as much a challenge as magical. Given that Star Trek had tablets it was inevitable that the form factor would make it to computing (yes, the dynabook...). Microsoft had been working for more than 10 years starting with "WinPad" through Tablet PC. We were fixated on Win32, Pen, and more. The success of iPhone (140K apps & 3B downloads announced that day) blinded us at Microsoft as to where Apple was heading. Endless rumors of Apple's tablet *obviously* meant a pen computer based on Mac. Why not? The industry chased this for 20 years. That was our context. The press, however, was fixated on Apple lacking an "answer" (pundits seem to demand answers) to Netbooks -- those small, cheap, Windows laptops sweeping the world. Over 40 million sold. "What would Apple's response be?" We worried -- a cheap, pen-based, Mac. Sorry Harry!

Jobs said that a new computer needed to be better at some things, better than an iPhone/iPod and better than a laptop. Then he just went right at Netbooks answering what could be better at these things. "Some people have thought that that's a Netbook." (The audience joined in a round of laughter.) Then he said, "The problem is ... Netbooks aren't better at anything ... They're slow. They have low quality displays ... and they run clunky old PC software ... They're just cheap laptops." "Cheap laptops" ... from my perch that was a good thing. I mean inexpensive was a better word. But we knew that Netbooks (and ATOM) were really just a way to make use of the struggling efforts to make low-power, fanless, intel chips for phones. A brutal takedown of 40M units. Sitting in a Le Corbusier chair, he showed the "extraordinary" things his new device did, from browsing to email to photos and videos and more. The real kicker was that it achieved 10 hours of battery life -- unachievable in PCs struggling for 4 hours with their whirring fans.

There was no stylus..no pen. How could one input or be PRODUCTIVE? PC brains were so wedded to a keyboard, mouse, and pen alternative that the idea of being productive without those seemed fanciful. Also instant standby, no viruses, rotate-able, maintained quality over time... As if to emphasize the point, Schiller showed "rewritten" versions of Apple's iWork apps for the iPad. The iPad would have a word processor, spreadsheet, and presentation graphics. Rounding out the demonstration, the iPad would also sync settings with iTune -- content too. This was still early in the travails of iCloud but really a game changer Windows completely lacked except in enterprise with crazy server infrastructure or "consumer" Live apps. iPad had a 3G modem BECAUSE it was built on the iPhone. If you could figure out the device drivers and software for a PC, you'd need a multi-hundred dollar USB modem and a $60/month fee at best. The iPad made this a $29.99 option on AT&T and a slight uptick in purchase price. Starting at $499, iPad was a shot right across the consumer laptop. Consumer laptops were selling over 100 million units a year! Pundits were shocked at the price. I ordered mine arriving in 60/90 days.

At CES weeks earlier, there were the earliest tablets -- made with no help from Google a few fringe Chinese ODMs were shopping hacky tablets called "Mobile Internet Devices" or "Media Tablets". Samsung's Galaxy was 9 months away. Android support (for 4:3 screens) aways. The first looks and reviews a bit later were just endless (and now tiresome) commentary on how the iPad was really for "consumption" and not productivity. There were no files. No keyboard. No mouse. No overlapping windows. Can't write code! In a literally classically defined case of disruption, iPad didn't do those things but what it did, it did so much better not only did people prefer it but they changed what they did in order to use it. Besides, email was the most used too and iPad was great for that. In first year 2010-2011 Apple sold 20 million iPads. That same year would turn out to be an historical high water mark for PCs (365M, ~180M laptops). Analysts had forecasted more than 500M PCs were now rapidly increasing tablet forecasts to 100s of million and dropping PC. The iPad and iPhone were soundly existential threats to Microsoft's core platform business.

Without a platform Microsoft controlled that developers sought out, the soul of the company was "missing." The PC had been overrun by browsers, a change 10 years in the making. PC OEMs were deeply concerned about a rise of Android and loved the Android model (no PC maker would ultimately be a major Android OEM, however). Even Windows Server was eclipsed by Linux and Open Source. The kicker for me, though, was that keyboard stand for the iPad. It was such a hack. Such an obvious "objection handler." But it was critically important because it was a clear reminder that the underlying operating system was "real" ...it was not a "phone OS". Knowing the iPhone and now iPad ran an robust OS under the hood, with a totally different "shell", interface model (touch), and app model (APIs and architecture) had massive implications for being the leading platform provider for computers. That was my Jan 27, 2010.
Further reading: The iPad's original software designer and program lead look back on the device's first 10 years.
Space

Help NASA Choose the Name For Its Next Mars Rover (nasa.gov) 80

Slashdot reader DevNull127 writes: NASA will launch a new rover to Mars this July — and 28,000 American schoolchildren wrote essays with suggestions for what NASA should name it.

NASA has now selected the top nine finalists, which they'll let the public vote on through Monday on a special web page where they're also displaying the schoolchildren's essays. "Scientists are tenacious," wrote one student who suggested the name Tenacity. "It is what keeps them thinking and experimenting... When scientists make mistakes they see what they did wrong and then try again.

"If they didn't have tenacity, Mars rovers wouldn't be a thing."

The new rover will also be carrying the names of 10,932,295 earthlings, etched onto a microchip.

Bloomberg points out that because Mars and Earth are unusually close in July and August -- a mere 39 million miles -- another rover will also be launched by the international ExoMars programme (led by the European Space Agency and the Russian Roscosmos State Corporation), while the United Arab Emirates will also try sending an orbiter to Mars, and China will deploy "an orbiter to circle the planet and a rover to land on it."
Desktops (Apple)

36 Years Ago Today, Steve Jobs Unveiled the First Macintosh (macrumors.com) 108

An anonymous reader quotes a report from MacRumors: On January 24, 1984, former Apple CEO Steve Jobs introduced the first Macintosh at Apple's annual shareholder's meeting in Cupertino, California, debuting the new computer equipped with a 9-inch black and white display, an 8MHz Motorola 68000 processor, 128KB of RAM, a 3.5-inch floppy drive, and a price tag of $2,495. The now iconic machine weighed in at a whopping 17 pounds and was advertised as offering a word processing program, a graphics package, and a mouse. At the time it was introduced, the Macintosh was seen as Apple's last chance to overcome IBM's domination of the personal computer market and remain a major player in the personal computer industry. Despite the high price at the time, which was equivalent to around $6,000 today, the Macintosh sold well, with Apple hitting 70,000 units sold by May 1984. The now iconic "1984" Super Bowl ad that Apple invested in and debuted days before the Macintosh was unveiled may have helped bolster sales.
AMD

AMD Launches Navi-Based Radeon RX 5600XT To Battle GeForce RTX 2060 Under $300 (hothardware.com) 57

MojoKid writes: Today AMD launched its latest midrange graphics card based on the company's all new Navi architecture. The AMD Radeon RX 5600 XT slots in under $300 ($279 MSRP) and is based on the same Navi 10 GPU as AMD's current high-end Radeon RX 5700 series cards. AMD's Radeon RX 5600 XT is outfitted with 36 compute units, with a total of 2,304 stream processors and is essentially a Radeon 5700 spec GPU with 2GB less GDDR 6 memory (6GB total) and a narrower 192-bit interface, versus Radeon RX 5700's 8GB, 256-bit config. HotHardware took a Sapphire Pulse Radeon RX 5600 XT around the benchmark track and this card has a BIOS switch on-board that toggles between performance and silent/quiet modes. In performance mode, the card has a 160W power target, 14Gbps memory data rate, a Boost Clock of 1,750MHz and a Game Clock of 1,615MHz. In silent/quiet mode, things are a bit more tame with a 135W power target, 12Gbps memory, and 1,620 MHz/1,460MHz Boost and Game Clocks, respectively. In the gaming benchmarks, the new Radeon RX 5600 XT is generally faster than NVIDIA's GeForce RTX 2060 overall, with the exception of a few titles that are more NVIDIA-optimized and in VR. Though it lacks the capability for hardware-accelerated ray tracing, the new AMD Radeon RX 5600 XT weighs in $20-30 less than NVIDIA's closest competitor and offers similar if not better performance.
Wine

Wine 5.0 Released (bleepingcomputer.com) 60

An anonymous reader quotes a report from BleepingComputer: Wine 5.0 has been released today and contains over 7,400 bug fixes and numerous audio and graphics improvements that will increase performance in gaming on Linux. With the release of Wine 5.0, WineHQ hopes to resolve many of these issues, with the main improvements being:

-Builtin modules in PE format: To make games think Wine is a real Windows environment, most Wine 5.0 modules have been converted into the PE format rather than ELF binaries. It is hoped that this will allow copy-protection and anti-cheat programs to not flag games running under Wine as being modified.
-Multi-monitor support: Multiple displays adapters and multi-monitor configurations are now supported under Wine.
-XAudio2 reimplementation: XAudio2 libraries have been added back to Wine and will use the FAudio library for better compatibility.
-Vulkan 1.1 support: "The Vulkan driver supports up to version 1.1.126 of the Vulkan spec."
Here are the release notes, download locations for the binary packages (when available) and source.
Intel

Intel's First Discrete GPU is Built For Developers (engadget.com) 50

At its CES 2020 keynote, Intel showed off its upcoming Xe discrete graphics chip and today, we're seeing exactly how that's going to be implemented. From a report: First off, Intel unveiled a standalone DG1 "software development vehicle" card that will allow developers to optimize apps for the new graphics system. It didn't reveal any performance details for the card, but did show it running the Warframe game. It also noted that it's now "sampling to ISVs (independent software vendors) worldwide... enabling developers to optimize for Xe." As far as we know right now, Intel's discrete graphics will be chips (not cards) installed together with the CPUs on a single package. However, it's interesting to see Intel graphics in the form of a standalone PCIe card, even one that will never be sold to consumers.
AMD

AMD Unveils Ryzen 4000 Mobile CPUs Claiming Big Gains, 64-Core Threadripper (hothardware.com) 71

MojoKid writes: Yesterday, AMD launched its new Ryzen 4000 Series mobile processors for laptops at CES 2020, along with a monstrous 64-core/128-thread third-generation Ryzen Threadripper workstation desktop CPU. In addition to the new processors, on the graphics front the oft-leaked Radeon RX 5600 XT that target's 1080p gamers in the sweet spot of the GPU market was also made official. In CPU news, AMD claims Ryzen 4000 series mobile processors offer 20% lower SOC power, 2X perf-per-watt, 5X faster power state switching, and significantly improved iGPU performance versus its previous-gen mobile Ryzen 3000 products. AMD's U-Series flagship, the Ryzen 7 4800U, is an 8-core/16-thread processor with a max turbo frequency of 4.2GHz and integrated Vega-derived 8-core GPU.

Along with architectural enhancements and the frequency benefits of producing the chips at 7nm, AMD is underscoring up to 59% improved performance per graphics core as well. AMD is also claiming superior single-thread CPU performance versus current Intel-processors and significantly better multi-threaded performance. The initial Ryzen 4000 U-Series line-up consists of five processors, starting with the 4-core/4-thread Ryzen 5 4300U, and topping off with the aforementioned Ryzen 7 4800U. On the other end of the spectrum, AMD revealed some new information regarding its 64-core/128-thread Ryzen Threadripper 3990X processor. The beast chip will have a base clock of 2.9GHz and a boost clock of 4.3GHz with a whopping 288MB of cache. The chip will drop into existing TRX40 motherboards and be available on February 7th for $3990. AMD showcased the chip versus a dual socket Intel Xeon Platinum in the VRAY 3D rendering benchmark beating the Xeon system by almost 30 minutes in a 90-minute workload, though the Intel system retails for around $20K.

AI

MIT's New Tool Predicts How Fast a Chip Can Run Your Code (thenextweb.com) 13

Folks at the Massachusetts Institute of Technology (MIT) have developed a new machine learning-based tool that will tell you how fast a code can run on various chips. This will help developers tune their applications for specific processor architectures. From a report: Traditionally, developers used the performance model of compilers through a simulation to run basic blocks -- fundamental computer instruction at the machine level -- of code in order to gauge the performance of a chip. However, these performance models are not often validated through real-life processor performance. MIT researchers developed an AI model called Ithmel by training it to predict how fast a chip can run unknown basic blocks. Later, it was supported by a database called BHive with 300,000 basic blocks from specialized fields such as machine learning, cryptography, and graphics. The team of researchers presented a paper [PDF] at the NeuralIPS conference in December to describe a new technique to measure code performance on various processors. The paper also describes Vemal, a new automatically generating algorithm that can be used to generate compiler optimizations.
Graphics

Apple Reunites With iPhone Graphics Chip Partner To License Technology (theverge.com) 28

Apple will once again license technology from Imagination Technologies, the chip designer that used to provide graphics processors for the iPhone and iPad, the UK-based company announced today. The Verge reports: In a short statement posted on its website, Imagination said that it had entered into a multiyear license agreement with Apple, under which the Cupertino, California-based firm will have access to "a wider range of Imagination's intellectual property in exchange for license fees." Apple announced its split from Imagination back in April 2017 when it said that it would start designing its own graphics chips, and it would stop licensing the company's technology within two years. After the split was announced, Imagination expressed skepticism that Apple could design its own chips "without violating Imagination's patents, intellectual property, and confidential information."
China

One-Quarter of World's Pigs Died In a Year Due To Swine Fever In China (nytimes.com) 104

An anonymous reader quotes a report from The New York Times: The [African swine fever disease] was first reported in Shenyang, Liaoning Province, in early August 2018. By the end of August 2019, the entire pig population of China had dropped by about 40 percent. China accounted for more than half of the global pig population in 2018, and the epidemic there alone has killed nearly one-quarter of all the world's pigs. By late September, the disease had cost economic losses of one trillion yuan (about $141 billion), according to Li Defa, dean of the College of Animal Science and Technology at China Agricultural University in Beijing. Qiu Huaji, a leading Chinese expert on porcine infectious diseases, has said that African swine fever has been no less devastating "than a war" -- in terms of "its effects on the national interest and people's livelihoods and its political, economic and social impact."

Much like severe acute respiratory syndrome, or SARS, exposed the shortcomings of China's public health system when it became an epidemic in 2002-3, swine fever today exposes the weaknesses of the country's animal-disease prevention and control. But it also reveals something much more fundamental: notably, the perverse effects that even sound regulations can have when they are deployed within a system of governance as unsound as China's. According to Yu Kangzhen, a deputy minister of agriculture, the localities that struggled to control the spread of African swine fever were also those that lacked staff, funding or other resources in animal-epidemic prevention. Yet that alone cannot explain the breadth of the epidemic or the speed with which it swept across China...

AI

Researchers Detail AI that De-hazes and Colorizes Underwater Photos (venturebeat.com) 25

Kyle Wiggers, writing for VentureBeat: Ever notice that underwater images tend to be be blurry and somewhat distorted? That's because phenomena like light attenuation and back-scattering adversely affect visibility. To remedy this, researchers at Harbin Engineering University in China devised a machine learning algorithm that generates realistic water images, along with a second algorithm that trains on those images to both restore natural color and reduce haze. They say that their approach qualitatively and quantitatively matches the state of the art, and that it's able to process upwards of 125 frames per second running on a single graphics card. The team notes that most underwater image enhancement algorithms (such as those that adjust white balance) aren't based on physical imaging models, making them poorly suited to the task. By contrast, this approach taps a generative adversarial network (GAN) -- an AI model consisting of a generator that attempts to fool a discriminator into classifying synthetic samples as real-world samples -- to produce a set of images of specific survey sites that are fed into a second algorithm, called U-Net.
Privacy

Ask Slashdot: What Will the 2020s Bring Us? 207

dryriver writes: The 2010s were not necessarily the greatest decade to live through. AAA computer games were not only DRM'd and internet tethered to death but became increasingly formulaic and pay-to-win driven, and poor quality console ports pissed off PC gamers. Forced software subscriptions for major software products you could previously buy became a thing. Personal privacy went out the window in ways too numerous to list, with lawmakers failing on many levels to regulate the tech, data-mining and internet advertising companies in any meaningful way. Severe security vulnerabilities were found in hundreds of different tech products, from Intel CPUs to baby monitors and internet-connected doorbells. Thousands of tech products shipped with microphones, cameras, and internet connectivity integration that couldn't be switched off with an actual hardware switch. Many electronics products became harder or impossible to repair yourself. Printed manuals coming with tech products became almost non-existent. Hackers, scammers, ransomwarers and identity thieves caused more mayhem than ever before. Troll farms, click farms and fake news factories damaged the integrity of the internet as an information source. Tech companies and media companies became afraid of pissing off the Chinese government.

Windows turned into a big piece of spyware. Intel couldn't be bothered to innovate until AMD Ryzen came along. Nvidia somehow took a full decade to make really basic realtime raytracing happen, even though smaller GPU maker Imagination had done it years earlier with a fraction of the budget, and in a mobile GPU to boot. Top-of-the-line smartphones became seriously expensive. Censorship and shadow banning on the once-more-open internet became a thing. Easily-triggered people trying to muzzle other people on social media became a thing. The quality of popular music and music videos went steadily downhill. Star Wars went to shit after Disney bought it, as did the Star Trek films. And mainstream cinema turned into an endless VFX-heavy comic book movies, remakes/reboots and horror movies fest. In many ways, television was the biggest winner of the 2010s, with many new TV shows with film-like production values being made. The second winner may be computer hardware that delivered more storage/memory/performance per dollar than ever before.

To the question: What, dear Slashdotters, will the 2020s bring us? Will things get better in tech and other things relevant to nerds, or will they get worse?
Hardware

Atari's Home Computers Turn 40 (fastcompany.com) 86

harrymcc writes: Atari's first home computers, the 400 and 800, were announced at Winter CES in January 1980. But they didn't ship until late in the year -- so over at Fast Company, Benj Edwards has marked their 40th anniversary with a look at their rise and fall. Though Atari ultimately had trouble competing with Apple and other entrenched PC makers, it produced machines with dazzling graphics and sound and the best games of their era, making its computers landmarks from both a technological and cultural standpoint.

Slashdot Top Deals