China

One-Quarter of World's Pigs Died In a Year Due To Swine Fever In China (nytimes.com) 104

An anonymous reader quotes a report from The New York Times: The [African swine fever disease] was first reported in Shenyang, Liaoning Province, in early August 2018. By the end of August 2019, the entire pig population of China had dropped by about 40 percent. China accounted for more than half of the global pig population in 2018, and the epidemic there alone has killed nearly one-quarter of all the world's pigs. By late September, the disease had cost economic losses of one trillion yuan (about $141 billion), according to Li Defa, dean of the College of Animal Science and Technology at China Agricultural University in Beijing. Qiu Huaji, a leading Chinese expert on porcine infectious diseases, has said that African swine fever has been no less devastating "than a war" -- in terms of "its effects on the national interest and people's livelihoods and its political, economic and social impact."

Much like severe acute respiratory syndrome, or SARS, exposed the shortcomings of China's public health system when it became an epidemic in 2002-3, swine fever today exposes the weaknesses of the country's animal-disease prevention and control. But it also reveals something much more fundamental: notably, the perverse effects that even sound regulations can have when they are deployed within a system of governance as unsound as China's. According to Yu Kangzhen, a deputy minister of agriculture, the localities that struggled to control the spread of African swine fever were also those that lacked staff, funding or other resources in animal-epidemic prevention. Yet that alone cannot explain the breadth of the epidemic or the speed with which it swept across China...

AI

Researchers Detail AI that De-hazes and Colorizes Underwater Photos (venturebeat.com) 25

Kyle Wiggers, writing for VentureBeat: Ever notice that underwater images tend to be be blurry and somewhat distorted? That's because phenomena like light attenuation and back-scattering adversely affect visibility. To remedy this, researchers at Harbin Engineering University in China devised a machine learning algorithm that generates realistic water images, along with a second algorithm that trains on those images to both restore natural color and reduce haze. They say that their approach qualitatively and quantitatively matches the state of the art, and that it's able to process upwards of 125 frames per second running on a single graphics card. The team notes that most underwater image enhancement algorithms (such as those that adjust white balance) aren't based on physical imaging models, making them poorly suited to the task. By contrast, this approach taps a generative adversarial network (GAN) -- an AI model consisting of a generator that attempts to fool a discriminator into classifying synthetic samples as real-world samples -- to produce a set of images of specific survey sites that are fed into a second algorithm, called U-Net.
Privacy

Ask Slashdot: What Will the 2020s Bring Us? 207

dryriver writes: The 2010s were not necessarily the greatest decade to live through. AAA computer games were not only DRM'd and internet tethered to death but became increasingly formulaic and pay-to-win driven, and poor quality console ports pissed off PC gamers. Forced software subscriptions for major software products you could previously buy became a thing. Personal privacy went out the window in ways too numerous to list, with lawmakers failing on many levels to regulate the tech, data-mining and internet advertising companies in any meaningful way. Severe security vulnerabilities were found in hundreds of different tech products, from Intel CPUs to baby monitors and internet-connected doorbells. Thousands of tech products shipped with microphones, cameras, and internet connectivity integration that couldn't be switched off with an actual hardware switch. Many electronics products became harder or impossible to repair yourself. Printed manuals coming with tech products became almost non-existent. Hackers, scammers, ransomwarers and identity thieves caused more mayhem than ever before. Troll farms, click farms and fake news factories damaged the integrity of the internet as an information source. Tech companies and media companies became afraid of pissing off the Chinese government.

Windows turned into a big piece of spyware. Intel couldn't be bothered to innovate until AMD Ryzen came along. Nvidia somehow took a full decade to make really basic realtime raytracing happen, even though smaller GPU maker Imagination had done it years earlier with a fraction of the budget, and in a mobile GPU to boot. Top-of-the-line smartphones became seriously expensive. Censorship and shadow banning on the once-more-open internet became a thing. Easily-triggered people trying to muzzle other people on social media became a thing. The quality of popular music and music videos went steadily downhill. Star Wars went to shit after Disney bought it, as did the Star Trek films. And mainstream cinema turned into an endless VFX-heavy comic book movies, remakes/reboots and horror movies fest. In many ways, television was the biggest winner of the 2010s, with many new TV shows with film-like production values being made. The second winner may be computer hardware that delivered more storage/memory/performance per dollar than ever before.

To the question: What, dear Slashdotters, will the 2020s bring us? Will things get better in tech and other things relevant to nerds, or will they get worse?
Hardware

Atari's Home Computers Turn 40 (fastcompany.com) 86

harrymcc writes: Atari's first home computers, the 400 and 800, were announced at Winter CES in January 1980. But they didn't ship until late in the year -- so over at Fast Company, Benj Edwards has marked their 40th anniversary with a look at their rise and fall. Though Atari ultimately had trouble competing with Apple and other entrenched PC makers, it produced machines with dazzling graphics and sound and the best games of their era, making its computers landmarks from both a technological and cultural standpoint.
Television

The BBC's 1992 TV Show About VR, 3D TVs With Glasses, and Holographic 3D Screens (youtu.be) 54

dryriver writes: 27 years ago, the BBC's "Tomorrow's World" show broadcasted this little gem of a program [currently available on YouTube]. After showing old Red-Cyan Anaglyph movies, Victorian Stereoscopes, lenticular-printed holograms and a monochrome laser hologram projected into a sheet of glass, the presenter shows off a stereoscopic 3D CRT computer display with active shutter glasses. The program then takes us to a laboratory at Massachusetts Institute Of Technology, where a supercomputer is feeding 3D wireframe graphics into the world's first glasses-free holographic 3D display prototype using a Tellurium Dioxide crystal. One of the researchers at the lab predicts that "years from now, advances in LCD technology may make this kind of display cheap enough to use in the home."

A presenter then shows a bulky plastic VR headset larger than an Oculus Rift and explains how VR will let you experience completely computer-generated worlds as if you are there. The presenter notes that 1992 VR headsets may be "too bulky" for the average user, and shows a mockup of much smaller VR glasses about the size of Magic Leap's AR glasses, noting that "these are already in development." What is astonishing about watching this 27-year-old TV broadcast is a) the realization that much of today's stereo stereo 3D tech was already around in some form or another in the early 1990s; b) VR headsets took an incredibly long time to reach the consumer and are still too bulky; and that c) almost three decades later, MIT's prototype holographic glasses-free 3D display technology never made its way into consumer hands or households.

Portables (Apple)

Walt Mossberg: Tim Cook's Apple Had a Great Decade But No New Blockbusters (theverge.com) 59

Veteran tech columnist, who retired two years ago, returns with one story to cap the end of the decade: Apple hasn't said how many Watches and AirPods it's sold, but they're widely believed to be the dominant players in each of their categories and, in the grand Apple tradition, the envy of competitors that scramble to ape them. Neither of these hardware successes has matched the impact or scale of Jobs' greatest hits. Even the iPad, despite annual unit sales that are sharply down from its heyday, generated almost as much revenue by itself in fiscal 2019 as the entire category of "wearables, home and accessories" where the Apple Watch and AirPods are slotted by Apple. [...] Cook does bear the responsibility for a series of actions that screwed up the Macintosh for years. The beloved mainstream MacBook Air was ignored for five years. At the other end of the scale, the Mac Pro, the mainstay of professional audio, graphics, and video producers, was first neglected then reissued in 2013 in a way that put form so far ahead of function that it enraged its customer base. Some insiders think Cook allowed Ive's design team far too much power and that the balance Jobs was able to strike between the designers and the engineers was gone, at least until Ive left the company earlier this year.

The design-first culture that took root under Cook struck again with the MacBook Pro, yielding new laptops so thin their keyboards were awful and featuring USB-C ports that required sleek Macs to be used with ugly dongles. Apple has only recently retreated back to decent keyboards on the latest MacBook Pro, and it issued a much more promising Mac Pro. But dongles are still a part of the Apple experience across its product lines. Cook's other success this decade was to nurture the iPhone along as smartphone sales first plateaued and then began to decline. The biggest change he made came in 2014, before the dip, when Apple introduced two new iPhone 6 models, which belatedly adopted big screens that Android phones had pioneered. Sales took off like a rocket, and there's been a big iPhone option every year since.

Graphics

Qualcomm To Offer GPU Driver Updates On Google Play Store For Some Snapdragon Chips (hothardware.com) 8

MojoKid writes: At its Snapdragon Summit in Maui, Hawaii this week, Qualcomm unveiled the new Snapdragon 865 Mobile Platform, which enable next year's flagship 5G Android phones with more performance, a stronger Tensor-based AI processor and a very interesting new forthcoming feature not yet offered for any smartphone platform to date. The company announced that it will eventually start delivering driver updates for its Adreno GPU engines on board the Snapdragon 865 as downloadable packages via the Google Play Store. This is big news for smartphones, as GPU drivers are rarely updated out of band, if ever, and typically have to wait for the next major Android release. And even then, many OEMs don't bother putting in the effort to ensure that mobile GPUs are running the most current graphics drivers from Qualcomm. The process, which would have to be pre-qualified by major OEMs as well, will be akin to what the PC GPU 3D graphics driver ecosystem has been benefiting from for a long time, for maximum performance and compatibility. Unfortunately, at least currently, GPU driver update support is limited to only the Adreno 650 core on board the new Snapdragon 865, which currently supports updating drivers in this fashion. Here's hoping this program is met with success and Qualcomm will begin to enable the feature for legacy and new midrange Snapdragon platforms as well.
Graphics

Ask Slashdot: How Much Faster Is an ASIC Than a Programmable GPU? 63

dryriver writes: When you run a real-time video processing algorithm on a GPU, you notice that some math functions execute very quickly on the GPU and some math functions take up a lot more processing time or cycles, slowing down the algorithm. If you were to implement that exact GPU algorithm as a dedicated ASIC hardware chip or perhaps on a beefy FPGA, what kind of speedup -- if any -- could you expect over a midrange GPU like a GTX 1070? Would hardwiring the same math operations as ASIC circuitry lead to a massive execution time speedup as some people claim -- e.g. 5x or 10x faster than a general purpose Nvidia GPU -- or are GPUs and ASICs close to each other in execution speed?

Bonus question: Is there a way to calculate the speed of an algorithm implemented as an ASIC chip without having an actual physical ASIC chip produced? Could you port the algorithm to, say, Verilog or similar languages and then use a software tool to calculate or predict how fast it would run if implemented as an ASIC with certain properties (clock speed, core count, manufacturing process... )?
Cloud

Google Addresses Complaints of Sub-4K Image Quality On Stadia (arstechnica.com) 44

An anonymous reader quotes a report from Ars Technica: Since March, Google has been promising that its streaming Stadia platform would be capable of full 4K, 60fps gameplay (for users with a robust Internet connection and $10/month Stadia Pro subscription). But technical analyses since launch have shown that some of the service's highest profile games aren't hitting that mark. A Digital Foundry analysis of Red Dead Redemption 2 on Stadia, for instance, found that the game actually runs at a native 2560x1440 resolution, which is then upscaled to the 4K standard of 4096x2160 via the Chromecast Ultra. And a Bungie representative said that the Stadia version of Destiny 2 runs at the PC equivalent of "medium" graphics settings and that the game will "render at a native 1080p and then upsample [to 4K] and apply a variety of techniques to increase the overall quality of effect."

Over the weekend, Google issued a statement to 9to5Google that essentially places the blame for this situation on Stadia developers themselves (emphasis added): "Stadia streams at 4K and 60fps -- and that includes all aspects of our graphics pipeline from game to screen: GPU, encoder, and Chromecast Ultra all outputting at 4K to 4K TVs, with the appropriate Internet connection. Developers making Stadia games work hard to deliver the best streaming experience for every game. Like you see on all platforms, this includes a variety of techniques to achieve the best overall quality. We give developers the freedom of how to achieve the best image quality and frame rate on Stadia, and we are impressed with what they have been able to achieve for day one. We expect that many developers can, and in most cases will, continue to improve their games on Stadia. And because Stadia lives in our data centers, developers are able to innovate quickly while delivering even better experiences directly to you without the need for game patches or downloads."

Transportation

Analysts, Gamers, and Blade Runner's Artistic Director React To The Look of Tesla's Cybertruck (businessinsider.com) 293

Syd Mead, the artistic director on Blade Runner says Tesla's new Cybertruck "has completely changed the vocabulary of the personal truck market design."

Or, for another perspective, "Tesla's Cybertruck looks weird... like, really weird," wrote Toni Sacconaghi, a senior equity research analyst at the global asset management firm AllianceBernstein. "Add a little bit of dirt, and you could even say it gives off a retro-future vibe a la Mad Max."

That's from a Market Insider article citing Wall Street analysts they say "aren't buying the futuristic design of Tesla's new electric pickup truck." For example, Dan Levy of Credit Suisse, who wrote "amid the radical design for Cybertruck, it's somewhat unclear to us who the core buyer will be." "We do not see this vehicle in its current form being a success," Jeffrey Osborne of Cowen wrote in a note on Friday, adding that he doesn't see the Tesla brand or the Cybertruck design "resonating with existing pickup truck owners...."

Still, the Cybertruck's design wasn't unanimously disliked by Wall Street. The design "will be a hit with the company's fanatic EV installed base globally as Musk & Co. are clearly thinking way out of the box on this model design," Dan Ives of Wedbush wrote in a Friday note....

[And] "While styling will always be subjective, we believe the unique and futuristic design will resonate with consumers, leading to solid demand," Jed Dorsheimer of Canaccord Genuity wrote in a Friday note.

The article also quotes Toni Sacconaghi of Bernstein as saying that the "really futuristic, like cyberpunk Blade Runner" design "is too bad, because its on-paper specs are insane."

But IGN reports there's another group commenting enthusiastically on the Cybertruck's looks: gamers. Unlike anything else we've seen from Musk's line of vehicles before, the Tesla truck resembles something you'd see in an old video game set in the future or sci-fi flick from the late '90s to the early 2000s.

Of course, gamers all over the internet couldn't help themselves from sharing images, making memes, and drawing comparisons to look-alikes we've seen in games, TV shows, and movies... According to the internet, the Tesla Cybertruck either hasn't finished rendering yet or is made of some very dated graphics. Either way, it takes us back to the days where we got to experience the famous low-poly Lara Croft.

Open Source

System76 Will Start Designing and Building Its Own Linux Laptops Beginning January 2020 (forbes.com) 24

An anonymous reader quotes a report from Forbes: Denver-based PC manufacturer and Pop!_OS Linux developer System76 plans to follow-up its custom Thelio desktop PC with an in-house laptop beginning next year, according to founder and CEO Carl Richell. During a recent interview, Richell was quick to emphasize that the entire process of designing, prototyping and iterating the final product could take two to three years. But the company is eager to break into this market and put the same signature "stamp" on its laptop hardware that graces its custom-built Thelio desktop.

System76 sells an extensive lineup of laptops, but the machines are designed by the likes of Sager and Clevo. The company doesn't merely buy a chassis and slap Pop!_OS on it, but Richell tells me he's confident that with the experience gained from developing Thelio -- and the recent investment into a factory at the company's Denver headquarters -- System76 is capable of building a laptop from the ground up that meets market needs and carries a unique value proposition. Richell says the company's first priority is locking down the aesthetic of the laptop and how various materials look and feel. It will simultaneously begin working on the supply chain aspects and speaking with various display and component manufacturers. System76 will design and build a U-class laptop first (basically an Ultrabook form factor like the existing Darter and Galago) and then evaluate what it might do with higher-end gaming and workstation notebooks with dedicated graphics.

Intel

Intel Unveils 7nm Ponte Vecchio GPU Architecture For Supercomputers and AI (hothardware.com) 28

MojoKid writes: Intel has unveiled its first discrete GPU solution that will hit the market in 2020, code name Ponte Vecchio. Based on 7nm silicon manufacturing and stack chiplet design with Intel's Foveros tech, Ponte Vecchio will target HPC markets for supercomputers and AI training in the datacenter. According to HotHardware, Ponte Vecchio will employ a combination of both its Foveros 3D packaging and EMIB (Embedded Multi-die Interconnect Bridge) technologies, along with High Bandwidth Memory (HBM) and Compute Express Link (CXL), which will operate over the newly ratified PCIe 5.0 interface and serve as Ponte Vecchio's high-speed switch fabric connecting all GPU resources. Intel is billing Ponte Vecchio as its first exascale GPU, proving its meddle in the U.S. Department of Energy's (DOE) Aurora supercomputer. Aurora will employ a topology of six Ponte Vecchio GPUs and two Intel Xeon Scalable processors based on Intel's next generation Sapphire Rapids architecture, along with Optane DC Persistent Memory on a single blade. The new supercomputer is schedule to arrive sometime in 2021.
Intel

Intel To Remove Old Drivers and BIOS Updates From Its Site (zdnet.com) 130

By Friday this week, Intel plans to remove old drivers and BIOS updates from its official website. From a report: "This download, BIOS Update [BLH6710H.86A] 0163, will no longer be available after November 22, 2019 and will not be supported with any additional functional, security, or other updates," reads a message posted to the download page of one of the impacted components. "Intel recommends that users of BIOS Update [BLH6710H.86A] 0163 uninstall and/or discontinue use as soon as possible," the message continues. The downloads are drivers and BIOS updates for Intel desktop components and motherboards the company released in the 90s and early-to-mid 2000s. Downloads for hundreds of components are believed to have been impacted, from motherboards to NIC cards and graphics cards. Most of the drivers are for Windows versions like 98, ME, XP, and older Windows Server editions -- old Windows OS versions that have themselves reached end-of-life (EOL) All components and motherboards reached (EOL) years ago, and Intel stopped delivering firmware updates as a result. Its website was merely hosting the older files for convenience.
Transportation

Tesla Owners Say Autopilot Makes Them Feel Safer (bloomberg.com) 135

"Bloomberg has conducted a survey of Tesla Model 3 owners," writes Slashdot reader Thelasko. "Some of the most interesting data are responses to questions about Autopilot." Here's an excerpt from the report: We asked 5,000 Model 3 owners about their experience with the electric sedan that Tesla Chief Executive Officer Elon Musk says will lead the world into a new era of driverless transportation. [...] Six drivers claimed that Autopilot actually contributed to a collision, while nine people in the Bloomberg survey went so far as to credit the system with saving their lives. Hundreds of owners recalled dangerous behaviors, such as phantom braking, veering or failing to stop for a road hazard. But even those who reported shortcomings gave Autopilot high overall ratings. More than 90% of owners said driving with Autopilot makes them safer -- including most of the respondents who simultaneously faulted the software for creating dangerous situations. Bloomberg also asked Model 3 owners about the quality and reliability of their vehicles, as well as the service and charging.
Graphics

Adobe and Twitter Are Designing a System For Permanently Attaching Artists' Names To Pictures (theverge.com) 62

Adobe, Twitter, and The New York Times Company have announced a new system for adding attribution to photos and other content. A tool will record who created a piece of content and whether it's been modified by someone else, then let other people and platforms check that data. The Verge reports: The overall project is called the Content Authenticity Initiative, and its participants will hold a summit on the system in the next few months. Based on what Adobe has announced, the attribution tool is a piece of metadata that can be attached to a file. Adobe doesn't describe precisely how it will keep the tag secure or prevent someone from copying the content in a way that strips it out. Adobe chief product officer Scott Belsky said that some technical details will be worked out at the summit. Adobe described this system as a way to verify "authenticity" online. And The New York Times Company's research and development head, Marc Lavallee, suggested it could fight misinformation by helping people discern "trusted news" on digital media platforms.

But the most obvious uses include identifying a photo's source and making sure artists get credit for their work. Many photos and webcomics circulate anonymously on platforms like Twitter, and an attribution tag would help trace those images back to their creator. This depends entirely on how well the CAI system works, however. Tags wouldn't be very useful if they could be easily altered or removed, but if the system preserves security by tightly controlling how people can interact with the image, it could have the same downsides as other digital rights management or DRM systems.

Graphics

Helvetica's Evil Twin, Hellvetica, Will Haunt Your Nightmares (fastcompany.com) 47

Freshly Exhumed shares a report from Fast Company: Hold your favorite graphic design tome close. We now know what the classic typeface Helvetica would look like if it came from the underworld. Yes, it will keep type enthusiasts up at night. The design darling Helvetica -- that ubiquitous sans-serif typeface developed by Max Miedinger in 1957, representative of the crisp Swiss design aesthetic of that period, and star of its own documentary by the same name -- has made a deal with the kerning devil. The results aren't pretty. They're not meant to be.

Zack Roif and Matthew Woodward, both associate creative directors at the international advertising agency R/GA, have released a new typeface available free to download, Hellvetica, and it will make all your worst kerning nightmares come true. While each character has the same form as the classic typeface it's riffing on, Hellvetica utilizes inconsistent, variable spacing between each letterform to give an overall effect that something has gone terribly astray. Nope, that wasn't a mistake. You might just say it was intentionally erroneous.
The project is a study in playfulness and rule-breaking, "an exercise in going against the 'designer instincts' to fix up that awful kerning. Hundred percent break the rules," says Woodward. "Don't listen to your gut. Forget your training... and make that logo kern in hell!"
Intel

Intel Launches Core i9-9900KS 8-Core CPU At 5GHz Across All Cores (hothardware.com) 89

MojoKid writes: As the "S" in its name implies, the new Intel Core i9-9900KS that launched today is something akin to a Special Edition version of the company's existing Core i9-9900K 8-core CPU. The processors are built from the same slab of silicon -- an 8-core, Coffee Lake-refresh based die and packaged up for Intel's LGA1151 socket. What makes the Core i9-9900KS different from its predecessor are its base and turbo boost clocks, which are rated for 4GHz and 5GHz across all-cores, respectively, with enhanced binning of the chips to meet its performance criteria. The Core i9-9900KS is arguably the fastest processor available right now for single and lightly-threaded workloads, and offers the highest performance in gaming and graphics tests. In more heavily-threaded workloads that can leverage all of the additional processing resources available in a 12-core CPU like the Ryzen 9-3900X, however, the 8-core Intel Core i9-9900KS doesn't fare as well. It did catch AMD's 12-core Threadripper 2920X, which is based on the previous-gen Zen+ architecture, on a couple of occasions, however. Intel's new Core i9-9900KS desktop processor is available starting today at $513 MSRP.
Graphics

Nvidia Launches GeForce GTX 1660 Super For Faster 1080p Gaming (hothardware.com) 14

MojoKid writes: NVIDIA is expanding its GeForce GTX family of GPUs today with a pair of new Turing-based graphics cards, the GeForce GTX 1660 Super and GeForce GTX 1650 Super. As their mode numbers suggest, these new cards reside above their "non-Super" branded counterparts in NVIDIA's line-up, but a notch below GeForce GTX Ti variants. The GeForce GTX 1660 Super is somewhat of a cross between standard GTX 1660 and a 1660 Ti. The GeForce GTX 1660 Super has a similar number of CUDA cores and texture units to the vanilla GTX 1660, but with faster 14 Gbps GDDR6 memory. The GeForce GTX 1660 Super has higher GPU and memory clocks than a GeForce GTX 1660 Ti, however. In its price range, the GeForce GTX 1660 Super appears to be a solid value. Its gaming performance is strong, especially at 1080p resolution and generally faster than AMD's Radeon RX 590. The cards are also power-efficient, cool and quiet, and the GPUs are significantly overclockable as well. GeForce GTX 1660 Super cards should be available at retail today for around $230, with GTX 1650 Super on its way late next month.
Android

Nvidia Revamps Shield Android TV Streamer, Introduces New $150 Model (variety.com) 47

Nvidia today introduced a revamped version of its Android TV streaming device, complete with a faster processor and a new remote control. From a report: In addition to a $200 Pro model, Nvidia is also introducing a new $150 version with a home theater-friendly form factor that is meant to cater to a mainstream audience interested in great picture quality and sound. Nvidia Shield has long been a favorite with Android enthusiasts looking for the most advanced streaming hardware. First introduced in 2015, the device doubles as a console for both local and cloud-based video games, and thanks to a powerful processor, it's capable of running a whole bunch of software that wouldn't work on your average Roku or Fire TV streamer. A Shield can be turned into a DVR to record over-the-air television, a hub to control smart home devices and even a Plex server to manage expansive media collections.

The flip side of this has always been the price: The 2017 version of the device started at $180. You could buy 3 Rokus for the same amount, and still have money left over. Still, for its 2019 revamp, the company decided to stay the course, and still aim for the upper end of the market. "We don't look to do easy, cheap products," said Nvidia Shield director of product management Chris Daniel during a recent interview with Variety. Nvidia's goal was to use its expertise in graphics and AI to push the limits of what a streaming device can do, Daniel said. To stay true to that mission, Nvidia based its new generation of Shield streamers on its Tegra X1+ processor, which promises to be 25% faster than the processor used in previous-generation Shield devices.

Earth

Facing Unbearable Heat, Qatar Has Begun To Air-Condition the Outdoors (washingtonpost.com) 183

It was 116 degrees Fahrenheit in the shade outside the new Al Janoub soccer stadium, and the air felt to air-conditioning expert Saud Ghani as if God had pointed "a giant hair dryer" at Qatar. From a report: Yet inside the open-air stadium, a cool breeze was blowing. Beneath each of the 40,000 seats, small grates adorned with Arabic-style patterns were pushing out cool air at ankle level. And since cool air sinks, waves of it rolled gently down to the grassy playing field. Vents the size of soccer balls fed more cold air onto the field. Ghani, an engineering professor at Qatar University, designed the system at Al Janoub, one of eight stadiums that the tiny but fabulously rich Qatar must get in shape for the 2022 World Cup. His breakthrough realization was that he had to cool only people, not the upper reaches of the stadium -- a graceful structure designed by the famed Zaha Hadid Architects and inspired by traditional boats known as dhows. "I don't need to cool the birds," Ghani said.

Qatar, the world's leading exporter of liquefied natural gas, may be able to cool its stadiums, but it cannot cool the entire country. Fears that the hundreds of thousands of soccer fans might wilt or even die while shuttling between stadiums and metros and hotels in the unforgiving summer heat prompted the decision to delay the World Cup by five months. It is now scheduled for November, during Qatar's milder winter. The change in the World Cup date is a symptom of a larger problem -- climate change. Already one of the hottest places on Earth, Qatar has seen average temperatures rise more than 2 degrees Celsius (3.6 F) above preindustrial times, the current international goal for limiting the damage of global warming. The 2015 Paris climate summit said it would be better to keep temperatures "well below" that, ideally to no more than 1.5 degrees Celsius (2.7 F).

[...] To survive the summer heat, Qatar not only air-conditions its soccer stadiums, but also the outdoors -- in markets, along sidewalks, even at outdoor malls so people can window shop with a cool breeze. "If you turn off air conditioners, it will be unbearable. You cannot function effectively," says Yousef al-Horr, founder of the Gulf Organization for Research and Development. Yet outdoor air conditioning is part of a vicious cycle. Carbon emissions create global warming, which creates the desire for air conditioning, which creates the need for burning fuels that emit more carbon dioxide. In Qatar, total cooling capacity is expected to nearly double from 2016 to 2030, according to the International District Cooling & Heating Conference. And it's going to get hotter.

Slashdot Top Deals