Google

'Google is Getting Left Behind Due To Horrible UI/UX' (danielmiessler.com) 269

Daniel Miessler, a widely respected infosec professional in San Francisco, writes about design and user experience choices Google has made across its services in recent years: I've been writing for probably a decade about how bad Google's GUI is for Google Analytics, Google Apps, and countless of their other properties -- not to mention their multiple social media network attempts, like Google+ and Wave. Back then it was super annoying, but kind of ok. They're a hardcore engineering group, and their backend services are without equal. But lately it's just becoming too much.

1. Even Gmail is a cesspool at this point. Nobody would ever design a webmail interface like that, starting from scratch.
2. What happened to Google Docs? Why does it not look and behave more like Notion, or Quip, or any of the other alternatives that made progress in the last 5-10 years?
3. What college course do I take to manage a Google Analytics property?
4. Google just rolled out Google Analytics 4 -- I think -- and the internet is full of people asking the same question I am. "Is this a real rollout?"

[...] My questions are simple:
1. How the hell is this possible? I get it 10 years ago. But then they came out with the new design language. Materialize, or whatever it was. Cool story, and cool visuals. But it's not about the graphics, it's about the experience.
2. How can you be sitting on billions of dollars and be unable to hire product managers that can create usable interfaces?
3. How can you run Gmail on an interface that's tangibly worse than anything else out there?
4. How can you let Google Docs get completely obsoleted by startups?

I've heard people say that Google has become the new Microsoft, or the new Oracle, but damn -- at least Microsoft is innovating. At least Oracle has a sailing team, or whatever else they do. I'm being emotional at this point.

Google, you are made out of money. Fix your fucking interfaces. Focus on the experience. Focus on simplicity. And use navigation language that's similar across your various properties, so that I'll know what to do whether I'm managing my Apps account, or my domains, or my Analytics. You guys are awesome at so many things. Make the commitment to fix how we interact with them.

GUI

Creator of DirectX Dies at Age 55 (livemint.com) 94

The Wall Street Journal looks back to the days when Windows was "a loser in the world of computer games." But to change that, Eric Engstrom and his cohorts "secretly hired programmers to get the work done, and they had to do an end run around partners like Intel," remembers VentureBeat.

Long-time Slashdot reader whh3 shares The Wall Street Journal's report: Windows inserted itself between game programs and the computer hardware in a way that slowed down graphics and animation. Game developers vastly preferred the DOS operating system, which didn't gum up their special effects. That created an opportunity for three Microsoft misfits — Eric Engstrom, Alex St. John and Craig Eisler. Mr. Engstrom, who died Dec. 1 at the age of 55, and his pals formed one of several factions within Microsoft trying to solve the game problem. Openly contemptuous of colleagues who didn't share their ideas, they were so obnoxious that Brad Silverberg, who ran the Windows business, dubbed them the Beastie Boys. He had to fend off frequent demands for their dismissal.

Yet the solution they developed, DirectX, beat anything else on offer inside Microsoft. DirectX software recognized games and allowed them direct access to the computer's graphical capabilities, allowing a richer game experience than DOS could. "It was brilliant," Mr. Silverberg said. Launched in 1995, DirectX wowed game developers and led to a flood of new games for computers loaded with Windows. That success emboldened Microsoft to plunge deeper into the lucrative gaming market by developing the Xbox console.

Microsoft's game business produced $11.6 billion of revenue in the year ended June 30...

"He thought things were possible that nobody else on the planet thought would be possible," said Ben G. Wolff, a friend who runs a robotics company, "and sometimes he'd be right."

"DirectX remains the foundation for many games on Windows 10 and the Xbox Series X," writes Engadget, "and it's likely to remain relevant for years to come."

And VentureBeat shared this remark from Alex St. John at a memorial service for Engstrom. "He had huge dreams and huge fantasies, and he always took us all with him."
Games

Do Games Made Under Crunch Conditions Deserve 'Best Direction' Awards? (kotaku.com) 146

The annual Game Awards ceremony awarded this year's "Best Direction" award to Naughty Dog studio's The Last of Us Part IIprovoking a strong reaction from Kotaku's staff writer.

"I think it's pretty obvious that no game that required its developers to crunch, like The Last of Us Part II did, should be given a Best Direction award." It's no secret that Naughty Dog subjected its workers to unbelievable levels of crunch to get The Last of Us Part II out the door, but that's hardly an innovation when it comes to Naughty Dog or game development in general. Over the years, the studio has seen constant employee turnover as developers crunch on games like The Last of Us and Uncharted, burn out, and throw in the towel. Relentless overtime, missed weekends, long stretches of time without seeing your family — these things take a toll on even the most passionate artist.

"This can't be something that's continuing over and over for each game, because it is unsustainable," one The Last of Us Part II developer told Kotaku earlier this year. "At a certain point you realize, 'I can't keep doing this. I'm getting older. I can't stay and work all night.'"

Let's be clear: the existence of crunch indicates a failure in leadership. It's up to game directors and producers to ensure workloads are being managed properly and goals are being met. If workers are being forced to crunch, explicitly or otherwise, it means the managers themselves have fallen short somewhere, either in straining the limits of their existing staff, fostering an environment where overtime is an implied (if unspoken) requirement, or both. And as ambitious as The Last of Us Part II director Neil Druckmann and his projects may be, "questionable experiments in the realm of pushing human limits" are not required to make a great game...

I feel like the industry, now more than ever, is willing to discuss the dangers of crunch culture and solutions to eradicate it. But lavishing praise on the way The Last of Us Part II was directed feels like a tacit endorsement of crunch and only serves to push that conversation to the backburner again. A popular online statement, first coined by Fanbyte podcast producer Jordan Mallory, says, "I want shorter games with worse graphics made by people who are paid more to work less and I'm not kidding." The message from all those who share it is clear: No game, not even industry darling The Last of Us Part II, is worth destroying lives to create.

Graphics

NVIDIA Apologizes, 'Walks Back' Threat to Withhold GPUs From Reviewer (techspot.com) 111

This week NVIDIA threatened to stop providing GeForce Founders Edition review units to reviewer Steven Walton, who runs the YouTube channel Hardware Unboxed (and is also an editor/reviewer at TechSpot). NVIDIA had complained "your GPU reviews and recommendations have continued to focus singularly on rasterization performance, and you have largely discounted all of the other technologies we offer gamers. It is very clear from your community commentary that you do not see things the same way that we, gamers, and the rest of the industry do."

NVIDIA's email to Walton had said that henceforward their review products would instead be allocated to other media outlets "that recognize the changing landscape of gaming and the features that are important to gamers and anyone buying a GPU today, be it for gaming, content creation, or studio and stream."

But TechSpot reports tonight that "Less than 48 hours later, Steve received the good news. Nvidia apologized and walked everything back." Great news indeed, but let's be clear this wouldn't have happened if not for the support of the community at large and key people in the tech space that have such an enormous influence that it was too much for Nvidia to ignore. Linus from LinusTechTips (his angry rant on the WAN Show embedded above is pure gold) and Steve from Gamers Nexus, were two of those persons.
And unfortunately, by then TechSpot had already composed a scathing takedown of NVIDIA's email: As a corporation, it's Nvidia's prerogative to decide on the reviewers it chooses to collaborate with. However, this and other related incidents raise serious questions around journalistic independence and what they are expecting of reviewers when they are sent products for an unbiased opinion...

In today's dynamic graphics hardware space, with 350W flagships, hardware ray tracing, and exotic cooling solutions, there's a wide range of data points Hardware Unboxed looks at. But at the end of the day, there's only one real question every GPU buyer wants to know: how well do games run on a particular piece of hardware? Considering that 99% percent of Steam games feature raster-only rendering pipelines, rasterization performance was, is, and will be, a key point that Steve considers in GPU reviews...

[M]ost games (including almost all RTX titles) are built on raster renderers. A hypothetical graphics card with most of its die space reserved for ray tracing would run Quake II RTX great and... not much else. Ray tracing absolutely deserves a place in modern GPU reviews. But there's simply not enough of it in enough games for any responsible reviewer to put it center-stage, in place of raster performance. It wouldn't do justice to consumers, who will primarily be running raster workloads. This is why Nvidia's complaint is so puzzling.

Games

Inside the Obsessive World of Miniature Arcade Machine Makers (wired.co.uk) 25

The success of Nintendo's diminutive gadget led to a flurry of copycats, from a tiny Commodore 64 to a miniaturised Sony PlayStation. Some were good; many were flawed, with the play experience only being surface deep. Fortunately, some companies wanted to go further than fashioning yet another miniature plug-and-play TV console. From a report: One, the ZX Spectrum Next, brought into being a machine from an alternate universe in which Sinclair was never sold to Amstrad and instead built a computer to take on the might of the Amiga and Atari ST. Two other companies headed further back into gaming's past and set themselves an equally ambitious challenge: recreating the exciting, noisy, visually arresting classic cabinets you once found in arcades. "I always saw them as more than just a game, with their unique shapes, art, sounds and lights acting together to lure money from your pocket," explains Matt Precious, managing partner at Quarter Arcades creators Numskull Designs. "I was disappointed you couldn't purchase models of these machines during a time when physical items like LPs were booming in an increasingly sterile world of digital downloads."

Quarter Arcades was subsequently born as a project "trying to capture a piece of gaming history" in quarter-scale cabinets. The machines are in exact scale, including the controls, and play the original arcade ROMs. But look closer and there's an obsessive level of detail: the rough texture of the control panel art; mimicking an original cab's acoustics by careful speaker positioning; recreating the Space Invaders 'Pepper's ghost' illusion effect where graphics 'float' above an illuminated backdrop -- all realised by dismantling and reverse-engineering original cabs.

AI

'Cyberpunk 2077' Finally Shows What DLSS Is Good For (vice.com) 69

An anonymous reader shares a report: More recent Nvidia graphics cards have a proprietary feature called Deep Learning Super Sampling (DLSS), and while it's often been touted as a powerful new rendering tool, the results have sometimes been underwhelming. Some of this is down to the oddly mixed-message around how DLSS was rolled-out: it only works on more recent Nvidia cards that are still near the cutting edge of PC graphics hardware⦠but DLSS is designed to render images at lower resolutions but display them as if they were rendered natively at a higher resolution. If you had just gotten a new Nvidia card and were excited to see what kind of framerates and detail levels it could sustain, what DLSS actually did sounded counterintuitive. Even games like Control, whose support of DLSS was especially praised, left me scratching my head about why I would want to use the feature. On my 4K TV, Control looked and ran identically well with and without DLSS, so why wouldn't I just max-out my native graphics settings instead rather than use a fancy upscaler? Intellectually, I understood the DLSS could produce similarly great looking images without taxing my hardware as much, but I neither fully believed it, nor had I seen a game where the performance gain was meaningful.

Cyberpunk 2077 converted me. DLSS is a miracle, and without it there's probably no way I would ever have been happy with my graphics settings or the game's performance. I have a pretty powerful video card, an RTX 2080 TI, but my CPU is an old i5 overclocked to about 3.9 GHz and it's a definite bottleneck on a lot of games. Without DLSS, Cyberpunk 2077 was very hard to get running smoothly. The busiest street scenes would look fine if I were in a static position, but a quick pan with my mouse would cause the whole world to stutter. If I was walking around Night City, I would get routine slow-downs. Likewise, sneaking around and picking off guards during encounters was all well and good but the minute the bullets started flying, with grenades exploding everywhere and positions changing rapidly, my framerate would crater to the point where the game verged on unplayable. To handle these peaks of activity, I had to lower my detail settings way below what I wanted, and what my hardware could support for about 80 percent of my time with the game. Without DLSS, I never found a balance I was totally happy with. The game neither looked particularly great, nor did it run very well. DLSS basically solved this problem for me. With it active, I could run Cyberpunk at max settings, with stable framerates in all but the busiest scenes.

PlayStation (Games)

Is Sony Developing a Dual-GPU PS5 Pro? (collider.com) 60

According to a Sony patent spotted by T3, the console maker may be working on a new PlayStation 5 with two graphics card. From the report: The patent describes a "scalable game console" where "a second GPU [is] communicatively coupled to the first GPU" and that the system is for "home console and cloud gaming" usage. To us here at T3 that suggests a next-gen PlayStation console, most likely a new PS5 Pro flagship, supercharged with two graphics cards instead of just one. These would both come in the APU (accelerated processing unit) format that the PlayStation 5's system-on-a-chip (SoC) do, with two custom made AMD APUs working together to deliver enhanced gaming performance and cloud streaming.

The official Sony patent notes that, "plural SoCs may be used to provide a 'high-end' version of the console with greater processing and storage capability," while "the 'high end' system can also contain more memory such as random-access memory (RAM) and other features and may also be used for a cloud-optimized version using the same game console chip with more performance." And, with the PlayStation 5 console only marginally weaker on paper than the Xbox Series X (the PS5 delivers 10.28 teraflops compared to the Xbox Series X's 12 teraflops), a new PS5 Pro console that comes with two APUs rather than one, improving local gaming performance as well as cloud gaming, would be no doubt the Xbox Series X as king of the next-gen consoles death blow.

The cloud gaming part of the patent is particularly interesting, too, as it seems to suggest that this technology could not just find itself in a new flagship PS5 Pro console, but also in more streamlined cloud-based hardware, too. An upgraded PS5 Digital Edition seems a smart bet, as too the much-rumored PSP 5G. [...] Will we see a PS5 Pro anytime soon? Here at T3 we think absolutely not -- we imagine we'll get at least two straight years of PS5 before we see anything at all. As for a cloud-based next-gen PSP 5G, though...

Entertainment

'Code Switch' From NPR Is Apple's Podcast of the Year (engadget.com) 48

Apple has picked "Code Switch" as the best audio show of the year, marking the first time the company has recognized a single podcast in this way. Engadget reports: Code Switch is NPR's weekly discussion on race. While the series has been on the air for the better part of seven years, it became significantly more popular over the summer as people across the US took to protest the death of George Floyd and other instances of racial injustice.

As in past years, the company also shared a selection of the most popular audio shows people listened to through Apple Podcasts. Few surprises here as old favorites like Stuff You Should Know, This American Life and The Daily came out as the most popular shows in the US. When it comes to shows new to 2020, Unlocking Us, Nice White Parents and CounterClock made the top three for the year. Apple's editorial team had their say as well. They picked California Love, Canary by the Washington Post and Dying for Sex as their favorites of 2020. If you're looking for something new to listen to, all three lists are a good place to start.

Hardware

NVIDIA Launches GeForce RTX 3060 Ti, Sets a New Gaming Performance Bar At $399 (hothardware.com) 70

MojoKid writes: NVIDIA expanded its line-up of Ampere-based graphics cards today with a new lower cost GeForce RTX 3060 Ti. As its name suggests, the new $399 NVIDIA GPU supplants the previous-gen GeForce RTX 2060 / RTX 2060 Super, and slots in just behind the recently-released GeForce RTX 3070. The GeForce RTX 3060 Ti features 128 CUDA cores per SM, for a total of 4,864, 4 Third-Gen Tensor cores per SM (152 total), and 38 Second-Gen RT cores. The GPU has a typical boost clock of 1,665MHz and it is linked to 8GB of standard GDDR6 memory (not the GDDR6X of the RTX 3080/3090) via a 256-bit memory interface that offers up to 448GB/s of peak bandwidth. In terms of overall performance, the RTX 3060 Ti lands in the neighborhood of the GeForce RTX 2080 Super, and well ahead of cards like AMD's Radeon RX 5700 XT. The GeForce RTX 3060 Ti's 8GB frame buffer may give some users pause, but for 1080p and 1440p gaming, it shouldn't be a problem for the overwhelming majority of titles. It's also par for the course in this $399 price band. Cards are reported to be shipping in retail tomorrow.
Graphics

Cerebras' Wafer-Size Chip Is 10,000 Times Faster Than a GPU (venturebeat.com) 123

An anonymous reader quotes a report from VentureBeat: Cerebras Systems and the federal Department of Energy's National Energy Technology Laboratory today announced that the company's CS-1 system is more than 10,000 times faster than a graphics processing unit (GPU). On a practical level, this means AI neural networks that previously took months to train can now train in minutes on the Cerebras system.

Cerebras makes the world's largest computer chip, the WSE. Chipmakers normally slice a wafer from a 12-inch-diameter ingot of silicon to process in a chip factory. Once processed, the wafer is sliced into hundreds of separate chips that can be used in electronic hardware. But Cerebras, started by SeaMicro founder Andrew Feldman, takes that wafer and makes a single, massive chip out of it. Each piece of the chip, dubbed a core, is interconnected in a sophisticated way to other cores. The interconnections are designed to keep all the cores functioning at high speeds so the transistors can work together as one. [...] A single Cerebras CS-1 is 26 inches tall, fits in one-third of a rack, and is powered by the industry's only wafer-scale processing engine, Cerebras' WSE. It combines memory performance with massive bandwidth, low latency interprocessor communication, and an architecture optimized for high bandwidth computing.

Cerebras's CS-1 system uses the WSE wafer-size chip, which has 1.2 trillion transistors, the basic on-off electronic switches that are the building blocks of silicon chips. Intel's first 4004 processor in 1971 had 2,300 transistors, and the Nvidia A100 80GB chip, announced yesterday, has 54 billion transistors. Feldman said in an interview with VentureBeat that the CS-1 was also 200 times faster than the Joule Supercomputer, which is No. 82 on a list of the top 500 supercomputers in the world. [...] In this demo, the Joule Supercomputer used 16,384 cores, and the Cerebras computer was 200 times faster, according to energy lab director Brian Anderson. Cerebras costs several million dollars and uses 20 kilowatts of power.

Graphics

Radeon RX 6800 and 6800 XT Performance Marks AMD's Return To High-End Graphics (hothardware.com) 62

MojoKid writes: AMD officially launched its Radeon RX 6800 and Radeon RX 6800 XT graphics cards today, previously known in the PC gaming community as Big Navi. The company claimed these high-end GPUs would compete with NVIDIA's best GeForce RTX 30 series and it appears AMD made good on its claims. AMD's new Radeon RX 6800 XT and Radeon 6800 are based on the company's RDNA 2 GPU architecture, with the former sporting 72 Compute Units (CUs) and 2250MHz boost clock, while the RX 6800 sports 60 CUs at a 2105MHz boost clock. Both cards come equipped with 16GB of GDDR6 memory and 128MB of on-die cache AMD calls Infinity Cache, that improves bandwidth and latency, feeding the GPU in front of its 256-bit GDDR6 memory interface.

In the benchmarks, it is fair to say the Radeon RX 6800 is typically faster than an NVIDIA GeForce RTX 3070 just as AMD suggested. Things are not as cut and dry for the Radeon RX 6800 XT though, as the GeForce RTX 3080 and Radeon RX 6800 XT trade victories depending on the game title or workload, but the RTX 3080 has an edge overall. In DXR Ray Tracing performance, NVIDIA has a distinct advantage at the high-end. Though the Radeon RX 6800 wasn't too far behind and RTX 3070, neither the Radeon RX 6800 XT or 6800 card came close the GeForce RTX 3080. Pricing is set at $649 and $579 for the AMD Radeon RX 6800 XT and Radeon RX 6800, respectively and the cards are on sale as of today. However, demand is likely to be fierce as this new crop of high-end graphics cards from both companies have been quickly evaporating from retail shelves.

Desktops (Apple)

Apple's M1 Is Exceeding Expectations (extremetech.com) 274

Reviews are starting to pour in of Apple's MacBook Pro, MacBook Air and Mac Mini featuring the new M1 ARM-based processor -- and they're overwhelmingly positive. "As with the Air, the Pro's performance exceeds expectations," writes Nilay Patel via The Verge.

"Apple's next chapter offers strong performance gains, great battery and starts at $999," says Brian Heater via TechCrunch.

"When Apple said it would start producing Macs with its own system-on-chip processors, custom CPU and GPU silicon (and a bunch of other stuff) to replace parts from Intel and AMD, we figured it would be good. I never expected it would be this good," says Jason Cross in his review of the MacBook Air M1.

"The M1 is a serious, serious contender for one of the all-time most efficient and highest-performing architectures we've ever seen deploy," says ExtremeTech's Joel Hruska.

"Spending a few days with the 2020 Mac mini has shown me that it's a barnburner of a miniature desktop PC," writes Chris Welch via The Verge. "It outperforms most Intel Macs in several benchmarks, runs apps reliably, and offers a fantastic day-to-day experience whether you're using it for web browsing and email or for creative editing and professional work. That potential will only grow when Apple inevitably raises the RAM ceiling and (hopefully) brings back those missing USB ports..."

"Quibbling about massively parallel workloads -- which the M1 wasn't designed for -- aside, Apple has clearly broken the ice on high-performance ARM desktop and laptop designs," writes Jim Salter via Ars Technica. "Yes, you can build an ARM system that competes strongly with x86, even at very high performance levels."

"The M1-equipped MacBook Air now packs far better performance than its predecessors, rivaling at times the M1-based MacBook Pro. At $999, it's the best value among macOS laptops," concludes PCMag.

"For developers, the Apple Silicon Macs also represent the very first full-fledged Arm machines on the market that have few-to-no compromises. This is a massive boost not just for Apple, but for the larger Arm ecosystem and the growing Arm cloud-computing business," writes Andrei Frumusanu via AnandTech. "Overall, Apple hit it out of the park with the M1."

Intel

No, the New MacBook Air is Not Faster Than 98% of PC Laptops (pcworld.com) 249

Gordon Mah Ung, writing at PC World: Let me just say it outloud, OK? Apple is full of it. I'm referring to Apple's claim that its fanless, Arm-based MacBook Air is "faster than 98 percent of PC laptops." Yes, you read that correctly: Apple officials literally claimed that the new MacBook Air using Apple's custom M1 chip is faster than 98 percent of all PC laptops sold this year. Typically, when a company makes such a claim, it publishes a benchmark, a performance test or actual details on what it's basing that marketing claim on. This to prevent lawyers from launching out of missile silos across the world. Apple's website restates the claim by stating: "M1 is faster than the chips in 98 percent of PC laptops sold in the past year." The site also includes a detail note that states: "Testing conducted by Apple in October 2020 using preproduction 13-inch MacBook Pro systems with Apple M1 chip and 16GB of RAM. Performance measured using select industry-standard benchmarks. PC configurations from publicly available sales data over the last 12 months. Performance tests are conducted using specific computer systems and reflect approximate performance of MacBook Pro."

So, not only does Apple not say what tests it's basing its claims on, it doesn't even say where it sources the comparable laptops. Does that mean the new fanless MacBook Air is faster than, say, Asus' stupidly fast Ryzen 4000 based, GeForce RTX 2060-based Zephyrus G14? Does it mean the MacBook Air is faster than Alienware's updated Area 51M? The answer, I'm going to guess is "no." Not at all. Is it faster than the miniLED-based MSI Creator 17? Probably not, either. And what is that "performance" claim hinged on? CPU performance? GPU performance? Performance running Windows? Is it using the same application running on both platforms? Is it experiential? Is this running Red Dead Redemption II or Call of Duty: Black Ops Cold War? Is it running CyberLink's PowerDirector? Is it running Fortnite? While I have absolutely no idea what Apple is basing its claims on, I can tell you that I am 98 percent sure that any of the above laptops listed will wreck the MacBook Air doing any of the tasks I just named.

When Apple makes its claims, my guess is they are comparing the new M1 to Intel-based processors ranging from Atom to Celeron N to Core i3 and up, all with integrated graphics. But by not defining the word "performance," all this becomes just pure marketing spin. And is it really fair to compare a $999 MacBook to one that costs $150? Because $150 PCs are included in the 98 percent of laptops sold. Maybe Apple should compare its own $150 MacBook Air against a $150 Chromebook or Windows-based laptop. Of course, that would mean Apple would have to sell a product that most people can afford. I have no doubt the M1 will be impressive, but do I think it's going to compare to 8-cores of Ryzen 4000 performance or a GeForce RTX 2060? No.

Graphics

Apple's New M1 Macs Won't Work With External GPUs (engadget.com) 103

Today, Apple showed off the first Macs powered by its new M1 CPU, delivering impressive performance and excellent battery life, however they won't come without any compromises. According to Engadget, citing Paul Gerhardt's tweet, "tech spec pages for the new machines reveal that none of them are compatible with external GPUs that connect via Thunderbolt." From the report: Only some people would require add-on oomph in any case, but Apple's support for external graphics cards gave it some extra gaming cachet and informed creative professionals their needs would continue to be met. Now, they'll have to wait and see if things change for higher-end models as Apple Silicon spreads throughout the company's PC lineup.

There's also been some focus on the fact that the 13-inch MacBook Pro M1 models only include two USB-C ports onboard instead of four, but whether or not you think that's enough ports, it's consistent with the cheaper Intel models it replaces. A more striking limitation is the one we've already noted, that the MBP is limited to 16GB of RAM -- if you think you'll need 32GB then you'll have to opt for an Intel-powered model.

Crime

Microsoft Engineer Gets Nine Years For Stealing $10 Million From Microsoft (arstechnica.com) 41

A former Microsoft software engineer from Ukraine has been sentenced to nine years in prison for stealing more than $10 million in store credit from Microsoft's online store. Ars Technica reports: From 2016 to 2018, Volodymyr Kvashuk worked for Microsoft as a tester, placing mock online orders to make sure everything was working smoothly. The software automatically prevented shipment of physical products to testers like Kvashuk. But in a crucial oversight, it didn't block the purchase of virtual gift cards. So the 26-year-old Kvashuk discovered that he could use his test account to buy real store credit and then use the credit to buy real products.

At first, Kvashuk bought an Office subscription and a couple of graphics cards. But when no one objected to those small purchases, he grew much bolder. In late 2017 and early 2018, he stole millions of dollars worth of Microsoft store credit and resold it online for bitcoin, which he then cashed out using Coinbase. US prosecutors say he netted at least $2.8 million, which he used to buy a $160,000 Tesla and a $1.6 million waterfront home (his proceeds were less than the value of the stolen credit because he had to sell at a steep discount).

Kvashuk made little effort to cover his tracks for his earliest purchases. But as his thefts got bigger, he took more precautions. He used test accounts that had been created by colleagues for later thefts. This was easy to do because the testers kept track of test account credentials in a shared online document. He used throwaway email addresses and began using a virtual private networking service. Before cashing out the bitcoins, he sent them to a mixing service in an attempt to hide their origins. Kvashuk reported the bitcoin windfall to the IRS but claimed the bitcoins had been a gift from his father.

Portables (Apple)

Apple Unveils New M1 Apple Silicon-powered MacBook Air, Mac Mini, and MacBook Pro (zdnet.com) 112

Apple announced three Macs today that are powered by the company's new M1 chip. They are: MacBook Air: The first Mac that will be powered by the M1 chip is the MacBook Air. According to Apple, the new Air is 3.5x faster with up to 5x graphics performance than the previous generation thanks to the M1 processor. The new MacBook Air doesn't have a fan, so it'll be completely quiet at all times. It has up to 18 hours of total battery life when watching videos or 15 hours when browsing the web. You can get it with up to 2TB of storage and 16GB of memory, with the price still starting at $999.

Mac Mini: Additionally, Apple will release an Apple Silicon-powered Mac Mini. It's the same design Apple used for the DTK, but with the M1 processor. The new Mac Mini starts at $699, a drop in the price of $100, and supports up to a 6K display via USB-C Thunderbolt ports with USB-4 support.

MacBook Pro: Lastly, Apple is updating the 13-inch MacBook Pro with the M1 chip. Again, Apple touted performance gains in the MacBook Pro with 2.8x CPU gains and 5x GPU gains thanks to the M1 in the MacBook Pro. It keeps its cooling system but now gets 17 hours of battery life when browsing the web, or 20 hours when watching video. Apple kept the price of the MacBook Pro at $1,299 starting price.

Apple

Apple Introduces M1 Chip To Power Its New Arm-Based Macs (theverge.com) 155

Apple has introduced the new M1 chip that will power its new generation of Arm-based Macs. It's a 5nm processor, just like the A14 Bionic powering its latest iPhones. From a report: Apple says the new processor will focus on combining power efficiency with performance. It has an eight-core CPU, which Apple says offers the world's best performance per watt of an CPU. Apple says it delivers the same peak performance as a typical laptop CPU at a quarter of the power draw. It says this has four of the world's fastest CPUs cores, paired with four high-efficiency cores. It pairs this with up to an eight-core GPU, which Apple claims offers the world's fastest integrated graphics, and a 16-core Neural Engine. In addition, the M1 processor has a universal memory architecture, a USB 4 controller, media encode and decode engines, and a host of security features. These include hardware-verified secure boot, encryption, and run-time protections.
Earth

A Biden Victory Positions America For a 180-Degree Turn On Climate Change (seattletimes.com) 251

"Joe Biden, the projected winner of the U.S. presidency, will move to restore dozens of environmental safeguards President Donald Trump abolished," reports the Washington Post, "and launch the boldest climate change plan of any president in history."

destinyland shares their report: While some of Biden's most sweeping programs will encounter stiff resistance from Senate Republicans and conservative attorneys general, the United States is poised to make a 180-degree turn on climate change and conservation policy. Biden's team already has plans on how it will restrict oil and gas drilling on public lands and waters; ratchet up federal mileage standards for cars and SUVs; block pipelines that transport fossil fuels across the country; provide federal incentives to develop renewable power; and mobilize other nations to make deeper cuts in their own carbon emissions... Biden has vowed to eliminate carbon emissions from the electric sector by 2035 and spend $2 trillion on investments ranging from weatherizing homes to developing a nationwide network of charging stations for electric vehicles.

That massive investment plan stands a chance only if his party wins two Senate runoff races in Georgia in January; otherwise, he would have to rely on a combination of executive actions and more-modest congressional deals to advance his agenda.

Still, a number of factors make it easier to enact more-ambitious climate policies than even four years ago. Roughly 10% of the globe has warmed by 2 degrees Celsius (3.6 degrees Fahrenheit), a temperature rise the world has pledged to avoid. The price of solar and wind power has dropped, the coal industry has shrunk, and Americans increasingly connect the disasters they're experiencing in real time — including more-intense wildfires, hurricanes and droughts — with global warming. Biden has made the argument that curbing carbon will produce high-paying jobs while protecting the planet...

Some of the new administration's rules could be challenged in federal court, which have a number of Trump appointees on the bench. But even some conservative activists said that Biden could enact enduring policies, whether by partnering with Congress or through regulation... The new administration may be able to broker compromises with key industries that have experienced regulatory whiplash in the past decade, including the auto industry and power sector, while offering tax breaks for renewable energy that remain popular with both parties. And Biden can rebuild diplomatic alliances that will spur foreign countries to pursue more-ambitious carbon reductions...

Biden's advisers have said that they plan to elevate climate change as a priority in departments that have not always treated it as one, including the Transportation, State and Treasury departments. It will influence key appointments, affecting everything from overseas banking and military bases to domestic roads and farms.... Biden's pledge to achieve a carbon-free U.S. power sector within 15 years would mean the closing or revamping of nearly every coal- and gas-fired power plant around the country, and the construction of an unprecedented number of new wind turbines and solar farms. On top of that, engineers still need to devise a better way of storing energy when the sun is not shining or the wind is not blowing.

"If I were advising Biden on energy, my first three priorities would be storage, storage and storage," said Sen. Angus King, I-Maine, who worked in the alternative energy businesses before running for office.

Facebook

How Ex-Facebook Data Experts Spent $75 Million On Targeted Anti-Trump Ads (fastcompany.com) 78

The night before America's election, Fast Company reported: On the internet, we're subject to hidden A/B tests all the time, but this one was also part of a political weapon: a multimillion-dollar tool kit built by a team of Facebook vets, data nerds, and computational social scientists determined to defeat Donald Trump. The goal is to use microtargeted ads, follow-up surveys, and an unparalleled data set to win over key electorates in a few critical states: the low-education voters who unexpectedly came out in droves or stayed home last time, the voters who could decide another monumental election. By this spring, the project, code named Barometer, appeared to be paying off. During a two-month period, the data scientists found that showing certain Facebook ads to certain possible Trump voters lowered their approval of the president by 3.6%...

"We've been able to really understand how to communicate with folks who have lower levels of political knowledge, who tend to be ignored by the political process," says James Barnes, a data and ads expert at the all-digital progressive nonprofit Acronym, who helped build Barometer. This is familiar territory: Barnes spent years on Facebook's ads team, and in 2016 was the "embed" who helped the Trump campaign take Facebook by storm. Last year, he left Facebook and resolved to use his battle-tested tactics to take down his former client. "We have found ways to find the right news to put in front of them, and we found ways to understand what works and doesn't," Barnes says. "And if you combine all those things together, you get a really effective approach, and that's what we're doing...."

By the election it promises to have spent $75 million on Facebook, Google, Instagram, Snapchat, Hulu, Roku, Viacom, Pandora, and anywhere else valuable voters might be found... Barnes had been a Republican all his life, but he did not like Trump; he says he ended up voting for Clinton. The election, and his role in it, left him unsettled, and he left Facebook's political ads team to work with the company's commercial clients... In the wake of Trump's election and its aftermath, Barnes helped Facebook develop some of its election integrity initiatives (one of Facebook's moves was to stop embedding employees like him inside campaigns) and even sat down for lengthy interviews with the Securities and Exchange Commission and with then-Special Counsel Robert Mueller. Last year, after some soul-searching, some of it in Peru, Barnes registered as a Democrat, left Facebook, and began working on a way to fight Trump... Acronym and a political action committee, Pacronym, were founded in 2017 by Democratic strategist Tara McGowan, in an effort to counter Trump's online spending advantage and what The New Yorker called his Facebook juggernaut...

For Barnes, Acronym's aggressive approach to Facebook, and Barometer's very existence, isn't just personal, but relates to his former employer: Facebook hasn't only failed to effectively police misinformation and disinformation, but helped accelerate it... But while Barnes is using some of the weapons that helped Trump, he's at pains to emphasize that, unlike the other side, Acronym's artillery is simply "the facts."

The PAC's donors include Laurene Powell Jobs, Steven Spielberg, venture capitalists Reid Hoffman and Michael Moritz, and (according to the Wall Street Journal) Facebook's former product officer, Chris Cox (who is also an informal adviser.)

But in addition, the group "can access an unprecedented pool of state voter files and personal information: everything from your purchasing patterns to your social media posts to your church, layered with AI-built scores that predict your traits..."
Apple

A14X Bionic Allegedly Benchmarked Days Before Apple Silicon Mac Event (appleinsider.com) 88

The chip expected to be at the core of the first Apple Silicon Mac -- the "A14X" -- may have been benchmarked just days before the next Apple event. From a report: The alleged CPU benchmarks for the "A14X" show a 1.80GHz processor capable of turbo-boosting to 3.10GHz marking this the first custom Apple Silicon to ever clock above 3GHz. It is an 8-core processor with big-little arrangement. The GPU results show 8GB of RAM will be included with the processor. The single-core benchmark for the "A14X" scored 1634 vs the A12Z at 1118. The A14 scored 1,583 points for single-core tests, which is expected as single-core results shouldn't change much between the regular and "X" models. The multi-core benchmark for the "A14X" scored 7220 vs the A12Z at 4657. The A14 scored 4198 for multi-core, which means the "A14X" delivers a marked increase in performance in the sorts of environments that the GeekBench test suite focuses on. The additional RAM and graphics capabilities boost this result much higher than the standard iPhone processor. For comparison, a 16-inch MacBook Pro with the Intel Core-i9 processor scores 1096 for single and 6869 for multi-core tests. This means the alleged "A14X" outperforms the existing MacBook Pro lineup by a notable margin.

Slashdot Top Deals