Programming

Hundreds of Tech, Business and Nonprofit Leaders Urge States To Boost CS Education 49

theodp writes: In partnership with tech-bankrolled nonprofit Code.org, over 500 of the nation's business, education and nonprofit leaders issued a letter calling for state governments and education leaders to bring more Computer Science to K-12 students across the U.S. The signatories include a who's who of tech leaders, including Bill Gates, Jeff Bezos, Satya Nadella, Steve Ballmer, Tim Cook, Sundar Pichai, and Mark Zuckerberg. A new website -- CEOs for CS -- was launched in conjunction with the campaign. "The United States leads the world in technology, yet only 5% of our high school students study computer science. How is this acceptable?" the CEOs demand to know in their letter addressed "To the Governors and Education Leaders of the United States of America." They add, "Nearly two-thirds of high-skilled immigration is for computer scientists, and every state is an importer of this strategic talent. The USA has over 700,000 open computing jobs but only 80,000 computer science graduates a year. We must educate American students as a matter of national competitiveness."

A press release explains that the announcement "coincides with the culmination of the National Governors Association Chairman's Initiative for K-12 computer science, led by Arkansas Gov. Asa Hutchinson." Hutchinson is a founding Governor of the Code.org-led advocacy group Govs for CS, which launched in anticipation of President Obama's tech-supported but never materialized $4 billion CS for All initiative. Hutchinson was a signatory of an earlier 2016 Code.org organized letter from Governors, business, education, and nonprofit leaders that implored Congress to make CS education for K-12 students a priority.
Security

PyPI Is Rolling Out 2FA For Critical Projects, Giving Away 4,000 Security Keys (zdnet.com) 19

PyPI or the Python Package Index is giving away 4,000 Google Titan security keys as part of its move to mandatory two-factor authentication (2FA) for critical projects built in the Python programming language. ZDNet reports: PyPI, which is managed by the Python Software Foundation, is the main repository where Python developers can get third-party developed open-source packages for their projects. [...] One way developers can protect themselves from stolen credentials is by using two-factor authentication and the PSF is now making it mandatory for developers behind "critical projects" to use 2FA in coming months. PyPI hasn't declared a specific date for the requirement. "We've begun rolling out a 2FA requirement: soon, maintainers of critical projects must have 2FA enabled to publish, update, or modify them," the PSF said on its PyPI Twitter account.

As part of the security drive, it is giving away 4,000 Google Titan hardware security keys to project maintainers gifted by Google's open source security team. "In order to improve the general security of the Python ecosystem, PyPI has begun implementing a two-factor authentication (2FA) requirement for critical projects. This requirement will go into effect in the coming months," PSF said in a statement. "To ensure that maintainers of critical projects have the ability to implement strong 2FA with security keys, the Google Open Source Security Team, a sponsor of the Python Software Foundation, has provided a limited number of security keys to distribute to critical project maintainers.

PSF says it deems any project in the top 1% of downloads over the prior six months as critical. Presently, there are more than 350,000 projects on PyPI, meaning that more than 3,500 projects are rated as critical. PyPI calculates this on a daily basis so the Titan giveaway should go a long way to cover a chunk of key maintainers but not all of them. In the name of transparency, PyPI is also publishing 2FA account metrics here. There are currently 28,336 users with 2FA enabled, with nearly 27,000 of them using a 2FA app like Microsoft Authenticator. There are over 3,800 projects rated as "critical" and 8,241 PyPI users in this group. The critical group is also likely to grow since projects that have been designated as critical remain so indefinitely while new projects are added to mandatory 2FA over time. The 2FA rule applies to both project maintainers and owners.

Programming

Meet Bun, a Speedy New JavaScript Runtime (bun.sh) 121

Bun is "a modern JavaScript runtime like Node or Deno," according to its newly-launched web site, "built from scratch to focus on three main things."

- Start fast (it has the edge in mind).
- New levels of performance (extending JavaScriptCore, the engine).
- Being a great and complete tool (bundler, transpiler, package manager).

Bun is designed as a drop-in replacement for your current JavaScript & TypeScript apps or scripts — on your local computer, server or on the edge. Bun natively implements hundreds of Node.js and Web APIs, including ~90% of Node-API functions (native modules), fs, path, Buffer and more. [And Bun also implements Node.js' module resolution algorithm, so you can use npm packages in bun.js]

The goal of Bun is to run most of the world's JavaScript outside of browsers, bringing performance and complexity enhancements to your future infrastructure, as well as developer productivity through better, simpler tooling.... Why is Bun fast? An enormous amount of time spent profiling, benchmarking and optimizing things. The answer is different for every part of Bun, but one general theme: [it's written in Zig.] Zig's low-level control over memory and lack of hidden control flow makes it much simpler to write fast software.

An infographic on the site claims its server-side rendering of React is more than three times faster than Node or Deno. And Bun.js can even automatically load environment variables from .env files, according to the site. No more require("dotenv").load()
Hackaday describes it as "a performant all-in-one approach," including "bundling, transpiling, module resolution, and a fantastic foreign-function interface." Many Javascript projects have a bundling and transpiling step that takes the source and packages it together in a more standard format. Typescript needs to be packaged into javascript, and modules need to be resolved. Bun bakes all this in. Typescript and JSX "just work." This dramatically simplifies many projects as much of the build infrastructure is part of Bun itself, lowering cognitive load when trying to understand a project... Some web-specific APIs, such as fetch and Websockets, are also built-in.
"What's even wilder is that Bun is written by one person, Jared Sumner," the article points out — adding that the all the code is available on GitHub under the MIT License ("excluding dependencies which have various licenses.")
Crime

What Happened to the Teen Who Stole $23.8M in Cryptocurrency? (rollingstone.com) 67

15-year-old Ellis Pinsky stole $23.8 million worth of cryptocurrency — and his life was never the same. For example, Rolling Stone reports, in his last year of high school, "Four men wearing ski masks and gloves, armed with knives, rope, brass knuckles, and a fake 9 mm," crept around the back of his home in the suburbs: Two weeks before the break-in, a lawsuit had been filed against him, and news stories had circulated connecting him to the hack. He knew that the thieves wanted this money, the millions and millions of dollars he had stolen. He also knew that he couldn't give it to them. He didn't have it. Not anymore.
The magazine paints the portrait of "an anxious young man in Invisalign braces" who describes the revelation he'd had at the age of 13. "The internet held such secrets. All he had to do was uncover them." As he soon found, there were plenty of people working to uncover them all the time, and willing to share their methods — for a price.... Realizing that a lot of the information social engineers used came from hacked databases, he began teaching himself to program, particularly to do the Structured Query Language injections and cross-site scripting that allowed him to attack companies' database architecture. The terabyte upon terabyte of databases he extracted, traded, and hoarded made him valuable to OGUsers as well as to others, like the Russian hackers he was able to converse with thanks to his fluency with his mother's native language... By the time he was 14, he tells me, "I think it's fair to say I had the capabilities to hack anyone."
The article describes him as "attending high school by day and extracting the source code of major corporations by night.... He was 14 years old and taken with the thrill of possessing a hidden superpower, of spending his nights secretly tapping into an underground world where he was esteemed and even feared. And then, in the morning, being called downstairs to breakfast." He wrote a Python script to comb through social media networks and seek out any mentions of working for a [cellphone] carrier. Then he'd reach out with an offer of compensation for helping him with a task. Every fifth or sixth person — underpaid and often working a short-term contract — would say they were game, as Pinsky tells it. For a couple hundred dollars' worth of bitcoin, they'd be willing to do a SIM swap, no questions asked. Eventually, Pinsky says, he had employees at every major carrier also working for him. Then the stakes got even higher. It was only a matter of time before OG hackers, known to each other as "the Community," realized that if they could use the SIM-swapping method to steal usernames, they could just as easily use it to steal cryptocurrency...
In one massive heist Pinksky stole 10% of all the Trigger altcoins on the market from crypto impresario Michael Terpin. ("As Pinsky's money launderers were converting it, the market was crashing in real time.") Pinsky recruited a crew to launder the money — at least one of which simply kept it — but even with all the conversion fees, he still made off with millions. And then... For a while, he half-expected the FBI to knock on his door at any moment, just like in the movies; but as time passed, he grew less anxious.... He says he moved on to learning different types of programming. He ran a sneaker business that used bots and scripts to snap up limited pairs then flip them... He went to soccer practice. He and his friends had started hanging out with girls on the weekend, driving down to the docks where you could see the glowing lights from the Tappan Zee Bridge.
Until Terpin figured out it was Pinsky who'd robbed him: Pinsky and his legal team preempted his arrest by contacting the U.S. attorney directly and offering his cooperation. In February 2020, he voluntarily returned every last thing he says he got from the Terpin heist: 562 bitcoins, the Patek watch, and the cash he'd stored in the safe under his bed.... When I ask if he has also worked with the FBI to help bring down other hackers, he blinks quickly and then changes the subject.
Pinsky has not been criminally charged — partly because he was a minor, but also because of his cooperation with law enforcement. But filing a civil suit, Terpin wants to be compensated with triple the amount stolen, arguing that the teenager who robbed him was running an organized crime racket and that he should be heavily punished to set an example.

Rolling Stone's article raisees the question: what should happen next?
Programming

Vim 9.0 Released (vim.org) 81

After many years of gradual improvement Vim now takes a big step with a major release. Besides many small additions the spotlight is on a new incarnation of the Vim script language: Vim9 script. Why Vim9 script: A new script language, what is that needed for? Vim script has been growing over time, while preserving backwards compatibility. That means bad choices from the past often can't be changed and compatibility with Vi restricts possible solutions. Execution is quite slow, each line is parsed every time it is executed.

The main goal of Vim9 script is to drastically improve performance. This is accomplished by compiling commands into instructions that can be efficiently executed. An increase in execution speed of 10 to 100 times can be expected. A secondary goal is to avoid Vim-specific constructs and get closer to commonly used programming languages, such as JavaScript, TypeScript and Java.

The performance improvements can only be achieved by not being 100% backwards compatible. For example, making function arguments available by creating an "a:" dictionary involves quite a lot of overhead. In a Vim9 function this dictionary is not available. Other differences are more subtle, such as how errors are handled. For those with a large collection of legacy scripts: Not to worry! They will keep working as before. There are no plans to drop support for legacy script. No drama like with the deprecation of Python 2.

Programming

The Really Important Job Interview Questions Engineers Should Ask (But Don't) (posthog.com) 185

James Hawkins: Since we started PostHog, our team has interviewed 725 people. What's one thing I've taken from this? It's normal for candidates not to ask harder questions about our company, so they usually miss out on a chance to (i) de-risk our company's performance and (ii) to increase the chances they'll like working here.

Does the company have product-market fit? This is the single most important thing a company can do to survive and grow.
"Do you ever question if you have product-market fit?"
"When did you reach product-market fit? How did you know?"
"What do you need to do to get to product-market fit?"
"What's your revenue? What was it a year ago?"
"How many daily active users do you have?"

It's ok if these answers show you the founder doesn't have product market fit. In this case, figure out if they will get to a yes. Unless you want to join a sinking ship, of course! Early stage founders are (or should be) super-mega-extra-desperately keen to have product-market fit -- it's all that really matters. The ones that will succeed are those that are honest about this (or those that have it already) and are prioritizing it. Many will think or say (intentionally or through self-delusion) that they have it when they don't. Low user or revenue numbers and vague answers to the example questions above are a sign that it isn't there. Product-market fit is very obvious.

Google

Google Launches Advanced API Security To Protect APIs From Growing Threats (techcrunch.com) 6

Google today announced a preview of Advanced API Security, a new product headed to Google Cloud that's designed to detect security threats as they relate to APIs. TechCrunch reports: Built on Apigee, Google's platform for API management, the company says that customers can request access starting today. Short for "application programming interface," APIs are documented connections between computers or between computer programs. API usage is on the rise, with one survey finding that more than 61.6% of developers relied on APIs more in 2021 than in 2020. But they're also increasingly becoming the target of attacks. According to a 2018 report commissioned by cybersecurity vendor Imperva, two-thirds of organizations are exposing unsecured APIs to the public and partners.

Advanced API Security specializes in two tasks: identifying API misconfigurations and detecting bots. The service regularly assesses managed APIs and provides recommended actions when it detects configuration issues, and it uses preconfigured rules to provide a way to identify malicious bots within API traffic. Each rule represents a different type of unusual traffic from a single IP address; if an API traffic pattern meets any of the rules, Advanced API Security reports it as a bot. [...] With the launch of Advanced API Security, Google is evidently seeking to bolster its security offerings under Apigee, which it acquired in 2016 for over half a billion dollars. But the company is also responding to increased competition in the API security segment.
"Misconfigured APIs are one of the leading reasons for API security incidents. While identifying and resolving API misconfigurations is a top priority for many organizations, the configuration management process is time consuming and requires considerable resources," Vikas Ananda, head of product at Google Cloud, said in a blog post shared with TechCrunch ahead of the announcement. "Advanced API Security makes it easier for API teams to identify API proxies that do not conform to security standards... Additionally, Advanced API Security speeds up the process of identifying data breaches by identifying bots that successfully resulted in the HTTP 200 OK success status response code."
Businesses

FBI Says People Are Using Deepfakes To Apply To Remote Jobs (gizmodo.com) 47

An anonymous reader quotes a report from Gizmodo: The FBI wrote to its Internet Crime Complaint Center Tuesday that it has received multiple complaints of people using stolen information and deepfaked video and voice to apply to remote tech jobs. According to the FBI's announcement, more companies have been reporting people applying to jobs using video, images, or recordings that are manipulated to look and sound like somebody else. These fakers are also using personal identifiable information from other people -- stolen identities -- to apply to jobs at IT, programming, database, and software firms. The report noted that many of these open positions had access to sensitive customer or employee data, as well as financial and proprietary company info, implying the imposters could have a desire to steal sensitive information as well as a bent to cash a fraudulent paycheck.

What isn't clear is how many of these fake attempts at getting a job were successful versus how many were caught and reported. Or, in a more nefarious hypothetical, whether someone secured an offer, took a paycheck, and then got caught. These applicants were apparently using voice spoofing techniques during online interviews where lip movement did not match what's being said during video calls, according to the announcement. Apparently, the jig was up in some of these cases when the interviewee coughed or sneezed, which wasn't picked up by the video spoofing software.
Companies who suspect a fake applicant can report it to the complaint center site.
Open Source

Linus Torvalds Is Cautiously Optimistic About Bringing Rust Into Linux Kernel's Next Release (zdnet.com) 123

slack_justyb shares a report from ZDNet: For over three decades, Linux has been written in the C programming language. Indeed, Linux is C's most outstanding accomplishment. But the last few years have seen a growing momentum to make the Rust programming language Linux's second Linux language. At the recent Open Source Summit in Austin, Texas, Linux creator Linus Torvald said he could see Rust making it into the Linux kernel as soon as the next major release. "I'd like to see the Rust infrastructure merging to be started in the next release, but we'll see." Linux said after the summit. "I won't force it, and it's not like it's going to be doing anything really meaningful at that point -- it would basically be the starting point. So, no promises."

Now, you may ask: "Why are they adding Rust at all?" Rust lends itself more easily to writing secure software. Samartha Chandrashekar, an AWS product manager, said it "helps ensure thread safety and prevent memory-related errors, such as buffer overflows that can lead to security vulnerabilities." Many other developers agree with Chandrashekar. Torvalds also agrees and likes that Rust is more memory-safe. "There are real technical reasons like memory safety and why Rust is good to get in the kernel." Mind you, no one is going to be rewriting the entire 30 or so million lines of the Linux kernel into Rust. As Linux developer Nelson Elhage said in his summary of the 2020 Linux Plumber's meeting on Rust in Linux: "They're not proposing a rewrite of the Linux kernel into Rust; they are focused only on moving toward a world where new code may be written in Rust." The three areas of potential concern for Rust support are making use of the existing APIs in the kernel, architecture support, and dealing with application binary interface (ABI) compatibility between Rust and C.

Programming

Svelte Origins: a JavaScript Documentary (youtube.com) 48

Svelte Origins: The Documentary tells the story of how Svelte came to be, what makes Svelte different, and how it changes the game as a JavaScript framework. From the description of the documentary, which was recommended by several Slashdot readers: Filmed in locations throughout Europe and the US, it features Svelte's creator Rich Harris and members from the core community who contributed to making Svelte what it is today. Svelte Origins was filmed in late 2021, produced by OfferZen and directed by Dewald Brand, with shoots in the USA, the UK, Ireland, Sweden and Germany.
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Programming

Stack Overflow Survey Finds Developers Like Rust, Python, JavaScript and Remote Work (infoworld.com) 97

For Stack Overflow's annual survey, "Over 73,000 developers from 180 countries each spent roughly 15 minutes answering our questions," a blog post announces: The top five languages for professional developers haven't changed: JavaScript is still the most used, and Rust is the most loved for a seventh year. The big surprise came in the most loved web framework category. Showing how fast web technologies change, newcomer Phoenix took the most loved spot from Svelte, itself a new entry last year.... Check out the full results from this year's Developer Survey here.
In fact, 87% of Rust developers said that they want to continue using Rust, notes SD Times' summary of the results: Rust also tied with Python as the most wanted technology in this year's report, with TypeScript and Go following closely behind. The distinction between most loved and most wanted is that most wanted includes only developers who are not currently developing with the language, but have an interest in developing with it.
Slashdot reader logankilpatrick writes, "It should come as no surprise to those following the growth and expansion of the Julia Programming Language ecosystem that in this year's Stack Overflow developer survey, Julia ranked in the top 5 for the most loved languages (above Python — 6th, MatLab — Last, and R — 33rd)."

And the Register shares more highlights: Also notable in the 71,547 responses regarding programming languages was a switch again between Python and SQL. In 2021, Python pushed out SQL to be the third most commonly used language. This year SQL regained third place, just behind second placed HTML /CSS.

And the most hated...

Unsurprisingly, developers still dread that tap on the shoulder from the finance department for a tweak to that bit of code upon which the entire company depends. Visual Basic for Applications and COBOL still lurk within the top three most dreaded technologies.

The operating system rankings were little changed: Windows won out for personal and professional use, although for professional use Linux passed macOS to take second place with 40 percent of responses compared to Apple's 33 percent. Most notable was the growth of Windows Subsystem for Linux, which now accounts for 14 percent of personal use compared with a barely registering 3 percent in 2021.

But SD Times noted what may be the most interesting statistic: Only 15% of developers work on-site full time. Forty-three percent are fully remote and 42% are hybrid. Smaller organizations with 2-19 employees are more likely to be in-person, while large organizations with over 10k employees are more likely to be hybrid, according to the survey.
InfoWorld delves into what this means: "The world has made the decision to go hybrid and remote, I have a lot of confidence given the data I have seen that that is a one-way train that has left the station," Prashanth Chandrasekar, CEO of Stack Overflow told InfoWorld.

Chandrasekar says that flexibility and the tech stack developers get to work with are the most important contributors to overall happiness at work. "Many developers drop out of the hiring process because of the tech stack they will be working with," he said... Organizational culture is also shifting, and cloud-native techniques have taken hold among Stack Overflow survey respondents. Most professional developers (70%) now use some form of CI/CD and 60% have a dedicated devops function....

Lastly, Web3 still has software developers torn, with 32% of respondents favorable, 31% unfavorable, and 26% indifferent. Web3 refers to the emerging idea of a decentralized web where data and content are registered on blockchains, tokenized, or managed and accessed on peer-to-peer distributed networks.

IT

Are 'Google Programmers' the New 'Next-Next-Finish Programmers'? (pvs-studio.com) 203

Long-time Slashdot reader theodp writes: Back in 1998, Ellen Ullman wrote in Salon about The dumbing-down of programming: "My programming tools were full of wizards. Little dialog boxes waiting for me to click "Next" and "Next" and "Finish." Click and drag and shazzam! — thousands of lines of working code. No need to get into the "hassle" of remembering the language. No need to even learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don't really need to understand."

Twenty-four years later, PVS-Studio has published a translation of Ivan Belokamentsev's cautionary tale of how modernizing his interviewing process from coding on paper to a computer led him to inadvertently hire 'Google Programmers', who dazzled him in interviews and initially on the job, but soon reached a plateau in productivity that puzzled him until he had a gobsmacking realization.

From their article: It was like somebody hit me on the head with a sack of flour. It took me about two days to process it. How is it really possible? The beautiful, well-optimized code they showed me at the first interview was from the Internet. The explosive growth of productivity in the first months was due to the solutions that they found on the Internet. Those answers to user questions after the magic "We'll call you back" from these guys — were found on the Internet. They were coding without understanding the basic constructs. No, they didn't write code — they downloaded it. No, that's not it, either. To download the code is like running "npm i", it's ok. They copy-pasted the code. Without knowing how to write it.

That's what angered me — what the...? Well, I understand when you surf the net to figure out how a new technology works. Or when you need to use some exotic feature and not to bloat your head with unnecessary information. But basic things! How can you copy-paste basic things from the Internet?!

The article meditates on the mindset of "Google" programmers. Rather than learning about basic objects, types, and the constructs of a programming language, "Any information is available to them, always and everywhere. They've learned how to find this information quickly — whether it's the address of a store with cookies, pants on sale or generating a query."

But long-time Slashdot reader AmiMoJo now pushes back: This is dumb. Not everyone has a great memory, and these days there are so many different tools and frameworks that nobody can remember them all anyway. Back in the day when it was all C, you could reasonably write useful code on paper. These days most of that code will probably be interacting with libraries that you have not committed to memory.

If your developers are not progressing, help them. Give them training or mentoring. Challenge them.

And there's also this advice from Slashdot reader Iamthecheese: "Stop selecting for low ethics in your hiring process." There is a stupid, stupid idea out there among the pointy hair types that it's possible to hire top tier candidates for peanuts. This idea has been put into their heads by massively over-promising companies selling HR solutions of all shapes... They're actively selecting people with just enough ability to pass these specific tests and who are unwilling to show their true levels of ability by hashing it out on their own. So you have these untrained people who look for easy ways past problems, but you were expecting "rock stars".
Their suggested solution? "Stop looking for easy, cheap, already trained people and start looking for trainable, people." And then, "show them a little loyalty. That way you'll have people to train new hires, who also know what they're doing on the job."
AI

AI-Powered GitHub Copilot Leaves Preview, Now Costs $100 a Year (techcrunch.com) 36

It was June 29th of 2021 that Microsoft-owned GitHub first announced its AI-powered autocompletion tool for programmers — trained on GitHub repositories and other publicly-available source code.

But after a year in "technical preview," GitHub Copilot has reached a new milestone, reports Info-Q: you'll now have to pay to use it after a 60-day trial: The transition to general availability mostly means that Copilot ceases to be available for free. Interested developers will have to pay 10 USD/month or $100 USD/year to use the service, with a 60-day free trial.... According to GitHub, while not frequent, there is definitely a possibility that Copilot outputs code snippets that match those in the training set.
Info-Q also cites GitHub stats showing over 1.2 million developers used Copilot in the last 12 months "with a shocking 40% figure of code written by Copilot in files where it is enabled." That's up from 35% earlier in the year, reports TechCrunch — which has more info on the rollout: It'll be free for students as well as "verified" open source contributors — starting with roughly 60,000 developers selected from the community and students in the GitHub Education program... One new feature coinciding with the general release of Copilot is Copilot Explain, which translates code into natural language descriptions. Described as a research project, the goal is to help novice developers or those working with an unfamiliar codebase.

Ryan J. Salva, VP of product at GitHub, told TechCrunch via email... "As an example of the impact we've observed, it's worth sharing early results from a study we are conducting. In the experiment, we are asking developers to write an HTTP server — half using Copilot and half without. Preliminary data suggests that developers are not only more likely to complete their task when using Copilot, but they also do it in roughly half the time."

Owing to the complicated nature of AI models, Copilot remains an imperfect system. GitHub said that it's implemented filters to block emails when shown in standard formats, and offensive words, and that it's in the process of building a filter to help detect and suppress code that's repeated from public repositories. But the company acknowledges that Copilot can produce insecure coding patterns, bugs and references to outdated APIs, or idioms reflecting the less-than-perfect code in its training data.

The Verge ponders where this is going — and how we got here: "Just like the rise of compilers and open source, we believe AI-assisted coding will fundamentally change the nature of software development, giving developers a new tool to write code easier and faster so they can be happier in their lives," says GitHub CEO Thomas Dohmke.

Microsoft's $1 billion investment into OpenAI, the research firm now led by former Y Combinator president Sam Altman, led to the creation of GitHub Copilot. It's built on OpenAI Codex, a descendant of OpenAI's flagship GPT-3 language-generating algorithm.

GitHub Copilot has been controversial, though. Just days after its preview launch, there were questions over the legality of Copilot being trained on publicly available code posted to GitHub. Copyright issues aside, one study also found that around 40 percent of Copilot's output contained security vulnerabilities.

AI

Amazon Launches CodeWhisperer, a GitHub Copilot-like AI Pair Programming Tool (techcrunch.com) 13

At its re:Mars conference, Amazon today announced the launch of CodeWhisperer, an AI pair programming tool similar to GitHub's Copilot that can autocomplete entire functions based on only a comment or a few keystrokes. From a report: The company trained the system, which currently supports Java, JavaScript and Python, on billions of lines of publicly available open-source code and its own codebase, as well as publicly available documentation and code on public forums. It's now available in preview as part of the AWS IDE Toolkit, which means developers can immediately use it right inside their preferred IDEs, including Visual Studio Code, IntelliJ IDEA, PyCharm, WebStorm and Amazon's own AWS Cloud 9. Support for the AWS Lambda Console is also coming soon. Ahead of today's announcement, Vasi Philomin, Amazon's VP in charge of its AI services, stressed that the company didn't simply create this in order to offer a copy of Copilot. He noted that with CodeGuru, its AI code reviewer and performance profiler, and DevOps Guru, its tool for finding operation issues, the company laid the groundwork for today's launch quite a few years ago.
Linux

Linus Torvalds Says Rust For The Kernel Could Possibly Be Merged For Linux 5.20 (phoronix.com) 157

Speaking this week at the Linux Foundation's Open-Source Summit, Linus Torvalds talked up the possibilities of Rust within the Linux kernel and that it could be landing quite soon -- possibly even for the next kernel cycle. From a report: Linus Torvalds and Dirk Hohndel had their usual Open-Source Summit keynote/chat where Linus commented on Rust programming language code within the kernel. Torvalds commented that real soon they expect to have the Rust infrastructure merged within the kernel, possibly even for the next release -- meaning Linux 5.20. There hasn't yet been any Rust for Linux pull request sent in or merged yet, but things have begun settling down in the initial Rust enablement code for the kernel with the basic infrastructure, a few basic sample drivers, etc. Last month saw the most recent Rust Linux kernel patches posted that got more functionality into shape and additional reviews completed. As noted plenty of times before, this Rust support within the Linux kernel will remain optional when building the kernel depending upon whether you want the support or any of the kernel features to be implemented just in Rust code.
Canada

Canada To Compel YouTube, TikTok and Streamers To Boost Domestic Content (wsj.com) 141

Canada approved legislation that targets what video- and audio-sharing platforms like YouTube and TikTok can broadcast to a Canadian audience, as the country follows in Europe's footsteps in imposing a heftier regulatory burden on the digital sector. From a report: This marks the second attempt in as many years by Canada's Liberal government to compel digital platforms, including streaming companies like Netflix, to prominently feature Canadian artists on their services when users with a Canadian internet-protocol address log in. As contemplated under the new measures, users who search for music, television programming, films or do-it-yourself video shorts would get results incorporating a certain quota of Canadian-made content.

YouTube, a unit of Alphabet, TikTok, and the big streaming companies, among them Netflix, as well as legal experts and some Canadian artists, have either opposed Canada's move or warned of unintended consequences -- such as hurting the people the new policy is intended to help. Countries like Canada are increasingly turning to regulatory changes to protect domestic interests in light of the big inroads the world's biggest digital companies have made in transforming how households watch programs, listen to music, conduct day-to-day business and consume news.

Programming

Researchers Claim Travis CI API Leaks 'Tens of Thousands' of User Tokens (arstechnica.com) 7

Ars Technica describes Travis CI as "a service that helps open source developers write and test software." They also wrote Monday that it's "leaking thousands of authentication tokens and other security-sensitive secrets.

"Many of these leaks allow hackers to access the private accounts of developers on Github, Docker, AWS, and other code repositories, security experts said in a new report." The availability of the third-party developer credentials from Travis CI has been an ongoing problem since at least 2015. At that time, security vulnerability service HackerOne reported that a Github account it used had been compromised when the service exposed an access token for one of the HackerOne developers. A similar leak presented itself again in 2019 and again last year.

The tokens give anyone with access to them the ability to read or modify the code stored in repositories that distribute an untold number of ongoing software applications and code libraries. The ability to gain unauthorized access to such projects opens the possibility of supply chain attacks, in which threat actors tamper with malware before it's distributed to users. The attackers can leverage their ability to tamper with the app to target huge numbers of projects that rely on the app in production servers.

Despite this being a known security concern, the leaks have continued, researchers in the Nautilus team at the Aqua Security firm are reporting. A series of two batches of data the researchers accessed using the Travis CI programming interface yielded 4.28 million and 770 million logs from 2013 through May 2022. After sampling a small percentage of the data, the researchers found what they believe are 73,000 tokens, secrets, and various credentials.

"These access keys and credentials are linked to popular cloud service providers, including GitHub, AWS, and Docker Hub," Aqua Security said. "Attackers can use this sensitive data to initiate massive cyberattacks and to move laterally in the cloud. Anyone who has ever used Travis CI is potentially exposed, so we recommend rotating your keys immediately."

Software

The Collapse of Complex Software 317

Nolan Lawson, writing in a blogpost: Anyone who's worked in the tech industry for long enough, especially at larger organizations, has seen it before. A legacy system exists: it's big, it's complex, and no one fully understands how it works. Architects are brought in to "fix" the system. They might wheel out a big whiteboard showing a lot of boxes and arrows pointing at other boxes, and inevitably, their solution is... to add more boxes and arrows. Nobody can subtract from the system; everyone just adds. This might go on for several years. At some point, though, an organizational shakeup probably occurs -- a merger, a reorg, the polite release of some senior executive to go focus on their painting hobby for a while. A new band of architects is brought in, and their solution to the "big diagram of boxes and arrows" problem is much simpler: draw a big red X through the whole thing. The old system is sunset or deprecated, the haggard veterans who worked on it either leave or are reshuffled to other projects, and a fresh-faced team is brought in to, blessedly, design a new system from scratch.

As disappointing as it may be for those of us who might aspire to write the kind of software that is timeless and enduring, you have to admit that this system works. For all its wastefulness, inefficiency, and pure mendacity ("The old code works fine!" "No wait, the old code is terrible!"), this is the model that has sustained a lot of software companies over the past few decades. Will this cycle go on forever, though? I'm not so sure. Right now, the software industry has been in a nearly two-decade economic boom (with some fits and starts), but the one sure thing in economics is that booms eventually turn to busts. During the boom, software companies can keep hiring new headcount to manage their existing software (i.e. more engineers to understand more boxes and arrows), but if their labor force is forced to contract, then that same system may become unmaintainable. A rapid and permanent reduction in complexity may be the only long-term solution.

One thing working in complexity's favor, though, is that engineers like complexity. Admit it: as much as we complain about other people's complexity, we love our own. We love sitting around and dreaming up new architectural diagrams that can comfortably sit inside our own heads -- it's only when these diagrams leave our heads, take shape in the real world, and outgrow the size of any one person's head that the problems begin. It takes a lot of discipline to resist complexity, to say "no" to new boxes and arrows. To say, "No, we won't solve that problem, because that will just introduce 10 new problems that we haven't imagined yet." Or to say, "Let's go with a much simpler design, even if it seems amateurish, because at least we can understand it." Or to just say, "Let's do less instead of more."
Programming

Museum Restores 21 Rare Videos from Legendary 1976 Computing Conference (computerhistory.org) 58

At Silicon Valley's Computer History Museum, the senior curator just announced the results of a multi-year recovery and restoration process: making available 21 never-before-seen video recordings of a legendary 1976 conference: For five summer days in 1976, the first generation of computer rock stars had its own Woodstock. Coming from around the world, dozens of computing's top engineers, scientists, and software pioneers got together to reflect upon the first 25 years of their discipline in the warm, sunny (and perhaps a bit unsettling) climes of the Los Alamos National Laboratories, birthplace of the atomic bomb.
Among the speakers:

- A young Donald Knuth on the early history of programming languages

- FORTRAN designer John Backus on programming in America in the 1950s — some personal perspectives

- Harvard's Richard Milton Bloch (who worked with Grace Hopper in 1944)

- Mathematician/nuclear physicist Stanislaw M. Ulam on the interaction of mathematics and computing

- Edsger W. Dijkstra on "a programmer's early memories."


The Computer History Museum teases some highlights: Typical of computers of this generation, the 1946 ENIAC, the earliest American large-scale electronic computer, had to be left powered up 24 hours a day to keep its 18,000 vacuum tubes healthy. Turning them on and off, like a light bulb, shortened their life dramatically. ENIAC co-inventor John Mauchly discusses this serious issue....

The Los Alamos peak moment was the brilliant lecture on the British WW II Colossus computing engines by computer scientist and historian of computing Brian Randell. Colossus machines were special-purpose computers used to decipher messages of the German High Command in WW II. Based in southern England at Bletchley Park, these giant codebreaking machines regularly provided life-saving intelligence to the allies. Their existence was a closely-held secret during the war and for decades after. Randell's lecture was — excuse me — a bombshell, one which prompted an immediate re-assessment of the entire history of computing. Observes conference attendee (and inventor of ASCII) IBM's Bob Bemer, "On stage came Prof. Brian Randell, asking if anyone had ever wondered what Alan Turing had done during World War II? From there he went on to tell the story of Colossus — that day at Los Alamos was close to the first time the British Official Secrets Act had permitted any disclosures. I have heard the expression many times about jaws dropping, but I had really never seen it happen before."

Publishing these original primary sources for the first time is part of CHM's mission to not only preserve computing history but to make it come alive. We hope you will enjoy seeing and hearing from these early pioneers of computing.

Slashdot Top Deals