×
Open Source

Linus Torvalds Is Cautiously Optimistic About Bringing Rust Into Linux Kernel's Next Release (zdnet.com) 123

slack_justyb shares a report from ZDNet: For over three decades, Linux has been written in the C programming language. Indeed, Linux is C's most outstanding accomplishment. But the last few years have seen a growing momentum to make the Rust programming language Linux's second Linux language. At the recent Open Source Summit in Austin, Texas, Linux creator Linus Torvald said he could see Rust making it into the Linux kernel as soon as the next major release. "I'd like to see the Rust infrastructure merging to be started in the next release, but we'll see." Linux said after the summit. "I won't force it, and it's not like it's going to be doing anything really meaningful at that point -- it would basically be the starting point. So, no promises."

Now, you may ask: "Why are they adding Rust at all?" Rust lends itself more easily to writing secure software. Samartha Chandrashekar, an AWS product manager, said it "helps ensure thread safety and prevent memory-related errors, such as buffer overflows that can lead to security vulnerabilities." Many other developers agree with Chandrashekar. Torvalds also agrees and likes that Rust is more memory-safe. "There are real technical reasons like memory safety and why Rust is good to get in the kernel." Mind you, no one is going to be rewriting the entire 30 or so million lines of the Linux kernel into Rust. As Linux developer Nelson Elhage said in his summary of the 2020 Linux Plumber's meeting on Rust in Linux: "They're not proposing a rewrite of the Linux kernel into Rust; they are focused only on moving toward a world where new code may be written in Rust." The three areas of potential concern for Rust support are making use of the existing APIs in the kernel, architecture support, and dealing with application binary interface (ABI) compatibility between Rust and C.

Programming

Svelte Origins: a JavaScript Documentary (youtube.com) 48

Svelte Origins: The Documentary tells the story of how Svelte came to be, what makes Svelte different, and how it changes the game as a JavaScript framework. From the description of the documentary, which was recommended by several Slashdot readers: Filmed in locations throughout Europe and the US, it features Svelte's creator Rich Harris and members from the core community who contributed to making Svelte what it is today. Svelte Origins was filmed in late 2021, produced by OfferZen and directed by Dewald Brand, with shoots in the USA, the UK, Ireland, Sweden and Germany.
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Programming

Stack Overflow Survey Finds Developers Like Rust, Python, JavaScript and Remote Work (infoworld.com) 97

For Stack Overflow's annual survey, "Over 73,000 developers from 180 countries each spent roughly 15 minutes answering our questions," a blog post announces: The top five languages for professional developers haven't changed: JavaScript is still the most used, and Rust is the most loved for a seventh year. The big surprise came in the most loved web framework category. Showing how fast web technologies change, newcomer Phoenix took the most loved spot from Svelte, itself a new entry last year.... Check out the full results from this year's Developer Survey here.
In fact, 87% of Rust developers said that they want to continue using Rust, notes SD Times' summary of the results: Rust also tied with Python as the most wanted technology in this year's report, with TypeScript and Go following closely behind. The distinction between most loved and most wanted is that most wanted includes only developers who are not currently developing with the language, but have an interest in developing with it.
Slashdot reader logankilpatrick writes, "It should come as no surprise to those following the growth and expansion of the Julia Programming Language ecosystem that in this year's Stack Overflow developer survey, Julia ranked in the top 5 for the most loved languages (above Python — 6th, MatLab — Last, and R — 33rd)."

And the Register shares more highlights: Also notable in the 71,547 responses regarding programming languages was a switch again between Python and SQL. In 2021, Python pushed out SQL to be the third most commonly used language. This year SQL regained third place, just behind second placed HTML /CSS.

And the most hated...

Unsurprisingly, developers still dread that tap on the shoulder from the finance department for a tweak to that bit of code upon which the entire company depends. Visual Basic for Applications and COBOL still lurk within the top three most dreaded technologies.

The operating system rankings were little changed: Windows won out for personal and professional use, although for professional use Linux passed macOS to take second place with 40 percent of responses compared to Apple's 33 percent. Most notable was the growth of Windows Subsystem for Linux, which now accounts for 14 percent of personal use compared with a barely registering 3 percent in 2021.

But SD Times noted what may be the most interesting statistic: Only 15% of developers work on-site full time. Forty-three percent are fully remote and 42% are hybrid. Smaller organizations with 2-19 employees are more likely to be in-person, while large organizations with over 10k employees are more likely to be hybrid, according to the survey.
InfoWorld delves into what this means: "The world has made the decision to go hybrid and remote, I have a lot of confidence given the data I have seen that that is a one-way train that has left the station," Prashanth Chandrasekar, CEO of Stack Overflow told InfoWorld.

Chandrasekar says that flexibility and the tech stack developers get to work with are the most important contributors to overall happiness at work. "Many developers drop out of the hiring process because of the tech stack they will be working with," he said... Organizational culture is also shifting, and cloud-native techniques have taken hold among Stack Overflow survey respondents. Most professional developers (70%) now use some form of CI/CD and 60% have a dedicated devops function....

Lastly, Web3 still has software developers torn, with 32% of respondents favorable, 31% unfavorable, and 26% indifferent. Web3 refers to the emerging idea of a decentralized web where data and content are registered on blockchains, tokenized, or managed and accessed on peer-to-peer distributed networks.

IT

Are 'Google Programmers' the New 'Next-Next-Finish Programmers'? (pvs-studio.com) 203

Long-time Slashdot reader theodp writes: Back in 1998, Ellen Ullman wrote in Salon about The dumbing-down of programming: "My programming tools were full of wizards. Little dialog boxes waiting for me to click "Next" and "Next" and "Finish." Click and drag and shazzam! — thousands of lines of working code. No need to get into the "hassle" of remembering the language. No need to even learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don't really need to understand."

Twenty-four years later, PVS-Studio has published a translation of Ivan Belokamentsev's cautionary tale of how modernizing his interviewing process from coding on paper to a computer led him to inadvertently hire 'Google Programmers', who dazzled him in interviews and initially on the job, but soon reached a plateau in productivity that puzzled him until he had a gobsmacking realization.

From their article: It was like somebody hit me on the head with a sack of flour. It took me about two days to process it. How is it really possible? The beautiful, well-optimized code they showed me at the first interview was from the Internet. The explosive growth of productivity in the first months was due to the solutions that they found on the Internet. Those answers to user questions after the magic "We'll call you back" from these guys — were found on the Internet. They were coding without understanding the basic constructs. No, they didn't write code — they downloaded it. No, that's not it, either. To download the code is like running "npm i", it's ok. They copy-pasted the code. Without knowing how to write it.

That's what angered me — what the...? Well, I understand when you surf the net to figure out how a new technology works. Or when you need to use some exotic feature and not to bloat your head with unnecessary information. But basic things! How can you copy-paste basic things from the Internet?!

The article meditates on the mindset of "Google" programmers. Rather than learning about basic objects, types, and the constructs of a programming language, "Any information is available to them, always and everywhere. They've learned how to find this information quickly — whether it's the address of a store with cookies, pants on sale or generating a query."

But long-time Slashdot reader AmiMoJo now pushes back: This is dumb. Not everyone has a great memory, and these days there are so many different tools and frameworks that nobody can remember them all anyway. Back in the day when it was all C, you could reasonably write useful code on paper. These days most of that code will probably be interacting with libraries that you have not committed to memory.

If your developers are not progressing, help them. Give them training or mentoring. Challenge them.

And there's also this advice from Slashdot reader Iamthecheese: "Stop selecting for low ethics in your hiring process." There is a stupid, stupid idea out there among the pointy hair types that it's possible to hire top tier candidates for peanuts. This idea has been put into their heads by massively over-promising companies selling HR solutions of all shapes... They're actively selecting people with just enough ability to pass these specific tests and who are unwilling to show their true levels of ability by hashing it out on their own. So you have these untrained people who look for easy ways past problems, but you were expecting "rock stars".
Their suggested solution? "Stop looking for easy, cheap, already trained people and start looking for trainable, people." And then, "show them a little loyalty. That way you'll have people to train new hires, who also know what they're doing on the job."
AI

AI-Powered GitHub Copilot Leaves Preview, Now Costs $100 a Year (techcrunch.com) 36

It was June 29th of 2021 that Microsoft-owned GitHub first announced its AI-powered autocompletion tool for programmers — trained on GitHub repositories and other publicly-available source code.

But after a year in "technical preview," GitHub Copilot has reached a new milestone, reports Info-Q: you'll now have to pay to use it after a 60-day trial: The transition to general availability mostly means that Copilot ceases to be available for free. Interested developers will have to pay 10 USD/month or $100 USD/year to use the service, with a 60-day free trial.... According to GitHub, while not frequent, there is definitely a possibility that Copilot outputs code snippets that match those in the training set.
Info-Q also cites GitHub stats showing over 1.2 million developers used Copilot in the last 12 months "with a shocking 40% figure of code written by Copilot in files where it is enabled." That's up from 35% earlier in the year, reports TechCrunch — which has more info on the rollout: It'll be free for students as well as "verified" open source contributors — starting with roughly 60,000 developers selected from the community and students in the GitHub Education program... One new feature coinciding with the general release of Copilot is Copilot Explain, which translates code into natural language descriptions. Described as a research project, the goal is to help novice developers or those working with an unfamiliar codebase.

Ryan J. Salva, VP of product at GitHub, told TechCrunch via email... "As an example of the impact we've observed, it's worth sharing early results from a study we are conducting. In the experiment, we are asking developers to write an HTTP server — half using Copilot and half without. Preliminary data suggests that developers are not only more likely to complete their task when using Copilot, but they also do it in roughly half the time."

Owing to the complicated nature of AI models, Copilot remains an imperfect system. GitHub said that it's implemented filters to block emails when shown in standard formats, and offensive words, and that it's in the process of building a filter to help detect and suppress code that's repeated from public repositories. But the company acknowledges that Copilot can produce insecure coding patterns, bugs and references to outdated APIs, or idioms reflecting the less-than-perfect code in its training data.

The Verge ponders where this is going — and how we got here: "Just like the rise of compilers and open source, we believe AI-assisted coding will fundamentally change the nature of software development, giving developers a new tool to write code easier and faster so they can be happier in their lives," says GitHub CEO Thomas Dohmke.

Microsoft's $1 billion investment into OpenAI, the research firm now led by former Y Combinator president Sam Altman, led to the creation of GitHub Copilot. It's built on OpenAI Codex, a descendant of OpenAI's flagship GPT-3 language-generating algorithm.

GitHub Copilot has been controversial, though. Just days after its preview launch, there were questions over the legality of Copilot being trained on publicly available code posted to GitHub. Copyright issues aside, one study also found that around 40 percent of Copilot's output contained security vulnerabilities.

AI

Amazon Launches CodeWhisperer, a GitHub Copilot-like AI Pair Programming Tool (techcrunch.com) 13

At its re:Mars conference, Amazon today announced the launch of CodeWhisperer, an AI pair programming tool similar to GitHub's Copilot that can autocomplete entire functions based on only a comment or a few keystrokes. From a report: The company trained the system, which currently supports Java, JavaScript and Python, on billions of lines of publicly available open-source code and its own codebase, as well as publicly available documentation and code on public forums. It's now available in preview as part of the AWS IDE Toolkit, which means developers can immediately use it right inside their preferred IDEs, including Visual Studio Code, IntelliJ IDEA, PyCharm, WebStorm and Amazon's own AWS Cloud 9. Support for the AWS Lambda Console is also coming soon. Ahead of today's announcement, Vasi Philomin, Amazon's VP in charge of its AI services, stressed that the company didn't simply create this in order to offer a copy of Copilot. He noted that with CodeGuru, its AI code reviewer and performance profiler, and DevOps Guru, its tool for finding operation issues, the company laid the groundwork for today's launch quite a few years ago.
Linux

Linus Torvalds Says Rust For The Kernel Could Possibly Be Merged For Linux 5.20 (phoronix.com) 157

Speaking this week at the Linux Foundation's Open-Source Summit, Linus Torvalds talked up the possibilities of Rust within the Linux kernel and that it could be landing quite soon -- possibly even for the next kernel cycle. From a report: Linus Torvalds and Dirk Hohndel had their usual Open-Source Summit keynote/chat where Linus commented on Rust programming language code within the kernel. Torvalds commented that real soon they expect to have the Rust infrastructure merged within the kernel, possibly even for the next release -- meaning Linux 5.20. There hasn't yet been any Rust for Linux pull request sent in or merged yet, but things have begun settling down in the initial Rust enablement code for the kernel with the basic infrastructure, a few basic sample drivers, etc. Last month saw the most recent Rust Linux kernel patches posted that got more functionality into shape and additional reviews completed. As noted plenty of times before, this Rust support within the Linux kernel will remain optional when building the kernel depending upon whether you want the support or any of the kernel features to be implemented just in Rust code.
Canada

Canada To Compel YouTube, TikTok and Streamers To Boost Domestic Content (wsj.com) 141

Canada approved legislation that targets what video- and audio-sharing platforms like YouTube and TikTok can broadcast to a Canadian audience, as the country follows in Europe's footsteps in imposing a heftier regulatory burden on the digital sector. From a report: This marks the second attempt in as many years by Canada's Liberal government to compel digital platforms, including streaming companies like Netflix, to prominently feature Canadian artists on their services when users with a Canadian internet-protocol address log in. As contemplated under the new measures, users who search for music, television programming, films or do-it-yourself video shorts would get results incorporating a certain quota of Canadian-made content.

YouTube, a unit of Alphabet, TikTok, and the big streaming companies, among them Netflix, as well as legal experts and some Canadian artists, have either opposed Canada's move or warned of unintended consequences -- such as hurting the people the new policy is intended to help. Countries like Canada are increasingly turning to regulatory changes to protect domestic interests in light of the big inroads the world's biggest digital companies have made in transforming how households watch programs, listen to music, conduct day-to-day business and consume news.

Programming

Researchers Claim Travis CI API Leaks 'Tens of Thousands' of User Tokens (arstechnica.com) 7

Ars Technica describes Travis CI as "a service that helps open source developers write and test software." They also wrote Monday that it's "leaking thousands of authentication tokens and other security-sensitive secrets.

"Many of these leaks allow hackers to access the private accounts of developers on Github, Docker, AWS, and other code repositories, security experts said in a new report." The availability of the third-party developer credentials from Travis CI has been an ongoing problem since at least 2015. At that time, security vulnerability service HackerOne reported that a Github account it used had been compromised when the service exposed an access token for one of the HackerOne developers. A similar leak presented itself again in 2019 and again last year.

The tokens give anyone with access to them the ability to read or modify the code stored in repositories that distribute an untold number of ongoing software applications and code libraries. The ability to gain unauthorized access to such projects opens the possibility of supply chain attacks, in which threat actors tamper with malware before it's distributed to users. The attackers can leverage their ability to tamper with the app to target huge numbers of projects that rely on the app in production servers.

Despite this being a known security concern, the leaks have continued, researchers in the Nautilus team at the Aqua Security firm are reporting. A series of two batches of data the researchers accessed using the Travis CI programming interface yielded 4.28 million and 770 million logs from 2013 through May 2022. After sampling a small percentage of the data, the researchers found what they believe are 73,000 tokens, secrets, and various credentials.

"These access keys and credentials are linked to popular cloud service providers, including GitHub, AWS, and Docker Hub," Aqua Security said. "Attackers can use this sensitive data to initiate massive cyberattacks and to move laterally in the cloud. Anyone who has ever used Travis CI is potentially exposed, so we recommend rotating your keys immediately."

Software

The Collapse of Complex Software 317

Nolan Lawson, writing in a blogpost: Anyone who's worked in the tech industry for long enough, especially at larger organizations, has seen it before. A legacy system exists: it's big, it's complex, and no one fully understands how it works. Architects are brought in to "fix" the system. They might wheel out a big whiteboard showing a lot of boxes and arrows pointing at other boxes, and inevitably, their solution is... to add more boxes and arrows. Nobody can subtract from the system; everyone just adds. This might go on for several years. At some point, though, an organizational shakeup probably occurs -- a merger, a reorg, the polite release of some senior executive to go focus on their painting hobby for a while. A new band of architects is brought in, and their solution to the "big diagram of boxes and arrows" problem is much simpler: draw a big red X through the whole thing. The old system is sunset or deprecated, the haggard veterans who worked on it either leave or are reshuffled to other projects, and a fresh-faced team is brought in to, blessedly, design a new system from scratch.

As disappointing as it may be for those of us who might aspire to write the kind of software that is timeless and enduring, you have to admit that this system works. For all its wastefulness, inefficiency, and pure mendacity ("The old code works fine!" "No wait, the old code is terrible!"), this is the model that has sustained a lot of software companies over the past few decades. Will this cycle go on forever, though? I'm not so sure. Right now, the software industry has been in a nearly two-decade economic boom (with some fits and starts), but the one sure thing in economics is that booms eventually turn to busts. During the boom, software companies can keep hiring new headcount to manage their existing software (i.e. more engineers to understand more boxes and arrows), but if their labor force is forced to contract, then that same system may become unmaintainable. A rapid and permanent reduction in complexity may be the only long-term solution.

One thing working in complexity's favor, though, is that engineers like complexity. Admit it: as much as we complain about other people's complexity, we love our own. We love sitting around and dreaming up new architectural diagrams that can comfortably sit inside our own heads -- it's only when these diagrams leave our heads, take shape in the real world, and outgrow the size of any one person's head that the problems begin. It takes a lot of discipline to resist complexity, to say "no" to new boxes and arrows. To say, "No, we won't solve that problem, because that will just introduce 10 new problems that we haven't imagined yet." Or to say, "Let's go with a much simpler design, even if it seems amateurish, because at least we can understand it." Or to just say, "Let's do less instead of more."
Programming

Museum Restores 21 Rare Videos from Legendary 1976 Computing Conference (computerhistory.org) 58

At Silicon Valley's Computer History Museum, the senior curator just announced the results of a multi-year recovery and restoration process: making available 21 never-before-seen video recordings of a legendary 1976 conference: For five summer days in 1976, the first generation of computer rock stars had its own Woodstock. Coming from around the world, dozens of computing's top engineers, scientists, and software pioneers got together to reflect upon the first 25 years of their discipline in the warm, sunny (and perhaps a bit unsettling) climes of the Los Alamos National Laboratories, birthplace of the atomic bomb.
Among the speakers:

- A young Donald Knuth on the early history of programming languages

- FORTRAN designer John Backus on programming in America in the 1950s — some personal perspectives

- Harvard's Richard Milton Bloch (who worked with Grace Hopper in 1944)

- Mathematician/nuclear physicist Stanislaw M. Ulam on the interaction of mathematics and computing

- Edsger W. Dijkstra on "a programmer's early memories."


The Computer History Museum teases some highlights: Typical of computers of this generation, the 1946 ENIAC, the earliest American large-scale electronic computer, had to be left powered up 24 hours a day to keep its 18,000 vacuum tubes healthy. Turning them on and off, like a light bulb, shortened their life dramatically. ENIAC co-inventor John Mauchly discusses this serious issue....

The Los Alamos peak moment was the brilliant lecture on the British WW II Colossus computing engines by computer scientist and historian of computing Brian Randell. Colossus machines were special-purpose computers used to decipher messages of the German High Command in WW II. Based in southern England at Bletchley Park, these giant codebreaking machines regularly provided life-saving intelligence to the allies. Their existence was a closely-held secret during the war and for decades after. Randell's lecture was — excuse me — a bombshell, one which prompted an immediate re-assessment of the entire history of computing. Observes conference attendee (and inventor of ASCII) IBM's Bob Bemer, "On stage came Prof. Brian Randell, asking if anyone had ever wondered what Alan Turing had done during World War II? From there he went on to tell the story of Colossus — that day at Los Alamos was close to the first time the British Official Secrets Act had permitted any disclosures. I have heard the expression many times about jaws dropping, but I had really never seen it happen before."

Publishing these original primary sources for the first time is part of CHM's mission to not only preserve computing history but to make it come alive. We hope you will enjoy seeing and hearing from these early pioneers of computing.

Programming

'Rust Is Hard, Or: The Misery of Mainstream Programming' (github.io) 123

Hirrolot's blog: When you use Rust, it is sometimes outright preposterous how much knowledge of language, and how much of programming ingenuity and curiosity you need in order to accomplish the most trivial things. When you feel particularly desperate, you go to rust/issues and search for a solution for your problem. Suddenly, you find an issue with an explanation that it is theoretically impossible to design your API in this way, owing to some subtle language bug. The issue is Open and dated Apr 5, 2017.

I entered Rust four years ago. To this moment, I co-authored teloxide and dptree, wrote several publications and translated a number of language release announcements. I also managed to write some production code in Rust, and had a chance to speak at one online meetup dedicated to Rust. Still, from time to time I find myself disputing with Rust's borrow checker and type system for no practical reason. Yes, I am no longer stupefied by such errors as cannot return reference to temporary value - over time, I developed multiple heuristic strategies to cope with lifetimes...

But one recent situation has made me to fail ignominiously. [...]

Programming

Google's Chrome Team Evaluates Retrofitting Temporal Memory Safety on C++ (googleblog.com) 49

"C++ allows for writing high-performance applications but this comes at a price, security..." So says Google's Chrome security team in a recent blog post, adding that in general, "While there is appetite for different languages than C++ with stronger memory safety guarantees, large codebases such as Chromium will use C++ for the foreseeable future."

So the post discusses "our journey of using heap scanning technologies to improve memory safety of C++." The basic idea is to put explicitly freed memory into quarantine and only make it available when a certain safety condition is reached. Microsoft has shipped versions of this mitigation in its browsers: MemoryProtector in Internet Explorer in 2014 and its successor MemGC in (pre-Chromium) Edge in 2015. In the Linux kernel a probabilistic approach was used where memory was eventually just recycled. And this approach has seen attention in academia in recent years with the MarkUs paper. The rest of this article summarizes our journey of experimenting with quarantines and heap scanning in Chrome.
In essence the C++ memory allocator (used by new and delete) is "intercepted." There are various hardening options which come with a performance cost:


- Overwrite the quarantined memory with special values (e.g. zero);

- Stop all application threads when the scan is running or scan the heap concurrently;

- Intercept memory writes (e.g. by page protection) to catch pointer updates;

- Scan memory word by word for possible pointers (conservative handling) or provide descriptors for objects (precise handling);

- Segregation of application memory in safe and unsafe partitions to opt-out certain objects which are either performance sensitive or can be statically proven as being safe to skip;

- Scan the execution stack in addition to just scanning heap memory...


Running our basic version on Speedometer2 regresses the total score by 8%. Bummer...

To reduce the regression we implemented various optimizations that improve the raw scanning speed. Naturally, the fastest way to scan memory is to not scan it at all and so we partitioned the heap into two classes: memory that can contain pointers and memory that we can statically prove to not contain pointers, e.g. strings. We avoid scanning memory that cannot contain any pointers. Note that such memory is still part of the quarantine, it is just not scanned....

[That and other] optimizations helped to reduce the Speedometer2 regression from 8% down to 2%.

Thanks to Slashdot reader Hari Pota for sharing the link
Music

'Father of MIDI' Dave Smith Dies At 72 (billboard.com) 30

Sad news from long-time Slashdot reader NormalVisual: Synthtopia reports that Dave Smith, founder of the legendary synthesizer manufacturer Sequential Circuits and creator of the MIDI (Musical Instrument Digital Interface) standard, died this past Wednesday.

Some of Smith's notable creations include the Prophet 5, one of the first commercially available digitally-controlled polyphonic analog synthesizers, and the Prophet-600, the first available device to offer MIDI...

Smith, who held degrees in both computer science and electronic engineering from UC Berkeley, was scheduled to appear at this year's National Association of Music Merchant (NAMM), but died suddenly. No cause of death has yet been released.

Smith's Wikipedia entry calls his 1977 Prophet 5 "the world's first microprocessor-based musical instrument" and a crucial step forward as a programmable synthesizer.

And this week Billboad magazine hailed Smith as "a key figure in the development of synth technology in the 1970s and 1980s." With Sequential (originally known as Sequential Circuits), Smith released various sequencers and programmers to be used with the Moog and ARP synthesizers prevalent at the time, before designing his own release: the Prophet-5, first polyphonic synth with programmable memory, to allow sounds to be stored and re-accessed at any time. The Prophet-5 quickly became the gold standard in its field, used in the recording both of epochal '80s blockbuster LPs like Michael Jackson's Thriller and Madonna's Like a Virgin and envelope-pushing scores for era composers like John Carpenter and Vangelis....

Smith's greatest legacy might be the introduction of MIDI to synth technology... Smith's invention (along with Roland pioneer Ikutaro Kakehashi and Sequential engineer Chet Wood) of MIDI allowed unprecedented levels of synchronization and communication between different instruments, computers and other recording equipment, which was previously incredibly difficult to achieve — particularly between equipment designed by separate manufacturers. The innovation of MIDI helped facilitate the explosion of forward-thinking programming and creativity throughout the industry of the '80s, essentially making the future of pop music accessible to all.

Smith would also develop the world's first computer synthesizer as president of Seer Systems in the '90s, and launched the company Dave Smith Instruments, an instruments manufacturer, in 2002. He has won many lifetime awards for his work in the field of musical technology, including a Technical Grammy for MIDI's creation in 2013 (an honor he shared with Kakehashi).

Programming

Should IT Professionals Be Liable for Ransomware Attacks? (acm.org) 250

Denmark-based Poul-Henning Kamp describes himself as the "author of a lot of FreeBSD, most of Varnish and tons of other Open Source Software." And he shares this message in June's Communications of the ACM.

"The software industry is still the problem." If any science fiction author, famous or obscure, had submitted a story where the plot was "modern IT is a bunch of crap that organized crime exploits for extortion," it would have gotten nowhere, because (A) that is just not credible, and (B) yawn!

And yet, here we are.... As I write this, 200-plus corporations, including many retail chains, have inoperative IT because extortionists found a hole in some niche, third-party software product most of us have never heard of.

But he's also proposing a solution. In Denmark, 129 jobs are regulated by law. There are good and obvious reasons why it is illegal for any random Ken, Brian, or Dennis to install toilets or natural-gas furnaces, perform brain surgery, or certify a building is strong enough to be left outside during winter. It may be less obvious why the state cares who runs pet shops, inseminates cattle, or performs zoological taxidermy, but if you read the applicable laws, you will learn that animal welfare and protection of endangered species have many and obscure corner cases.

Notably absent, as in totally absent, on that list are any and all jobs related to IT; IT architecture, computers, computer networks, computer security, or protection of privacy in computer systems. People who have been legally barred and delicensed from every other possible trade — be it for incompetence, fraud, or both — are entirely free to enter the IT profession and become responsible for the IT architecture or cybersecurity of the IT system that controls nearly half the hydrocarbons to the Eastern Seaboard of the U.S....

With respect to gas, water, electricity, sewers, or building stability, the regulations do not care if a company is hundreds of years old or just started this morning, the rules are always the same: Stuff should just work, and only people who are licensed — because they know how to — are allowed to make it work, and they can be sued if they fail to do so.

The time is way overdue for IT engineers to be subject to professional liability, like almost every other engineering profession. Before you tell me that is impossible, please study how the very same thing happened with electricity, planes, cranes, trains, ships, automobiles, lifts, food processing, buildings, and, for that matter, driving a car.

As with software product liability, the astute reader is apt to exclaim, "This will be the end of IT as we know it!" Again, my considered response is, "Yes, please, that is precisely my point!"

NASA

NASA Programmer Remembers Debugging Lisp In Deep Space (thenewstack.io) 70

joshuark writes: NASA programmer/scientist, Ron Garret shares his experience debugging LISP code from 150-million miles away on the robotic Mars rover Sojourner. Garret describes his experience in a recent episode of Adam Gordon Bell's Corecursive podcast. Garret later explains, "And it didn't work..." for the next project NASA's New Millennium project using LISP.

Like a professor said in LISP programming class, LISP -- getting it done is half DEFUN. Garret had written an essay in 2006 , titled, "How knowing LISP destroyed my programming career." Available on the web archive. So much for LISPcraft, or the Little LISPer.

The Almighty Buck

Survey Finds Highest Developer Interest in Blockchain Apps, Cryptocurrencies, and NFTs (zdnet.com) 62

Charlotte Web writes: A recent survey of 20,000 developers found a third (34%) were learning about cryptocurrencies, ZDNet reports — and 16% even said they were actively working on crypto-related projects. (And 11% said they were actively working on NFT technology, while 32% said they were learning more about NFTs.)

30% also said they were learning about blockchain technologies other than cryptocurrencies (with just 12% currently working on blockchain projects — just 1% higher than in a 2021 survey).

Citing the survey, ZDNet adds that "The next most popular technologies were the metaverse and AI-assisted software development: 28% of developers are learning about these technologies."

Programming

What Made Golang Become Popular? Its Creators Look Back (acm.org) 52

Created at Google in late 2007, the Go programming language was open sourced in late 2009, remember its creators, and "since then, it has operated as a public project, with contributions from thousands of individuals and dozens of companies."

In a joint essay in Communications of the ACM, five of the language's five original creators explore what brought growing popularity to this "garbage-collected, statically compiled language for building systems" (with its self-contained binaries and easy cross-compilation). "The most important decisions made in the language's design...were the ones that made Go better for large-scale software engineering and helped us attract like-minded developers...." Although the design of most languages concentrates on innovations in syntax, semantics, or typing, Go is focused on the software development process itself. Go is efficient, easy to learn, and freely available, but we believe that what made it successful was the approach it took toward writing programs, particularly with multiple programmers working on a shared codebase. The principal unusual property of the language itself — concurrency — addressed problems that arose with the proliferation of multicore CPUs in the 2010s. But more significant was the early work that established fundamentals for packaging, dependencies, build, test, deployment, and other workaday tasks of the software development world, aspects that are not usually foremost in language design.

These ideas attracted like-minded developers who valued the result: easy concurrency, clear dependencies, scalable development and production, secure programs, simple deployment, automatic code formatting, tool-aided development, and more. Those early developers helped popularize Go and seeded the initial Go package ecosystem. They also drove the early growth of the language by, for example, porting the compiler and libraries to Windows and other operating systems (the original release supported only Linux and MacOS X). Not everyone was a fan — for instance, some people objected to the way the language omitted common features such as inheritance and generic types. But Go's development-focused philosophy was intriguing and effective enough that the community thrived while maintaining the core principles that drove Go's existence in the first place. Thanks in large part to that community and the technology it has built, Go is now a significant component of the modern cloud computing environment.

Since Go version 1 was released, the language has been all but frozen. The tooling, however, has expanded dramatically, with better compilers, more powerful build and testing tools, and improved dependency management, not to mention a huge collection of open source tools that support Go. Still, change is coming: Go 1.18, released in March 2022, includes the first version of a true change to the language, one that has been widely requested — the first cut at parametric polymorphism.... We considered a handful of designs during Go's first decade but only recently found one that we feel fits Go well. Making such a large language change while staying true to the principles of consistency, completeness, and community will be a severe test of the approach.

Programming

Developer Survey: JavaScript and Python Reign, but Rust is Rising (infoworld.com) 60

SlashData's "State of the Developer Nation" surveyed more than 20,000 developers in 166 countries, taken from December 2021 to February 2022, reports InfoWorld.

It found the most popular programming language is JavaScript — followed by Python (which apparently added 3.3 million new net developers in just the last six months). And Rust adoption nearly quadrupled over the last two years to 2.2 million developers.

InfoWorld summarizes other findings from the survey: Java continues to experience strong and steady growth. Nearly 5 million developers have joined the Java community since the beginning of 2021.

PHP has grown the least in the past six month, with an increase of 600,000 net new developers between Q3 2021 and Q1 2022. But PHP is the second-most-commonly used language in web applications after JavaScript.

Go and Ruby are important languages in back-end development, but Go has grown more than twice as fast in the past year. The Go community now numbers 3.3 million developers.

The Kotlin community has grown from 2.4 million developers in Q1 2021 to 5 million in Q1 2022. This is largely attributed to Google making Kotlin its preferred language for Android development.

Slashdot Top Deals