Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Security AI

AI Hallucinations Lead To a New Cyber Threat: Slopsquatting 51

Researchers have uncovered a new supply chain attack called Slopsquatting, where threat actors exploit hallucinated, non-existent package names generated by AI coding tools like GPT-4 and CodeLlama. These believable yet fake packages, representing almost 20% of the samples tested, can be registered by attackers to distribute malicious code. CSO Online reports: Slopsquatting, as researchers are calling it, is a term first coined by Seth Larson, a security developer-in-residence at Python Software Foundation (PSF), for its resemblance to the typosquatting technique. Instead of relying on a user's mistake, as in typosquats, threat actors rely on an AI model's mistake. A significant number of packages, amounting to 19.7% (205,000 packages), recommended in test samples were found to be fakes. Open-source models -- like DeepSeek and WizardCoder -- hallucinated more frequently, at 21.7% on average, compared to the commercial ones (5.2%) like GPT 4. Researchers found CodeLlama ( hallucinating over a third of the outputs) to be the worst offender, and GPT-4 Turbo ( just 3.59% hallucinations) to be the best performer.

These package hallucinations are particularly dangerous as they were found to be persistent, repetitive, and believable. When researchers reran 500 prompts that had previously produced hallucinated packages, 43% of hallucinations reappeared every time in 10 successive re-runs, with 58% of them appearing in more than one run. The study concluded that this persistence indicates "that the majority of hallucinations are not just random noise, but repeatable artifacts of how the models respond to certain prompts." This increases their value to attackers, it added. Additionally, these hallucinated package names were observed to be "semantically convincing." Thirty-eight percent of them had moderate string similarity to real packages, suggesting a similar naming structure. "Only 13% of hallucinations were simple off-by-one typos," Socket added.
The research can found be in a paper on arXiv.org (PDF).
This discussion has been archived. No new comments can be posted.

AI Hallucinations Lead To a New Cyber Threat: Slopsquatting

Comments Filter:
  • by Brain-Fu ( 1274756 ) on Monday April 21, 2025 @10:24PM (#65322195) Homepage Journal

    You have to do your research and make sure the packages you are importing are legit. This is true whether or not the package was recommended by an AI.

    I guess sloth IS a risk. Vibe coders may get into the habit of just trusting whatever the LLM churns out. Could be a problem. But either way, it's still on you.

    • by martin-boundary ( 547041 ) on Monday April 21, 2025 @10:32PM (#65322201)
      I like how people have to buy tokens to receive the Wisdom of Superhuman Coding Overlord AIs that repeatedly tell them to use the same fake packages every time, but it's always the people who are responsible for following the bad advice they paid for.

      It's a great business model! Risk free! How can I invest in it?

      • by DarkOx ( 621550 )

        I just a variation of Lucy's Psychiatry Booth in Peanuts. Get people to pay for advice while disclaiming any responsibility for it, anything that does not work out is a personal moral failing in the buyer. Same thing with 'self-driving' cars, etc.

      • by mysidia ( 191772 )

        How can I invest in it?

        Use Vibe coding to create your own AI-based service.

    • You could do that or you could use a service to scan your repo for you ( even non-vibe coders can do it ).
    • In software development the trend have been to just use many open source depencies and frowning at you if you write your own instead. But at the last job I suddenly saw the opposit trend: people acknowledged the downsides of depending on foreign dependencies.
      • Businesses sure have an incentive to encourage use of dependencies: it saves them a lot of money and gets their product to market sooner. They don't want to pay programmers a fortune to build something that has already been built, especially when they can just use it for free.

        There ARE long term consequences of course. Inherited bugs, inherited security vulnerabilities, and you have to wait for someone else to fix it on their schedule. You have to keep updating the packages either way and sometimes that

    • I am not a coder. If I want a simple program to read data from a humidity sensor it involves python having to download random packages and dependencies.

      • There is an easy solution to that: become a coder :-).

        And yes, read the sensor's specification page. it's actually quite fun to make one's own low-level library to communicate with it - and the joy when it works! it's priceless.

      • by zlives ( 2009072 )

        as a non coder myself, i pay for valid code from the sensor manufacturer/supplier and hope the vendor didn't cut corners.
        or i can choose to become a coder and roll my own.
        security is so inconvenient, like all the time

    • "Do your research". On what? The Golang, Rust, NodeJS, JPackage, CPAN, Pypi, or other source repo hosted tools with dependencies scattered everywhere? One may as well rely on the label "Made in America", it's impossible _by design_ to see past the "layer of abstraction" and verify the whole code stack. This is one of the primary goals of "object oriented" software, to conceal anything but the snippet you are tasked to work on.

  • god damn it (Score:2, Funny)

    by Anonymous Coward
    This is what happened when EZ Pass made tollbooths obsolete. The morons of society could no longer work in tollbooths, so somehow they wormed their way into tech companies. Remember when you had to have a degree from Stanford, MIT, Berkeley, Cal Tech or another first-rate university to get a job at a major tech company unless you had a very impressive resume?
    • I am intrigued by your conspiracy theory and I wish to subscribe to your newsletter - provided it's not protected by a paywall. But maybe you are one of *them*... *puts tinfoil hat on*

    • by zlives ( 2009072 )

      your morons were so preoccupied with whether or not they could code, they didn't stop to think if they should.

  • Prior Art (Score:5, Informative)

    by Kunedog ( 1033226 ) on Monday April 21, 2025 @10:58PM (#65322219)
  • by tiqui ( 1024021 ) on Tuesday April 22, 2025 @12:14AM (#65322281)

    be doing all our coding in the future. "Who needs programmers anymore" seems to be the new mantra in the corporate corner offices.

    It was bead enough when incompetent human programmers used unallocated memory or freed memory they were still using, but now we'll get to see the effects of "AI hallucinations"... oh, joy...

    What could POSSIBLY go wrong? go wrong? go wrong? go wrong?...

    • I'd imagine that the folks who looked after horses in New York City in the late 1800s looked with similar disdain on those limited, buggy, undependable new streetcars. The difference is that these tools are improving far faster than the automobile did. They had several decades to come to terms with it. We don't.

      • I'd imagine that the folks who looked after horses in New York City in the late 1800s looked with similar disdain on those limited, buggy, undependable new streetcars. The difference is that these tools are improving far faster than the automobile did. They had several decades to come to terms with it. We don't.

        They were pretty much a minority. NYC (and others) were suffocating in horse shit. They were ready for motorized transportation big time. Must have smelled something awful. So they traded horseshit for diesel fumes.

        • by DarkOx ( 621550 )

          It wasn't just the excrement either, often the animals would die or have to be put down in the streets.

          How do you move 1000 pound animal corpse? Right you don't without help or equipment. So those decaying corpses would lay there on hot cobble stones or brick road in the summer until they could be cleaned up.

      • buggy

        In that case, the buggy is the older more reliable tech.

    • be doing all our coding in the future. "Who needs programmers anymore" seems to be the new mantra in the corporate corner offices.

      It was bead enough when incompetent human programmers used unallocated memory or freed memory they were still using, but now we'll get to see the effects of "AI hallucinations"... oh, joy...

      What could POSSIBLY go wrong? go wrong? go wrong? go wrong?...

      As well, we need to reactivate 1970's nuclear reactors to serve up the AI referencing itself and delivering our malware. It takes serious power to do that!

  • by ihadafivedigituid ( 8391795 ) on Tuesday April 22, 2025 @12:22AM (#65322293)
    I swear, the future is weird as hell.
  • This is a known supply chain attack .. but now they added the label "AI". Someone must get paid per advisory.

    • It's a "known supply chain attack" that is specifically applicable to LLMs (AI), since LLMs seem to have a pattern in their hallucinated packages.

      I wonder if you can get paid per stupid comment.
    • There is AI in supply chAIn!

    • by mysidia ( 191772 )

      This is a "future supply chain" attack, because they're attacking a Supply chain that doesn't even exist yet.

      Your programming AIs thanks to Vibe coding are now using dependencies that don't exist yet to derive your software.

  • This isn't going to get major corporations who have internal AI. This is going to get the startup who has no real coders. Serves them right, I guess.

  • Remembering some package names I came across, I think one of the source problems is that evryone is trying to come up with an obscure, in-joke, oh-so-clever name for the extra nerd credz or "teh lulz"...
  • If these libraries are repeatedly hallucinated, do they represent gaps in functionality that should exist that human coders have missed? Do they represent open opportunities?

    • Like everything open source you it's probably done a dozen times over already because someone didn't like that the original author started the project on a Tuesday and had to fork it.

  • Based purely on "this package name looks good."
  • Just because a machine wrote the code doesn't mean we can't use our standard static analysis tools on it to spot problems and enforce organizational coding inspections. This would necessarily include making sure your imports and includes are valid and allowed from license, performance, behavior, and sanity perspectives. Or are we just throwing everything out and letting an undergrad with a chatGPT account do it all now? Coding standards, tools, and inspections are more important when the input mechanism
    • by zlives ( 2009072 )

      good news everyone, tomorrow you won't need that highly underpaid undergrad intern to write the code.

  • Only csoonline calls it slopsquatting.
    The scientific paper does not contain the word, but lists it as "typosquatting, combosquatting, brandjacking, and similarity attacks"

I'm all for computer dating, but I wouldn't want one to marry my sister.

Working...