Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Security

AI Hallucinated a Dependency. So a Cybersecurity Researcher Built It as Proof-of-Concept Malware (theregister.com) 44

"Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI," the Register reported Thursday

"Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI's bad advice, we've learned." If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.

According to Bar Lanyado, security researcher at Lasso Security, one of the businesses fooled by AI into incorporating the package is Alibaba, which at the time of writing still includes a pip command to download the Python package huggingface-cli in its GraphTranslator installation instructions. There is a legit huggingface-cli, installed using pip install -U "huggingface_hub[cli]". But the huggingface-cli distributed via the Python Package Index (PyPI) and required by Alibaba's GraphTranslator — installed using pip install huggingface-cli — is fake, imagined by AI and turned real by Lanyado as an experiment.

He created huggingface-cli in December after seeing it repeatedly hallucinated by generative AI; by February this year, Alibaba was referring to it in GraphTranslator's README instructions rather than the real Hugging Face CLI tool... huggingface-cli received more than 15,000 authentic downloads in the three months it has been available... "In addition, we conducted a search on GitHub to determine whether this package was utilized within other companies' repositories," Lanyado said in the write-up for his experiment. "Our findings revealed that several large companies either use or recommend this package in their repositories...."

Lanyado also said that there was a Hugging Face-owned project that incorporated the fake huggingface-cli, but that was removed after he alerted the biz.

"With GPT-4, 24.2 percent of question responses produced hallucinated packages, of which 19.6 percent were repetitive, according to Lanyado..."

Thanks to long-time Slashdot reader schneidafunk for sharing the article.
This discussion has been archived. No new comments can be posted.

AI Hallucinated a Dependency. So a Cybersecurity Researcher Built It as Proof-of-Concept Malware

Comments Filter:
  • by Kunedog ( 1033226 ) on Saturday March 30, 2024 @03:44PM (#64357162)

    Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI's bad advice, we've learned.

    If you're married, then you know that sometimes this kind of thing can be worth it just to avoid starting another argument.

    • Re: (Score:3, Insightful)

      by war4peace ( 1628283 )

      If you're married to the wrong person, then you know that sometimes this kind of thing can be worth it just to avoid starting another argument.

      Fixed that for you.

      • We can tell you either aren't married or will not be for long. If your relationship isn't work then how do you know you or your spouse isn't just there because it's easy?
        • You can't tell shit, my friend :)
          Our marriage isn't work because we match each other very well, we both know when to compromise, and we deeply respect each other to name a few of the many aspects that make our relationship work perfectly.
          We both know we're the lucky few in a world full of wrong matches.

          • Yeah sheâ(TM)s cheating on you
          • Just don't have kids or pets. They bring up all kinds of new different issues and if you aren't adapted to discussing them then you won't last long. I have known a couple couples that didn't fight... right up until the point they were divorced. One of you is backing down and making up for it some way outside of the relationship. It sounds like one of you is giving in for fear of arguing rather than compromising.
            • No kids, but plenty of pets.
              Maybe I should point out this is my second marriage (I learned a LOT from the first one).
              We are together 100% of the time. Working from home is a blessing.
              And yes, plenty of small compromises to go around, but they are all openly discussed and agreed; I'd say the count of compromises is 50/50.

              I'm aware this type of relationship is very rare, but I've seen it at my maternal grandparents before. 53 years of blissful marriage, through pretty rough times in part, until one of them sa

    • So the AI was not hallucinating - it was prophetical.
  • by MpVpRb ( 1423381 ) on Saturday March 30, 2024 @04:10PM (#64357230)

    ...for research and fun
    It should NOT be used for serious work

    • by gweihir ( 88907 )

      Indeed. The only time you can use it for serious work is if the task the AI is asked to do is significantly below you own level of expertise and you fully verify the results. of course, that probably will cost you more time than doing it yourself from the start.

      • of course, that probably will cost you more time than doing it yourself from the start.

        Not if you have AI do it for you.

      • Indeed. The only time you can use it for serious work is if the task the AI is asked to do is significantly below you own level of expertise and you fully verify the results. of course, that probably will cost you more time than doing it yourself from the start.

        Not so, depending on the task.

        The task probably should be at or below your level of expertise, sure. But I'm using it frequently as a time saver. And it works.

        "Given the following php code, add pagination functionality."

        Boom, done. Could I have done it? Sure. In 4 seconds? Nope.

    • It's also great for your employer to pretend they can replace you on a whim, so you'll be afraid to ask for a raise.
    • by hey! ( 33014 )

      A chainsaw is a great tool. Some fool chopping his leg off with one doesn't change that.

    • by Lumpy ( 12016 )

      But companies are expecting to replace all their programmers with AI.... What could possibly go wrong?

  • by ffkom ( 3519199 ) on Saturday March 30, 2024 @04:33PM (#64357268)
    ... what a dream team for malicious actors! Both on their own are already a guarantee for all kinds of security disasters, but combining the two is truly peak incompetence.
  • Doom Loop (Score:4, Interesting)

    by mspohr ( 589790 ) on Saturday March 30, 2024 @05:21PM (#64357374)

    AI is entering a doom loop where it hallucinates then incorporates the hallucination into subsequent versions.
    Eventually AI will be all hallucinations.

    • Is hallucinate the right word? Maybe its realizing people can be fed b.s.
    • by Lumpy ( 12016 )

      When you feed it a giant cesspool of invalidated data (The internet) you should not expect a single response to be accurate. none of these AI's are fed a carefully curated data set.

  • Needs a catchy name (Score:5, Interesting)

    by clawsoon ( 748629 ) on Saturday March 30, 2024 @05:32PM (#64357386)
    I propose the straightforward "hallucination attack".
  • by SuperKendall ( 25149 ) on Saturday March 30, 2024 @11:06PM (#64357862)

    I asked an LLM (doesn't really matter which one, they are fail in this same way) for advice on how to load a specific file type, it gave me three possible packages to use to be able to land the file...

    But not one of them actually existed. When pressed further on one of the frameworks that didn't exist, it doubled-down and gave me a website for the package - which did not exist.

    And that led me to think, maybe I should build out that package. Not in order to create malware as described here, but because you already know some people in a similar situation will be directed right to your package of that specific name without having to do anything!

    So it's pre-made marketing just waiting for a product.

    • by Waccoon ( 1186667 ) on Sunday March 31, 2024 @11:02PM (#64360096)

      So now AI thinks up the ideas and writes the specs, and real people do all the work to make this crap work.

      That's it... AI has now graduated to being the new pointed-haired boss.

      • So now AI thinks up the ideas and writes the specs, and real people do all the work to make this crap work.

        It's more like AI opens a door that you can take advantage of people going through.

        Sort of more like a force of nature than a boss.

        Or if you like, it is recognizing a shortcoming by pretending something does not exist when plainly it would be useful, and someone opts to fill that hole for the benefit of mankind. Although in truth that scenario feels a bit like AI is a boss. :-)

  • by simlox ( 6576120 ) on Sunday March 31, 2024 @04:40AM (#64358172)
    That reads answers, check urls, see if code can compile, check references to see if they state what is claimed in the answer?
  • Ironically, that this kind of stuff would happen with a big probabilistic prodiction engine was to be expected.

    Let the AI write code, they said.
    It's really good at it, they said.
    Look how fast it generates code when I put in a natural language prompt!
    What could possibly go wrong?

  • by ghoul ( 157158 ) on Sunday March 31, 2024 @05:42AM (#64358238)
    At this point it seems like there are vulnerabilities deliberately being built into AIs by programmers who are worried AI is coming for the jobs of their brethren. Apparently some folks at the AI company have empathy for the thousands of programmers AI is about to make redundant.

"Beware of programmers carrying screwdrivers." -- Chip Salzenberg

Working...