Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming Security IT

'GitHub Actions' Artifacts Leak Tokens, Expose Cloud Services and Repositories (securityweek.com) 19

Security Week brings news about CI/CD workflows using GitHub Actions in build processes. Some workflows can generate artifacts that "may inadvertently leak tokens for third party cloud services and GitHub, exposing repositories and services to compromise, Palo Alto Networks warns." [The artifacts] function as a mechanism for persisting and sharing data across jobs within the workflow and ensure that data is available even after the workflow finishes. [The artifacts] are stored for up to 90 days and, in open source projects, are publicly available... The identified issue, a combination of misconfigurations and security defects, allows anyone with read access to a repository to consume the leaked tokens, and threat actors could exploit it to push malicious code or steal secrets from the repository. "It's important to note that these tokens weren't part of the repository code but were only found in repository-produced artifacts," Palo Alto Networks' Yaron Avital explains...

"The Super-Linter log file is often uploaded as a build artifact for reasons like debuggability and maintenance. But this practice exposed sensitive tokens of the repository." Super-Linter has been updated and no longer prints environment variables to log files.

Avital was able to identify a leaked token that, unlike the GitHub token, would not expire as soon as the workflow job ends, and automated the process that downloads an artifact, extracts the token, and uses it to replace the artifact with a malicious one. Because subsequent workflow jobs would often use previously uploaded artifacts, an attacker could use this process to achieve remote code execution (RCE) on the job runner that uses the malicious artifact, potentially compromising workstations, Avital notes.

Avital's blog post notes other variations on the attack — and "The research laid out here allowed me to compromise dozens of projects maintained by well-known organizations, including firebase-js-sdk by Google, a JavaScript package directly referenced by 1.6 million public projects, according to GitHub. Another high-profile project involved adsys, a tool included in the Ubuntu distribution used by corporations for integration with Active Directory." (Avital says the issue even impacted projects from Microsoft, Red Hat, and AWS.) "All open-source projects I approached with this issue cooperated swiftly and patched their code. Some offered bounties and cool swag."

"This research was reported to GitHub's bug bounty program. They categorized the issue as informational, placing the onus on users to secure their uploaded artifacts." My aim in this article is to highlight the potential for unintentionally exposing sensitive information through artifacts in GitHub Actions workflows. To address the concern, I developed a proof of concept (PoC) custom action that safeguards against such leaks. The action uses the @actions/artifact package, which is also used by the upload-artifact GitHub action, adding a crucial security layer by using an open-source scanner to audit the source directory for secrets and blocking the artifact upload when risk of accidental secret exposure exists. This approach promotes a more secure workflow environment...

As this research shows, we have a gap in the current security conversation regarding artifact scanning. GitHub's deprecation of Artifacts V3 should prompt organizations using the artifacts mechanism to reevaluate the way they use it. Security defenders must adopt a holistic approach, meticulously scrutinizing every stage — from code to production — for potential vulnerabilities. Overlooked elements like build artifacts often become prime targets for attackers. Reduce workflow permissions of runner tokens according to least privilege and review artifact creation in your CI/CD pipelines. By implementing a proactive and vigilant approach to security, defenders can significantly strengthen their project's security posture.

The blog post also notes protection and mitigation features from Palo Alto Networks....
This discussion has been archived. No new comments can be posted.

'GitHub Actions' Artifacts Leak Tokens, Expose Cloud Services and Repositories

Comments Filter:
  • by VeryFluffyBunny ( 5037285 ) on Monday August 19, 2024 @07:01AM (#64717622)
    ...to Microsoft's inefficiency that it's taken them 6 years to screw up Github's security.
    • by gweihir ( 88907 )

      I agree. The MS of old would have messed up this bad after no more than four years. I guess MS does not even get "messing up" right anymore. They are a total joke these days.

  • Two men can keep a secret only if one of this is dead.

    If you upload your secrets to repo that is public well they are out there. Sure there are some possibly surprising gotchas like if you then make things private, some of the history remains forever public. Yeah you'd have to RTFM to know that, and I'll grant you it might violate the least surprise principle.

    That said there is also the reality that if you ever made a authentication secret public even briefly the ONLY correct action to take is change it i

    • NOT on the repo (Score:4, Informative)

      by DrYak ( 748999 ) on Monday August 19, 2024 @09:14AM (#64717804) Homepage

      If you upload your secrets to repo that is public well they are out there.

      I know that not RTFA is popular on /. , but according to the summary:

      "It's important to note that these tokens weren't part of the repository code but were only found in repository-produced artifacts," Palo Alto Networks' Yaron Avital explains...

      The secrets WERE NOT UPLOADED to the public repo.

      But, as part of the CD/CI scripts:
      - some scripts need to connect to services. They do so using indeed a security Token.
      - some other scripts output a dump of the environment (for debugging purpose) as part of their log
      - some 3rd script saves all the ouput of the CI/CD in a ZIP (again, for debugging purpose).

      What happens ist that:
      - for security reasons, the tokens aren't stored in the (public) repo, they are storage in a (supposedly) secure and restricted storage and passed as specific environment variable to the scripts.
      - SuperLinter lists all environment variable into its log for debugging (which includes the special environment variable with the token)
      - Save Artifacts is configured to include the log files of SuperLinter, so it pack it in including the dump of environment, including the special variable
      - the ZIP file is availble to download from the CI/CD pipeline summary page (in case a user or external author of a pull request want to debug it), but it also means that now there's a downloadable ZIP file with the security tokens somewhere in it, even if the token where never in the repo itself.

      In short: The token were never published in the a repo, but a chain of missconfigurations end up with the token accidentally leaked from the "secure" (private) store into debug material available to download.

      TL;DR: It's not a clueless user actually pushing a token into a git commit, it's a clueless who misconfigured their CI/CD in a way that accidentally leaks the token into automatically generated ZIP files.

      • by DarkOx ( 621550 )

        It's not a clueless user actually pushing a token into a git commit, it's a clueless who misconfigured their CI/CD in a way that accidentally leaks the token into automatically generated ZIP files.

        That is a distinction without a difference.
        Uploading you secrets to public repo, and uploading your CI/CD logs that embed your secrets to a public repo...

        • That is a distinction without a difference.

          Oh, that I don't agree. There's a key difference:

          if this was caused by a simple "git add secrets.yaml && git commit -m 'Enabling SuperLinter'",
          then it's a problem that is clearly covered by the "never check in files containing passwords into the source version control system" mantra and you should water-spray the dev or hit them with a rolled newspaper until they get it and stop doing something as stupid.

          But here it is caused by a more complex chain of event:
          The dev knew they should necommit commit

      • "It's not a clueless user actually pushing a token into a git commit, it's a clueless who misconfigured their CI/CD in a way that accidentally leaks the token into automatically generated ZIP files."

        Which were then published into the repo by that clueless user in exactly the way you say they weren't, specifically because they were clueless, to which you admitted. This remains user error. I would put the blame on the script author, though, which might be a different clueless user.

        • Mod this comment down too, troll.

        • I would put the blame on the script author, though, which might be a different clueless user.

          That's why I pointed out the difference: the chain of event that led to this leak is different and thus the steps necessary to prevent it are different.

          If it was the usual, the only necessary step would be to teach the dev to stop doing shit like "git add secrets.yaml && git commit -m 'Enabling SuperLinter'".

          But here the dev was aware of that.

          This needs different steps:
          - "Always use fine-grained access" and "Always restrict providing the Token to only the steps that need it"

  • by Anonymous Coward
    So, after making me switch to MFA against my will, making me use it...

    they screwed up.

    Again.

  • There is no clearer sign that Microsoft cannot get security right. They mess it up, mess it up some more and then mess it up again. Why are we allowing these clowns to run anything important? (With apologies to all real clowns...)

    • by GoTeam ( 5042081 )

      There is no clearer sign that Microsoft cannot get security right. They mess it up, mess it up some more and then mess it up again. Why are we allowing these clowns to run anything important? (With apologies to all real clowns...)

      We share a similarity with the stereotypical "battered housewife". Whenever there is a security mistake by Microsoft that screws us over, we say it's our fault... then we give Microsoft another government contract. We just need to kick Microsoft in the balls and say "never again!"

      • by gweihir ( 88907 )

        Yes. Or collective Stockholm Syndrome.

        My guess is MS will finally get the end in flames they deserve when they cause some major, longer-term catastrophe.

    • by DarkOx ( 621550 )

      If your credentials end up on github though, multiple failures have to occur:

      You have to have secrets embedded in your code or devops pipeline artifacts to begin with, a best practice violation.

      You have to fail to scrub them before checking into github, a procedure violation.

      You have to be able submit secrets to github, a DLP failure.

      You have to have had some of the repo public at some point, a policy, procedure, or id10t failure

      You would have needed to be ignorant of github's documented workings, a trainin

      • by gweihir ( 88907 )

        Sure. In essence, you have to have processes and tool landscape so complex that you not only lost control, you lost sight of what is going on.

        Classical engineering deals with this by drastic simplification by having components with known and reliable characteristics, whether it is a screw, a pump or a whole power-stations. CS/IT is not there yet and will not be there for at the very least 50 years, but possibly much longer. Hence the only way to make IT systems reliable and secure at this time is to make th

  • When will people learn "Don't trust Microsoft"
  • More like Security Weak... am i right? I'll see myself out.

One good suit is worth a thousand resumes.

Working...