Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Encryption Security IT

Github Kills Search After Hundreds of Private Keys Exposed 176

mask.of.sanity writes "Github has killed its search function to safeguard users who were caught out storing keys and passwords in public repositories. 'Users found that quite a large number of users who had added private keys to their repositories and then pushed the files up to GitHub. Searching on id_rsa, a file which contains the private key for SSH logins, returned over 600 results. Projects had live configuration files from cloud services such as Amazon Web Services and Azure with the encryption keys still included. Configuration and private key files are intended to be kept secret, since if it falls into wrong hands, that person can impersonate the user (or at least, the user's machine) and easily connect to that remote machine.' Search links popped up throughout Twitter pointing to stored keys, including what was reportedly account credentials for the Google Chrome source code repository. The keys can still be found using search engines, so check your repos."
This discussion has been archived. No new comments can be posted.

Github Kills Search After Hundreds of Private Keys Exposed

Comments Filter:
  • At least... (Score:5, Funny)

    by Anonymous Coward on Friday January 25, 2013 @08:52AM (#42689973)

    they've been seen by 'many eye balls'.

    That's good right?

  • by h4rr4r ( 612664 ) on Friday January 25, 2013 @08:55AM (#42689993)

    This is why developers are not sysadmins.

    These kinds of repositories need to learn that and not let these folks do this sort of thing. If would be simple to use a regex to filter out the posting of these sorts of files. Maybe Devs should even be charged a couple dollars to get a decent review of these things.

    • by Anonymous Coward on Friday January 25, 2013 @08:59AM (#42690021)

      No. This is actually completely absurd. A developer that cannot grasp the concept that private keys have to be kept private, cannot be trusted to do anything but screw up the most basic security provisions when writing code.

      They should get a kick in the ass, such as three months without any sort of commit privileges, and mandatory code review for an year. THAT should be enough to make it stick, and impress on them the real gravity of their failure. Otherwise, they will just chalk it up as "an annoyance done by those uninteresting people who should learn to code before they go pestering code-gods".

      • by h4rr4r ( 612664 ) on Friday January 25, 2013 @09:12AM (#42690145)

        Sysadmins should also know how to code. Nothing better than showing them their screwup and the solution to it.

        Plus, since all sysadmins, real ones anyway, are already competent in several scripting languages it is not that hard a skill to add if all you need to do is be better than bottom of the barrel programmers.

        • I can't remember the last time a developer had a workable, secure solution to my problems.

          There's a reason you hear, 'fix your own code' a lot more than 'fix your servers' in a development environment.

          • by KingMotley ( 944240 ) on Friday January 25, 2013 @11:16AM (#42691593) Journal

            I dunno about that here. Ever since they rolled out Sophos Full Disk Encryption on every desktop and server here, it's contributed more to downtime than any virus/malware ever has. I think literally every person in this office has had to have their machine completely rebuilt after it got corrupted somehow, and that includes our testing servers as well.

            All I can say is, thank god our production servers are out of our company's control. They haven't had any issues, but then again, they also don't have Sophos malware on them either.

      • by 1s44c ( 552956 )

        No. This is actually completely absurd. A developer that cannot grasp the concept that private keys have to be kept private, cannot be trusted to do anything but screw up the most basic security provisions when writing code.

        They grasp the concept just fine. It isn't that they don't understand, it's that they don't see it as their problem.

    • I don't think that would be easy to implement. In git you add and commit your changed to a local repository, then push them back to github. Somehow cutting them out later would give you really cool errors. They would need to catch them on your commit locally. The real question should be why those keys were placed inside the project directory, and not somewhere like ~/.ssh/
      • by h4rr4r ( 612664 )

        Could they not supply a .gitignore ?
        Either way simple enough to have a find script run, before you make it public. Basically on every commit turn off public access, run your clean script, then turn it back on. If this causes errors, that seems better than this.

        I admit at work we mostly use a combination of CVS and a 2x4 to hit developers with to avoid these issues while still having a nice simple repository.

      • by ArsenneLupin ( 766289 ) on Friday January 25, 2013 @09:17AM (#42690191)
        In some of these instances, all of ~/.ssh/ [github.com] did actually end up in the project directory. Or maybe they used their entire home directory as the project root? Stoopid, stoopid people.

        (Yes, there is also a nice ~/.ssh/config file, so that you also know which locks those key fits...)

      • by robmv ( 855035 )

        Yo can launch a filter like program from an event before the commit is saved on the remote repository, and stop with an error, this force the careless developer to amend their commits

        • by h4rr4r ( 612664 )

          If we could pair this with some sort of clue bat strike via IP that would be best.

          • If we could pair this with some sort of clue bat strike via IP that would be best.

            I hear a guy is working on a IP-based face-stabbing machine.

      • For new screwups the soloution would be to just reject the push and let the developer sort it out.

        For existing screwups it's not so easy. One of the characteristics of hash-based dvcs systems like git is that they make it REALLY painful to change history. You could generate new commits and blacklist the old ones but doing so would tip off all users of the repository that something was up and those users would still have their copies of the original commits.

    • by Hatta ( 162192 )

      Or just leave the keys and let them learn their lessons the hard way.

    • by Anonymous Coward

      This is why developers shouting "give me full access now" should always be denied - there is a totally different mindset between developers and admins (or DevOps) when it comes to protecting things like SSH keys.

      Both groups have similar (or certainly overlapping) technical skill sets, but have very different motivations.

      ObXKCD: http://xkcd.com/705/

      • by h4rr4r ( 612664 )

        The best system I have seen is not allow devs any access to production environments. All pushes done by sysadmin and dev boxes identical to production.

    • This happens both in private and publicly developed projects. All too often the developers do not grasp the fundamentals of security. If lucky, they grasp 'enable encryption' but it's exceptionally rare for them to understand things like mutual authentication and appropriate key management or even why a backdoor or fixed credential is very very bad news. The 'answer' in many companies is to tack on a 'security expert' to audit the code and do some penetration testing. While this is certainly not a bad i
    • The first thing you learn is that your private SSH keys are sacrosanct. Most developers seems to just go through a howto on how to generate a SSH key and don't think about anything after that. They're probably all using node.js or something.........
      • The first thing you learn is that your private SSH keys are sacrosanct. Most developers seems to just go through a howto on how to generate a SSH key and don't think about anything after that. They're probably all using node.js or something.........

        Followed by going through the git howto that tells them to
        git init
        git add .
        git commit -m "Initial Commit"

    • This is why key management should be part of the operating system, and every piece of software that doesn't use those APIs should be suspect.

      It's simply too big a subject to expect everyone who is in danger of falling prey to something similar (everyone who uses a computer) to manage on their own. If you know where every individual piece of software you run stores every single key, you are a very, very rare person. You're also probably mistaken.

      Even if we started down the path, it would take a long time, th

  • by Anonymous Coward

    'nuff said.

  • by Anonymous Coward

    Developers (using the term loosely) deserve whatever ill comes from checking in private keys. Public repo or otherwise

    • Re:Deserving (Score:5, Insightful)

      by GameboyRMH ( 1153867 ) <gameboyrmh.gmail@com> on Friday January 25, 2013 @09:04AM (#42690057) Journal

      Exactly, GitHub shouldn't disable a site feature to protect the stupid.

      • by Bengie ( 1121981 )
        Protecting the stupid from themselves is just enabling them to continue to be stupid. Let them learn from their mistakes or let Darwin take over.
  • Search engines (Score:5, Informative)

    by ArsenneLupin ( 766289 ) on Friday January 25, 2013 @09:02AM (#42690035)
    On google, the following search string still turns up a goldmine...:

    site:github.com inurl:id_dsa

    Idiots...

  • ...that even (supposedly) smart people can be stupid.

    • I'm sorry, were you under the assumption that idiots can't write code?
      • by sdnoob ( 917382 )

        I thought most of those lived in or near Redmond, WA and Redwood City, CA.

        • Visual Basic monkeys live all over the world unfortunately. Only the jumping monkey lives in or near Redmond...
      • by Lisias ( 447563 )

        In a environment where idiots write code, you will never see a coder calling "idiot" to another.

        Been there, saw that.

        (I got fired, by the wat - I wasn't idiot enough!)

  • by n1ywb ( 555767 )
    Thanks for taking away valuable functionality to protect idiots from themselves. O_o
  • by slashmydots ( 2189826 ) on Friday January 25, 2013 @09:04AM (#42690055)
    I was cruising ebay yesterday and saw that one of the laptops had their windows license keys exposed in pictures in a readable format. I poked around some more and found that isn't terribly uncommon. Some people just don't think no matter what website it is.
    • by antdude ( 79039 )

      Same thing on computers' case labels like at work.

    • by tlhIngan ( 30335 )

      I was cruising ebay yesterday and saw that one of the laptops had their windows license keys exposed in pictures in a readable format. I poked around some more and found that isn't terribly uncommon. Some people just don't think no matter what website it is.

      Those aren't actual working keys, though most of the time. Usually on machines from the big guys, they're nonworking keys - because the real activation key is built into the BIOS. For earlier (pre-Vista) versions of Windows, they would require manual act

    • by PRMan ( 959735 )
      Those keys only work with the DELL or HP key, don't they? They're not actual Windows keys that could be typed into a new installation.
  • overreaction? (Score:4, Insightful)

    by __aaltlg1547 ( 2541114 ) on Friday January 25, 2013 @09:10AM (#42690127)
    Seems like the wrong response. Instead of killing search, why not just erase the keys files and lock out the accounts of the offending devs?
    • Re:overreaction? (Score:4, Insightful)

      by h4rr4r ( 612664 ) on Friday January 25, 2013 @09:19AM (#42690209)

      Because some of these might be test keys or place holders. If the key is not valid on any system and is just test data, it should not be a big deal to post publicly.

      • by Bogtha ( 906264 )

        Example: the keys for Vagrant [github.com]. Vagrant is a system for managing virtual machines for development purposes. The ssh keys are used to facilitate passwordless login. They aren't typically exposed to the outside world, and they are clearly labelled as insecure.

        • by h4rr4r ( 612664 )

          That thing should really generate new ones on install.

          That way there are no keys to expose to the world that anyone would know.

          • by Bogtha ( 906264 )

            One of the things people do with it is build base boxes, which are preconfigured virtual machines, and share those base boxes for other people to build upon. In order to do so, the people who receive these base boxes need the private keys they are configured with.

            You could distribute the private keys with the base boxes, I suppose, but then you are stuck sharing multiple files instead of just one, and you can't install a base box by running one single command with a URL argument any more. It increases

      • I think the summary is wrong. status.github.com [github.com] seems to indicate that github's search cluster died, not that they took it down. More likely is that there was a flood of search requests for private keys at the same time and the search cluster buckled.

    • by KPU ( 118762 )

      Some projects deal with SSH keys and include them for testing purposes: https://github.com/trolldbois/sslsnoop/blob/master/test/id_dsa-1.key [github.com]

  • Stupid people... (Score:4, Insightful)

    by Lisias ( 447563 ) on Friday January 25, 2013 @09:18AM (#42690195) Homepage Journal

    These stupid people should be had their accounts suspended.

    People should be accountable for their actions, and these idiots are potentially compromising third party data security!

    ICO didn't fined Sony for the information leak on that Anonymous attack? Why in hell GITHUB user's should be less accountable for things THEY ARE FSCKING COMMITING in their accounts?

  • by Anonymous Coward

    According to their twitter and status pages, the search is currently unoperational due to problems with their search cluster. They recently released changes to their search including, I believe, a move to ElasticSearch. The linked article says as much, too, so yet another fail in a slashdot summary.

  • by 140Mandak262Jamuna ( 970587 ) on Friday January 25, 2013 @09:25AM (#42690275) Journal
    Back in the days when I was the root (of all evil according my fellow grad students) of our lab, one of the constant problems was people blindly doing chmod 777 .* on the $home. They have .emacs or .profile or .cshrc that was customized ages ago by some grad student, and they want to share it with a new student. Somehow they stumbled on to "chmod 777 .*" as a solution to all their file sharing problems. Now this "magic command" was also being blindly passed around without worrying about security implications. Oh, yeah, they think they are clever and tape the login credentials to the underside of the keyboard and laugh at secretaries who tape it to their monitors.

    Looks like these grad students have all growned up and uploading it all to the cloud.

    • by h4rr4r ( 612664 )

      When people did stuff like that in my sysadmin classes we were encouraged to teach them a lesson. Far better to edit their login script to log them right back out than delete their homedir contents, or change their path so they got other versions of common programs. Probably the meanest was to make it so instead of calling the work submission script it called rm on whatever they were trying to submit as their classwork.

      • by Frohboy ( 78614 )

        A subtler prank that I pulled on a friend who left himself logged in to one of the public undergrad labs (where there was the risk that an actual asshole would delete your stuff, send email as you, or something similarly cruel) was to add "echo 'sleep 1' >> .cshrc" to the end of his .cshrc before logging him out. I chuckled to myself, and then forgot about it.

        A week later, when it was 5 minutes before a submission deadline and he was yelling at the terminal to finish logging in (since it was taking 2-

      • I hate that attitude though. I told a friend once that I didn't use authentication with X windows because no one is ever going to bother to interfere and I didn't care if anyone was snooping. So he went out and decided to pop up random pictures on my screen and post messages until I relented. So I wasted a lot of time learning all about authentication and configuring it correctly, not to keep out adversaries but to keep out friends...

    • by WankersRevenge ( 452399 ) on Friday January 25, 2013 @10:10AM (#42690775)

      Yeah ... I was "that guy". The first time I installed Linux in 2000, I was annoyed that I needed "permission" to write to a directory outside of my home directory. I was coming from a Windows world, after all.

      I solved this "problem" by chmod 777 the entire filesystem. Hah. Problem solved. Needless to say, I couldn't start the machine back up again. I'm guessing it killed itself from the shear embarrassment. After that, I decided it may be in my best interest to read the manual.

      I'll do that one of these days :)

      • It's probably obvious and I'm just being stupid, but I can't think what you could possibly break by setting all perms to 777. Yeah, you'll mark a bunch of non-executable files as executable, but nothing should be trying to execute them anyway. There may be a few files (like /etc/passwd|shadow) which some components might refuse to use if they're world-readable, I suppose...

        Any idea what broke?

        BTW, my similar story: I purchased a NeXT machine in 1991. It came with a 110 MB hard drive, which wasn't a lot

        • It's probably obvious and I'm just being stupid, but I can't think what you could possibly break by setting all perms to 777.

          Anything with the sticky, setuid, or setgid bits set.

        • chmod'ing 777 kills the setuserid-bit.

          This means that programs like "su" or "ping" or "passwd" break.

          Also, some important daemons probably check whether their required files are world-writable and refuse continuing...

    • by xaxa ( 988988 ) on Friday January 25, 2013 @12:20PM (#42692473)

      Someone in my class installed a game in the officially-public network share. He was writing an AI for it, for a project. Other students found it, and played it.

      It had taken a lot of hacking to get the game to run on Linux, and he was annoyed other students had played it without putting in that effort. So, he altered the 'start.sh' script to generate an ssh key, add the public part to the user's authorized_hosts file, and move the private key somewhere obscure.

      He then got bored with the AI project.

      Some time later, while helping in a tutorial, I was showing a student how to set up an SSH key. The authorized_keys file already contained about 20 entries. The AI guy was sitting at the next computer, and told me what he'd done (I knew him quite well, but he hadn't told me what he'd done until now). He found over 200 private keys in the obscure place. He deleted them, chown -R go-rwx'd the game, and we thought that was the end of it...

      About a year later, Debian had that OpenSSL bug. The sysadmins ran a script across everyone's authorized_keys file, and removed any entries from keys generated by Debian OpenSSL. The email ended (I still have it):

      By the way: some of you have FAR TOO MANY authorized_keys ENTRIES
        and we seriously recommend that you radically shrink these down.
        As I said, we recommend kerberos tickets or ssh-agent instead!

      ...so I don't think they knew how they got there.

  • Security IQ test question 1: "Ensure all private keys are are stored in a secured location."
    "Oh sh**..."
  • Not so many (Score:4, Insightful)

    by Shimbo ( 100005 ) on Friday January 25, 2013 @09:58AM (#42690609)

    Hundreds of keys from a million accounts; less than one in a thousand developers screwed up. Call a doctor at once! Then ask him about outliers in large populations.

    • Well, you have to consider quality over quantity. There may only be less than one in a thousand developers who screwed up but what if the keys that were exposed belong to super-important servers such as those that control Google Chrome source code or some other big project?

    • But how many of those developers have projects that need private keys? None of mine currently do.
  • by NightHwk1 ( 172799 ) <jon&emptyflask,net> on Friday January 25, 2013 @10:06AM (#42690699) Homepage

    git rm id_rsa*; git commit -a -m "problem solved\!"

    Not quite. They're already out there. The keys are still in the revision history. People have forked and cloned it.

    Hopefully the developers who created these keys know that besides removing them from the repo, the keys can no longer be used. They must be removed from every .ssh/authorized_keys file, every service like Github that uses them for deploying code, etc.

    • by rdnetto ( 955205 )

      Github is officially denying [github.com] that the search feature being killed has anything to do with the exposure of keys. They also have a link on the same page to information on how to purge keys from your repository. (Make of that what you will.)

  • This doesn't suggest github took anything down on purpose: https://status.github.com/messages [github.com].
    Seems to me they were just experiencing some technical difficulties from all the people sharing those search links and having a laugh at the stupids...
    I skimmed over the github site and didn't find anything that would suggest otherwise at least.
    Of course I didn't read the articles because they seem badly misinformed and confuse private keys with passwords.

  • Any wonder why I make factor passwords and keys a coding standard?
  • Comment removed based on user account deletion

Dinosaurs aren't extinct. They've just learned to hide in the trees.

Working...