Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Programming

Code Spaces Hosting Shutting Down After Attacker Deletes All Data 387

An anonymous reader writes Code Spaces [a code hosting service] has been under DDOS attacks since the beginning of the week, but a few hours ago, the attacker managed to delete all their hosted customer data and most of the backups. They have announced that they are shutting down business. From the announcement: An unauthorized person who at this point who is still unknown (All we can say is that we have no reason to think its anyone who is or was employed with Code Spaces) had gained access to our Amazon EC2 control panel and had left a number of messages for us to contact them using a Hotmail address. Reaching out to the address started a chain of events that revolved around the person trying to extort a large fee in order to resolve the DDOS.

At this point we took action to take control back of our panel by changing passwords, however the intruder had prepared for this and had already created a number of backup logins to the panel and upon seeing us make the attempted recovery of the account he proceeded to randomly delete artifacts from the panel.
This discussion has been archived. No new comments can be posted.

Code Spaces Hosting Shutting Down After Attacker Deletes All Data

Comments Filter:
  • The cloud (Score:5, Insightful)

    by Anonymous Coward on Wednesday June 18, 2014 @10:31AM (#47263163)

    Good thing people hosted their stuff on the cloud...

    • Re:The cloud (Score:5, Interesting)

      by SQLGuru ( 980662 ) on Wednesday June 18, 2014 @10:34AM (#47263203) Homepage Journal

      Single account to rule them all......the best approach is the separation of concerns (user management, server management, backup / restore, etc.) so that it is a lot harder to compromise everything.

      • Re:The cloud (Score:5, Insightful)

        by i kan reed ( 749298 ) on Wednesday June 18, 2014 @10:39AM (#47263243) Homepage Journal

        But that would have cost the company a little more money.

        • Re:The cloud (Score:5, Insightful)

          by NatasRevol ( 731260 ) on Wednesday June 18, 2014 @11:07AM (#47263531) Journal

          More likely, actual planning would have to be involved.

        • Re: (Score:2, Interesting)

          I don't think that was a money thing, rather it was an oversight of risk management. Hindsight is always 20/20.

          (Besides, where does this "blame the victim" attitude always come from? It's ridiculous. This is equal to saying that wearing scantily clad clothing means a woman deserves to get raped.)

          • Re:The cloud (Score:5, Insightful)

            by Mister_Stoopid ( 1222674 ) on Wednesday June 18, 2014 @11:17AM (#47263659)

            Having an offline backup isn't 20/20 hindsight, it's the absolute basics of the basics.

            This is equal to saying that wearing scantily clad clothing means a woman deserves to get raped.

            It's more like saying that a guy who dies in a car accident because he was street racing while drunk, high, and not wearing a seatbelt got what he deserved.

            • Re:The cloud (Score:5, Insightful)

              by Munchr ( 786041 ) on Wednesday June 18, 2014 @12:13PM (#47264151)
              Exactly this. They state in the article that they had off-site backups. What use are off-site backups if the "on-site" control panel has direct online access to them? "In summary, most of our data, backups, machine configurations and offsite backups were either partially or completely deleted."
              • I think the offsite backups where not their own offsite backups, but managed by Amazon. Which is really not what I would consider an offsite backup.

                "where's your data"
                "in the cloud"
                "where do you keep your backups"
                "um, in the cloud..."

          • Re: (Score:2, Insightful)

            by pla ( 258480 )
            Besides, where does this "blame the victim" attitude always come from? It's ridiculous.

            Bad people exist. Plan accordingly, or don't come crying when you get hacked.

            Otherwise, I agree with you, this looks more like an oversight of risk management: When wandering around the park at 2am in a mini-dress... don't.
            • Re: (Score:3, Interesting)

              100% wrong. Maybe the company should have been better prepared, but the fact is they were attacked by a criminal who first hijacked and then destroyed possibly an enormous amount of value in people's data. He, she or they committed a horrible crime and should go to jail for a long time.
              • Re:The cloud (Score:5, Insightful)

                by Oligonicella ( 659917 ) on Wednesday June 18, 2014 @12:48PM (#47264471)

                And the company and it's owners should have their asses sued off for failing to take normal precautions for the data they promised to protect. I have sympathy and pity for the owners of the data (although I have always thought "the cloud" was a stupid idea), but none for the company. Unconnected archiving is a universally recognized good practice. Why in hell don't the new guys understand this?

              • Re: (Score:3, Insightful)

                by pla ( 258480 )
                100% wrong. Maybe the company should have been better prepared, but the fact is they were attacked by a criminal who first hijacked and then destroyed possibly an enormous amount of value in people's data. He, she or they committed a horrible crime and should go to jail for a long time.

                You'll notice that at no point did I excuse the criminal. I agree with you completely that we as a society should dedicate the resources to hunting him down and punishing him.

                That doesn't change the fact that Code Space
            • Bad people exist. It doesn't matter if you cry, it does matter if you seek Justice or not.

              Changing your life to accommodate them is ill advised.

          • Re:The cloud (Score:5, Interesting)

            by TheCarp ( 96830 ) <sjc AT carpanet DOT net> on Wednesday June 18, 2014 @12:28PM (#47264279) Homepage

            I see this come up a lot and honestly..... I mean.... is it really wrong to suggest that a person should think about self-protection?

            Do you lock the door to your house? Your car? I do. I generally wont even leave my phone in the locked car unless I expect I will not be out of view of the car for more than a minute, I even look around first when making such a decision. Why? Because people I know, including myself, have had shit stolen from their cars!

            And you know what.... I, the victim, was stupid for thinking it was going to be ok to leave my GPS on the cradle in the car overnight. The person who stole it is still an asshole, still deserves to be punished, but you know what....that doesn't make me smart for exposing myself to his actions.

            Should a woman be able to wear what she wants? Should she be able to walk down the street at night alone? Yes. Absolutely. However, when my wife clips a knife on her belt before going for walks at night, when she tells me what streets she avoids at night because she knows its where alot of the rapes are reported.... it makes me think I married a smart girl.

            But hey maybe I am odd, I don't say "don''t wear that" I say "don't forget your knife"

            Because its true, she shouldn't ever have to use it, and I hope she never does.... but if it ever happens, I hope she spills entrails on the sidewalk.

            • I see this come up a lot and honestly..... I mean.... is it really wrong to suggest that a person should think about self-protection?

              No, it is wrong to claim that they're expected to. See the difference? No?

              Why bloviate for dozens of words if you're going to fall on your face in the first sentence?

              You can't even tell the difference between prerogatives and coercion, so you have no moral or ethical foundation to build anything on. You have no points, because they're suspended in space and everybody else is on planet Earth.

              And yes, it is really "very" wrong to attempt to exercise other people's prerogatives. It is a less extreme example of

          • (Besides, where does this "blame the victim" attitude always come from? It's ridiculous.

            You obviously missed the comments I made to the same effect back in April and had folks respond that yes, the victim is partially to blame no matter what.

            Here, [slashdot.org] read the torturous and twisted excuses people make trying to justify why the victim is to blame, whether a hacking event such as this or having your house broken into.
          • by Kjella ( 173770 )

            I don't think that was a money thing, rather it was an oversight of risk management. (...) Besides, where does this "blame the victim" attitude always come from?

            Because it's pretty hard to criticize/discuss/improve someone's risk management without at the same time assigning part of the blame to them. I mean if I was entirely without fault that means I did nothing wrong which means I don't have to change my ways, yet here you are arguing I should take greater precautions which means I did do something stupid which means it's partly my own fault right? It's pretty hard to say that you could and should avoid danger, yet it doesn't matter if you sought and exposed you

    • The cloud (Score:3, Funny)

      by Anonymous Coward

      Normally things form clouds AFTER going up in smoke. With the 'new technology' it is the opposite.

    • Re:The cloud (Score:5, Interesting)

      by Dishevel ( 1105119 ) on Wednesday June 18, 2014 @10:51AM (#47263389)
      The real problem was that they still had access to their stuff and never bothered to look at the number of accounts on the system before changing the password.

      The concept was good but the people in charge were in way over their heads and it became suddenly clear to them that they had no business securing other peoples data. Good for them. At least they know what they suck at.

      • Re:The cloud (Score:5, Informative)

        by Kagato ( 116051 ) on Wednesday June 18, 2014 @11:12AM (#47263595)

        AWS has one of the best security systems out there. IF you decide to enable the features. The production AWS configs I've used have mandated multi factor auth (using the number generator on the phone) as well as network source network restrictions. You can also setup a large number of ACLs to restrict things like the ability to create additional accounts.

        It's hard for me to feel bad for these guys.

      • by LWATCDR ( 28044 )

        Isn't the real problem the criminals that made the attack?

        • Sure it is a real problem. The issue is that if you are going to wait for their to be no criminal behavior out there nothing can ever get done.

          So you have to take some responsibility for the security of your users data in spite of the fact that there are criminals out there.

    • and our admin password is "letmein"

    • Good thing people hosted their stuff on the cloud...

      No kidding. Their backups also, apparently.

      • You should always have an offline backup (even if slightly out of date).

        In this case, they could have used a separate "cloud" provider just for backups.

        Cloud or not, everything under one umbrella was the problem.

      • by Jawnn ( 445279 )
        Backups, accessible via the same system that made them, are not backups. A backup is a thing that lives elsewhere and is not affected by anything that might happen on the primary system. All they had were "copies".
    • Re:The cloud (Score:5, Interesting)

      by Penguinisto ( 415985 ) on Wednesday June 18, 2014 @10:56AM (#47263441) Journal

      Good thing people hosted their stuff on the cloud...

      I don't think their problem is necessarily because it was "on the cloud" - the same thing could have happened if someone penetrated a corporate network and got hold of a VM farm. A bigger obstacle to be sure, but if your corporation has partner/vendor access and a not-so-sharp security guy...

      One question I have though - instead of changing a password, why couldn't they have called Amazon, had the thing universally locked out for that company, replaced all root-level access with a new account, and sent the new username and p/w by phone back to the company?

      Also, why didn't they have an offline (think: off-cloud) backup of the stuff? Sure it costs time/money/skull-sweat to do that, but it's worth the time and trouble in the end. After all, if your family jewels are hanging out there, it always pays to have a DR plan for 'em...

      If nothing else, they could have set up a separate and distinct AWS account/rigging as a "DR" of sorts, with DB replication and the works feeding it as a warm DR site. That way if some jackass compromises the first, you only need to stop DB replication, turn on the rest of the DR servers, do a quick test, and shift your DNS to the backup site - 15 mintues later, you can delete the objects yourself in the original site if you want (while you set up yet a different site and build a new backup site to replace the one you just put into production.)

      We have a sizable AWS setup where I work, and first/foremost we back that shit up (the DB contents) to machinery that we control. We also have a means of re-deploying/rebuilding if necessary; sure it takes time, but it's better to have it and not need it...

      • Re:The cloud (Score:5, Insightful)

        by vux984 ( 928602 ) on Wednesday June 18, 2014 @11:15AM (#47263625)

        I don't think their problem is necessarily because it was "on the cloud"

        No. The cloud was a key part of the problem. They had as much access and control over the system as the hacker did with no physical fall back.

        A VM farm on an onsite rack or even a colo rack? You knock out the hacker by unplugging it from the router to the internet, and then audit and reset security to your hearts content.

        • Re:The cloud (Score:5, Informative)

          by Anonymous Coward on Wednesday June 18, 2014 @11:20AM (#47263683)
          With Amazon's service you can contact them and have all access blocked until there is time to sort things out, and authenticate the real admin with billing information or the root SSH key you're given, etc.
      • Our offsite backups are put in a metal box and taken offsite. Unless you plan on hijacking a truck, it's a lot harder deleting our data than using a nice control panel on the web.
    • Re:The cloud (Score:4, Insightful)

      by rwven ( 663186 ) on Wednesday June 18, 2014 @10:59AM (#47263473)

      It has nothing to do with the cloud. It could have been any un-managed hosting.

      The fact that they went with un-managed hosting in the first place is what really screwed them. If they had a real support team they could turn to, steps could have been taken to keep this from happening as soon as the DDOS started, and they would have had "offsite" or at least "offline" backups.

      This happened because it appears that code spaces had some knee-jerk reactions and didn't think through how they were handling this (like changing the password before making sure there weren't other methods of access already established). They should have straight-up called amazon, explained what was going on, and paid for support for amazon put access to their account and instances on lockdown until the situation was resolved. Shoulda, woulda, coulda though...

    • Good thing people hosted their stuff on the cloud...

      Hosting stuff on the cloud wasn't the problem. It's really no different from hosting anywhere else. The problem was a lack of off-site backups.

      Something as simple as s3cmd and cron would have protected them. Or if really necessary they could have backed up servers to an independent s3 account.

      This is a simple case of someone keeping all their eggs in a single basket, breaking the fundamental rule of backups needing to be independent of their source.

  • by QilessQi ( 2044624 ) on Wednesday June 18, 2014 @10:34AM (#47263199)

    ...doesn't seem to work so well.

  • by Anonymous Coward on Wednesday June 18, 2014 @10:35AM (#47263205)

    So you just unplug your server's network connection from the internet while you fix the damage... oh. cloud stuff needs constant internet connection? hm. well I guess that's it then. It was an honor to serve with you. BOOM!

    • by Anonymous Coward

      Well, sounds like they first attempted to fix it themselves using ther mad 1337 skills. Amazon cloud is run by adults, and they must have a large staff of top notch security experts. This might sound like monday morning quarterbacking, but if they really feared this threat, they should have called amazon so that not only could they put their instance on ice, they might have gotten some help in hunting down the creep.

      • Who do you "call" with most cloud vendors? After all, sounds like whoever was doing the DDOS to extort Code Spaces could have also "called" Amazon to do any number of things, as whoever it was had the passwords, other accounts, etc.

        Unless you're one of Amazon EC3's largest customers (e.g. Netflix), you're one of thousands of low-paying customers with rudimentary authentication. Amazon should have an "oh shit" master key that relies on old-school technology, like a RSA number keyfob that the client's pres
        • by Penguinisto ( 415985 ) on Wednesday June 18, 2014 @11:01AM (#47263497) Journal

          Who do you "call" with most cloud vendors? After all, sounds like whoever was doing the DDOS to extort Code Spaces could have also "called" Amazon to do any number of things, as whoever it was had the passwords, other accounts, etc..

          I've actually worked with them once - sure someone could impersonate them, but you could just as easily call up, explain the situation, and then prove you're the rightful owner of the account (using info that most script kiddies aren't going to think of gathering in the first place, let alone spoof the original contact phone #.)

          To their credit, Amazon is actually fairly intelligent and responsive, even to small accounts.

          BTW - if you use/handle it right, each instance comes pre-made with a specific SSH auth keyset for root, and you're the only one with the private key (even Amazon doesn't have it) - store/use that as your proof by logging into an instance with one (it's something the script kiddie definitely won't have).

      • by ZeroPly ( 881915 )
        Have you worked with service providers? From the time you've dialed their number, what is your estimate of how long it takes to get someone on the line who can lock down an entire corporate account? Remember that there's a big authentication issue there - how do they know it's not a prank call?

        By comparison, I can get to our server center and completely isolate us and all our data from the Internet in under 10 minutes.
  • by Lab Rat Jason ( 2495638 ) on Wednesday June 18, 2014 @10:35AM (#47263207)
    for air gapped backups.
    • by Russ1642 ( 1087959 ) on Wednesday June 18, 2014 @10:39AM (#47263241)

      If your backups are sitting right next to your active files they aren't backups. They're just copies sitting there.

    • by Richy_T ( 111409 )

      There may be better ones but this is sufficient all on its own. As the poster above me says, if it's not offline, it's not a backup.

    • by CAIMLAS ( 41445 )

      Or for in-house networks.

      Pretty trivial to just pull the cable when your kit has been compromised and you're facing extortion.

    • by gsslay ( 807818 )

      Why isn't this standard procedure for all data repositories?

      Doesn't matter how efficient and secure you are, if one person can wipe absolutely everything from one control panel then you have a risk that is not being addressed. And one that isn't even difficult to address.

    • by Charliemopps ( 1157495 ) on Wednesday June 18, 2014 @10:59AM (#47263475)

      for air gapped backups.

      It has to be more than that. We had a policy of air gapped backups that everyone followed. But we had several different sites with several different admins. There was a large hurricane and we found some flaws in the system to say the least.

      In several cases, the backups were kept IN the drive... they were gone.
      In others, they removed the backups, put them on top of the server or in a desk draw.... gone as well.

      In others, they actually removed the tapes from the site, but often they were taken home by the admin or other staff... in those cases we faired slightly better because both the site and the staffs house would have to be under water. Hurricanes are big however, so we had about a 50% success rate there.

      In some cases they had a safe on site. This proved marginally better... the tapes were safe in most cases. In one instance we had a rather brave Admin fly across the country, take a cab out to the site and the literally SWIM to get the tape. But in a lot of cases the tape was OK, but the safe was under water. So we weren't able to retrieve it for days.

      The sites where local admins stored the tapes at local banks faired the best. So now that's our policy. Backups get stored off-site, in a vault. Technology is better now so we also do remote backups across the net now as well in case the bank is under water as well. But no matter what, we can always head to the bank vault. Ok, I guess a meteor would ruin our day, but you cant plan for everything.

      • There was a large hurricane and we found some flaws in the system to say the least.

        That's why you have backups in different geographical areas.

        The sites where local admins stored the tapes at local banks faired the best.

        Have you considered a service like Iron Mountain? They'll send out a truck to pick up your backups every day, if you like, and store it in a very safe location.

        • There was a large hurricane and we found some flaws in the system to say the least.

          That's why you have backups in different geographical areas.

          The sites where local admins stored the tapes at local banks faired the best.

          Have you considered a service like Iron Mountain? They'll send out a truck to pick up your backups every day, if you like, and store it in a very safe location.

          Iron Mountain doesn't serve most of the areas involved. We have dozens of VERY rural sites. Like the top of a mountain, or out in the desert, or along the Mexican border kind of rural. One remote on a mountain gets so much snow build up on it we have a local guy contracted to shovel snow off of it weekly so it doesn't overheat. Another is at the bottom of a canyon on an Indian reservation. The tech has to ride once a week on a helicopter to get to it. In the event of an outage he literally takes a mule down

      • IMHO:

        1) Backups that don't get done automatically often don't get done regularly, so they should be automatically performed via scripts.

        2) Offline isn't as important as offsite. Buildings catch fire, get flooded, disappear into sink holes, get hit by falling jet airplanes.

        3) Security matters. Paranoia should be the order of the day.

  • by Anonymous Coward on Wednesday June 18, 2014 @10:35AM (#47263213)

    would you mind going into ebay.com & deleting my account?

    Ebay refuses to close it.

    • would you mind going into ebay.com & deleting my account?

      Ebay refuses to close it.

      Move to Europe and sue them under your new right to be forgotten.

  • by ACK!! ( 10229 ) on Wednesday June 18, 2014 @10:36AM (#47263215) Journal
    At least they had backups of their cloud data in a safe place where no random asshat could just go in and waste the data. That is a code hosting company you can trust with your stuff that is for sure!
  • They didn't have offline backups? tapes? I'm not familiar with codespaces service, but how come the backups could be deleted remotely?

    • by gstoddart ( 321705 ) on Wednesday June 18, 2014 @10:40AM (#47263269) Homepage

      No, because it was all in Amazon. Who needs tape when you have the cloud, right?

      So the stuff they had backed up from Amazon to Amazon, was still controlled by the same logins (or the ones the hacker had created).

      So when he/she/they started deleting stuff, the backups also got deleted.

      Sounds like a brilliant strategy, and an epic demonstration of what can go wrong with the cloud.

      If you host your own stuff, you do your own backups. If you backup your cloud data to the cloud using the same stuff as the rest of it ... well, your backups are hardly secure, are they.

      So unless Amazon has offsite tape backups (which I highly doubt) ... they're pretty much screwed.

      I think this is about the same as backing up your hard drive to itself so you have a spare copy.

      • by Bengie ( 1121981 )

        No, because it was all in Amazon. Who needs tape when you have the cloud, right?

        A rule of thumb that I've heard was "It's not backed up until on at least 2 different media types, at least 2 different file systems, and stored in at least 2 different physical locations".

        • You have been short-changed.

          If its worth money:

          Hve three copies, on three media types in three locations.

          Not so sure about file systems. If you have proprietry backup software, then you will never get the data when you really need it. tar loves you!

      • by Jeff Flanagan ( 2981883 ) on Wednesday June 18, 2014 @10:57AM (#47263453)
        >Sounds like a brilliant strategy, and an epic demonstration of what can go wrong with the cloud.

        No, it's just an example of what can happen to incompetent people. There's no reason to believe that these people would not have also failed to have offline backup with local servers. There was nothing to prevent them from keeping backups locally or on another cloud.

        Blaming cloud computing for this is completely idiotic, and about what I expect on the dumbed down Slashdot these days.
      • I don't think you necessarily need to backup to tapes yourself. If you backed up your Amazon stuff to Rackspace, for example, you would be protected both against someone gaining access to your Amazon account, as well as a systemic problem with Amazon. Just so long as there's nothing in your Amazon account that would allow an attacker to access your Rackspace account, that should be a pretty good solution.

        No solution is perfect. You're just looking for one that's extremely unlikely to break.

      • So now it's a double tragedy? Codespace doesn't have offsite backups AND Amazon doesn't have offsite backups? Shame on BOTH of them!
        • You know, I've actually heard people championing cloud stuff saying "we don't need to keep backups, it's in the cloud".

          People act like the cloud is full of unicorns and rainbows, and makes all problems go away, and then they do really stupid things like this and realize that isn't the case.

          The problem, is that people buy into it, and then when they realize they've made poor decisions, it's too damned late.

          It sounds like Codespace more or less created their own mess, but it's their clients who are really get

  • by Anonymous Coward on Wednesday June 18, 2014 @10:41AM (#47263281)

    The guys behind Code Spaces should be issued a citation for Operating While Pwned. If you know admin access is compromised, shut it down out-of-band.

  • Presumably when they realized that the attacker had access to their control panel they shoulda coulda (yes I know I hate that too) called Amazon and had them shut everything down until order could be restored.
  • by Thanshin ( 1188877 ) on Wednesday June 18, 2014 @10:45AM (#47263321)

    I must be a cynic but my first reaction is to think:

    1 - Create cloud based system.
    2 - Sell subscriptions for hundreds of $.
    3 - Announce hacker attack!
    4 - Profit.

  • by Edrick ( 590522 ) on Wednesday June 18, 2014 @10:46AM (#47263337)

    If you're a hosted site with important data and your site is compromised, the first & best move is to cut the cord immediately. Contact Amazon (or whomever is hosting the data) and get all access shut down instantly and immediately, thereby ending the attacker's ability to do anything further. This will cause an outage, but at least everything is safe.

    Working with Amazon, they can create a new account, give it a strong password, and begin cleaning up the mess with the new account (which the hacker will be unaware of). Now they can, at their own leisure, change passwords, administer accounts, delete crap created by the hacker, etc...Trying to outpace a professional hacker at their own game is a gamble that isn't worth it---especially if no offsite backups exist!!!

    Lastly, they should be forwarding all of the email/attacker info to Amazon, Microsoft (Hotmail), and to the authorities. Whether they can be caught or not is up in the air, but odds are almost certain that this attacker has hit other sites and would eventually have different cases correlated to each other.

    Safety & security of data is #1, fixing damage caused is #2, and accountability is #3. Securing the site against future attacks is part of #3---there's no reason to put the site up (or leave it up) and risk further attacks, thereby risking data loss or a security breach.

    • Contact Amazon (or whomever is hosting the data) and get all access shut down instantly and immediately, thereby ending the attacker's ability to do anything further.

      But what if the attacker is the one contacting Amazon to shutdown everything? Do you want your business shut down by random teenagers calling Amazon, telling them to shut everything down?

  • If someone has penetrated your system so that they have root or admin privileges over all your machine, you shut down immediately. In the physical world, you pull the plug. On Amazon, you immediately tell Amazon to lock things down, disable all passwords and administrative control, and then work back up to fixing things.

    • by tlhIngan ( 30335 )

      If someone has penetrated your system so that they have root or admin privileges over all your machine, you shut down immediately. In the physical world, you pull the plug. On Amazon, you immediately tell Amazon to lock things down, disable all passwords and administrative control, and then work back up to fixing things.

      But that's so 20th century! I mean, in the 21st century, if you can't do everything yourself without having to deal with another human being, then it's broken! Interacting with other humans

  • Facking Idiots (Score:5, Interesting)

    by l0ungeb0y ( 442022 ) on Wednesday June 18, 2014 @10:52AM (#47263393) Homepage Journal
    Not providing for your own OFFLINE BACKUPS is a reckless oversight of such magnitude that I am entirely incapable of having sympathy for these asshats. We need a few examples such as these to serve as cautionary tales for those who think the Cloud is the answer to everything.
    • nothing to do with being cloud based or not, just proper attention to good systems operations practices was lacking.

      even not doing the obvious and blocking all newly created accounts after certain time is just incredibly irresponsible.

  • Git (Score:5, Interesting)

    by blackiner ( 2787381 ) on Wednesday June 18, 2014 @10:55AM (#47263429)
    This is why git is such an effective code hosting solution. Everyone who has cloned the repository is a potential backup copy.
    • by JigJag ( 2046772 )

      Only wimps use tape backup: real men just upload their important stuff on ftp, and let the rest of the world mirror it ;)

              Torvalds, Linus (1996-07-20). Message. linux-kernel mailing list. IU. Retrieved on 2014-04-26.

      I guess we should update that quote and replace ftp with git

    • Why git? Even clearcase snap shot views are full copies of the repository. (Granted, snapshot views dont have history in themselves and levels of roll backs will be limited). Almost all the source control systems that clone the source repository create full backups. Of course git is much nicer and has replicated history as well.
  • Someone else mentioned having offline backups, so I won't belabor that. But once they knew they were compromised, perhaps a smarter thing to do would have been to contact the service provider and take countermeasures (ask for a snapshot of the site as it was, examine and disable accounts, change admin passwords, perhaps contact authorities) before reaching out to the perp. I'm not sure reaching out to the perp was a good idea in any case.

    For awhile I hosted a number of websites from a rental space, and I

  • Seriously.... no offline backups? Not a real business in that case.

  • Instead of trying to take back control themselves, shouldn't they have contacted Amazon and let them handle it? Perhaps they could have frozen the entire account, locking out both the rightful owner and the attacker, until things were sorted.
  • This is why distributed version control is important (git/mercurial), even if you think SVN is easier. Sometimes your remote server will disappear, whether its hackers, fires, or someone forgot to pay the bill.
  • by Culture20 ( 968837 ) on Wednesday June 18, 2014 @11:50AM (#47263919)
    This must be where the IRS stored backups of emails.
  • by grimmy ( 75458 ) on Wednesday June 18, 2014 @12:53PM (#47264515) Homepage

    ....oh never mind.

  • Nothing copied elsewhere or onto tape? - Guess not. The cloud is SOOOO secure...

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...