Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Businesses Privacy Security IT

Good: Companies Care About Data Privacy Bad: No Idea How To Protect It 77

Esther Schindler writes: Research performed by Dimensional Research demonstrated something most of us know: Just about every business cares about data privacy, and intends to do something to protect sensitive information. But when you cross-tabulate the results to look more closely at what organizations are actually doing to ensure that private data stays private, the results are sadly predictable: While smaller companies care about data privacy just as much as big ones do, they're ill-equipped to respond. What's different is not the perceived urgency of data privacy and other privacy/security matters. It's what companies are prepared (and funded) to do about it. For instance: "When it comes to training employees on data privacy, 82% of the largest organizations do tell the people who work for them the right way to handle personally identifiable data and other sensitive information. Similarly, 71% of the businesses with 1,000-5,000 employees offer such training. However, even though smaller companies are equally concerned about the subject, that concern does not trickle down to the employees quite so effectively. Half of the midsize businesses offer no such training; just 39% of organizations with under 100 employees regularly train employees on data privacy."
This discussion has been archived. No new comments can be posted.

Good: Companies Care About Data Privacy Bad: No Idea How To Protect It

Comments Filter:
  • by Anonymous Coward

    "We care about your privacy."

    "Oh, by the way, we have TBs of data stolen 3 months ago, but we forgot to tell you until today."

  • by SuperKendall ( 25149 ) on Thursday April 23, 2015 @11:59PM (#49543237)

    Never collect it to begin with.

    • by dgatwood ( 11270 )

      Well, that's not always possible, but it's a good start. I'd suggest a more nuanced/layered approach:

      • To the maximum extent possible, don't collect it.
      • If you must collect it, don't retain it.
      • If you must retain it, use end-to-end encryption, so that you cannot access the data yourself.
      • If you must retain it and must be able to access it, use encryption correctly, use access controls to limit access as narrowly as possible, and audit the heck out of your code.
      • I like all those layers.

        A simpler approach: All system developers dealing with personal data must place alongside any stored person data, their own personal SSN and login details for all of their banking and investment account, along with one embarrassing JPG.

        Then just let them do whatever comes naturally.

      • by Tool Man ( 9826 )

        I'm in the security industry, and this approach pretty much sums up what I try to instruct my clients to do. It differs of course from the piles of unprotected, unaudited, unmanaged fluff that some management wanker thought might be handy to keep around. Even restricted to such a constrained, specific scope as credit card data makes them blanche, I can't imagine them making the leap to more loosely guarded information without a business case.

        • by clark0r ( 925569 )
          "but we might need this stuff later, so i took a copy on my personal usb device and kept it in my glovebox because nobody would find it there"
        • by Jawnn ( 445279 )

          I can't imagine them making the leap to more loosely guarded information without a business case.

          The business case is already there, unless you do business only in one of the few remaining states without a law that makes it truly painful to suffer a breach. But I get what you mean, even the reality of ruinous penalties, lawsuits, and bad PR is just theoretical to many decision makers. They won't part with a dime to mitigate security issues without at least a good scare or two.

          • by BVis ( 267028 )

            And sometimes not even then. I was at a company when they had a breach involving financial info. It cost them hundreds of thousands of dollars to purchase credit protection for thousands of our customers. However, they just kept on operating the same way, storing credit card information in the clear because that's the way they've always done it, and upgrading the back-office accounting system to allow tokenization of transactions would have cost money. Nobody in upper management had the balls to go to t

            • by Tool Man ( 9826 )

              I find that's one of the more useful bits about PCI, is that at some point, somebody tells the company to get their house in order. Maybe not the whole thing, but there's some value to moving all of the CC data tot he closet and locking THAT.
              My general security side says they should apply that principle elsewhere, but it's a harder sell when the rest isn't directly tied to cash flow.

            • by dgatwood ( 11270 )

              And sometimes not even then. I was at a company when they had a breach involving financial info. It cost them hundreds of thousands of dollars to purchase credit protection for thousands of our customers. However, they just kept on operating the same way, storing credit card information in the clear because that's the way they've always done it, and upgrading the back-office accounting system to allow tokenization of transactions would have cost money. Nobody in upper management had the balls to go to the C

              • by BVis ( 267028 )

                Don't know about that, and don't care. I left that shithole a year and a half ago.

      • All sounds good however... For a large organization such rules become impractical. To get full security there will be so much administrative overhead of approving access to a given area for so much time and back, that if you played by the rules you wouldn't get your job done timely. So you end up with "black market" IT where people will store backups of the data in say an access or excel files, and keep them hidden from the official system. Not because they have nefarious use of them, but because they wil

        • by clark0r ( 925569 )
          Sometimes it's a good idea to have an incident management process that overrides controls to achieve aims within agreed timeframes. Triggering of the process has to match defined criteria, eg "we got compromised".
        • "Not because they have nefarious use of them, but because they will need to get their job done, and the official secure way is too impractical."

          And by finding and using workarounds you are just making the problem bigger since an undetected problem is a problem that won't get solved anytime.

          If the policy in place is dumb, make it obviously so. This way it can be solved, if you don't do it, you are part of the problem.

          • by BVis ( 267028 )

            If the policy in place is dumb, make it obviously so. This way it can be solved, if you don't do it, you are part of the problem.

            In my experience, the dumbness of the policy is directly proportional to the difficulty in making anyone understand how dumb it is. It's also directly proportional to the likelihood that someone whose job title starts with "Chief" wrote the policy and will not change it, no matter what.

            It's also dumb to allow the CEO to have a non-expiring password that is the name of the company

            • "In my experience, the dumbness of the policy is directly proportional to the difficulty in making anyone understand how dumb it is."

              Well, that's not exactly what we were talking about. If a policy is "just" dumb, or insecure, it's probably not your role to change it but, at most, to share your opinions with whomever is nominally responsible for that.

              Here we are talking about subverting the policies in order to be able to get your job done. No need to explain anything here, just follow the policies and le

              • by BVis ( 267028 )

                It's certainly your problem when they fire you for not doing it.

                • "It's certainly your problem when they fire you for not doing it."

                  Nobody is going to fire you for not doing something known not to be possible.

                  See... machines cause that effect. People get angry with people not doing as commanded -even if it is a silly command, but they won't take it personally if it is a machine the one saying "no". That means the CEO cannot have a four letter password, because it is not me disallowing it, but a policy in a machine stating that everybody will have an 8 letter password, a

                  • by BVis ( 267028 )

                    "Change it or you're fired."

                    He got his non-compliant password.

                    Executives are immune from inconvenient policy.

      • The problem is step 4 is the issue.

        using encryption correctly with access controls, is all but impossible with current OS's.

        Very few OS's have the access control setup to properly limit. Most current forms of Access control assume a greater and greater level of access with each level. That still creates accounts which can access everything. You don't want that.

        What is needed is an access level system that lets you install updates, maybe move files, but not read them. This way the system admin can't acce

        • "Most current forms of Access control assume a greater and greater level of access with each level. That still creates accounts which can access everything."

          Hey! we could put a name to that. I suggest, hummm... "discretionary access control". What about that?

          "What is needed is an access level system that lets you install updates, maybe move files, but not read them. This way the system admin can't access your secure data period."

          If only someone invented something we could call, say, "mandatory access cont

    • Ha! It's not like they don't actually want all that delicious, valuable customer data. That stuff is pure gold. They just want to be able to use it themselves, such as selling access to "interested third parties".

      My summation / interpretation of the article's premise: "We don't want a huge security breach that will embarrass us, but we don't actually want to spend a lot of money on the problem."

    • Re: (Score:2, Informative)

      by Anonymous Coward

      1) Stop using cloud-infrastructure for storage.
      Essentially the reason stuff gets stolen in the first place is because someone's client is compromised, which is a lot easier to do than hacking into the cloud storage itself.
      2) Stop using virtual machines on "the real network", because it's a lot easier to just pull a virtual machine image, and run it on a "hostile" machine, all the well impersonating the hypervisor of the real machine. Why bruteforce over the network when you can just patch the login process

    • by schwit1 ( 797399 )
      How am I supposed to know it's you unless I use your SSN as an ID number? But how are we supposed to send you targeted advertizing if we don't know everything about you?
    • Sure, but then how are they going to put all our medical records online and use big data to analyze treatments and outcomes. Oh, you don't want that? No, sorry there isn't an opt out for that. I guess even though we know businesses are incapable of protecting privacy we'll just have to be understanding that it's for our own good.

      Myself, I'm preparing certified letters for my pharmacy, insurer and doctor's office to let them no they do not have my consent to do any of that and that I'd like all eligible reco

  • by ArhcAngel ( 247594 ) on Friday April 24, 2015 @12:01AM (#49543241)
    You can't train an employee to care [wikipedia.org] about someone else's data. If you make them take the course they will. They might even retain some of the message but when it comes time to put it into action it better not be more complex than pressing a button cause something else more important is calling their names.
  • by mlts ( 1038732 ) on Friday April 24, 2015 @12:06AM (#49543251)

    Elaborating on the concept, the good thing is that businesses have a lot of security tools that are not too expensive:

    IDS/IPS.

    AD's innate protection and logging.

    Management and Alerting software like SolarWinds, SCOM/SCVMM/SCCM, or Splunk/Puppet/Chef/Webmin.

    Encapsulating network segments by offering access to data without the ability to fetch the raw items, which can be done with App-V, Remote Desktop, or Citrix.

    Disk encryption is in virtually every OS.

    Basic routing/firewalling/segmenting either via dedicated appliances or a general purpose PC with a routing OS.

    Virtualization/containers to separate applications from each other as well as completely revert the damage done to malware by snapshots.

    Backup servers. Even a SMB can buy an edition of Windows Server 2012R2, enable the Essentials package, and back up a number of clients via a pull mechanism which prevents malware on the target clients from being able to tamper with or modify stored data on the server. For larger installs, MS's SCDPM is one alternative, NetBackup, TSM, and other enterprise tier utilities are another.

    Now the bad news:

    The tools we have are decent. However, it takes not just putting them together to make a cohesive security structure, but also putting policies, procedures, and dealing with the human element. Piss the employees off, and no amount of glued USB ports and Draconian policies will keep them from slurping data offsite out of spite. This is where the expenses come in. It takes people who know what the heck they are doing and know each tools uses and what they can't do (for example, not think that BitLocker to protect against threats over the network.)

    A whitehat's job is hard. It requires a broad spectrum of knowledge of products, as well as being able to configure things in a failsafe manner [1] so if one item with security fails, all isn't lost.

    Another problem is that there has been such a disincentive for so long for people interested in computer security. I have been told by managers at different companies, "Security has no ROI and if we do get hacked, Tata/Infosys/Geek Squad can fix the problem with a phone call." Because security has been hind teat in the IT world for so long, finding experienced people is hard, and can be expensive.

    Maybe this will change, and if companies want security people, more people will start going that route, creating a positive feedback loop. However, I fear this is going to take a major event that causes loss of life before this ever will happen [2].

    It may not have to be that expensive a fix... if Sony had an alerting system to notify their SOC that someone was brute-forcing AD, the attack against them likely would have been far less widespread.

    [1]: For example, an anonymous FTP site would have the /pub directory NFS mounted read-only with permissions squashing root, but allowing everyone to read that directory. That way, if the FTP server gets compromised, the data offered for public FTP can't be tampered with. Of course, the intruder can dismount /pub and put their own Trojaned downloads in its place, but security is about mitigation about attacks as well as prevention, and cleaning up a hacked FTP server can just be as easy as rolling back to an earlier VMWare snapshot.

    [2]: Before the term "cyber 9/11" was coined, it was termed the "Warhol event".

    • That's like.. a few hundred thousand dollars of software and hardware to support all that software.
      • by Tool Man ( 9826 )

        These companies seem convinced there is financial reason to keep everyone else's data, and maybe there is. If so, it behooves them to do so correctly, according to the value of what they hold. If they think the data is worth less, a painful lawsuit judgement may change their minds. (See Ford, and Pinto gas tanks.)

    • by mjwx ( 966435 )
      The big cost isn't in the software itself, but in the cost of the operation.

      Even a free tool has huge costs if it requires specialist knowledge. What stops a lot of companies from being more secure isn't that they dont have the tools, its rather they dont have the know how.

      AD's innate protection and logging.

      Could you elaborate on this, specifically how to configure and use? Any link would be appreciated.

      Although it wont be of much use in my current role (we have another system that does everything t

      • by clark0r ( 925569 )
        AD is useful for providing policy control over objects (eg for rolling out configurations to a particular set of devices), providing a directory with nested groups for granular access to network resources (file shares, web based systems, desktops, networks, anything that integrates with AD). AD also provides a lot of logs so you can see wtf happened after an event, or when performing regular auditing. If you're running a Windows shop you can't deny that AD is your most powerful tool, I've not yet come acr
  • by Anonymous Coward

    If anybody has looked for a job lately, you know most companies are using some form of applicant tracking system. IE: you don't send your info to the company, you enter it into a 3rd party web form. Mosey on down and read the privacy statement, they all say the same thing: We value your privacy.... we will share all of your data with our "trusted" partners. Who are they? What are they doing with my data? Who are they sharing it with? What is their privacy policy? What control do we have? None obviou

  • by DougPaulson ( 4034537 ) on Friday April 24, 2015 @12:11AM (#49543261)
    How about encrypting the data and using PKI [techtarget.com] over VPN [techtarget.com] with a full irrevocable audit trail. The keys being stored on a portable hardware token.
  • Reality (Score:4, Insightful)

    by sublayer ( 2465650 ) on Friday April 24, 2015 @12:15AM (#49543277)
    TFA: Just about every business cares about data privacy

    Reality: Just about every business says they care about data privacy

    The first line of the typical company privacy policy is "we value your privacy", but the next ten pages list all the ways they are going to violate it.

    • by Anonymous Coward

      Exactly. And we all know many examples of even the largest companies flat out rejecting, often with the highest arrogance, to do anything about even the simplest security/privacy holes... We have personal examples, and there are many articles about this on Slashdot too...

      Most *never* care in the least. Even when it is exposed in mass media, and they lose/spend millions or more because of it. They *never* care. These people at the top have all the money they want for many lives. At best they care about even

  • by Anonymous Coward

    Of course it means their data privacy, not yours.

  • by Gravis Zero ( 934156 ) on Friday April 24, 2015 @12:31AM (#49543321)

    smaller companies care about data privacy just as much as big ones do

    so they care deeply until you ask them to spend money at which point they will do the minimum needed to avoid being sued. gotcha, they're directed by sociopath.

    • ...or the small ones only hire people who believe the same things as the company. That is, if you're small (and you believe in privacy), you can only afford to hire people that also believe in privacy, have integrity and can keep a secret. Big companies cast their net much wider, and by the miracle of crap middle-management ensure that those people only do as they're told and don't think for themselves. Thus, those people need to be told to observe privacy through training courses.

      Ultimately, privacy is eit

      • "by the miracle of crap middle-management ensure that those people only do as they're told and don't think for themselves."

        Is not "crap middle-management" but "crap companies". In such companies, the moment middle-management start thinking for themselves, they are fired.

    • smaller companies care about data privacy just as much as big ones do

      so they care deeply until you ask them to spend money at which point they will do the minimum needed to avoid being sued. gotcha, they're directed by sociopath.

      I think you spelled MBA wrong.

  • They care about it bad, man, bad!
  • Suppose a smaller company does care, and wants to implement measures? These tools sound good, but like an auto parts store when you want a whole car, the integration is non-trivial. I guess the current solution is to hire a specialist, if you can find one appropriate. Maybe the industry has to evolve a bit more.
  • Most everyone is commenting about better security software, firewalls, VPNs, encryption, and all that shit. Isn't the article about employee training?

    For example: call up a bank. Try to get the balance on someone's account. This is a task well within reason for the person on the other end of the phone, ASSUMING it is your account, right? That's the point of employee training. The human element is the weakest element of any security system. What training do these employees need in order to not leak out your

  • Enterprise security management is an ongoing process. The underlying trick is in understanding those vulnerable points that are exploitable, and in identifying the impact on end users and the business. With data infiltration and breaches taking place at an alarming rate, organizations need to build robust enterprise security management strategies. Read: 5 Key Aspects for a Robust Enterprise Security Management Strategy [cmsitservices.com]
  • by pr100 ( 653298 ) on Friday April 24, 2015 @06:16AM (#49544063)

    :/

  • Big companies spend time training so they can point to it when something happens? Training is mostly a CYA not a real protection.

    Firing that helps a lot you would be amassed at the amount of stupid and lazy, implemented simple ssn email filters. Watch how many people send things with full ssn's outside the company (something that should never happen) everything from not redacting after idiot customer puts full ssn in an email to automated reports getting sent to outside vendors without so much as requirin

    • "Something like a SSN should be sitting in a well secure table that only verifies if it's a match since no human should ever need to do a customer to SSN lookup"

      And this, sir, shows where the problem lies: even basic understandment of what security is about.

      Why the hell should be an IDENTIFICATOR be taken for a SECURITY TOKEN???

      SSNs should be damn public because they are and should be nothing but a way for you to tell me who you are, just as it is your name. Do you imagine your name being secret? Well, an

      • Unfortunately the government tends to require that it's used. The credit industry got allowed to use it as an identifier. At this point it's out of industry's hands it would require an act of congress to change things.

        • "The credit industry got allowed to use it as an identifier"

          That's good, since it *is* an identifier, a better one that the first name/surname combination since it offers less collisions. What it is not is an authenticity token.

          The problem is not the industry using SSNs as an identification means, that should be OK, but that they are using them as passwords.

          Since they are private companies, it really doesn't take "an act of congress" to change things but people voting with their wallets. Would you put you

  • A comma between 'privacy' and 'bad' would have done wonders for clarifying the headline. It's kind of confusing.

  • If anyone knows that money solves all ills, it is an independent corporation. They just have to decide the right way to solve their ills with it.
  • If companies actually cared about data privacy, then they would know how to protect it. If they don't know how to protect it, then they only care about *appearing* to care about data privacy.

"I've finally learned what `upward compatible' means. It means we get to keep all our old mistakes." -- Dennie van Tassel

Working...