Calif. Attorney General: We Need To Crack Down On Companies That Don't Encrypt 127
tsamsoniw writes "California Attorney Kamala Harris says her office will start cracking down on companies in the Golden State that don't encrypt customer data and fall victim to data breaches; she's also calling on the state to pass a law requiring companies to use encryption. That's just one of the recommendations in the state's newly released data breach report, which says 131 companies in California suffered data breaches in 2012, affecting 2.5 million residents."
Re:NSA (Score:5, Insightful)
Mozilla Firefox and other browsers treats self encrypted certs as WORSE than unencrypted and put big scary messages up
I think it is reasonable action for a certificate you don't know the source. You can always add the certificate to your browser [mozilla.org] and avoid the error. The rationale for the pop-up is that an unknown self-signed certificate is as bad as no encryption - totally open to a main-in-the-middle attack, but people have a higher expectation of security from SSL.
Re:NSA (Score:5, Insightful)
Is "as bad as no encryption" a reason for yelling on the user and presenting it like the worst security problem ever? Even if I accept the premise that it is as bad as no encryption, the obvious conclusion is that the browser should present it the same as no encryption.
Actually, it is not as bad. It still keeps you safe from passive attacks (like your ISP collecting all data for a three-letter agency, which analyses them later).
Re:NSA (Score:5, Insightful)
people have a higher expectation of security from SSL.
I think the GPs point was that it does not have to be a all or none - that you can have SSL of a self signed cert without the error message and without giving any "expectation of [high] security" (to quote GP "no full secure icon")
The rationale for the pop-up is that an unknown self-signed certificate is as bad as no encryption
In light of the Snowden revelations and subsequent fallout, this rational has very few legs to stand on. Unencrypted is less desirable than plain text. The only argument I have seen against this rational is that people may be lulled into a false sense of security if they believe self signed certs are as secure as CA issued ones, falling for MITM attacks for their bank traffic etc. The counter to that is that is simple and sensible: no, not if the browser does not try to tell them they have a top secure connection - and treats it like it is a plain text connection.
self-signed certificate is... totally open to a main-in-the-middle attack
The current SSL system is also totally open to a main-in-the-middle attacks by state sponsors, as has been reported here various times. And yes self signed certs are also very vulnerable to the same attack - but the point here is to encrypt the majority of data. State sponsers can always target but with blanket always on encryption they are unable to perform mass illegal capture and storage.... that is the point of not raising an error message on self signed certs.
Any way I cut these arguments, browsers appear to be in the wrong on this one - throw in cosy relationships with CAs, state departments etc and we could have a conspiracy here.
Re:Encryption (Score:4, Insightful)
So instead of burning cpu cycles, you are burning crypto processor cycles plus you have the cost of buying the hardware in the first place and possibly the bus overhead of sending data to/from the device.
If the server gets compromised while its running, the data is accessible because the server needs access to the data in order to function.
If the server gets physically stolen its likely the crypto hardware will be stolen with it. If you store the key somewhere it can be automatically obtained and used then the key can be stolen too, if you enter the key manually on bootup (ie how you would on a laptop) then you require physical intervention if the server reboots for any reason.
Encryption has its uses, but its not a magic bullet, and poor/inappropriate use of encryption is damaging - not only does it waste resources unnecessarily, but it also brings a false sense of security and encourages lazy thinking... People will simply implement the bare minimum required to comply with the law, which will probably mean encrypting the data while leaving the key on the same box.
You will also end up with a "one size fits all" attitude, which is clearly ridiculous...
You need to consider *what* data your storing, *why* your storing it and *what* needs to access it.
You can segregate the data so that some is only accessible by those systems that need it.
You can tokenize the data, eg for repeat billing of a credit card you can store a token agreed only between you and your payment processor.
You can store rarely referenced data with public/private keys, leaving only the public key online and keeping the private offline for use when necessary.
No, pushing a one size fits all "encrypt your data" mandate is stupid and will only make things worse, each individual case needs to be designed by someone who understands the needs and is technically competent.
Dictate penalties and properties not methods (Score:5, Insightful)
Good laws of this sort are those which do not impose technical solutions but rather provide general systems level requirements.
The problem with "duh use encryption" there is no guarantee of any kind simply applying encryption makes a system more secure against a specific threat.
Every time you get into the weeds you are guaranteed to codify errors and hurt those who choose to innovate using different but better or equally valid approaches.