How Privacy-Enhancing Technologies Are Fulfilling Cryptography's Potential (theguardian.com) 13
Here's the Guardian's report on new cryptographic techniques where "you can share data while keeping that data private" — known by the umbrella term "privacy-enhancing technologies" (or "Pets).
They offer opportunities for data holders to pool their data in new and useful ways. In the health sector, for example, strict rules prohibit hospitals from sharing patients' medical data. Yet if hospitals were able to combine their data into larger datasets, doctors would have more information, which would enable them to make better decisions on treatments. Indeed, a project in Switzerland using Pets has since June allowed medical researchers at four independent teaching hospitals to conduct analysis on their combined data of about 250,000 patients, with no loss of privacy between institutions. Juan Troncoso, co-founder and CEO of Tune Insight, which runs the project, says: "The dream of personalised medicine relies on larger and higher-quality datasets. Pets can make this dream come true while complying with regulations and protecting people's privacy rights. This technology will be transformative for precision medicine and beyond."
The past couple of years have seen the emergence of dozens of Pet startups in advertising, insurance, marketing, machine learning, cybersecurity, fintech and cryptocurrencies. According to research firm Everest Group, the market for Pets was $2bn last year and will grow to more than $50bn in 2026. Governments are also getting interested. Last year, the United Nations launched its "Pet Lab", which was nothing to do with the welfare of domestic animals, but instead a forum for national statistical offices to find ways to share their data across borders while protecting the privacy of their citizens.
Jack Fitzsimons, founder of the UN Pet Lab, says: "Pets are one of the most important technologies of our generation. They have fundamentally changed the game, because they offer the promise that private data is only used for its intended purposes...." The emergence of applications has driven the theory, which is now sufficiently well developed to be commercially viable. Microsoft, for example, uses fully homomorphic encryption when you register a new password: the password is encrypted and then sent to a server who checks whether or not that password is in a list of passwords that have been discovered in data breaches, without the server being able to identify your password. Meta, Google and Apple have also over the last year or so been introducing similar tools to some of their products.
The article offers quick explanations of zero-knowledge proofs, secure multiparty computation, and fully homomorphic encryption (which allows the performance of analytics on data by a second party who never reads the data or learns the result).
And "In addition to new cryptographic techniques, Pets also include advances in computational statistics such as 'differential privacy', an idea from 2006 in which noise is added to results in order to preserve the privacy of individuals."
The past couple of years have seen the emergence of dozens of Pet startups in advertising, insurance, marketing, machine learning, cybersecurity, fintech and cryptocurrencies. According to research firm Everest Group, the market for Pets was $2bn last year and will grow to more than $50bn in 2026. Governments are also getting interested. Last year, the United Nations launched its "Pet Lab", which was nothing to do with the welfare of domestic animals, but instead a forum for national statistical offices to find ways to share their data across borders while protecting the privacy of their citizens.
Jack Fitzsimons, founder of the UN Pet Lab, says: "Pets are one of the most important technologies of our generation. They have fundamentally changed the game, because they offer the promise that private data is only used for its intended purposes...." The emergence of applications has driven the theory, which is now sufficiently well developed to be commercially viable. Microsoft, for example, uses fully homomorphic encryption when you register a new password: the password is encrypted and then sent to a server who checks whether or not that password is in a list of passwords that have been discovered in data breaches, without the server being able to identify your password. Meta, Google and Apple have also over the last year or so been introducing similar tools to some of their products.
The article offers quick explanations of zero-knowledge proofs, secure multiparty computation, and fully homomorphic encryption (which allows the performance of analytics on data by a second party who never reads the data or learns the result).
And "In addition to new cryptographic techniques, Pets also include advances in computational statistics such as 'differential privacy', an idea from 2006 in which noise is added to results in order to preserve the privacy of individuals."
PETs (Score:2)
It's PETs, not Pets, stop pretending it's some sort of fluffy thing that no one should object to.
It's incredibly annoying trying to figure out what "Pets" means without context (which happens often enough), whereas PETs at least shows it's something non ambiguous.
Re:PETs (Score:5, Interesting)
i don't know about this new pets, but i was involved in a project for electronic voting that built heavily on homomorphic encryption and zero knowledge proofs, from 2010 to circa 2015. we never added noise as a specific step, though, we mixed stuff so it wasn't traceable but still verifiable.
this experience convinced me that universal online electronic voting is perfectly feasible in theory. life convinced me that, in practice, it doesn't matter.
Re: (Score:2)
I did a project around 2000 for an electronic voting mechanism, and tested prototypes against all kinds of attacks, from physical capture to black box stuff. It convinced me that without some sort of verified voting mechanism, like Chaum's voter verifiable election system, it is still easily breakable and a well-heeled nation-state can easily tamper with results. Mainly because there are so many links and so many points of attack. Compromising the machines, compromising the storage, compromising the firm
Re: (Score:2)
Mainly because there are so many links and so many points of attack. Compromising the machines, compromising the storage, compromising the firmware, armed robbery (again, voting is high stakes, so for an organization which would benefit greatly if their guy is in office, I wouldn't put it past them)...
you need a proper protocol and chain of custody to manage all that. machines are tpm'd (the full software stack must be open source), critical systems are airgapped, tasks are split among machines, keys are split into shares among representatives, etc. it does work, and it is just as secure as paper (if not more) and in the end you can even exactly determine if even a single vote was tampered with, and if need be invalidate the election (with paper you could have trailers of votes be taken away and no one w
Re: (Score:2)
Also not to be confused with polyethylene terephthalate (PET), one of the most common types of plastic.
Wow, Microsoft (Score:1)
"Microsoft, for example, uses fully homomorphic encryption when you register a new password: the password is encrypted and then sent to a server who checks whether or not that password is in a list of passwords that have been discovered in data breaches, without the server being able to identify your password"
So Microsoft has finally discovered one-way hashed passwords?
Re: (Score:2)
Good point, but why does it matter if you enter a password into a client, the password can be sent to the server unencrypted if it chooses, you have to trust the client/server not keep your password. Sure keeping the passwords hashed is a good idea so people can't figure it out but not doing the initial check in clear text (over an encrypted channel) seems pointless. If you don't want the server to know your secret, never give them it. Like only give them your public key and they can challenge response your
Re: (Score:2)
Good job crypto Bros (Score:1)
I guess you did add a lot of computational overhead on top of an already existing process that dates back to the.. honestly it's got to be at least the 70s. Seriously people are doing this before computers.
I'm really getting sick and tired of people telling me that a giant linked list that everybody has a copy of is somehow a revolutionary technology. I get it the Ponzi scammers are trying to make it seem like a legitimate thing. But it is kind of annoyi
Re: (Score:2)
It's become a fairly typical part of having a universal healthcare system. Universal healthcare is also transferable across the entire E
I don't have the author's faith (Score:4, Interesting)
In my mind, these sorts of protections - and the concept of differential privacy - really only work if an entity hasn't taken steps to control the whole end-to-end stack. Like Google. Or, for that matter, Apple.
In the past I've tended to have a bit more faith in Apple, just because they've put so much of their reputation behind the "we are the privacy company" idea - if they get caught intentionally working against that, it could destroy them. However my trust in them has been eroding. Not that I see them necessarily being intentionally deceptive, like Google; I just think over the past several years Apple has repeatedly demonstrated a lack of competence on the software side (which is almost as bad as being intentionally evil, from an end-user perspective).
I'm not giving Microsoft a pass on this; I just think they've largely failed in their attempts to this point.
Cient trust (Score:2)
Wrong definition of fully homomorphic encryption. (Score:2)
A fully homomorphic encryption allows you to perform mathematical functions on encrypted data without first performing the decryption.