EU Chat Control Law Proposes Scanning Your Messages - Even Encrypted Ones (theverge.com) 136
The European Union is getting closer to passing new rules that would mandate the bulk scanning of digital messages -- including encrypted ones. On Thursday, EU governments will adopt a position on the proposed legislation, which is aimed at detecting child sexual abuse material (CSAM). The vote will determine whether the proposal has enough support to move forward in the EU's law-making process. From a report: The law, first introduced in 2022, would implement an "upload moderation" system that scans all your digital messages, including shared images, videos, and links. Each service required to install this "vetted" monitoring technology must also ask permission to scan your messages. If you don't agree, you won't be able to share images or URLs.
As if this doesn't seem wild enough, the proposed legislation appears to endorse and reject end-to-end encryption at the same time. At first, it highlights how end-to-end encryption "is a necessary means of protecting fundamental rights" but then goes on to say that encrypted messaging services could "inadvertently become secure zones where child sexual abuse material can be shared or disseminated."
As if this doesn't seem wild enough, the proposed legislation appears to endorse and reject end-to-end encryption at the same time. At first, it highlights how end-to-end encryption "is a necessary means of protecting fundamental rights" but then goes on to say that encrypted messaging services could "inadvertently become secure zones where child sexual abuse material can be shared or disseminated."
It's bad, but (Score:4, Insightful)
All it constrains are URLs and images. If you send raw text, the regulation doesn't seem to apply.
Still kinda sucks that the EU is glomming onto CSAM purveyors as a way to break consumer-grade encryption. The CSAM peddlers will find ways around this, while regular people will risk undue scrutiny and violations of their privacy if they dare send encrypted images or HTML links.
Re: (Score:3, Insightful)
It is basically what Apple tried. A list of known illegal image and URL hashes. The images hashes are fuzzy, so they can detect things like resized or slightly modified images. All detection is done locally on the device, so end-to-end encryption is not affected.
As Apple discovered, the technology doesn't work. It's not at all difficult to create images that trigger a hash collision, but which are entirely different to the illegal image they are trying to block.
It's not clear if the EU wants them to revisit
Re:It's bad, but (Score:5, Insightful)
Apple tried client side scanning, but stopped because it was a PR disaster. It's not about technical/economic problems, for mail they've been doing it for 5 years now after all (and Microsoft/Google a lot longer than that). There are huge costs to the human review, but they big boys can pay for it.
The Centre will simply force human review on the service providers. Send too many false positives to the Centre and you get fined, miss one of their test images in undercover tests and you get fined. Apple&Co can handle it themselves, smaller service providers will have to let Kutcher&Co handle it for them to keep costs under control. That much is clear, because it's the only way it can work.
I prophesize that eventually they are going to hand out some filtering software and for exceptional cases the software will report straight to the Centre without the service provider being allowed to look at the messages ... all for the children of course. At least that is how I would do it if I was a politician who wanted totalitarian control over citizens on the down low.
Re: It's bad, but (Score:2)
Are Apple, Google and Microsoft offering any end to end encrypted email or other messaging service ?
Re: (Score:2)
Yes, at least Apple is. iMessage has been end to end encrypted since the beginning, and end to end encryption can be enabled in Apple Mail. https://support.apple.com/en-u... [apple.com]
Re: (Score:2)
The mail scanning is different to the tech that Apple tried to apply to iPhones. They are just using a simple file hash for that.
This will only apply to large companies. Small ones won't need to bother with it, won't need to buy in any software.
It's still dumb, but it's not the apocalyptic disaster that some people are making out.
Re: (Score:2)
I prophesize that eventually they are going to hand out some filtering software and for exceptional cases the software will report straight to the Centre without the service provider being allowed to look at the messages ...
That is because they don't actually care about CSAM (what a fucked up acronym) itself. They want to identify documents generated by the US government or catch communications to/by people who want to hurt the US government. The same applies to other countries and unions. They are doing self-protection under the guise of saving the children. How can they lose? How can normal citizens win?
They can't. The fix is in. State level control over the tiniest aspects of our lives is on the plate in the future. I feel
Re: (Score:2)
The politicians involved don't give a shit about your privacy ... they'll just demand it be done server side. Unless the algorithm leaks, you can't really create collisions.
Re: It's bad, but (Score:2)
3,2,1, if you have nothing to hide, then why not let the government read all of your private communications? The U.S. used to have citizen privacy rights enshrined in its founding documents, but today under the current âoerulesâ, who knows anymoreâ¦
so stupid (Score:5, Insightful)
Kiddie porn is well over 99% produced by and for teens sexting each other. The fact that this is not just illegal but a nuclear kind of illegal - like a web page can have a hidden pixel that loads such a photo, and your cops find it for some other reason and then your life is ruined - is ridiculous. There is no value whatsoever in having such laws for such photos. The way to stop it is to educate teens that sexting can lead to bullying and social embarassment, and if it happens, so what. It's just a photo of an average-looking person, nobody is going to care. Kids do stupid shit all the time and it's going to be yesterday's news by the end of the school year.
Second, the amount of actual kiddie porn is so small that the images that get traded around are specifically known to law enforcement, e-mail providers etc. Like, they show censored versions of the photos (or photos from the set that don't have nudity) to cops nationwide so they can recognize the photos when they see them. Isn't that a million times worse than just letting the photos languish in obscurity?
Third the idea that data is somehow tainted by how it came from doesn't apply anywhere else. The "Napalm Girl" photo from the Vietnam war won a freaking Pulitzer Prize. The data from Nazi scientists, or the Tuskagee syphillis study, is in journals and books. There are websites full of car crash and suicide photos. Yet when it comes to a teen sexting another teen, my god, the horror -- time to ruin some lives.
Re: (Score:2, Informative)
Third the idea that data is somehow tainted by how it came from doesn't apply anywhere else. The "Napalm Girl" photo from the Vietnam war won a freaking Pulitzer Prize. The data from Nazi scientists, or the Tuskagee syphillis study, is in journals and books.
The difference with those is that some good comes from their use. That photo helped end the war, illegal medical experiments help treat people who are suffering. Child porn has no beneficial use, if you don't accept the idea that access to it prevents paedophiles from harming other children.
Re: so stupid (Score:2)
Re:It's bad, but (Score:5, Insightful)
This idiotic crap has been going on for a few decades now in the EU and especially Germany (which has a history of two (!) totalitarian states doing this to its citizens). There are just some defectives that cannot stand people having secrets and these make their way into power. Obviously, this will do nothing for the stated purpose and I am pretty sure the assholes behind it _know_ that. They just want to ready everybody's regular messages.
Re: (Score:3)
This idiotic crap has been going on for a few decades now in the EU and especially Germany (which has a history of two (!) totalitarian states doing this to its citizens). There are just some defectives that cannot stand people having secrets and these make their way into power. Obviously, this will do nothing for the stated purpose and I am pretty sure the assholes behind it _know_ that. They just want to ready everybody's regular messages.
More than likely what this really is is yet another way to shovel our data into some giant AI processing system. Because no human will be involved in looking through every message sent.
On the bright side, at least the EU is making this a big public display, rather than trying to shovel it through secretly. Which makes me wonder how much shit HAS been shoveled through secretly.
Re: (Score:2)
That is an excellent point.
Re: (Score:2)
This idiotic crap has been going on for a few decades now in the EU and especially Germany
Germany is one of the few countries that definitively said they would vote against this law on privacy grounds alongside the Netherlands, Luxembourg, Poland and Austria. Your understanding of the EU is incredibly lacking.
Re: (Score:2)
It's interesting that you mention Germany. Germany was one of those European countries where child porn was legal in some circumstances. There used to be magazines that published it. IIRC the rule was that the subject had to take the photo themselves, physically push the shutter button, because the law only made it illegal for other people to photograph children.
The Netherlands was another one that had very lax laws, also around bestiality, so used to produce that stuff commercially. After the laws changed
Re: (Score:2)
They just want to ready everybody's regular messages.
And when they finally completely succeed, they will look around and cry and ask how they got into this position. The System is amoral, faceless, and can be used against you as surely as you use it against it others. It will be a literal hell for everyone. I imagine Religion will become mandatory.
I don't see how that's possible (Score:4, Insightful)
Re: (Score:2)
The messaging client could detect URLs and especially images on-device, then send/flag those specific messages for review.
(I'm not a proponent for this law in the least, just giving an example of how it could be done.)
Re: It's bad, but (Score:4, Insightful)
The service can't be called end to end encrypted if someone other than you has the ability to decrypt it, even partially.
Re: (Score:2)
Oh, marketing will give you any lie you want to hear and then some. Kind of like some parts of law enforcement.
Re: (Score:2)
Yes. To be fair, I worked on S/MIME a few decades ago for Mozilla. It proved too difficult for mere mortals to deploy. I managed to set it up with family members, but the moment their certificates expired, it was game over, unfortunately. Trying to fix it from 6000 miles away was just not going to happen.
PGP is not exactly simple to setup either, and has different trust issues than X.509-based systems.
Unfortunately, the level of security seems to be inversely proportional to ease of use.
All proper end-to-en
Re: (Score:2)
You don't need certificates for encryption, all you need is key exchange.
Re: It's bad, but (Score:2)
You don't need to tell me. S/MIME however does rely on certificates.
And there was never a very good answer to public key discovery.
LDAP works at the organization level, but isn't deployed on the public Internet.
You can share your public key by signing an email, but that initial one won't be encrypted.
Doing the key exchange online rather than offline is easier. But doing it securely is another matter.
Re: (Score:2)
LDAP works at the organization level, but isn't deployed on the public Internet.
Whut!? LDAP is *not* encrypted. Why even mention this?
Re: It's bad, but (Score:2)
Firstable, LDAP is encrypted. It's just called LDAPS.
Second, in the context of S/MIME e-mail, LDAP provides a way for the sender to find the recipient's certificate, which includes the public key. It's a way to solve the key exchange problem.
That works great in a private organization, because all members/employees are registered in the same directory, where they can publish their certificates.
It doesn't work on the public internet, because there is no global LDAP server to do the same, and AFAIK no means t
Re: (Score:2)
Firstable, LDAP is encrypted. It's just called LDAPS.
Nope. It isn't. This is just LDAP over SSL - that's still plain text within the channel. So why even mention LDAP? LDAP doesn't even negotiate the channel!
LDAPS is a kludge put in place precisely *because* LDAP is plain-text garbage. Even then, it's still garbage. It's only encrypted inside the channel. It is still exposed at either end.
LDAP provides a way for the sender to find the recipient's certificate, which includes the public key.
OK. You have undermined your own claimed authority here in a way you simply cannot recover from. But thanks for the ... well, not exactly input but... something or other..
Re: It's bad, but (Score:2)
I have no need for the approval of a stranger, much less someone who gets off on putting people down.
What I described are perfectly standard uses of S/MIME , LDAP and e-mail. And some organizations still deploy them that way in 2024. Your criticisms should be directed at the protocol authors. But before you do that, ask yourself why they would still be deployed, if they were proven to be insecure.
Your earlier responses show that you might have missed what problem was being solved in the first place, for whi
Re: (Score:2)
Re: (Score:2)
Well, if you do not care who the other person is, sure.
Re: (Score:2)
"Can't?" What happens when you try to call it that? Facebook offers end-to-e~~NO CARRIER
Re: It's bad, but (Score:2)
Re: (Score:2)
New hallmark (Score:5, Insightful)
Re: (Score:2, Insightful)
Re: (Score:2, Insightful)
I dont trust people who think of the children.
Check their internet history, hard drives, and basement FIRST.
Jimmy saville thought of the children all the time.
Re: (Score:3)
The Victim or the Crime? (Score:3)
Punitive laws make excellent weapons. Yet weapons don't need rationality, truth, or justice to inflict damage on anyone for any reason. Whether you call it punishment of the guilty or victimization of the innocent or the institutional unjustified indifference towards any miscarriage of justice. Who cares if the innocent are guilty and the guilty are innocent while at a public hanging for an angry mob. Ethical purgatory and moral ambiguity weaponizes hypocrisy and evil with righteous indignation and veng
"Save the children" (Score:5, Insightful)
>"which is aimed at detecting child sexual abuse material (CSAM)"
The *intent* ("aimed at" in this example) doesn't matter. It is what the *effect* will be, which is to destroy privacy and freedom. Throw in "save the children" and everything rational goes right out the window. I think a better law would be to hold parents and their agents responsible for not parenting.
>"The law, first introduced in 2022, would implement an "upload moderation" system that scans all your digital messages,"
How completely draconian. A list/algorithm, controlled by and mandated by the government. And could also be supplemented by mega corps. What a wonderful choke point for monitoring and controlling EVERYTHING. But I am sure they will pinky-promise it would never be abused by any existing or future government or third-party actors.
>"Each service required to install this "vetted" monitoring technology must also ask permission to scan your messages. If you don't agree, you won't be able to share images or URLs."
Oooh! We have a choice! We can agree or just not communicate. Such freedom and transparency!
>"encrypted messaging services could "inadvertently become secure zones where child sexual abuse material can be shared or disseminated."
You can have end-to-end encryption, which is actual privacy and security.... or not. There really is no in-between. CHOOSE. And please choose wisely.
Re: (Score:2)
> destroy privacy and freedom
Bullshit. In a dictatorship, perhaps, but in a western civilisation, nope. We have only had this tech for what, 10, 15 years? Before them, way back to when I was born in the 80's, we had just as much freedom and privacy.
> How completely draconian. A list/algorithm, controlled by and mandated by the government. And could also be supplemented by mega corps. What a wonderful choke point for monitoring and controlling EVERYTHING. But I am sure they will pinky-promise it wou
Re: (Score:2)
Bullshit. In a dictatorship, perhaps, but in a western civilisation, nope. We have only had this tech for what, 10, 15 years? Before them, way back to when I was born in the 80's, we had just as much freedom and privacy.
No one was reading every piece of mail sent or monitoring every phone call everyone made back then. So yeah, you had quite a bit of freedom and privacy.
Re: (Score:2)
lol
You are forgetting they key driver here: something MUST be done.
I think a better law would be to hold parents and their agents responsible for not parenting.
lol, that ain't gonna fly. But remember, something MUST be done...
But I am sure they will pinky-promise it would never be abused by any existing or future government or third-party actors.
Your concerns are noted; however, something MUST be done.
You have proposed nothing to solve this issue and yet something MUST be done. Since you do not have better ideas, we will implement full monitoring.
(are you seeing how this works now? find a hot button issue and keep pressing and pressing until something breaks, then you take advantage of that break and continue pressing
Re: Wait until (Score:2)
OTR/PGP (Score:5, Interesting)
And how exactly are they going to scan OTR/PGP messages on open protocol services?
Asking for a friend.
Re: (Score:2)
They are not.
(26) [...] Having regard to the availability of technologies that can be used to meet the requirements of this Regulation whilst still allowing for end-to-end encryption, nothing in this Regulation should be interpreted as prohibiting, requiring to disable, or making end-to-end encryption impossible. Providers should remain free to offer services using end-to-end encryption and should not be obliged by this Regulation to decrypt data or create access to end-to-end encrypted data.[...]
Re: (Score:2)
>"They are not." [going to try to prevent encryption]
Right. Which means there is no point at all in the whole regulation. People will just shift to encrypting things and nothing will have changed on the "save the children" front. So why bother with any of this in the first place?
A "We want to examine all communication for child porn"
B "But you can't examine anything encrypted, and if you try to force backdoors into encryption, you WILL break ALL security. And if you try to outlaw encryption, you WILL
Re: (Score:2)
I suppose you could catch some "stupid" people who would continue to do illegal things (like child porn) out in the open.
Yes I think that's the point of the legislation, with "the open" now explicitly extended to sharing an image taken from a phone into a messaging application, for example, or attaching a clear jpg to an email.
But that would dry up pretty quickly.
I'm not sure of that. New idiots are born every year.
Re: (Score:2)
Re: (Score:3)
You are missing that this is a first step to then they will require more and more and at some time end-to-end encryption will simply get outlawed because it is only an additional small step. These people _know_ what they are currently asking for makes no sense. It is not their real goal however.
What I expect (Score:2)
Re: (Score:2)
Which means there is no point at all in the whole regulation. People will just shift to encrypting things and nothing will have changed on the "save the children" front. So why bother with any of this in the first place?
People won't shift. People don't care. Individual criminals will care and make the switch showing that this won't have much of an impact in the way the EU is saying it will. But that's not the real goal is it. It's just another underhanded way to get the general population have open communications, network effect will cause the general population to keep using whatsapp / imessage / whatever.
Are the cops rational and honest? (Score:3)
This is the hard question. They assert that they can use the material that they harvest to good effect to catch offenders. What's needed here is evidence that people are being caught by that method. Indeed a sunset clause requiring the legislation to be passed again in five years time, allowing evidence that it has been successful - or not, might be a useful compromise.
Re: (Score:2)
Arnt both essentially dead now?
PGP isnt even implemented in secure corporate environments anymore, they use VPN's and isolated private networks for secure email.
OTR was neat but it got replaced a long time ago.
Anyone using such old tech is not even a target.
Re: OTR/PGP (Score:2)
Images within images (Score:2)
Reminder: It is not so hard to hide encrypted information (say images) in the noise bits of legit images.
Re: (Score:2)
I suspect AI systems are going to severely blunt the usefulness of Steganographic message hiding.
Re: Images within images (Score:2)
Don't overestimate the capabilities of AI. A properly encoded and encrypted steganographic payload is indistinguishable from random noise, and AI isn't going to change that or help in any way with detection. While steganography can be defeated, for example, by forcibly applying filters that do lossy compression, such approaches are orthogonal to AI.
Not that people here need this explained, but (Score:5, Insightful)
The fundamental issue is - whether European or American, these lawmakers fundamentally don't understand how encryption works. They really think they can have it both ways, and their spy agencies intentionally mislead them into thinking so because *they* want access to the messages.
One thing I'll say for the Chinese - they don't try to pretend. They just come right come out and say "all your base are belongs to us, suck it".
Re:Not that people here need this explained, but (Score:5, Insightful)
By now I am convinced they exactly understand how encryption works and they could not care less about CSAM. They want to spy on regular people with ease and chaply. This is the first step and if they manage to get it, there will be next steps. At the end of this, saying something bad about the assholes in power will get you sent to a concentration camp. We have seen this often enough by now. And, bonus!, the Germans have had two totalitarian regimes doing this to excess and their politicians and police still want that capability again and as soon as possible. Some people are just defective and a threat to everybody.
Politicians are actually quite smart (Score:2)
They simply don't trust the public.
Imagine the political ad: "Minister Kraus voted against the kiddy-diddling act. Minister Kraus is in favor of kiddie diddling. Vote (fascist party of the specific nation)!"
Yes, the nuances of laws like this can be explained. But the old political truism: "If you're explaining, you're losing" applies. A lot of voters don't want to hear the truth.
I know some people want to think it's all a big conspiracy to view other people's foot-fetishes, but usually the truth is a lot s
Re: (Score:2)
I suspect you haven't had opportunity to read the proposed legislation -- I fully understand you, Slashdot makes a point in hiding the interesting URLs, and then the legislation itself is 200+ pages. However, I would suggest you have a look at article 26 and 26a from here https://netzpolitik.org/wp-upl... [netzpolitik.org] (page 15 of the pdf).
As far as I understand (this is my rewrite, each articles is half a page of legalese)
* If the person is not using a publicly-available service (e.g. the person is running a private SMT
Re: (Score:2)
The thing is that this is legislation is not an end-goal, it is a stepping-stone. As they failed before, they first divided the way to the total surveillance state into steps and then, as they failed again, they made the steps smaller. But at the end-goal is that surveillance state, there really cannot be any doubt anymore. And if the steps are small enough, they may well succeed.
Re: (Score:2)
I suspect you haven't had opportunity to read the proposed legislation -- I fully understand you, Slashdot makes a point in hiding the interesting URLs, and then the legislation itself is 200+ pages.
i have tried to digest these 200 pages of asinine gibberish (at least diagonally) but this document seems specifically written to be as obscure and obfuscated as possible, i think it should be rejected simply for that reason.
my takeaway: it seems to gravitate around "detection orders", not indiscriminate verification. these "detection orders" would be issued by (i understand) judicial authorities to providers against specfic services, maybe specific parts of services, maybe specific channels, groups or acco
Re: (Score:2)
You can find a study on the implementation in the Impact Assessment report. https://op.europa.eu/en/public... [europa.eu]
Again? (Score:2)
Re: (Score:2)
Exactly this. It is beyond stupid. Me, a law abiding citizen am being spied on while some druglord will ask his nephew to go to the library, read a book about encryption and develop an app so that he can chat discretely with other criminals. It's that simple.
Re: (Score:2)
Re: (Score:2)
I mean.... They taught me in first year CS and it was a mighty big class. And not the only such class on campus, nor the only uni in Canada.
Re:Again? (Score:5, Insightful)
I wish some would teach them that encryption = maths so can't be technically stopped.
You're thinking technically, not socially. End users do not use encryption, they use common widely available popular platforms like WhatsApp. You don't need to ban encryption to have a horrendously chilling effect on privacy, you just need to legislate that the likes of WhatsApp aren't allowed to use it. The effect is the same.
Re: (Score:2)
Given that, you could say the law is pointless and the legislator are stupid. However that assumes they really think such a law will work. It could equally be they just wanting an easier way to spy on people who are not breaking the law. Either case is not a good look.
Re: (Score:2)
Actually by affecting supply it gives people an "in" to the criminal underworld. You can see that in effect the past few years. While WhatsApp is encrypted many criminals didn't trust it. But criminals are criminals, not IT geniuses. They don't go out and roll their own service and products, they rely on someone to do that.
Oh look a fancy new encrypted platform designed with the criminal in mind? Perfect! Let's use that. Oh woopse, turns out it was an FBI honeypot https://www.engadget.com/fbi-e... [engadget.com]
Criminals
Re: (Score:2)
>"End users do not use encryption, they use common widely available popular platforms like WhatsApp"
They don't *now* because they might not feel they need to. But once they put in systems that spy on everything, that might quickly change.
>"You don't need to ban encryption to have a horrendously chilling effect on privacy, you just need to legislate that the likes of WhatsApp aren't allowed to use it. The effect is the same."
That is true, but only for a limited time. It won't take long for people to
Re: (Score:2)
No they won't. People are inherently unconcerned with their privacy, and very concerned with the network effect. Apps like Signal exist. WhatsApp is the most popular messaging service in many European countries, but if you ask people if they trust Meta and if their conversations are truly private the majority of them will say "fuck no" followed by "I don't care".
We are doomed. Not only are we not as smart as you make out, we also simply don't give a damn as a society.
Balance (Score:5, Insightful)
I don't want to see a single child abused, but having them all grow up in a surveillance state is also abuse. It may not be to the same degree, but I'll affect a hell of a lot more of them.
Any politician who thinks "the government should know everything you do or say just in case maybe you are committing a crime" is acceptable should have all their communications on a live public feed. 24/7, including a body cam.
Re: (Score:2)
> having them all grow up in a surveillance state is also abuse
The solution, which is being implemented in the UK thankfully, is to take the smartphones away from the kids.
Vetted monitoring technology (Score:5, Insightful)
Cheap (Score:2)
Re: Cheap (Score:2)
ok, now it's clear (Score:4, Interesting)
Re: (Score:2)
It's forbidden so policing agencies aren't overwhelmed by false positives from simulated child rape. Also, so images of simulated child rape aren't in circulation where a disinterested person will see them and become traumatized/aroused by them.
Re:ok, now it's clear (Score:5, Insightful)
The reduction in actual kids abused and the dissemination of real CSAM would be more than worth the occasional person "accidentally" stumbling across it.
Of course, the biggest problem as always remains getting over the hill of uncoupling "this is bad" and "this should be illegal". I've met people who admit they straight up don't give a damn if more kids are actually abused as a result, there's no way they'd ever want perverts looking at it to go unpunished, because the law for them is a morality enforcement tool.
Re: (Score:2)
Drawings or fake images depicting CSAM are illegal to produce and are covered.
Thus if A.I were to produce such images, the owners/developers would be commiting a crime.
Re: (Score:2)
That's nonsense. It would be harder to produce images that even a computer couldn't determine were fake than just to make images that were provably fake. If AI generated CSAM was explicitly legal, everyone would want their supply from providers whose images could be proven fake (by algorithm), because that would eliminate the risk of the serious felonies associated with even simple possession.
hand drawn art is obviously not real. no need for authentication at all... and yet, such drawings have been semi-illegal for 30 some odd years now and explicitly "illegal" (how can something that is Constitutionally protected actually be illegal?) for about 20 years in numerous jurisdictions.
Re: (Score:2)
No, it's forbidden as no normal human should be looking at that stuff and if A.I were to create such images them it would just feed the beast.
Any image, real, drawn, described in a text file of CSAM is already illegal as those thoughts and desires are ILLEGAL.
I have a proposal too: Outlaw all communication. (Score:5, Insightful)
Criminals communicate to commit their criminal acts. WE MUST STOP THEM FROM COMMUNICATING! Therefore, we must propose a law that no communication at all may happen without the direct presence of law enforcement agents. If you are locked in your home with another human being, you may not speak unless you invite a law enforcement agent into your home to supervise and make sure you do not, in fact, communicate about illegal or illicit activity. Don't mind the notes they jot down every time you call a politician a tosser or a cunt. Nothing to worry about there. Just crossing 't's and dotting 'i's.
I mean, come on people! THINKN OF THE THE CHILDREN!
That's how much fucking sense these god damned proposals/laws make. No, you, the government, anywhere, do *NOT* need full access to all forms of communication to do your fucking jobs. 100% safety is impossible, and gang-raping your entire citizenship in the name of an impossible to achieve goal is just gang-raping your entire citizenship. It is not, in fact, making anything safer or more secure. Much like most of the bullshit that happened in the States after 9/11. This is about making sure there are no secure channels of communication. Period. It has as much to do with CSAM as writing in a notebook. It's just a good pre-argument for anyone who may fight it. It's bullshit that we all see through. Yet they keep doing it. And somehow? We keep letting them.
Re: (Score:2)
Re: (Score:2)
They don't care about kiddie porn! (Score:2)
Totalitarianism ahead (Score:2)
Go ahead, scan me (Score:2)
0BE673DD614DE035E884DFA4EB6667CDBB8A73A781C26C1D34B1E0F7AF50D12B
Next week (Score:2)
Next week the geniuses at the EU are going to legislate on how many cooler days are allowed in June, on which days thunderstorms are acceptable in August and September, which beers are best, if you should scratch then yawn or yawn then scratch, and whether it's a crime to put on your left sock before your right sock or vice versa, or both. Arseholes all. Thank God for Brexit.
the rise of fascism (Score:2)
The fasicsts will tell you that terrorists and pedophiles are our biggest worry. So we better give up our rights and privacy, else we're as bad as the terrorists and/or pedos. It's a psychological trick to get the public to accept the centralized control over people's lives that is the pillar of a fascist government. But if anyone bothers to look at the numbers, these are never the top concerns in every day life. And even if they were, there are better ways to deal with crime than creating a police state.
Re: (Score:2)
> these are never the top concerns in every day life
Er, I think you will find they indeed are.
They never give up (Score:2)
We've seen this before, of course. In the 90s, 00s, 10s, every few years. In different disguises, always for the reason-of-the-time (terrorism, drugs, child porn, etc.)
There are usually the same people behind it. One thing to learn from politicians: They never, ever, give up on a topic they want to push through. If resistance against it is strong enough to defeat them, they will lie low for a while, then try again. And again, and again.
If you keep trying, sooner or later you succeed.
Good (Score:2)
I switched sides.
I used to be all about GPG encryption etc. But I realised thats an old fashioned idea from a rose-tinted spectacle view of the older internet back where it was exciting and fun and way more innocent, probably because of the fact that social media hadnt been invented yet, so you had to have a website, registered in your name (well until geocities perhaps) and it was smarter people, and ounger amarter people using it.
Once social media and the general public came on board it was only a matter
Er, what? (Score:2)
News update: Proposal indefinitely withdrawn (Score:2)
The vote on this controversial topic has been indefinitely suspended:
https://stackdiary.com/eu-coun... [stackdiary.com]
Now, I'm looking forward /s how Hungary's presidency, starting 1st of July, will go about this. Orban is a far-right reactionary populist who has already severely curbed the press, for example, and is best friends with Putin. His slogan for their presidency? "Make Europe Great Again" Ring any bells?
Re: (Score:2)
Quit letting assholes ...
Your post shows an amazing lack of reality. There's always someone telling you what to do. It's why we have elections: To get rid of the assholes that go too far. Unfortunately, equally often, voters get rid of the leader who declares "it's your fault" and you (the voters) need to be less selfish and greedy.
Re: (Score:2)
So what do you do when you have two anarchic groups disagreeing with each other?
Well, lets look at the past...
They go to war. The winner takes control of the losers, turns them into slaves and starts enacting more control to make sure there are no dissidents..
Then you have governments form.
Been there, got the t-shirt