Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
EU Encryption Privacy

EU Chat Control Law Proposes Scanning Your Messages - Even Encrypted Ones (theverge.com) 136

The European Union is getting closer to passing new rules that would mandate the bulk scanning of digital messages -- including encrypted ones. On Thursday, EU governments will adopt a position on the proposed legislation, which is aimed at detecting child sexual abuse material (CSAM). The vote will determine whether the proposal has enough support to move forward in the EU's law-making process. From a report: The law, first introduced in 2022, would implement an "upload moderation" system that scans all your digital messages, including shared images, videos, and links. Each service required to install this "vetted" monitoring technology must also ask permission to scan your messages. If you don't agree, you won't be able to share images or URLs.

As if this doesn't seem wild enough, the proposed legislation appears to endorse and reject end-to-end encryption at the same time. At first, it highlights how end-to-end encryption "is a necessary means of protecting fundamental rights" but then goes on to say that encrypted messaging services could "inadvertently become secure zones where child sexual abuse material can be shared or disseminated."

This discussion has been archived. No new comments can be posted.

EU Chat Control Law Proposes Scanning Your Messages - Even Encrypted Ones

Comments Filter:
  • It's bad, but (Score:4, Insightful)

    by DrMrLordX ( 559371 ) on Wednesday June 19, 2024 @03:45PM (#64561887)

    All it constrains are URLs and images. If you send raw text, the regulation doesn't seem to apply.

    Still kinda sucks that the EU is glomming onto CSAM purveyors as a way to break consumer-grade encryption. The CSAM peddlers will find ways around this, while regular people will risk undue scrutiny and violations of their privacy if they dare send encrypted images or HTML links.

    • Re: (Score:3, Insightful)

      by AmiMoJo ( 196126 )

      It is basically what Apple tried. A list of known illegal image and URL hashes. The images hashes are fuzzy, so they can detect things like resized or slightly modified images. All detection is done locally on the device, so end-to-end encryption is not affected.

      As Apple discovered, the technology doesn't work. It's not at all difficult to create images that trigger a hash collision, but which are entirely different to the illegal image they are trying to block.

      It's not clear if the EU wants them to revisit

      • Re:It's bad, but (Score:5, Insightful)

        by Pinky's Brain ( 1158667 ) on Wednesday June 19, 2024 @04:51PM (#64562093)

        Apple tried client side scanning, but stopped because it was a PR disaster. It's not about technical/economic problems, for mail they've been doing it for 5 years now after all (and Microsoft/Google a lot longer than that). There are huge costs to the human review, but they big boys can pay for it.

        The Centre will simply force human review on the service providers. Send too many false positives to the Centre and you get fined, miss one of their test images in undercover tests and you get fined. Apple&Co can handle it themselves, smaller service providers will have to let Kutcher&Co handle it for them to keep costs under control. That much is clear, because it's the only way it can work.

        I prophesize that eventually they are going to hand out some filtering software and for exceptional cases the software will report straight to the Centre without the service provider being allowed to look at the messages ... all for the children of course. At least that is how I would do it if I was a politician who wanted totalitarian control over citizens on the down low.

        • Are Apple, Google and Microsoft offering any end to end encrypted email or other messaging service ?

        • by AmiMoJo ( 196126 )

          The mail scanning is different to the tech that Apple tried to apply to iPhones. They are just using a simple file hash for that.

          This will only apply to large companies. Small ones won't need to bother with it, won't need to buy in any software.

          It's still dumb, but it's not the apocalyptic disaster that some people are making out.

        • I prophesize that eventually they are going to hand out some filtering software and for exceptional cases the software will report straight to the Centre without the service provider being allowed to look at the messages ...

          That is because they don't actually care about CSAM (what a fucked up acronym) itself. They want to identify documents generated by the US government or catch communications to/by people who want to hurt the US government. The same applies to other countries and unions. They are doing self-protection under the guise of saving the children. How can they lose? How can normal citizens win?

          They can't. The fix is in. State level control over the tiniest aspects of our lives is on the plate in the future. I feel

    • so stupid (Score:5, Insightful)

      by Anonymous Coward on Wednesday June 19, 2024 @03:54PM (#64561909)

      Kiddie porn is well over 99% produced by and for teens sexting each other. The fact that this is not just illegal but a nuclear kind of illegal - like a web page can have a hidden pixel that loads such a photo, and your cops find it for some other reason and then your life is ruined - is ridiculous. There is no value whatsoever in having such laws for such photos. The way to stop it is to educate teens that sexting can lead to bullying and social embarassment, and if it happens, so what. It's just a photo of an average-looking person, nobody is going to care. Kids do stupid shit all the time and it's going to be yesterday's news by the end of the school year.

      Second, the amount of actual kiddie porn is so small that the images that get traded around are specifically known to law enforcement, e-mail providers etc. Like, they show censored versions of the photos (or photos from the set that don't have nudity) to cops nationwide so they can recognize the photos when they see them. Isn't that a million times worse than just letting the photos languish in obscurity?

      Third the idea that data is somehow tainted by how it came from doesn't apply anywhere else. The "Napalm Girl" photo from the Vietnam war won a freaking Pulitzer Prize. The data from Nazi scientists, or the Tuskagee syphillis study, is in journals and books. There are websites full of car crash and suicide photos. Yet when it comes to a teen sexting another teen, my god, the horror -- time to ruin some lives.

      • Re: (Score:2, Informative)

        by AmiMoJo ( 196126 )

        Third the idea that data is somehow tainted by how it came from doesn't apply anywhere else. The "Napalm Girl" photo from the Vietnam war won a freaking Pulitzer Prize. The data from Nazi scientists, or the Tuskagee syphillis study, is in journals and books.

        The difference with those is that some good comes from their use. That photo helped end the war, illegal medical experiments help treat people who are suffering. Child porn has no beneficial use, if you don't accept the idea that access to it prevents paedophiles from harming other children.

      • Well, the napalm girl didnâ(TM)t show her skin - it was burned offâ¦
    • Re:It's bad, but (Score:5, Insightful)

      by gweihir ( 88907 ) on Wednesday June 19, 2024 @04:25PM (#64562019)

      This idiotic crap has been going on for a few decades now in the EU and especially Germany (which has a history of two (!) totalitarian states doing this to its citizens). There are just some defectives that cannot stand people having secrets and these make their way into power. Obviously, this will do nothing for the stated purpose and I am pretty sure the assholes behind it _know_ that. They just want to ready everybody's regular messages.

      • This idiotic crap has been going on for a few decades now in the EU and especially Germany (which has a history of two (!) totalitarian states doing this to its citizens). There are just some defectives that cannot stand people having secrets and these make their way into power. Obviously, this will do nothing for the stated purpose and I am pretty sure the assholes behind it _know_ that. They just want to ready everybody's regular messages.

        More than likely what this really is is yet another way to shovel our data into some giant AI processing system. Because no human will be involved in looking through every message sent.

        On the bright side, at least the EU is making this a big public display, rather than trying to shovel it through secretly. Which makes me wonder how much shit HAS been shoveled through secretly.

      • This idiotic crap has been going on for a few decades now in the EU and especially Germany

        Germany is one of the few countries that definitively said they would vote against this law on privacy grounds alongside the Netherlands, Luxembourg, Poland and Austria. Your understanding of the EU is incredibly lacking.

      • by AmiMoJo ( 196126 )

        It's interesting that you mention Germany. Germany was one of those European countries where child porn was legal in some circumstances. There used to be magazines that published it. IIRC the rule was that the subject had to take the photo themselves, physically push the shutter button, because the law only made it illegal for other people to photograph children.

        The Netherlands was another one that had very lax laws, also around bestiality, so used to produce that stuff commercially. After the laws changed

      • They just want to ready everybody's regular messages.

        And when they finally completely succeed, they will look around and cry and ask how they got into this position. The System is amoral, faceless, and can be used against you as surely as you use it against it others. It will be a literal hell for everyone. I imagine Religion will become mandatory.

    • by rsilvergun ( 571051 ) on Wednesday June 19, 2024 @04:48PM (#64562083)
      If they can see pictures and URLs in your encrypted messages then they've broken the encryption enough to get to the rest of it. It's naive to think you can have just some encryption.
      • by Cinder6 ( 894572 )

        The messaging client could detect URLs and especially images on-device, then send/flag those specific messages for review.

        (I'm not a proponent for this law in the least, just giving an example of how it could be done.)

    • Re: It's bad, but (Score:4, Insightful)

      by madbrain ( 11432 ) on Wednesday June 19, 2024 @05:16PM (#64562151) Homepage Journal

      The service can't be called end to end encrypted if someone other than you has the ability to decrypt it, even partially.

      • by gweihir ( 88907 )

        Oh, marketing will give you any lie you want to hear and then some. Kind of like some parts of law enforcement.

        • by madbrain ( 11432 )

          Yes. To be fair, I worked on S/MIME a few decades ago for Mozilla. It proved too difficult for mere mortals to deploy. I managed to set it up with family members, but the moment their certificates expired, it was game over, unfortunately. Trying to fix it from 6000 miles away was just not going to happen.
          PGP is not exactly simple to setup either, and has different trust issues than X.509-based systems.
          Unfortunately, the level of security seems to be inversely proportional to ease of use.
          All proper end-to-en

          • You don't need certificates for encryption, all you need is key exchange.

            • You don't need to tell me. S/MIME however does rely on certificates.
              And there was never a very good answer to public key discovery.
              LDAP works at the organization level, but isn't deployed on the public Internet.
              You can share your public key by signing an email, but that initial one won't be encrypted.
              Doing the key exchange online rather than offline is easier. But doing it securely is another matter.

              • LDAP works at the organization level, but isn't deployed on the public Internet.

                Whut!? LDAP is *not* encrypted. Why even mention this?

                • Firstable, LDAP is encrypted. It's just called LDAPS.

                  Second, in the context of S/MIME e-mail, LDAP provides a way for the sender to find the recipient's certificate, which includes the public key. It's a way to solve the key exchange problem.

                  That works great in a private organization, because all members/employees are registered in the same directory, where they can publish their certificates.

                  It doesn't work on the public internet, because there is no global LDAP server to do the same, and AFAIK no means t

                  • Firstable, LDAP is encrypted. It's just called LDAPS.

                    Nope. It isn't. This is just LDAP over SSL - that's still plain text within the channel. So why even mention LDAP? LDAP doesn't even negotiate the channel!

                    LDAPS is a kludge put in place precisely *because* LDAP is plain-text garbage. Even then, it's still garbage. It's only encrypted inside the channel. It is still exposed at either end.

                    LDAP provides a way for the sender to find the recipient's certificate, which includes the public key.

                    OK. You have undermined your own claimed authority here in a way you simply cannot recover from. But thanks for the ... well, not exactly input but... something or other..

                    • I have no need for the approval of a stranger, much less someone who gets off on putting people down.

                      What I described are perfectly standard uses of S/MIME , LDAP and e-mail. And some organizations still deploy them that way in 2024. Your criticisms should be directed at the protocol authors. But before you do that, ask yourself why they would still be deployed, if they were proven to be insecure.

                      Your earlier responses show that you might have missed what problem was being solved in the first place, for whi

                    • by bn-7bc ( 909819 )
                      So what, it accomplishes the same end goal ie random bad actor on the wire has to work rather hard to get ay the data being transmitted. It's ofc important that tls gets started as early as possible, as to which algos can be used for S/MIME, ehhm I'll have to read up om that and ofc if they are weak you are sort of screwed. What screwed S/mime over, as has been indicated by others, was the need for evry party to get certificates, and the setyp not being single klick, not to mention secure syncing of private
            • by gweihir ( 88907 )

              Well, if you do not care who the other person is, sure.

      • by Sloppy ( 14984 )

        "Can't?" What happens when you try to call it that? Facebook offers end-to-e~~NO CARRIER

      • Oh, it is end to end encrypted alright- just with a few more ends than you wanted to have.
    • Uhhh... A URL *is* raw text... Did we switch to binary urls and I didn't notice?
  • New hallmark (Score:5, Insightful)

    by VeryFluffyBunny ( 5037285 ) on Wednesday June 19, 2024 @03:54PM (#64561913)
    Whenever a new law is proposed containing the exact words or phrases to the effect of, "aimed at detecting child sexual abuse material," that's the hallmark of trying to shoehorn dodgy surveillance laws & leave journalists, lawyers, campaigners, etc., who are exercising their democratic & constitutional rights but doing things that certain rich & powerful people don't like, vulnerable to govt & by extension private corporate "security agencies." Trying to unionise? Trying to expose political &/or corporate wrongdoing? Campaigning to keep/establish universal healthcare? Reporting war crimes & crimes against humanity? Trying to change unfair & immoral laws? Beware because these security agencies stoop to all kinds of illegal activities to undermine you, all in the name of, "aimed at detecting child sexual abuse material."
  • by markdavis ( 642305 ) on Wednesday June 19, 2024 @04:05PM (#64561941)

    >"which is aimed at detecting child sexual abuse material (CSAM)"

    The *intent* ("aimed at" in this example) doesn't matter. It is what the *effect* will be, which is to destroy privacy and freedom. Throw in "save the children" and everything rational goes right out the window. I think a better law would be to hold parents and their agents responsible for not parenting.

    >"The law, first introduced in 2022, would implement an "upload moderation" system that scans all your digital messages,"

    How completely draconian. A list/algorithm, controlled by and mandated by the government. And could also be supplemented by mega corps. What a wonderful choke point for monitoring and controlling EVERYTHING. But I am sure they will pinky-promise it would never be abused by any existing or future government or third-party actors.

    >"Each service required to install this "vetted" monitoring technology must also ask permission to scan your messages. If you don't agree, you won't be able to share images or URLs."

    Oooh! We have a choice! We can agree or just not communicate. Such freedom and transparency!

    >"encrypted messaging services could "inadvertently become secure zones where child sexual abuse material can be shared or disseminated."

    You can have end-to-end encryption, which is actual privacy and security.... or not. There really is no in-between. CHOOSE. And please choose wisely.

    • > destroy privacy and freedom

      Bullshit. In a dictatorship, perhaps, but in a western civilisation, nope. We have only had this tech for what, 10, 15 years? Before them, way back to when I was born in the 80's, we had just as much freedom and privacy.

      > How completely draconian. A list/algorithm, controlled by and mandated by the government. And could also be supplemented by mega corps. What a wonderful choke point for monitoring and controlling EVERYTHING. But I am sure they will pinky-promise it wou

      • by Altus ( 1034 )

        Bullshit. In a dictatorship, perhaps, but in a western civilisation, nope. We have only had this tech for what, 10, 15 years? Before them, way back to when I was born in the 80's, we had just as much freedom and privacy.

        No one was reading every piece of mail sent or monitoring every phone call everyone made back then. So yeah, you had quite a bit of freedom and privacy.

    • lol

      You are forgetting they key driver here: something MUST be done.

      I think a better law would be to hold parents and their agents responsible for not parenting.

      lol, that ain't gonna fly. But remember, something MUST be done...

      But I am sure they will pinky-promise it would never be abused by any existing or future government or third-party actors.

      Your concerns are noted; however, something MUST be done.

      You have proposed nothing to solve this issue and yet something MUST be done. Since you do not have better ideas, we will implement full monitoring.

      (are you seeing how this works now? find a hot button issue and keep pressing and pressing until something breaks, then you take advantage of that break and continue pressing

  • OTR/PGP (Score:5, Interesting)

    by pitch2cv ( 1473939 ) on Wednesday June 19, 2024 @04:09PM (#64561965)

    And how exactly are they going to scan OTR/PGP messages on open protocol services?

    Asking for a friend.

    • They are not.

      (26) [...] Having regard to the availability of technologies that can be used to meet the requirements of this Regulation whilst still allowing for end-to-end encryption, nothing in this Regulation should be interpreted as prohibiting, requiring to disable, or making end-to-end encryption impossible. Providers should remain free to offer services using end-to-end encryption and should not be obliged by this Regulation to decrypt data or create access to end-to-end encrypted data.[...]

      • >"They are not." [going to try to prevent encryption]

        Right. Which means there is no point at all in the whole regulation. People will just shift to encrypting things and nothing will have changed on the "save the children" front. So why bother with any of this in the first place?

        A "We want to examine all communication for child porn"
        B "But you can't examine anything encrypted, and if you try to force backdoors into encryption, you WILL break ALL security. And if you try to outlaw encryption, you WILL

        • I suppose you could catch some "stupid" people who would continue to do illegal things (like child porn) out in the open.

          Yes I think that's the point of the legislation, with "the open" now explicitly extended to sharing an image taken from a phone into a messaging application, for example, or attaching a clear jpg to an email.

          But that would dry up pretty quickly.

          I'm not sure of that. New idiots are born every year.

        • There's a very large number of very stupid people out there, which is good in a way I guess. Facebook and Twitter are two of the largest submitters of CSAM to NCMEC in the world. You read all the time about people sharing that crap on Discord and other completely insecure messaging apps. It's never going to dry up.
        • by gweihir ( 88907 )

          You are missing that this is a first step to then they will require more and more and at some time end-to-end encryption will simply get outlawed because it is only an additional small step. These people _know_ what they are currently asking for makes no sense. It is not their real goal however.

        • - It will catch many who express "unpatriotic" thoughts or feelings - It will catch whistle blowers for major corporations (and mysterious deaths will follow) - It will catch people calling out the criminal actions of high up politicians (and mysterious deaths will follow) -;It will catch people exposing atrocities happening in their own country (and not so mysterious deaths will follow) - It will catch people assembling protests against injustice - It might catch a few people engaging in illicit activities
        • Which means there is no point at all in the whole regulation. People will just shift to encrypting things and nothing will have changed on the "save the children" front. So why bother with any of this in the first place?

          People won't shift. People don't care. Individual criminals will care and make the switch showing that this won't have much of an impact in the way the EU is saying it will. But that's not the real goal is it. It's just another underhanded way to get the general population have open communications, network effect will cause the general population to keep using whatsapp / imessage / whatever.

          • This is the hard question. They assert that they can use the material that they harvest to good effect to catch offenders. What's needed here is evidence that people are being caught by that method. Indeed a sunset clause requiring the legislation to be passed again in five years time, allowing evidence that it has been successful - or not, might be a useful compromise.

    • Arnt both essentially dead now?

      PGP isnt even implemented in secure corporate environments anymore, they use VPN's and isolated private networks for secure email.

      OTR was neat but it got replaced a long time ago.

      Anyone using such old tech is not even a target.

    • Oooh, sieg heil to your friend. So easy.
  • Reminder: It is not so hard to hide encrypted information (say images) in the noise bits of legit images.

    • I suspect AI systems are going to severely blunt the usefulness of Steganographic message hiding.

      • Don't overestimate the capabilities of AI. A properly encoded and encrypted steganographic payload is indistinguishable from random noise, and AI isn't going to change that or help in any way with detection. While steganography can be defeated, for example, by forcibly applying filters that do lossy compression, such approaches are orthogonal to AI.

  • by 93 Escort Wagon ( 326346 ) on Wednesday June 19, 2024 @04:13PM (#64561983)

    The fundamental issue is - whether European or American, these lawmakers fundamentally don't understand how encryption works. They really think they can have it both ways, and their spy agencies intentionally mislead them into thinking so because *they* want access to the messages.

    One thing I'll say for the Chinese - they don't try to pretend. They just come right come out and say "all your base are belongs to us, suck it".

    • by gweihir ( 88907 ) on Wednesday June 19, 2024 @04:31PM (#64562045)

      By now I am convinced they exactly understand how encryption works and they could not care less about CSAM. They want to spy on regular people with ease and chaply. This is the first step and if they manage to get it, there will be next steps. At the end of this, saying something bad about the assholes in power will get you sent to a concentration camp. We have seen this often enough by now. And, bonus!, the Germans have had two totalitarian regimes doing this to excess and their politicians and police still want that capability again and as soon as possible. Some people are just defective and a threat to everybody.

    • They simply don't trust the public.

      Imagine the political ad: "Minister Kraus voted against the kiddy-diddling act. Minister Kraus is in favor of kiddie diddling. Vote (fascist party of the specific nation)!"

      Yes, the nuances of laws like this can be explained. But the old political truism: "If you're explaining, you're losing" applies. A lot of voters don't want to hear the truth.

      I know some people want to think it's all a big conspiracy to view other people's foot-fetishes, but usually the truth is a lot s

    • I suspect you haven't had opportunity to read the proposed legislation -- I fully understand you, Slashdot makes a point in hiding the interesting URLs, and then the legislation itself is 200+ pages. However, I would suggest you have a look at article 26 and 26a from here https://netzpolitik.org/wp-upl... [netzpolitik.org] (page 15 of the pdf).

      As far as I understand (this is my rewrite, each articles is half a page of legalese)
      * If the person is not using a publicly-available service (e.g. the person is running a private SMT

      • by gweihir ( 88907 )

        The thing is that this is legislation is not an end-goal, it is a stepping-stone. As they failed before, they first divided the way to the total surveillance state into steps and then, as they failed again, they made the steps smaller. But at the end-goal is that surveillance state, there really cannot be any doubt anymore. And if the steps are small enough, they may well succeed.

      • by znrt ( 2424692 )

        I suspect you haven't had opportunity to read the proposed legislation -- I fully understand you, Slashdot makes a point in hiding the interesting URLs, and then the legislation itself is 200+ pages.

        i have tried to digest these 200 pages of asinine gibberish (at least diagonally) but this document seems specifically written to be as obscure and obfuscated as possible, i think it should be rejected simply for that reason.

        my takeaway: it seems to gravitate around "detection orders", not indiscriminate verification. these "detection orders" would be issued by (i understand) judicial authorities to providers against specfic services, maybe specific parts of services, maybe specific channels, groups or acco

  • I wish some would teach them that encryption = maths so can't be technically stopped. So all you are doing is making the use of encryption illegal for people already committing crimes. If you are committing crimes then sending encrypted message would seem to be a like issuing speeding tickets to people doing drive by shootings. They really can't see the big picture.
    • Exactly this. It is beyond stupid. Me, a law abiding citizen am being spied on while some druglord will ask his nephew to go to the library, read a book about encryption and develop an app so that he can chat discretely with other criminals. It's that simple.

      • by Rujiel ( 1632063 )
        I appreciate your point but "roll your own encryption" is really not a thing in praactice, and if it was it would probably be easy for pros to tear apart
        • I mean.... They taught me in first year CS and it was a mighty big class. And not the only such class on campus, nor the only uni in Canada.

    • Re:Again? (Score:5, Insightful)

      by thegarbz ( 1787294 ) on Wednesday June 19, 2024 @05:05PM (#64562129)

      I wish some would teach them that encryption = maths so can't be technically stopped.

      You're thinking technically, not socially. End users do not use encryption, they use common widely available popular platforms like WhatsApp. You don't need to ban encryption to have a horrendously chilling effect on privacy, you just need to legislate that the likes of WhatsApp aren't allowed to use it. The effect is the same.

      • by ukoda ( 537183 )
        True, but that statement should really be qualified by saying it only applies to law abiding citizens. The effect is not the same for criminals who will simply switch to using, or creating, illegal apps and continue business as usual.

        Given that, you could say the law is pointless and the legislator are stupid. However that assumes they really think such a law will work. It could equally be they just wanting an easier way to spy on people who are not breaking the law. Either case is not a good look.
        • Actually by affecting supply it gives people an "in" to the criminal underworld. You can see that in effect the past few years. While WhatsApp is encrypted many criminals didn't trust it. But criminals are criminals, not IT geniuses. They don't go out and roll their own service and products, they rely on someone to do that.

          Oh look a fancy new encrypted platform designed with the criminal in mind? Perfect! Let's use that. Oh woopse, turns out it was an FBI honeypot https://www.engadget.com/fbi-e... [engadget.com]

          Criminals

      • >"End users do not use encryption, they use common widely available popular platforms like WhatsApp"

        They don't *now* because they might not feel they need to. But once they put in systems that spy on everything, that might quickly change.

        >"You don't need to ban encryption to have a horrendously chilling effect on privacy, you just need to legislate that the likes of WhatsApp aren't allowed to use it. The effect is the same."

        That is true, but only for a limited time. It won't take long for people to

        • No they won't. People are inherently unconcerned with their privacy, and very concerned with the network effect. Apps like Signal exist. WhatsApp is the most popular messaging service in many European countries, but if you ask people if they trust Meta and if their conversations are truly private the majority of them will say "fuck no" followed by "I don't care".

          We are doomed. Not only are we not as smart as you make out, we also simply don't give a damn as a society.

  • Balance (Score:5, Insightful)

    by Baron_Yam ( 643147 ) on Wednesday June 19, 2024 @04:26PM (#64562023)

    I don't want to see a single child abused, but having them all grow up in a surveillance state is also abuse. It may not be to the same degree, but I'll affect a hell of a lot more of them.

    Any politician who thinks "the government should know everything you do or say just in case maybe you are committing a crime" is acceptable should have all their communications on a live public feed. 24/7, including a body cam.

    • > having them all grow up in a surveillance state is also abuse

      The solution, which is being implemented in the UK thankfully, is to take the smartphones away from the kids.

  • by sinij ( 911942 ) on Wednesday June 19, 2024 @04:27PM (#64562029)
    Call things by a proper name - a mandatory backdoor for intelligence and law enforcement dragnets.
  • Cheaper to do it themselves than buy the Sara from the NSA?
  • ok, now it's clear (Score:4, Interesting)

    by jjaa ( 2041170 ) on Wednesday June 19, 2024 @04:38PM (#64562069)
    The reason why AI generated kid porn is still forbidden is because then no one would ever need to harm real babies, hence no reason to snoop in on all communications under blanket pretence of CSAM ðY
    • ... is still forbidden ...

      It's forbidden so policing agencies aren't overwhelmed by false positives from simulated child rape. Also, so images of simulated child rape aren't in circulation where a disinterested person will see them and become traumatized/aroused by them.

      • by fafalone ( 633739 ) on Wednesday June 19, 2024 @07:39PM (#64562403)
        That's nonsense. It would be harder to produce images that even a computer couldn't determine were fake than just to make images that were provably fake. If AI generated CSAM was explicitly legal, everyone would want their supply from providers whose images could be proven fake (by algorithm), because that would eliminate the risk of the serious felonies associated with even simple possession. A legit market for provably fake stuff would then allow a presumption of legitimacy; anything *not* cryptographically established as fake could be presumed real, with a high burden to prove otherwise. Remember, "reasonable doubt" isn't "technically possible regardless of odds".
        The reduction in actual kids abused and the dissemination of real CSAM would be more than worth the occasional person "accidentally" stumbling across it.
        Of course, the biggest problem as always remains getting over the hill of uncoupling "this is bad" and "this should be illegal". I've met people who admit they straight up don't give a damn if more kids are actually abused as a result, there's no way they'd ever want perverts looking at it to go unpunished, because the law for them is a morality enforcement tool.
        • Drawings or fake images depicting CSAM are illegal to produce and are covered.

          Thus if A.I were to produce such images, the owners/developers would be commiting a crime.

        • That's nonsense. It would be harder to produce images that even a computer couldn't determine were fake than just to make images that were provably fake. If AI generated CSAM was explicitly legal, everyone would want their supply from providers whose images could be proven fake (by algorithm), because that would eliminate the risk of the serious felonies associated with even simple possession.

          hand drawn art is obviously not real. no need for authentication at all... and yet, such drawings have been semi-illegal for 30 some odd years now and explicitly "illegal" (how can something that is Constitutionally protected actually be illegal?) for about 20 years in numerous jurisdictions.

      • No, it's forbidden as no normal human should be looking at that stuff and if A.I were to create such images them it would just feed the beast.

        Any image, real, drawn, described in a text file of CSAM is already illegal as those thoughts and desires are ILLEGAL.

  • by nightflameauto ( 6607976 ) on Wednesday June 19, 2024 @04:49PM (#64562087)

    Criminals communicate to commit their criminal acts. WE MUST STOP THEM FROM COMMUNICATING! Therefore, we must propose a law that no communication at all may happen without the direct presence of law enforcement agents. If you are locked in your home with another human being, you may not speak unless you invite a law enforcement agent into your home to supervise and make sure you do not, in fact, communicate about illegal or illicit activity. Don't mind the notes they jot down every time you call a politician a tosser or a cunt. Nothing to worry about there. Just crossing 't's and dotting 'i's.

    I mean, come on people! THINKN OF THE THE CHILDREN!

    That's how much fucking sense these god damned proposals/laws make. No, you, the government, anywhere, do *NOT* need full access to all forms of communication to do your fucking jobs. 100% safety is impossible, and gang-raping your entire citizenship in the name of an impossible to achieve goal is just gang-raping your entire citizenship. It is not, in fact, making anything safer or more secure. Much like most of the bullshit that happened in the States after 9/11. This is about making sure there are no secure channels of communication. Period. It has as much to do with CSAM as writing in a notebook. It's just a good pre-argument for anyone who may fight it. It's bullshit that we all see through. Yet they keep doing it. And somehow? We keep letting them.

    • i bet real organized criminals already know the internet is not secure and they just meet up in person somewhere noisy to communicate and they leave all their electronic gadgets at home, if a dummy like me can figure this out i am sure organized criminals did already
  • The EU wants to read all communications between their citizens!
  • This is like the 4th anti-CSAM policy/legislative bill to appear in the last 10 years. The UK, EU and USA of the 21st century repeatedly demand everyone else prove their innocence.
  • 0BE673DD614DE035E884DFA4EB6667CDBB8A73A781C26C1D34B1E0F7AF50D12B

  • Next week the geniuses at the EU are going to legislate on how many cooler days are allowed in June, on which days thunderstorms are acceptable in August and September, which beers are best, if you should scratch then yawn or yawn then scratch, and whether it's a crime to put on your left sock before your right sock or vice versa, or both. Arseholes all. Thank God for Brexit.

  • The fasicsts will tell you that terrorists and pedophiles are our biggest worry. So we better give up our rights and privacy, else we're as bad as the terrorists and/or pedos. It's a psychological trick to get the public to accept the centralized control over people's lives that is the pillar of a fascist government. But if anyone bothers to look at the numbers, these are never the top concerns in every day life. And even if they were, there are better ways to deal with crime than creating a police state.

  • We've seen this before, of course. In the 90s, 00s, 10s, every few years. In different disguises, always for the reason-of-the-time (terrorism, drugs, child porn, etc.)

    There are usually the same people behind it. One thing to learn from politicians: They never, ever, give up on a topic they want to push through. If resistance against it is strong enough to defeat them, they will lie low for a while, then try again. And again, and again.

    If you keep trying, sooner or later you succeed.

  • I switched sides.

    I used to be all about GPG encryption etc. But I realised thats an old fashioned idea from a rose-tinted spectacle view of the older internet back where it was exciting and fun and way more innocent, probably because of the fact that social media hadnt been invented yet, so you had to have a website, registered in your name (well until geocities perhaps) and it was smarter people, and ounger amarter people using it.

    Once social media and the general public came on board it was only a matter

  • I thought that only evil and smelly Americans took such panopticon state measures.
  • The vote on this controversial topic has been indefinitely suspended:
    https://stackdiary.com/eu-coun... [stackdiary.com]

    Now, I'm looking forward /s how Hungary's presidency, starting 1st of July, will go about this. Orban is a far-right reactionary populist who has already severely curbed the press, for example, and is best friends with Putin. His slogan for their presidency? "Make Europe Great Again" Ring any bells?

Enzymes are things invented by biologists that explain things which otherwise require harder thinking. -- Jerome Lettvin

Working...