Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Encryption United Kingdom

UK Could Force E2E Encrypted Platforms To Do CSAM-Scanning (techcrunch.com) 106

The U.K. government has tabled an amendment (PDF) to the Online Safety Bill that could put it on a collision course with end-to-end encryption. TechCrunch reports: It's proposing to give the incoming internet regulator, Ofcom, new powers to force messaging platforms and other types of online services to implement content-scanning technologies, even if their platform is strongly encrypted -- meaning the service/company itself does not hold keys to decrypt and access user-generated content in the clear. The home secretary, Priti Patel, said today that the governments wants the bill to have greater powers to tackle child sexual abuse.

"Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe," she said in a statement -- which also offers the (unsubstantiated) claim that: "Privacy and security are not mutually exclusive -- we need both, and we can have both and that is what this amendment delivers." The proposed amendment is also being targeted at terrorism content -- with the tabled clause referring to: "Notices to deal with terrorism content or CSEA [child sexual exploitation & abuse] content (or both)."

These notices would allow Ofcom to order a regulated service to use "accredited" technology to identify CSEA or terrorism content which is being publicly shared on their platform and "swiftly" remove it. But the proposed amendment goes further -- also allowing Ofcom to mandate that regulated services use accredited technical means to prevent users from encountering these types of (illegal) content -- whether it's being shared publicly or privately via the service, raising questions over what the power might mean for E2E encryption.

This discussion has been archived. No new comments can be posted.

UK Could Force E2E Encrypted Platforms To Do CSAM-Scanning

Comments Filter:
  • SAVE THE CHILDREN (Score:5, Insightful)

    by AutodidactLabrat ( 3506801 ) on Wednesday July 06, 2022 @09:08PM (#62679970)
    The inevitable cry of the Tyrant as the Camel's nose strips all rights away from the fools
    • by beepsky ( 6008348 ) on Wednesday July 06, 2022 @09:43PM (#62680014)
      Exactly. "Think of the children" is code for "I'm going to rape your human rights and get away with it by pretending it's about kids".
      • When the people of the UK gave the Tories a huge majority at the last election, they were well aware that Boris Johnston has been sacked from every job he has ever had for lying.
        They were also well aware that the Tories are sleazy and corrupt but they elected them anyway.
        The people of the UK (by which I mean England, because the Scots, the Welsh and the Northern Irish don't vote Conservative) have got exactly what they wanted.
        This sort of stuff is small beer for the Tories.
        • The problem with these laws is that they're forever. They'll be there long after Boris is dead and buried.

          • by gweihir ( 88907 )

            The problem with these laws is that they're forever. They'll be there long after Boris is dead and buried.

            Indeed. Alt least until the inevitable catastrophe at the end of every authoritarian regime. But as, for example, North Korea shows, it can take a long, long time to that collapse.

        • by gweihir ( 88907 )

          Well. I really do not know what is wrong with them, but it seems many conservatives want to vote themselves an authoritarian hell. Of course even a brief look into human history conclusively demonstrates that cannot go well, but somehow these people are unaware of that. If all the others were not in the picture, I would say let them do it to themselves.

      • Exactly. "Think of the children" is code for "I'm going to rape your human rights and get away with it by pretending it's about kids".

        Yep.

        The UK is already one huge honeypot for anybody searching for child porn. The police will be knocking on your door after just about anything related to searches for child porn or visits to certain web sites.

        That, combined with lists of contacts of the people they catch, ought to be enough.

        It's not what this is about though.

      • by UnknownSoldier ( 67820 ) on Thursday July 07, 2022 @09:38AM (#62681112)

        This is exactly why I hate how modern politics. Almost every issue becomes trivialized with sound bites and fallacies of duality or false equivalence.

        * Fallacy of Duality: If you are against X then you must be for ~X.
        * False Equivalence: If you are for X then you must be for Y.

        e.g.

        Pragmatic Pete: Encryption is important for privacy such as banking.
        Asshole Alice: But think of the children! What are you trying to hide? Are you Pedophile Pete?
        Pragmatic Pete: I FOR Encryption and AGAINST Pedophilia. You are trying to conflate being FOR Encryption assumes being FOR Pedophilia -- it does not.
        Asshole Alice: Listen up everyone, Pete wants privacy to send porn!
        Pragmatic Pete: **SIGH** (When did they stop teaching critical thinking in school? Did they ever start?)
        Pragmatic Pete: Without encryption hackers could listen in and steal your money. If you are AGAINST encryption does that mean you are FOR HACKERS? Listen up everyone, Alice wants hackers to drain everyone's bank accounts!

        ---
        Redditard: An asshole who downvotes someone simply for asking a question.

        • Fallacy of Duality: If you are against X then you must be for ~X.

          That's not a fallacy -- that's a tautology.

          • by xalqor ( 6762950 )

            No, a tautology would be something like "either you're X or you're not X", because that would be true for anything.

            The fallacy is saying "you're either for X or against X" because it's not true for everything. The world of "not for X" can include people who are "against X", but it can also include people who don't care, and people who are "for X" in general but feel very strongly that the proponents of X are going about it the wrong way and need to be opposed to because how we do things matters.

            In this case

    • by sg_oneill ( 159032 ) on Thursday July 07, 2022 @02:15AM (#62680376)

      Its always the same. Abuses of freedoms get prototyped on those who cant be defended, perfected, then let loose on the rest of us.

      Nobody wants to be THAT GUY who suggests actions that harm kiddy fiddlers are wrong. Its politically suicide. Throw in "Terrorists" and its even harder.

      The problem is , two steps removed from "Terrorism" is Legitimate dissent.
      its Terrorists (Guys who blow shit up), then "Violent protesters" (Rioters), then "Illegal protesters" (ie People who dont get permission to protest), then "protesters" , and then "the opposition party", and finally everyone, and now we're all under the eye of sauron.

      I'm happy if they throw the child molesters, and the jihadis into the same room and let them hack each other to death. But thats not this. We are told its all good , and we should trust the government to be responsibly about read our emails, and every time I see a low level minister retort against some news story by practically Doxxing some poor guy who dared to speak up, I'm reminded that no, in fact we should not trust the government.

      I refuse to trust the government.

      Because the government refuses to trust us.

      • by sg_oneill ( 159032 ) on Thursday July 07, 2022 @02:19AM (#62680388)

        The incident of Doxxing I'm refering to was here in Australia where we had a story about a woman who had recieved some abysmal treatment by the governments welfare program. So the government decided to respond by giving her file to a dodgy reporter. If anyone else in the government did that, it'd have the anti-corruption commission kicking in the door for a massive breach of privacy laws. But if a minister wants to do that? Its all fine and legal. This happened around the time the govt here was promising us it could be trusted with our health data in a new scheme designed to centralise medical records. An hour later I logged in and withdrew my consent from the systems website. Because fuck that.

    • Rather than the children, Boris is currently fighting a losing battle trying to save his government. I doubt this will go anywhere this time as a result since he currently does not even command a majority amongst his own MPs, let alone in parliament.
    • Agreed, this opens the door to all of your data being scanned, soon they will search houses
  • by ewibble ( 1655195 ) on Wednesday July 06, 2022 @09:23PM (#62679986)

    Privacy and security are not mutually exclusive

    I assume they will be releasing all MI5 and MI6 files for public inspection after that, just after they give out all there bank passwords.

    • by dgatwood ( 11270 )

      Privacy and security are not mutually exclusive

      I assume they will be releasing all MI5 and MI6 files for public inspection after that, just after they give out all there bank passwords.

      My guess on the first one? Shortly after China uncovers their identities through these back doors that they insisted on adding.

  • Then, they came for "terrorism." Next, they'll come for "hate speech."

    See how this works?

    • by viperidaenz ( 2515578 ) on Wednesday July 06, 2022 @11:07PM (#62680146)

      Isn't terrorism anything the government says it is?

      • by Alypius ( 3606369 ) on Thursday July 07, 2022 @12:57AM (#62680256)
        Yup. Just like "hate speech" is "speech I don't like."
        • Nope
          Hate speech is terroristic threats.
          I don't like your kind is not hate speech.
          Kill all the wogs is hate speech
          See how easy?
          • You obviously haven't been paying attention the last few years.
            • I've not been plugged into the eternal whinefest of White Privileged terrorists upset that threats are not treated as free speech...because they're not going to be in a working state
              Get your head out of Bowtie Carlson's aperature and learn something about free speech, it's limits and functions
              If instead you are whining that tech companies are being responsible and deleting contrafactual LIES, remember Free Speech does not apply on someone else's property.
      • by gweihir ( 88907 )

        The original definition of "terrorism" is a form of government that governs by fear. Seems to be applicable in the UK if not full-blown yet.

    • Comment removed based on user account deletion
      • Terrorism is relatively new in the US, it's what most of us British Gen Xers and older grew up with in the UK.

        False, as always. Terrorism against THE MINORITY view, religion, race or sex is as old as the U.S. Lynching was formally legal for more than 160 years in the U.S., involving black men, women and occassionally Jews. Atheists have always been fair game in the U.S.A.

        Hate speech was been subject to numerous laws, including attempts to outlaw the stirring of racial hatred by Margaret Thatcher in 1986, a more general hate speech law in 1994 from John Major that doesn't specify protected categories, a religious hatred law in 2005/2006 from Tony Blair, and in 2008 protections for sexual minorities from Gordon Brown.

        Good. Speech intended to cause violence most certainly MUST be outlawed.
        None of those laws you mentioned ever forbade the declaration of opinion "I don't like THOSE people"
        It merely forbade declaring falsely against the rights of "THOSE people".

  • It is incredibly easy to write and deploy your own E2E encrypted communication platform these days. 2-nd year software development students get these sort of assignments as homeworks all the time. Do you think that dedicated criminals won't find a scrawny teenager or a rogue pro dev to build their own chat & share service for them? In the end, it will only hurt genuine users and make criminals laugh lauder.

  • If the program is hosted outside GB, what can they do?

    • If the program is hosted outside GB, what can they do?

      Nothing, but that's not the point. The point is that they want mass-access to as much as possible.

      PS: Terrorists can use code words over open communications channels ("The teddy bear is in the woods" doesn't need encryption)

  • Funny how that same sentence worked for Epstein and his customers.

  • They want to censor and filter private messages, not just public content?

  • Sure would have been nice if the folks that designed networks, and subsequently the internet, had thought up a distributed network where anyone could setup a server and communicate with anyone else on said network, without using a central provider.

    --
    The Internet lives where anyone can access it. - Vint Cerf

  • I remember bureaucratic requirements mandating everything be scanned by their designated Windows virus scanner. The fact that we were using a Linux system meant nothing to them, so we scanned. It made them happy and we got on with our work.

    I'm sure some organizations will dutifully scan their encrypted traffic, the fact that the foregone conclusion is "nothing found" will be lost in the little check box.

    To be fair, successfully scanning content of encrypted traffic has been done before, it's called bad cryp

    • I don't support doing this. I think it's a bad idea.

      That said, it occurs to me it's actually not hard for a service that uses an app that does end to end encryption to actually scan. Just have the app do the check before encrypting it.

      If you're checking against a huge database, have the app compute a secure hash (or hash like thing) and query that. It reveals nothing if the content isn't already in the CP database.

      Again, I'm not saying this SHOULD be done. Just that it's not impossible as some would claim.

      O

      • IIRC, that's how Apple's implementation works. (iOS 15?) Very very few have access to the actual CSAM database; everything is matched by hashes. It still got Apple in hot water by privacy folks and it's been speculated that this is a result of LE being frustrated by encryption.
        • by cbm64 ( 9558787 )

          IIRC, that's how Apple's implementation works. (iOS 15?) Very very few have access to the actual CSAM database; everything is matched by hashes. It still got Apple in hot water by privacy folks and it's been speculated that this is a result of LE being frustrated by encryption.

          The reason Apple got in hot water with privacy advocates is that now you have a generic content scanning mechanism running on your device and reporting back to someone. They can claim it is only used for CSAM, but users can't know if Apple have included search for government-mandated material as well. Apple has shown in China that they do what authorities tell them to, and they state this too as official policy - that they follow "the law" in whatever country they operate in.

          • by dgatwood ( 11270 )

            IIRC, that's how Apple's implementation works. (iOS 15?) Very very few have access to the actual CSAM database; everything is matched by hashes. It still got Apple in hot water by privacy folks and it's been speculated that this is a result of LE being frustrated by encryption.

            The reason Apple got in hot water with privacy advocates is that now you have a generic content scanning mechanism running on your device and reporting back to someone. They can claim it is only used for CSAM, but users can't know if Apple have included search for government-mandated material as well. Apple has shown in China that they do what authorities tell them to, and they state this too as official policy - that they follow "the law" in whatever country they operate in.

            It's actually worse than that. It is possible to create adversarial images that cause neural networks to see things that no human would perceive as being there. Based on that, it is also presumably possible to create images that look normal to a human reviewer, but that would secondarily cause a neural network to throw a false match when it sees some other non-CSAM image that government spooks want to know about.

            And even if you try to prevent that by requiring a human being in the loop to verify that the

            • by cbm64 ( 9558787 )
              Interesting point, I saw a 'hacking ML' presentation once where they demonstrated how two photos completely identical to the human eye (much enlarged with high resolution) was recognized as two completely different things by the algorithm/dataset (believe it was Google-based).
      • On an iPhone, this is practical. They have desktop-class CPUs and GPUs with a shitload of ASICs too.

        Try to run that kind of AI model on a $75 Android handset? Excuse me while I laugh you out of the room.
        • what are you smoking, Apple's own DESKTOPS don't use desktop class CPUs and GPUs. All of the M1 and M2 stuff is mobile class by design. and iPhones are far from powerhouses, not that it matters though, because hashing an images does not need a lot of computer power, nor does it need AI in any way. slower phones would just take a few extra seconds before you could send that picture while the hash runs.

      • It's not correct to say this reveals nothing. It creates a database of unique hashes tied to specific accounts and timestamps. This metadata can be used to analyze and track data possession and movements between users. AV companies sell this metadata to governments today. If the government wants to know who owned a specific file and when, they can tie a hash to an IP (or account identifier) and timestamp and go knock down doors. The hashes don't need to be CSAM related.

        This can also work years later. They

        • > It creates a database of unique hashes tied to specific accounts and timestamps.

          Querying a database does not necessarily involve a user ID or create a log. You're making up a completely different system, then describing the attributes of what YOU designed, rather than addressing what I'm talking about.

      • by Sloppy ( 14984 )

        That said, it occurs to me it's actually not hard for a service that uses an app that does end to end encryption to actually scan. Just have the app do the check before encrypting it.

        Ah, but notice the switcharoo? We were talking about services, and now you're suddenly talking about an app.

        Conflating those things sounds like the horror story of smartphones, not the internet we all know and love, where services and applications are usually separate things from separate (sometimes even adversarial) makers, bu

        • > Ah, but notice the switcharoo? We were talking about services, and now you're suddenly talking about an app.

          > Conflating those things sounds like the horror story of smartphones

          For better or worse, we live in a smartphone world. New commercial services have apps. They all have mobile apps, some may also have web apps. I'm not saying I like it that way, just that's how it is.

          Note here I recently designed, built, and launched an end-to-end encrypted file transfer for my company that uses a web page. T

      • While some databases are restricted to reliably verified CSAM, others will have large numbers of false positives. Someone gets caught with a bunch of actual CP, everything they've got gets added to the DB if they look plausibly underage, even if they haven't be positively identified.
        The US has been caught prosecuting people for legal material before. In one particularly egregious case, a man was prosecuted for clearly commercial porn DVDs, complete with an "expert" pediatrician testifying the actress coul
  • I would imagine any controversial amendment like this, is going to the bottom of the pile right now.

    The Johnson government will be swept away within months, if not weeks, if you follow the latest soap opera over the pond.

    The minister behind controversial amendments like these, one Priti Patel, is a ghastly creature - quite profoundly evil - and I very much doubt she'll survive a cabinet reshuffle under a new PM, as she is widely hated by the public over here.

    Both the US and the UK are quite horribly excitin

    • Are there any Labour Parties in Europe that are against surveillance?

    • Both the US and the UK are quite horribly exciting places politically right now, as we watch far right thinking taking over. We don't know where it leads, but one thing is for sure, we desperately need a swing back toward the centre.

      Well, in the US, what you are seeing is largely a move from the far left progressives back somewhat towards the center...which was last seen I guess about 20 years ago.

      • by dgatwood ( 11270 )

        Both the US and the UK are quite horribly exciting places politically right now, as we watch far right thinking taking over. We don't know where it leads, but one thing is for sure, we desperately need a swing back toward the centre.

        Well, in the US, what you are seeing is largely a move from the far left progressives back somewhat towards the center...which was last seen I guess about 20 years ago.

        You're kidding, right? The U.S. hasn't had a far left since pre-Reagan. The federal government consists of moderates (Democrats), conservatives (Blue Dog Democrats and a few of the Tea Party Republicans), ultra-conservatives (most other Republicans), radical right nationalists (Trump Republicans and most of the other Tea Party Republicans), and people who aren't smart enough to understand that you shouldn't pay for all of your expenses on credit (Reagan Republicans/neoconservatives).

        The people who call th

        • You're kidding, right? The U.S. hasn't had a far left since pre-Reagan. The federal government consists of moderates (Democrats), conservatives (Blue Dog Democrats and a few of the Tea Party Republicans), ultra-conservatives (most other Republicans),

          Nope.

          The conservatives..the majority of republicans and independents that identify as conservative, really haven't changed that much.

          Many of those actually have shifted left slightly on social issues.

          The democrats, the left, at least on party level, have bee

          • by dgatwood ( 11270 )

            You're kidding, right? The U.S. hasn't had a far left since pre-Reagan. The federal government consists of moderates (Democrats), conservatives (Blue Dog Democrats and a few of the Tea Party Republicans), ultra-conservatives (most other Republicans),

            Nope.

            The conservatives..the majority of republicans and independents that identify as conservative, really haven't changed that much.

            Many of those actually have shifted left slightly on social issues.

            The democrats, the left, at least on party level, have been drawn VERY far left in a very short period of time.

            The progressives...I don't believe they are the majority of Dems, but they are the loudest and seem to have the most pull over the whole Democrat party ship.

            I haven't seen very much even from what U.S. progressives calls the "far left" that wouldn't be called centrist anywhere else in the world. Taking care of the poor, ensuring living wages, ensuring access to public education, etc. aren't left-leaning policies anywhere else. They're basic human decency. And the fact that the U.S. right wing is so far to the right that they disagree with such basic human decency puts them well outside acceptable societal norms in most of the world.

            About the only progressive

            • No, but the fact that you said that is basically conceding my point. We don't have a left wing. We have something that is just barely left of center even when it is at its most progressive, and usually right in the center.

              We have right and left in the US.

              They have NOTHING to do with right/left outside of the US, and therefore, when discussing swings within the US, there is no point in bringing in comparisons from the EU or other countries outside the US.

              You need to compare apples to apples.

              And as far as

              • by dgatwood ( 11270 )

                >

                And as far as the progressive left in the US going FAR to the left of what only a few years ago was considered normal, I stick by that assertion.

                Even if you limit yourself only to what is normal by modern U.S. standards (which I would argue are pretty darn conservative even by some historical U.S. standards, e.g. "That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it ..."), your original statement was ambiguous, which led to the two of us having two different arguments. :-)

                I interpreted your original statement to mean that the U.S. was moving from being controlled by far-left

  • by joe_frisch ( 1366229 ) on Thursday July 07, 2022 @12:04AM (#62680192)
    Yes child sexual abuse is a terrible crime - but is it so terrible and so common that people are willing to give up their privacy? Can people not imagine what far worse horrors will happen if the government goes bad and it has access to this vast trove of information - knows who might cause "trouble" who their friends are, what their secrets are? Is it worth increasing the risk of fraud and espionage?

    No matter what promises are made, technology that allows scanning for CSAM also allows scanning for all these other things, and anyone who believes that capability will never leave the hands of the government is naive. Anyone who believes the government itself will always use that data honestly is even more naive.
    • When having to decide whether I side with pedos or politicians, the pedos it is.

      I'm too old to get fucked by pedos, but you're never too old to get fucked by politicians.

      • by gweihir ( 88907 )

        On a related note, pedos (those that do not keep their urges in check, that is) can only do limited damage. Incidentally, the same applies to any common murderer or rapist or torturer. Even Jack the Ripper only made it to 5 victims and the average serial killer rarely gets to triple digits. Apparently only 7 serial killers ever managed to potentially get 100+ victims. The same does not apply to politicians. Politicians can scale their evil up to incredible numbers.

    • by AmiMoJo ( 196126 ) on Thursday July 07, 2022 @03:57AM (#62680548) Homepage Journal

      This doesn't need to violate anyone's privacy to work. All they need is a database of image hashes, provided by the police. Then if someone tries to post an image that matches the hash, the software simply declines. No need to even log it.

      You could argue that it's a slippery slope, that they will require logging eventually. However, there has been a system called CleanFeed in operation for decades. It's implemented at the ISP level and blocks known child abuse sites. It is not logged, or if it is the logs have never been used in any trial. Could be parallel construction I guess, but if so they have kept it very quiet.

      It could also be abused to prevent the spread of political messages like memes.

      I still think it's a bad idea, I'm just saying there is no technical reason why it would have to collect any data.

      • This doesn't need to violate anyone's privacy to work. All they need is a database of image hashes, provided by the police. Then if someone tries to post an image that matches the hash, the software simply declines. No need to even log it.

        I've got some experience of working with CSAM scanning and what you've described is exactly how it works.

        You get a free licence to use Microsoft's PhotoDNA algorithm (they also provide source code in common languages). PhotoDNA is like other common hashing methods, except

        • by dgatwood ( 11270 )

          This doesn't need to violate anyone's privacy to work. All they need is a database of image hashes, provided by the police. Then if someone tries to post an image that matches the hash, the software simply declines. No need to even log it.

          I've got some experience of working with CSAM scanning and what you've described is exactly how it works.

          No, it isn't. You missed a key implication in the GP post, which was that a privacy-protecting version of this may prevent the image from being posted, but must take no other reporting action, nor log the prevention in any way that could be detected later.

          You get a free licence to use Microsoft's PhotoDNA algorithm (they also provide source code in common languages). PhotoDNA is like other common hashing methods, except that it can survive small amounts of photo manipulation (such as recolouring, rotating, cropping and so on).

          You then subscribe to a hash list from the Internet Watch Foundation in the UK (or NCEMC in the US), hash the photo and compare to the list. If you have a match, then you need to come up with some process to investigate and, if necessary, report it.

          It is the "process to investigate and, if necessary, report it" part that makes it possible to coopt the process to detect other things. For example, create an adversarial image that, when reviewed by humans, appears to just be child porn, but actually c

      • There are two issues:

        One is that this requires prohibiting end-to-end encryption. That provides an opportunity for a variety of bad actors to cause a variety of problems - its making communication less scure.

        The other is that we have see the slippery-slope in action. Way back for the London Olympics a lot of surveillance equipment was installed around the city to prevent terrorism. In the end that same equipment was used to catch olympic copyright violators. Scanning for CSAM seems reasonable. What
        • by AmiMoJo ( 196126 )

          It doesn't require prohibiting end to end encryption if the checks are done on the device. They just need to build it into the app.

          • Giving the government have access to my device to update the databased does not sound better to me. Also which app? Do they want back doors like this in every communication app? Or every photo and video app? Every file sharing and cloud app? On every OS?
            • by AmiMoJo ( 196126 )

              The way the CleanFeed thing works is that all the "major" ISPs over a certain size are obliged to use it, and so far none have declined. The smaller ones sometimes don't bother.

              The updates would presumably be from the app developers, same as Chrome and Firefox update databases of known malware and phishing URLs.

      • by gweihir ( 88907 )

        You are right that it does not need to. But once the capability is there, it will be abused in short order. History is completely clear about this: Any surveillance and control mechanism will be abused to the maximum possible by the "authorities". And since this is material no civilian may see, there cannot be any civilian oversight. We already have enough cases of entirely different websites landing on block-lists that pretended to be about "protecting children". Again, no oversight, because if you complai

    • Yes child sexual abuse is a terrible crime - but is it so terrible and so common that people are willing to give up their privacy?

      The ordinary people I've talked to over this honestly believe they have "nothing to hide".

      The fact that they're indirectly condemning LGBTQ, dissidents, etc., by not fighting this doesn't even occur to them. In some cases they'll be condemning them to torture and even death.

      Democracy, too. If the government can mass-read people's communications/opinions in real time then it's the end of democracy. All you need is to know what's trending while the other party is in the dark and you can win any election.

    • by gweihir ( 88907 )

      Well, there is a statement by one victim organization in Germany. These people, who had all been abused as children, most certainly did not want to be used as a cheap excuse to establish a surveillance society. I think that makes the moral question pretty clear.

  • oh yeah also for "terrorist content"

  • by Orlando ( 12257 ) on Thursday July 07, 2022 @01:52AM (#62680318) Homepage

    The UK doesn't have a government at the moment.

    • And it's about bloody time. Maybe they should continue operating without a government for a year to catch their breath. It'll be better for the country than having the government they had, hell bent on destroying it.

    • by AmiMoJo ( 196126 )

      For those not familiar with UK politics, the current Prime Minister is a serial liar and has been getting into scandal after scandal. The latest one was apparently the straw that broke the camel's back, and triggered ~50 resignations in the last couple of days.

      The Prime Minister has now announced he will resign, eventually. Problem is the next Prime Minister gets picked by the ruling party, the Conservatives (also known as Tories). The summer recess is coming up too, and they have more important things to b

  • ... ensure criminals are not allowed to run rampant ...

    I agree totally: Just give us the names of the criminals we'll ensure they are not online. What: You don't know their names. So how will you stop them running "rampant online"?

    This is more of "prove you're innocent": It's is the government's catch-cry in the war on drugs/copyright infringement/terrorism/CS-AM. Since they demand less privacy from us, we should demand less privacy from them, the politicians and bureaucrats.

    The cynic in me suggests a repeated cry of "We must do something" means a hidd

    • by dgatwood ( 11270 )

      The cynic in me suggests a repeated cry of "We must do something" means a hidden agitator and I can think of one culprit: Google and Samsung cloud-services. Remember how your phone sends a copy of everything you do, to the cloud? When a school-girl takes a naked selfie, those cloud-services have just distributed kiddie-porn. Of course, their OS could check for CS-AM and quietly fail to upload it. When a school-girl attempts to sext someone, it could remind her that hackers/crackers might steal this photo.

      This would be a good idea regardless of the age of the person in question. Naked selfies need not be of underage girls to cause serious harm. Sometimes, I think people focus too much on CSAM and forget the bigger picture, which is exploitation of women (and, to a lesser extent, men) in general.

  • Re: "Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe,..." - Absolutely! And one of the most effective ways to track down & prosecute these criminals is to follow the money. Let's put an end to secretive banking practices that allow criminals to hide the profits of crime from the authorities, e.g. tax havens, off-shore banking, & shell
  • Comment removed based on user account deletion
    • by gweihir ( 88907 )

      Indeed. And it is basically impossible to find out what the hashes contain without the original data. That data is obviously not available to anybody that wants to do an outside verification. Too bad.

  • Governments will never stop trying to break encryption to gain access to all the data of their citizens and circumvent all privacy legislations. Time and time again these bills fail when challenged, and time and time again private parties collude with governments to try to force them through. The voting public need to be educated so the myth of helping the children ,with these invasive laws which will do anything but, loses its emotional effect.

  • ...is paved with good intentions. At least in the useful idiots supporting this. The instigators do not care one bit about children, they want a surveillance-state. Get this established and very soon it will be used for a lot of other things as well.

  • by Schoenlepel ( 1751646 ) on Thursday July 07, 2022 @09:43AM (#62681126)

    Try this instant messaging program.

    https://tox.chat/ [tox.chat]

  • by WCLPeter ( 202497 ) on Thursday July 07, 2022 @11:44AM (#62681484) Homepage

    We're in a Catch 22 situation here.

    In the pre-internet world and were suspected of committing the types of crimes being talked about here law enforcement would gather evidence, go to a judge, and if approved they'd go through your physical mail or search your properties with a warrant. It didn't matter if you provided a key or not, the heavily armoured folks would show up with a battering ram to rip the door off it's hinges.

    Because of checks and balances this was, mostly, fine since all the stuff was physical and you needed clear and actionable evidence to break someone's privacy rights.

    In a post internet world though you can have all evidence in the world, and have the necessary warrants to back it up, but without the encryption key you're likely not brute forcing that open with a "battering ram" until well past the heat death of the universe. It makes it extremely difficult to find, and prosecute, these people should they be savvy enough to utilize that level of work. So, on the surface, I have zero problems letting law enforcement "bypass" that lock and batter their way in when needed.

    But the same things which protect the criminal also protects me. If law enforcement can't crack it until after the heat death of the universe, then the hackers trying to rob me are going to have the same issue too - and I want them to have that issue. I don't want my encrypted communications to have a back door because unlike a battering ram used on the wrong house, telling me someone screwed up, I'll often won't know I've been hacked until my bank is calling asking about the new house I just bought in Taipei.

    Then there's the issue that those same computers which can protect my privacy can also be used to quickly erode it. Unlike the physical world, which has checks and balances to prevent abuse, how soon would we enter a dystopian hellscape where law enforcement uses the backdoor and machine learning algorithms to monitor your stuff "just to be safe"? Especially after having shown, time and time again, they simply aren't trustworthy enough to have that level of power and trust.

    There's no easy way to put the protections we have on our privacy rights in the physical world into the digital one without deliberately leaving the door unlocked, or trusting that law enforcement is going to follow the rules and ask for permission before using the spare key they're asking everyone to give them. So it's a Catch 22, we can make stopping the criminals easier but we have to give up our privacy and freedom en-masse to do it.

    As much as I hate to say it, I think that old adage says it best "it is better that 10 guilty persons escape, than that 1 innocent suffer". Criminals are gonna criminal, we can't change that, but once we've given up our personal liberties they're gone and we aren't going to get them back. It's best to just not go there and find a different way since, eventually, the criminal activity is going to require something to happen in meat-space to keep it going.

  • Once again the brownshirts are tugging at people's heart strings thinking up the nastiest senerios involving kids so people will turn off their brains and not even consider the real reason why something like this is being monstered onto the public.

  • Child sexual abuse is a sickening crime.

    On that I think we can agree.

    But what if also think computer voyeurism is a sickening crime - which is what they government seems to be advocating here; that is, they intend to normalize and institutionalize what most of their constituents also consider a sickening crime.

    If we can prosecute someone for having dirty thoughts about children, shouldn't we extend the same to politicians who have dirty thoughts about violating our rights? I mean, if someone could

If all else fails, lower your standards.

Working...