Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Encryption Crime The Courts The Internet

Feds Bust Alaska Man With 10,000+ CSAM Images Despite His Many Encrypted Apps (arstechnica.com) 209

A recent indictment (PDF) of an Alaska man stands out due to the sophisticated use of multiple encrypted communication tools, privacy-focused apps, and dark web technology. "I've never seen anyone who, when arrested, had three Samsung Galaxy phones filled with 'tens of thousands of videos and images' depicting CSAM, all of it hidden behind a secrecy-focused, password-protected app called 'Calculator Photo Vault,'" writes Ars Technica's Nate Anderson. "Nor have I seen anyone arrested for CSAM having used all of the following: [Potato Chat, Enigma, nandbox, Telegram, TOR, Mega NZ, and web-based generative AI tools/chatbots]." An anonymous reader shares the report: According to the government, Seth Herrera not only used all of these tools to store and download CSAM, but he also created his own -- and in two disturbing varieties. First, he allegedly recorded nude minor children himself and later "zoomed in on and enhanced those images using AI-powered technology." Secondly, he took this imagery he had created and then "turned to AI chatbots to ensure these minor victims would be depicted as if they had engaged in the type of sexual contact he wanted to see." In other words, he created fake AI CSAM -- but using imagery of real kids.

The material was allegedly stored behind password protection on his phone(s) but also on Mega and on Telegram, where Herrera is said to have "created his own public Telegram group to store his CSAM." He also joined "multiple CSAM-related Enigma groups" and frequented dark websites with taglines like "The Only Child Porn Site you need!" Despite all the precautions, Herrera's home was searched and his phones were seized by Homeland Security Investigations; he was eventually arrested on August 23. In a court filing that day, a government attorney noted that Herrera "was arrested this morning with another smartphone -- the same make and model as one of his previously seized devices."

The government is cagey about how, exactly, this criminal activity was unearthed, noting only that Herrera "tried to access a link containing apparent CSAM." Presumably, this "apparent" CSAM was a government honeypot file or web-based redirect that logged the IP address and any other relevant information of anyone who clicked on it. In the end, given that fatal click, none of the "I'll hide it behind an encrypted app that looks like a calculator!" technical sophistication accomplished much. Forensic reviews of Herrera's three phones now form the primary basis for the charges against him, and Herrera himself allegedly "admitted to seeing CSAM online for the past year and a half" in an interview with the feds.

This discussion has been archived. No new comments can be posted.

Feds Bust Alaska Man With 10,000+ CSAM Images Despite His Many Encrypted Apps

Comments Filter:
  • by Anonymous Coward

    This is a very dangerous slope, why not comic books? Or actual books like Lolita? The whole point of making kiddie porn illegal to have is the theory that abuse occured in order to make it therefore making it illegal to have will prevent it from being made. If that goes out the window then it's just ruining people's lives for no reason.

    Seems we should be encouraging people to make AI kiddie porn on the same basis that fake ivory is a thing.

    There was a case involving a woman named Little Lupe who actually h

    • by ArchieBunker ( 132337 ) on Thursday August 29, 2024 @10:55PM (#64747868)

      First, he allegedly recorded nude minor children himself and later "zoomed in on and enhanced those images using AI-powered technology."

      Nothing fake about that.

    • by billyswong ( 1858858 ) on Thursday August 29, 2024 @11:00PM (#64747876)

      According to the quoted report, the suspect fed nude children imagery into an AI image generator.

      For an AI image generator to be 100% clean and legal, we need to know the full data source fed into it and be sure there are no copyright violation nor nude real children. Else any AI generated CSAM will be suspected to be generated with real children victim behind the scene.

      • by AmiMoJo ( 196126 )

        Most of the large training data sets have been found to contain illegal images. They were just scraped from the web and image libraries without much oversight.

        Even the models that have been designed to not produce child abuse images are still creepy as hell. All the women look about 12 years old, but with massive boobs. These are the big players like Midjourney too.

    • by machineghost ( 622031 ) on Thursday August 29, 2024 @11:55PM (#64747948)

      >The whole point of making kiddie porn illegal to have is the theory that abuse occured in order to make it therefore making it illegal to have will prevent it from being made.

      That is most certainly NOT the only rationale. There's also a very scary argument that if a potential molester indulges in kiddie porn, it will lead to them eventually molesting real children. If you buy that argument, then stopping even artificial child porn (which, as others have noted, is not all that this guy was involved in) stops kids from getting raped.

      That being said, there is a counter argument that the availability of artificial child porn (eg. child hentai in Japan) actually *reduces* the number of child molesters (see https://en.wikipedia.org/wiki/... [wikipedia.org]). According to that theory, potential molesters view kiddie porn instead of harming actual kids.

      Personally I don't know enough to say for sure who is right. The one thing I do know is that we should do everything possible to stop child molestation. That means listening to the science ... even if that science contradicts our gut feelings.

      • by haruchai ( 17472 )

        "That means listening to the science ... even if that science contradicts our gut feelings"
        There's about as much chance of that as me winning the Powerball without ever buying a ticket

        • by gweihir ( 88907 )

          Yep. "Caveman wants to apply big club to problem!" That has never worked, but usually makes things worse. But people do not learn and have no insight.

        • If COVID has taught us anything it's that a vocal minority of people don't give two cents about what science says, and somehow convinces even intelligent people in to ignoring it.

          Social sciences also teach us that it's cheaper to prevent crimes by providing social services than it is to be reactionary and incarcerate people, and yet...

      • It's acceptable for society to have standards. If the majority of Americans hold that sexualizing minors is undesirable behavior, there are legal avenues by which their elected officials may prohibit such sexualization.

        It is imperative that the people fully understand what powers they've granted to law enforcement, as well as how law enforcement chooses to exercises those powers. It's also acceptable to question elected officials and/or law enforcement regarding any of these matters.

        Ultimately, if the peo

        • by gweihir ( 88907 )

          These standards are not morally acceptable if they lead to more child abuse. In fact, one could easily argue that those setting such standards have a significant part in making the problem worse and hence are morally on a comparable level of what they pretend to "fight".

          Incidentally, you just justified making being gay a criminal act, as it was for a long and dark time. Well done.

          • by Veretax ( 872660 )
            Even though I agree none of what has been described would fall into a morally acceptable category, this story has people juxtaposed between two potential morals.

            The availability and ease of storing such reprehensible content, and the right of a person to be safe in their effects and persons from an out of control surveillance state. It can't be a coincidence that some of this content was hidden behind Encryption. Many of us have noticed that news seems to be dropping those bits in more and more as if
          • Re: (Score:2, Troll)

            by DrMrLordX ( 559371 )

            You aren't going to convince many people that CSAM leads to less child abuse. You're also making the mistake of assessing how CSAM distracts pedophiles from potential rape victims without considering how CSAM affects those who aren't necessarily pedophiles (yet). Talk to any abuse counselor who treats victims of molestation and you'll discover that many victims become the abusers later in life. The same holds for exposing people to CSAM. Making it readily available and at least nominally acceptable is at

            • by gweihir ( 88907 )

              Spoken like one of those fanatics that do not mind doing utter evil, as long as their ideas of how the world should be (but not is) are met. Congratulations, you are part of the problem.

              • Generally I'm with drugs decriminalization and I support your comment that we should aim for policy that "standards are not morally acceptable if they lead to more child abuse" however that's not an absolute. The science around sexual things tends to be horribly badly done. The grandparent is just saying that people aren't convinced by it. They also give a specific example of scientific question that they claim isn't answered. I can add my own: even if the research shows that instantaneously today, fake CSA

            • Well, maybe not, but most of the people also used to be convinced that the Earth was flat, center of the universe, that maggots and flies could spawn from meat unassisted, etc...

              You do have a point and theory about considering the wider pool of CSAM viewers and the effects on them. It is NOT a well understood or studied topic. Even just pornography is barely studied, though the balance of results there is that access to porn in society generally reduces sexual assault and rape.

              That said, we have two oppos

        • It's acceptable for society to have standards.

          No. If someone's behavior isn't harming others, it's no one else's concern.

          No one is harmed during the creation of computer-generated kiddie porn. There is little evidence that it has a net effect of increasing actual sexual abuse, and some evidence that it has the reverse effect.

          Personal anecdote: I've known a few "porn addicts" who spent much of their spare time ogling sexually explicit photos of adult women. None of them had sex with real women or even attempted to.

          • by Targon ( 17348 )

            When the sources aren't computer generated, that makes the source material illegal to have. On top of that, encouraging others to view illegal things also ends up encouraging them to do more than just view them, but to also participate. The best way to stamp this stuff out is to keep people from making it in the first place, then it wouldn't be there to talk about, which would keep others who might be tempted to not be influenced by it.

            Note that porn will be out there, but kiddie porn just isn't somethi

          • Reread your last sentence and then rethink your position.

        • It's acceptable for society to have standards. If the majority of Americans hold that sexualizing minors is undesirable behavior, there are legal avenues by which their elected officials may prohibit such sexualization.

          Society is organized by legitimacy which requires broad consent of the governed to maintain. You can only use the legal system against outliers far beyond threshold of "majority of Americans" without eroding the states legitimacy and by doing so inflicting collateral damage on society.

          The war on hookers and blow being examples of the consequences when people lose sight of legitimacy. While most people may believe these things are bad a staggering percentage roughly 15% have done them anyway regardless of

      • by gweihir ( 88907 ) on Friday August 30, 2024 @04:34AM (#64748190)

        That being said, there is a counter argument that the availability of artificial child porn (eg. child hentai in Japan) actually *reduces* the number of child molesters (see https://en.wikipedia.org/wiki/... [wikipedia.org]). According to that theory, potential molesters view kiddie porn instead of harming actual kids.

        Personally I don't know enough to say for sure who is right. The one thing I do know is that we should do everything possible to stop child molestation. That means listening to the science ... even if that science contradicts our gut feelings.

        That would be a good idea. Incidentally, research into the question is made basically impossible. You cannot get funding, getting permission is hard and protecing interview data and the like is basically impossible. It almost looks like some groups have no interest in finding out what the actual facts are. As having the actual facts would allow a real reduction in child abuse, this is pretty evil. Of course, child abuse happening serves a lot of interests: Law enforcement can claim more resources and claim great "victories", politicians can get even more invasive and abusive laws passed, fanatics of all kinds can point to child abuse to misdirect away from the evil they do, etc., ad nauseam. The whole thing is also eerily similar to the utterly failed "war on drugs".

        Incidentally, there is hard data from non-child porn: Allow access and rape rates drop. Not to zero, but significantly. Same, incidentally, with prostitution, although that clearly is not an option here, except maybe for some role-play approaches. But it shows that substitution is significantly beneficial overall. Not investigating whether these benefits could be had for child abuse as well puts those that prevent that from happening on the same moral level as the child abusers.

      • by e3m4n ( 947977 )

        Until they get into a relationship and have unfettered access to the age children that excites them. I would argue that psilocybin would be a more productive treatment for the perversion.

      • ... lead to them eventually ...

        Yes, all criminals escalate: tobacco smokers turn to heroin, house-burglars turn to serial murder and lobbyists turn to bribing elected officials. Anyone have a problem with that line of thought: Except the last one, we know there's no humanity in elected officials?

        As so many victims explain, criminals don't suddenly appear: They learn to be a criminal: Newspapers make a big deal when a criminal has history. Why, are they dishonestly demanding petty criminals spend their life in a chain-gang? Would

      • An alternative argument when it comes to CSAM involving real victims is that it continues the cycle of abuse of non-consenting minors.
      • Ignoring that the accused actually took photos of naked kids, thereby being wholly responsible for the wrath that followed, letâ(TM)s focus on this bullshit that looking at pictures of something makes you more likely to engage in that something. My question is this: Do violent video-games make people violent?
        • Do violent video-games make people violent?

          Repeated exposure to violent and horrific images desensitize people to violence, even more-so when it is an active immersive training experience. Media does this, and video games are much more effective. (This effect happens with unrealistic cartoons like Bugs Bunny, but that is so far removed from reality that the effect is minimal and mostly ignorable.) But we're discussing highly realistic imagery here. This effect is due to neiroplasticity and the reward mechanisms of the brain. Do you disagree with the

    • Missing the point. Fundamentally, it is a thought crime. It's the thought that the authorities seek to extinguish, not actual abuse of minors.

  • by FeelGood314 ( 2516288 ) on Thursday August 29, 2024 @10:56PM (#64747872)
    I'm going to guess child porn but maybe the article could actually write it out the first time they use a new acronym otherwise I'm going to assume this is something the author thinks is really normal and common.
    • by LuniticusTheSane ( 1195389 ) on Thursday August 29, 2024 @11:00PM (#64747878)
      The first sentence of the article spells it out. "The rise in child sexual abuse material (CSAM) "
    • Child sex abuse material.

    • by alantus ( 882150 )
      Child porn. I don't know why we need new terms and abbreviations for this.
      • Some venues don't allow you to even use the word "porn" anymore. Try saying it on a YouTube vid that isn't going to be flagged 18+.

      • by Bahbus ( 1180627 )

        It's a definition thing. First, not all sexual activities or obscenities are considered to be pornography - regardless of whether someone is masturbating to it. Second, theoretically, everyone in pornography should be consenting legal ages. Children are not legal, nor can they consent, thus it's sexual abuse and not porn.

      • by cstacy ( 534252 )

        Child porn. I don't know why we need new terms and abbreviations for this.

        I assumed it was because they wanted to explicate that it was child "Sexual Abuse" material. Which rules out fantasy material not involving any child. And it makes a nice code word, which is more polite.

        "What to you do for a living?"
        "I screen child porn."
        vs.
        "I audit materials related to the sexual abuse of children."

        I guess. I don't work in the field.

    • The /. poster could have spelled it out, more like it. This was an incredibly confusing post for most of us, methinks.

  • - creates public telegram group for child porn.
    • Right? But also, does Telegram disclose the identity of their users? My understanding is that they do not, which is why that guy got arrested in France this week.

      I would also like to know how law enforcement accessed the content of this person's phone.

      • by haruchai ( 17472 )

        "I would also like to know how law enforcement accessed the content of this person's phone"
        Manufacturer backdoor which they'll never admit to

      • My understanding is that they do not, which is why that guy got arrested in France this week.

        No. The guy who got arrested in France this week had an openly published sheet of charges from the prosecution and precisely none of them had to do with "disclosing identity of users".

    • What is even stupider, clicked on the Feds' trap link without a proxy or at least the TOR he had, and gave them his IP address. A pro in perversion, a lamer in computer security, despite his eclectic sh!tload of tools.

    • by gweihir ( 88907 )

      And then proceeds to use a zoo of tools, clearly without overall strategy or risk assessment. Nobody understands KISS these days.

    • - creates public telegram group for child porn.

      Honestly, the headline is one of those scare-tactic clickbait bullshit ones. "Don't trust encryption! IT FAILED!" It's stupid, but I guess it got us to click. *SHRUG*

  • I don't want to sound like I'm on his side or anything, but it sounds like they shouldn't be able to know what's in all these places on his encrypted devices. It almost looks like they obtained all the evidence illegally, but the honeypot was their parallel construction to give them a reason to "search" his phones.

    If it didn't involve children, people would be more willing to question whether anyone's rights were violated.

    It's true that it sounds like the guy used every single encryption option available,

    • IANAL but it sounds reasonable to me. They didn't knock on his door and try to convince him to look at child porn that he otherwise would have avoided. He clearly and (seemingly) unambiguously went looking for it.

      Entrapment is dodgy. This doesn't seem to be that. Searching his devices seems correct because these are the tools used in commission of the offences.

      But what do I know? I'm an uninformed idiot with an opinion.

      • Yeah nat a lawyer either but entrapment specifically as I understand it is someone coercing you into commiting a crime that you would not otherwise commit and that last part is the important one. If an undercover says "Hey, you wanna go do a crime?" and you agree without much convincing, not entrapment. Could be something else legally though.

        • by AvitarX ( 172628 )

          You basically got it, but it also requires one be capable of doing the crime too.

          So if they say "hey, get me a kilo of cocaine, you'll be rich" and you go "boy golly, that sounds great" but you are a normal person that doesn't know how to go about getting a kilo of cocaine, they can't then have a guy walk up to you offering to sell you cocaine and bust you.

          • have a guy walk up to you offering to sell you cocaine

            I bet we know this as precedent because a PD tried exactly that at some point in time. "Hello fellow kids, can I interest you in some kyocaine?!"

    • We don't really wish to tolerate predatory behavior in our society when children are involved. Seems pretty natural to want to protect children from a likely threat before it strikes.

      Is it bordering on predicting thought crime? Maybe. But if there were leopards sleeping in my village I would be pretty concerned for the children even if they seemed like they haven't yet eaten any babies. Because being a predator is in their nature and it's bound to happen eventually.

      So it's no coincidence that we use the pre

    • It almost looks like they obtained all the evidence illegally, but the honeypot was their parallel construction to give them a reason to "search" his phones.

      Honeypots are not illegal, and there seems to be no evidence of parallel construction here.

    • This stuff actually happens quite a bit, it's just very, very rarely publicised because it helps law enforcement if people think their use of encryption is keeping them beyond the reach of the law. I've seen cases where perps used every trick in the book, deniable encryption (snort!), duress passwords, you name it, and it didn't help them in the end (hint: if you're using disk encryption software with "deniable encryption" as a major advertised feature then it's not entirely unlikely that the police know a
      • But Law Enforcement lobbies say we have to require encryption backdoors! Because encryption stops us from getting the pedophiles! Meanwhile, they still manage to fill up protective custody at every USP with chomos that used "encryption."
  • by hey! ( 33014 ) on Thursday August 29, 2024 @11:27PM (#64747926) Homepage Journal

    And he thought that would make it *harder* for the cops to get at his data?

    • by AvitarX ( 172628 )

      That was my first thought.

      It was a weird framing of the story that his opsec was good, sounds like it was a mess to me. A dunning Kruger of opsec.

      • by gweihir ( 88907 )

        The interesting question is whether the framing is due to more Dunning-Kruger or an intentional lie by misdirection.

    • I suspect the thought was if the cops only find 1000 of my images I'll only get 1/10th of the punishment.

    • by gweihir ( 88907 )

      Apparently though. Complete IT security management failure. The idea to throw (more) tools at the problem of IT security is found in professional settings as well. It hardly ever works out.

    • by AmiMoJo ( 196126 ) on Friday August 30, 2024 @05:05AM (#64748228) Homepage Journal

      It's hard to say what he was thinking, given he ran a Telegram channel with this stuff. Telegram channels are unencrypted, so at best you can just make them unlisted or invite only the people you want.

      • Telegram channels are unencrypted, so at best you can just make them unlisted or invite only the people you want.

        That's what you get for letting in FedMan69 asking "Sir, please grant expedited access to this telecommunications channel".

  • by bradley13 ( 1118935 ) on Friday August 30, 2024 @02:48AM (#64748102) Homepage

    It's just one case, but it goes to show that the feds don't need to break encryption to do their job.

    • by gweihir ( 88907 )

      Indeed. As always, a capability to easily break encryption serves one purpose: Cheap mass surveillance. Of course, the lies to obscure this fact vary.

  • CSAM means ... Criminal Sequential Access Method
    • Oracle VM VirtualBox contains a Code Scanning and Analysis Manager (CSAM), which disassembles guest code.

      Also Autosys has a CSAM component too (no idea what exactly that abbreviation means here).
      • Also Autosys has a CSAM component too (no idea what exactly that abbreviation means here).

        ChatGPT comes to the rescue: In the context of AutoSys, which is a job scheduling and workload automation tool, CSAM stands for CA Secure Sockets Adapter Manager.

  • IT security is not a subject where you can just throw tools at the problems in order to get a good solution. Instead, you must do risk analysis and strive for complete coverage of all relevant risks. Have one hole in there that attackers can reasonably be expected to find and you are done for.

    In the case at hand, it sounds to me like this person had all these tools, but no understanding of how to turn them into an effective protection architecture.

  • Is he a drag queen? That's what inquiring minds want to know. Or is he like these folks [imgur.com]?

  • I've been working with volunteer groups that caught people like this in the early 2000's. It was good fun. A lot less restrictions on evidence and entrapment back then. But yeah, these people always think they're untouchable and get lazy and complacent. He thought he was behind 7 proxies, but he's actually behind bars. Suck it, pervs!
    • A friend and I were seriously considering monetizing online child predators before we realized that it was likely illegal. The techniques are roughly the same as the boiler room virtual kidnapping extortion scam. I'm all for moral flexibility when the net good or removal of suffering accomplished is significantly positive.
  • He was flagged for hitting the honeypot and targeted. Most likely they found out which phones he was using and targeted his phone. Installed a rootkit and stole his passwords.

    How can they install rootkits? You can compel an "app store" to install it on a targeted phone. When people hear about CP they fall all over themselves to cooperate.

    Harder is they physically confiscated his phone either coped it to a hacked version of his phones internal electronics (using the old phone shell) or replaced a few chips

    • Doesn't even need an app store. Almost every wireless carrier can inject apps and possibly RCE into any of the phones the sell on their network. Phones are easy pickings. BYOD of a very old, obscure, or foreign phone from another carrier might be harder to get into.
  • I wonder if he had enabled biometrics on the device as well as notification on the lock screen? If he didn't enable biometrics and had only a pin/password/pattern on the device and also not have notifications enabled on the lock screen of these devices I wonder if law enforcement can compel him to provide the pin/password/pattern to unlock the device and possibly incriminate himself.

    Or perhaps they used some other means to get into the device, i.e. some un-reported 0-day exploit.

  • Seems every article about child porn mentions CSPAN images, they must be showing a lot of it.
  • Despite His Many Encrypted Apps

    the operative word there is probably 'many'

    Operational security online is a INCREDIBLY difficult. There are just so so so many ways you can get unmasked. Running multiple encrypted apps might even be one of them. Oh look at the box that is connecting A, B, F, Q, Y, and Z 'secure communications tools' constantly. Hmm I wonder if that is a mostly unique identifier...

    If someone with enough resources, and incentive wants to tie you to something online they likely can; unless you are being incredibly careful.

  • (Gross for the particular alleged crimes obviously,)

    So many ways to fail:
    • — That "Calculator Photo Vault" app isn't anything like VeraCrypt, and so that person was fool (in more ways than one).
    • — The person didn't use e2e with encryption at rest
    • — They entered a military facility and used a secret (or higher) network that logs everything
    • — Their device didn't secure data and metadata appropriately
    • — They didn't configure their device correctly
    • — One of their too many apps le

Keep up the good work! But please don't ask me to help.

Working...