Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security Businesses The Almighty Buck Technology

Fraudsters Deepfake CEO's Voice To Trick Manager Into Transferring $243,000 (thenextweb.com) 39

An anonymous reader quotes a report from The Next Web: In March, criminals sought the help of commercially available voice-generating AI software to impersonate the boss of a German parent company that owns a UK-based energy firm. They then tricked the latter's chief executive into urgently wiring said funds to a Hungarian supplier in an hour, with guarantees that the transfer would be reimbursed immediately. The company CEO, hearing the familiar slight German accent and voice patterns of his boss, is said to have suspected nothing, the report said.

But not only was the money not reimbursed, the fraudsters posed as the German CEO to ask for another urgent money transfer. This time, however, the British CEO refused to make the payment. As it turns out, the funds the CEO transferred to Hungary were eventually moved to Mexico and other locations. Authorities are yet to determine the culprits behind the cybercrime operation. The firm was insured by Euler Hermes Group, which covered the entire cost of the payment. The names of the company and the parties involved were not disclosed.
According to The Wall Street Journal, which first reported the news, the voice fraud cost the company $243,000.
This discussion has been archived. No new comments can be posted.

Fraudsters Deepfake CEO's Voice To Trick Manager Into Transferring $243,000

Comments Filter:
  • by WoodstockJeff ( 568111 ) on Wednesday September 04, 2019 @09:01AM (#59156572) Homepage
    • Is /. deepfaking articles now?
      • by Megane ( 129182 )

        Oh no, not Deep Dupes!

        If this was ~50 years ago, they could probably make a great Monty Python skit from this. The scam, not the dupe. Just don't mention the war! (yes, I know that's from Faulty Towers)

      • You'd think they could develop some AI algorithm to spot these dupes ahead of time. But it must be beyond all limits of current technology.

        • We are assuming it was a highly sophisticated technological attach using an AI generated voice to deep fake the unsuspecting victim. It was a real time phone call to social engineer someone. In all probability it was a person on the other end who spent a few weeks learning to mimic the voice of said CEO.
      • /. has had dupes for 20 years.

        The editors, and I use the term loosely, are a fucking joke. That's the running gag.

    • This is the second time in less than a week! Somebody do something!

    • It is Wednesday, my dupes. [insert wailing of 10,000 Budgett's frogs]
  • They did it every day the last week it seems, must be a new fashion.

    • Comment removed based on user account deletion
      • Re: (Score:2, Interesting)

        "DeepFaked "Whitewater" like scandal that never occurred, but where someone was politically ousted nonetheless."

        RussiaGate.

        "DeepFaked ID theft."

        They used to do that with rubber masks if you can believe old movies and TV shows.

        • Cue Mission Impossible theme. Even as a kid, I always thought that was over the top unbelievable, but the tech was cool.

          • by k6mfw ( 1182893 )

            Cue Mission Impossible theme. Even as a kid, I always thought that was over the top unbelievable, but the tech was cool.

            When I was a kid, I thought it was quite believable. Greg Morris had the coolest job with "The Phone Company" with all kinds of space age stuff. Looking back these days, yes over the top, but Morris played the role of the ultimate techie. Always cool and calm even in dire situations of near death or capture, and always able to make it all work. In many ways the IMF used deepfake tech and disguises to manipulate various governments around the world, topple their leaders. All in manners that they didn't know

      • Real "Whitewater" like scandal perpetrators claim it was deepfaked and nobody can prove otherwise, they stay in power.

        Someone does something bad and claims someone else stole their identity and the cops can't prove to otherwise to a court.

        Someone is being blackmailed of sexual content or worse, a sexual crime, claims it's fake, and everyone believes them.

      • Oh c'mon. In these times of Fake News? Quite the opposite is true! Now you can sleep around as you please and if you get caught on film, claim it's a deep fake and everyone believes you.

  • by the_skywise ( 189793 ) on Wednesday September 04, 2019 @09:37AM (#59156730)
    "Hello Smithers... You're very good at... turning... me... on"
  • First, the source article is behind a paywall and can't be readily reviewed. The editors never should have accepted this article based on a single pay-walled source.

    Second, how do we know that it was a deepfake? Seriously, nothing has been stated about anybody being caught or any software being found. Saying that this was a deepfake is nothing more than speculation. For all we know they simply practiced impersonating his voice until they got it to the point that they could use it to commit the fraud.

    Third,

    • Comment removed based on user account deletion
    • Third, Occam's razor has this doesn't make sense. Think about it, if you get software that can successfully replace someone's voice with another persons voice and accent why would you bother to use it for crime?

      You're assuming that the creator is the person or organization committing the theft. It's very possible, likely even, that someone else built the system and now makes money by renting it out. Very low risk in taking a cut of the proceeds, if the deal is transacted in a low-law-enforcement location.

  • AI deepfake or not (Score:5, Insightful)

    by Rosco P. Coltrane ( 209368 ) on Wednesday September 04, 2019 @09:56AM (#59156806)

    If the only proof the UK CEO needed to authorize a 243K payment was to recognize the voice of is boss, he deserves his pink slip. Falling for such an old social engineering trick is just pathetic when you're in charge.

    • The big problem here is that this indicates a lax attitude to compliance by both companies. If this sort of thing was completely out of the ordinary the CEO could be expected to be suspicious from the get go. The fact that he was not so, indicates that both companies had poor procedures and policies and this sort of nod and wink behaviour was the norm.

    • You obviously have never worked for a "do it or be fired" kind of boss that considers it a challenge of his authority if you as much as answer with a "but...".

      • No, but all my bosses have been working with a very explicit "fire me unjustly and get sued" employee (me). Here in Europe, unlike the US, employers take this seriously.

      • Anyone in a management position has learned to get a signed piece of paper authorizing the action as a CYA measure. So if things go sideways, they can simply show that piece of paper and escape responsibility, passing the blame onto someone else. Because at some point along your climb up the corporate ladder, probably several times,, you'll encounter leeches who try to take credit for things they didn't do and shift blame onto you for things they did. I'm surprised this guy made it CEO without learning th
  • Comment removed based on user account deletion
    • by ranton ( 36917 )

      Or simply made sure certain executives cannot bypass proper controls around the management of money just because they have a fancy title.

    • I think that's highly optimistic. Executives (including the ones at insurance companies who ultimately foot the bill for this fraud) are never going to sign off on a system where some black box of IT nerdery can overrule a phone call from the CEO to the head of accounting.

      Higher insurance premiums will be "just a cost of doing business".

    • This impersonation has been happening in corporations for some time but the method is new. At a former company, someone posing as the CEO emailed the head of HR wanting a copy of every single employee’s W-2 in one large PDF in a few hours for “data purposes.” At first glance the email looked legitimate, however this would have extremely impractical to create a PDF with tens of thousands of pages given the short amount of time. Another aspect that caused suspicion was that CEO was technical

      • by k6mfw ( 1182893 )

        but if HR hadn’t been more alert, they would have sent the PDF.

        All it takes is just one victim not being alert out of several attacks and the thief can get $243K.

  • Last year, Pindrop - a cybersecurity firm that designs anti-fraud voice software - reported a 350 percent jump in voice fraud from 2013 through 2017, with 1 in 638 calls reported to be synthetically created.

    Well, coincidentally, I sell condoms and I can report a 1,400 percent jump STDs from 2013 through 2017, with 1 in 638 unprotected sexual acts resulting in an embarrasing visit to the family doctor.

    (Hint: you should wear condoms.)

  • more more more!!!!!

    The keys to a quick change con (and most others) are "urgency" and distraction.

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    MAYBE, just MAYBE. it's time to slowdown just a little bit?

    "But OMG, my job!!!!!"

    I dunno... How much did this cost? I'd think it would be his/her job in any case, but what do I know?

  • A person's likeness should be automatically trademarked and wholely owned by the person. Using a person's likeness in anyway without their explicit permission should be illegal.

    • Yeah. That's what's going to solve this. I mean, I was about to steal a quarter million but by making the way I planned to steal illegal means I won't do it anymore. That might break the law!

    • There's so much wrong with this, but picking a few:

      - Would twins co-own their likeness?

      - If I happened to look like someone else, could they sue me for infringement? (Doubly important if that other person was someone rich or famous.)

      - A person's likeness changes over time. I definitely don't look like College Me. Do I continue to own all versions of me or just the current one?

      - Would public figures be able to sue comedians who impersonate them? For example, Alec Baldwin portrays Trump on SNL. Would Donald T

  • "At Vanguard, my voice is my password."

    Then Vanguard's software verifies that I am me. I guess I better hope that neither true voice synthesizers nor mimicry actors can fool Vanguard. I'm reasonably certain that voiceprint matching in software can be much more discerning than the human ear; whether a given algo meets that bar remains to be seen.

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...