Jump to content
  • Mother terrified by AI voice of her 15-year-old daughter used to stage fake kidnapping

    alf9872000

    • 369 views
    • 3 minutes
     Share


    • 369 views
    • 3 minutes

    This is what happens when bad people use AI

     

    AI’s such as ChatGPT and myriad voice changer tools can be useful and even fun to play around with but stories like this highlight the dangers of such technology. One Arizona mother was terrified into believing that her daughter had been kidnapped and held for ransom. 

     

    As reported by Arizona-based WKYT(opens in new tab), Jennifer DeStefano was subjected to more than just a prank call. She received a call seemingly from her 15-year-old daughter, who was out of town on a ski trip. 

     

    ‘Mom, I messed up,’ It began, followed by an unknown man saying “Put your head back, lie down.”

     

    Every parent’s worse nightmare, the man went on to claim that he had Jennifer’s daughter hostage and made several horrific threats that unless she paid him $1 million dollars that she would not see her daughter again. The whole time she could be heard “going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”

     

    Thankfully, after calling her husband Jennifer was able to confirm her daughter was safe, sound and completely unaware what was going on. It had all been a horrible trick. The criminals had used AI to imitate her daughter’s voice exactly.

    “It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

     

    This situation is horrible, but sadly not an isolated incident. Even high-profile figures have had words put into their mouth. Drake and the Weekend were targeted when a fake song was released in their voices. 

     

    More unforgivably, 7-time Formula One World Champion Michael Schumacher, who had a tragic ski accident in 2014, appeared to give an exclusive interview to a German magazine earlier this month, detailing some heartbreaking details about family life. What a scoop, until it was revealed it was all faked(opens in new tab), using an AI trained to produce quotes that sounded like him. Schumacher’s family is now taking the magazine to court.

    How can we avoid being faked by AI? 

    Unfortunately, we now live in a world where even just granting people access to our likeness and clips of our voice can be used against us. Public figures will likely struggle to escape from such scams but regular citizens can take a few measures to protect themselves.

     

    Setting social media accounts to private is a great way to ensure only eyes and ears of people you trust see your posts. Videos on Instagram, Facebook, and the like can not only reveal what you look like but scammers only need a few of seconds of audio to spoof your voice. 

     

    If you find yourself talking online or on the phone to someone who you are wary of, ask them personal questions the real friend or family member would know and never send money to or click on links from non-trusted sources. 

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...