Main content

How an AI scam phone call has had a lasting effect on a family.

After explicit faked photos of Taylor Swift went around the world, US politicians have
called for new laws to criminalise the creation of deepfake images. The term β€˜deepfake’ describes how artificial intelligence – AI – can be used to digitally alter pictures, audio or video and trick us into seeing or hearing something that is not real.

It is not just the famous who are being targeted. Host James Reynolds hears the story of how a daughter’s voice was copied and used to make a scam phone call to her mother.

β€œShe said mom I messed up, and all of a sudden a man said β€˜put your head back and lay down’ and that’s when I started to get really concerned that she was either really hurt or something more was going on,” Jennifer tells us. β€œAnd then she goes β€˜mom, mom, these bad men have me, help me, help me and she starts crying and sobbing.”

Thankfully her daughter, Brianna, had not been kidnapped but the call has had a lasting effect on the family.

Technology has made the process of adjusting images easier but artificial intelligence provides the means to create media from scratch to generate completely fake content. We bring together two women – in the US and Australia – who have had their faces manipulated using AI to produce malicious pornographic images and videos.

A Boffin Media production in partnership with the OS team.

(Photo: Noelle Martin. Credit: Noelle Martin)

Available now

23 minutes

Last on

Sun 11 Feb 2024 12:06GMT

Broadcasts

  • Fri 9 Feb 2024 20:06GMT
  • Fri 9 Feb 2024 21:06GMT
  • Sat 10 Feb 2024 09:06GMT
  • Sat 10 Feb 2024 16:06GMT
  • Sat 10 Feb 2024 19:06GMT
  • Sun 11 Feb 2024 00:06GMT
  • Sun 11 Feb 2024 12:06GMT