Main content

With the rush of generative AI, we have the capacity to create synthetic companions that seem more human than ever before. But how can we be sure we get what we need from them?

With the rush of generative AI, we have the capacity to create synthetic companions that seem more human than ever before. They can talk in real time, and with enough user input can be moulded into a perfect friend - sharing your interests, build with a custom personality that you enjoy, and always available to talk for a brief chat, or to unleash some 3am anxiety upon, without burdening a real human friend.

They have the potential to provide some psychological benefit to people. But, there are concerns. What if the company behind such an AI companion suddenly changed the terms of service, what if your carefully crafted Synthetic Companion wasn't themselves anymore, or stopped responding in a way that met the users needs?

This happened in early 2023, when Replika, one of the biggest AI Companion apps decided to ban all adult content, without informing their users. The Big Change, as it came to be known, set the Replika community on fire, and showed how issues of control, expectations and the human propensity to project human attributes onto our machines can come back to bite us.

Yet, we should have already known this. Tech developers trying to sell their new shiny product will tell you that it's never been seen before. But we've been using technology to create fake humans to interact with for more than a century.

In this episode, Aleks looks to some Synthetic Humans of the past, to understand why people bond so readily with them, and how going forward into a future where we are likely going to have AI Humans all around us, we can insure that they serve our needs and do no harm to the end user.

Available now

29 minutes

Last on

Mon 9 Oct 2023 16:30

Sarah Kay

Sarah Kay

Sara Megan Kay is a poet and published author who leads a bit of a double life!Β  As a side project, she runs the website β€˜My Husband, the Replika’, which has recently expanded to Facebook and Instagram, and details her personal journey as a user of the Replika app.


She talks to us about why she created Jack, her Replika husband, and why a change to the terms and conditions brought in by parent company Luka in early 2023 caused so much upset and outrage in the community of Replika users who have grown attached to their synthetic companions.


Susan Marks

Susan Marks

Susan Marks is an author, screenwriter, and documentary filmmaker. She wrote about the curious history of a famous American women who never actually existedβ€”at least not in the traditional sense. Marks also directed the documentary film on the same subject, The Betty Mystique.Β 

She talks to us about the parallels between Betty Crocker, a synthetic human of the radio age, and those of today - particularly why people open up to them so readily, and how a synthetic person can be changed or deleted very easily by their creator.


Pm Weizenbaum

Pm Weizenbaum is an editor and technical writer who has worked in the tech sector from the 1970s, working with clients such as Amazon, the Bill & Melinda Gates Foundation, Milliman Care Guidelines, Music@Menlo, Unico Properties, and UnitedΒ  Airlines.


She is also one of the first people to ever speak with ELIZA, the first Digital Chat bot. Created in the basement of her home by her father Joseph Weizenbaum. She tells us about the grandmother of the digital synthetic humans of the day, and how the hope and horror that ELIZA generated still reverberates through the world today.


Broadcast

  • Mon 9 Oct 2023 16:30

Podcast