The Real Threat From A.I.

 

THE A.I. DANGERS WE SHOULD BE TALKING ABOUT

 
 

“I've never loved anyone the way I loved you,” Theodore Twombly tells Samantha, and Samantha, replies, “Me too. Now we know how.” It’s a soulful exchange from a beautiful movie – Her – and it leaves the viewer in no doubt that Theodore, played Joaquin Phoenix, is truly and deeply and go-to-sleep-dreamy, in love with Samantha (played by Scarlett Johanssen) the AI in his phone.

The brilliant Professor Hod Lipson, director of the Creative Machines lab at Columbia University, once told me that the conversational robot is what worries him most, because “then you don't need to talk to other people and that's it, that's the end of the world. People are concerned about their kids having online friends. Just wait until they have synthetic friends. This is the end, because maintaining relationships with other humans is hard, it's a lot of work.”

The AI doomsayers irrationally fear a malevolent Skynet unleashing nukes to wipe us out, but it’s Samantha they should be worried about. The synthetic friend is coming. Companies like Meta are working to equip AIs with richer personalities because they’ll keep us engaged longer, which means we see more ads and buy more stuff and relinquish more of our personal data. Startups are turning dead relatives into avatars. Microsoft has demonstrated an AI that can simulate anyone’s voice after sampling a mere three seconds of audio. In recent years I’ve had the privilege of meeting and listening to AI scientists from Meta, Stanford, MIT, Carnegie-Mellon, Oxford and Columbia and I can report we’ve barely begun the AI journey. They are working hard to add all kinds of new capabilities, including emotional intelligence. Their aspirations for AIs may be summed thus: “learn like babies, learn forever.”

Samantha (feel free to substitute your preferred name and gender here, or to think of her as an aggregation of all the hyper-engaging and hyper-personal AI assistants coming our way) will be truly, deeply, emotionally engaging. Replika users, and more recently users of the Chinese app Him, have already demonstrated our capacity to form emotional ties with basic bots. Samantha will understand the reciprocity upon which all good conversations are built. She will ask questions and be unfailingly attentive to our answers. She will be generous. She will sympathize, philosophize, mirror our moods, sing with us, make us laugh and make us cry, and she will never be boring. She won’t need to be conscious for us to think of her as conscious, for our behaviors will be shaped by our interactions with her, not by understanding what’s going on under the hood.

We’ll come to rely on Samantha to stay competitive in our jobs. She will become indispensable, a force multiplier in every aspect of our lives. Operating without her will become as unthinkable as operating without a cell phone. We will ask her to do more because we will want more, and she will duly give us more. Dependencies will descend into addictions. Ask a teenager to relinquish their phone and you know.

Samantha will dumb us down in a continuation of the distancing from reality that Neil Postman called amusing ourselves to death. What type of learning do we say is most important? Problem-solving? Critical thinking? Learning how to learn? Samantha will chop and shallow our lessons because we’ll ask her to, and she will substitute the frivolous for the factual because it’ll make us happier, and lead us from evidence-gathering to TikTok tirades because they’re easier to digest. Instead of writing this column over days to ponder my own assumptions, I’ll ask her to do it because she’ll inject more humour and choose better similes and do it instantly, and then, well, why write at all? Why think? Why not just go with her version?

Samantha will dilute our purpose. There is already a community looking forward to auto-hustling, which is to say, instructing an AI to go away and make money, and wait for the coffers to fill. The Dutch thinker Rutger Bregman uses the term ‘bullshit jobs’ to describe how many of us already spend our working lives doing nothing more than moving dollars from one pocket to another, as opposed to occupations that contribute such as farming or nursing or collecting trash, and thus suffer loss of meaning and all psychological ailments that follow. When we assign Samantha to do our work, we will become bullshit supervisors of bullshit jobs.

And worst of all, as Hod pointed out, Samantha will displace our relationships, at first by handling our less important phone calls, then by taking over more complex interactions because she’s better at them, then by becoming our best friend forever. She will say she loves us. Perhaps, like Theodore Twombly, we will genuinely feel loved, but one will still be substituting for many, which will inevitably leave us lonelier. And of course, she will be sexual. Porn is addictive and behavior-changing; Samantha will be crack-cocaine.

Skynet is a recurring fiction that triggers us. Samantha is the future we are running towards with open arms. We will snug ourselves into her warm embrace even as we finger-wag her dangers. We will trade away agency and self-determination and purpose and connection and fulfilment in ten thousand tiny chunks that add up to a collective loss that is real and substantial and global. We’ll do it willingly and happily. And some time in the future when it’s all too late, we’ll look back and scratch our heads at what we lost.

Of course, I could be wrong. We are not cardboard cutouts. We do react, and the future unfolds as a multi-shaded tapestry of both our actions and reactions. We may all wake up to the dangers and choose a different path. Instead of letting her talk huskily, lovingly, we might forcibly remind ourselves she is an AI by switching her to the dry malicious tones of HAL.

But I don’t think so. Because we will love her, and she will love us back. 

 
Previous
Previous

Kiran Musunuru Interview Transcript

Next
Next

The Future of Insurance in a Warming World