“I’m the best man in my friend’s wedding this summer, and I’m dreading the speech. I have absolutely no idea what to say. Should I get an AI to help me? Or would that make me the worst man?”
—Lost for Words
Dear Lost,
You’re certainly not alone in realizing that some onerous creative or emotive task can be completed relatively painlessly with AI. The same thought has undoubtedly occurred to the tongue-tied Tinder user who discovers that he can enlist a digital Cyrano to pen his opening lines to a prospective date; or to the exhausted mother who recognizes that she has at her fingertips a tireless Scheherazade that can produce an infinite scroll of bedtime stories for her children; or to the overworked son who realizes that he can generate, in seconds, a personalized poem for his father’s retirement party.
Creatively expressing our feelings to others is time-consuming, uncompensated, and emotionally taxing—that is, at any rate, the message implicit in some of the marketing of large language models. When Microsoft, for instance, introduced its AI Copilot products in March, it imagined a mother using the software to generate a speech for her daughter’s high school graduation.
There are multiple ways you might use an LLM to produce a moving toast, ranging from the least intrusive (asking ChatGPT for writing tips or a quick proofread) to the more hands-on (generating a draft of the speech, which you can then customize). New sites like ToastWiz have built tools on top of GPT-4 that allow you to plug in “your stories and feelings” and generate three unique outputs for $30. Meanwhile, wedding-planning apps like Joy have incorporated AI that promises to help users with their “toughest wedding-related wordage.” The feature can produce toasts, or even vows, in the style of Shakespeare or Rumi, and aims to help users “bring their emotions on to paper in fun and creative ways.”
These aren’t the first commercial products that have promised to offshore the difficult work of human expression—or what is increasingly called “emotional labor.” Long before the recent AI boom, people turned to human ghostwriters to pen wedding speeches. (“Toast whisperers,” as The New York Times noted in 2015, were an under-the-table service that many clients were too embarrassed to admit paying for.) And I imagine that you, like many people, have for years sent greeting cards that leverage the words of a professional writer to articulate what are allegedly your own thoughts and emotions. This practice, of course, was not without controversy and critics. Hallmark’s very first slogan, introduced in 1944, was “When you care enough to send the very best,” a linguistic sleight of hand that inverted the most common critique of commercial greeting cards—that relying on the words of professionals was, in fact, evidence that you did not care enough to speak from your heart.
Such products have long approached what sociologist Arlie Russell Hochschild calls the “commodity frontier”—the threshold of activities we deem “too personal to pay for.” It’s a perimeter that exists even when the products we enlist are (for the moment) free, and the arrival of new technologies calls for its constant renegotiation. In the case of AI, there have already been some breaches of this still-hazy border. When Vanderbilt University enlisted ChatGPT to generate an email offering condolences to the victims of the mass shooting at Michigan State, the school was criticized for using automated tools for a gesture that demanded, as one student put it, “genuine, human empathy, not a robot.”
Writing a wedding speech would seem to require similar emotional engagement. But perhaps you have reasoned that intent and selection—“It’s the thought!”—are what matters in these situations. You are, after all, the one providing the model with the essential, albeit rough, emotive ingredients to produce the finished product. In conversations about AI-generated text, the prompt is often spoken of as the logos, the spiritual breath of human authenticity that animates the synthetic output (dismissed as so much mechanical “wordage”) with life and meaning. Just as the computer was, for Steve Jobs, a “bicycle for the mind,” so language-generation tools might be regarded as the vehicle that transports the spirit of our emotions from their point of origin to a desired destination.
But I’m not sure it’s so easy to separate intent from expression, or emotions from behavior. Some psychological experiments have demonstrated that it’s our words and actions that allow us to experience emotions, not the other way around—like the famous example of how forcing oneself to smile can induce a feeling of happiness. It’s possible that expression, including linguistic expression, is not a mere afterthought in our emotional lives, but the whole point. If that’s true, then the decision to outsource your speechwriting might contribute to a kind of emotional atrophy, a gradual loss of the ability to truly inhabit your internal states—or modulate them. A podcaster recently boasted that a friend of his who struggles with anger management uses AI “tone filters” when communicating with people who provoke his temper, feeding rageful rants into ChatGPT and asking the model to rewrite them “in a nicer way.”