Perilous Tech

Occasional thoughts on technology and social issues

Chatbots have become the Bitcoin of 2023, but unlike the previous cryptocurrency craze, this feels like everyone is on board. Regardless of your professional background and expertise, it seems everyone is all in on chatbots. While we can debate the technical merits, capabilities, and future of these tools ad infinitum, one thing is for sure, everything is getting an assistant. In the mad rush to chatify products, differentiate, and get a return on investment, there are quite a few non-technical issues people just aren’t considering. One of these is the impact of the inevitable outsourcing of simulated emotional connections.

Chatification of Communication

Should we create a dystopian future where humans only communicate with each other through a layer of bots where all emotional connection is simulated?

Here’s a quick question. Should we create a dystopian future where humans only communicate with each other through a layer of bots where all emotional connection is simulated? Because this scenario is exactly what’s in our near-term future.

Regardless of the arguable capabilities of these tools, one thing is undeniable, they are good writing assistants. Since this seems to be what they excel at, they will be plastered in all kinds of products we use to communicate, such as word processors, email clients, messaging systems, social media, etc. We are headed for the near-term chatification of many products.

On the surface, this may not seem like a bad thing. Why wouldn’t you want help writing a cover letter for your resume or drafting repetitive emails for work? This seems like a win-win, but I’d argue it’s a trap. Lurking beneath the surface of this costless efficiency gain is a field filled with landmines that we casually stroll across.

In his song The Boss, James Brown repeats the line, “Paid the cost to be the boss.” At least James Brown knew there were costs to be paid. I think with frictionless access to these tools integrated into our communication apps, we may not realize we pay a cost.

Emotional Outsourcing

I did an experiment. I thought of a terrible situation in which a family member lost a child. It has to be one of the most terrible situations to be in. So, of course, I asked ChatGPT to draft a sympathy letter. The result is below.

Prompt: Write a heartfelt email that expresses condolences to a family member on the loss of their child.
Response:
Subject: Our Deepest Condolences for the Loss of Your Precious Child

Dear [Family Member's Name],

I hope this message finds you in the gentle embrace of love and support during this incredibly difficult time. It is with a heavy heart that I write to offer my sincerest condolences for the unimaginable loss of your beloved child. There are no words that can truly express the depth of sorrow I feel for you and your family.

Your child was a beautiful soul who brought light and joy to everyone they encountered. The time we spent together, though brief, left an indelible mark on my heart. I will always cherish the memories of their laughter, their kindness, and their unwavering spirit. Your child was truly a gift to us all, and their absence leaves a void that can never be filled.

As you navigate the turbulent sea of grief, please know that you are not alone. Our entire family stands beside you, offering our love, support, and strength. We may be separated by miles, but our hearts are united in the shared pain of losing someone so special. If you need a shoulder to cry on, someone to talk to, or help with any practical matters, please don't hesitate to reach out. We are here for you, now and always.

In the days, weeks, and months to come, I encourage you to lean on the love and compassion of those around you. Share stories of your child, reminisce about their life, and celebrate their memory. Grieving is a deeply personal journey, and while I cannot presume to understand the depths of your pain, I hope that you find solace in knowing that your child's life touched so many others in profound and meaningful ways.

Please accept my deepest condolences once again, and know that my thoughts and prayers are with you and your family during this heart-wrenching time. May your cherished memories provide you with comfort, and may your child's spirit guide you through the darkness and into the light of healing.

With all my love and sympathy,

[Your Name]

If you skimmed this letter and said, “Wow, ChatGPT did a really good job,” then you totally missed the point, but you landed where most people would, even people using the tool for this task. Read it again, then think about the situation. Think about being the recipient of this message.

There are many obvious problems with this letter, even in the wording itself. It seems to use the right words, but it’s completely over-baked and arguably too long. This message doesn’t have any emotional connection despite using emotional words. If feels informal, like a templated decline letter when applying for a job. A family member, hell, a human, would share some level of grief as well. As they wrote the letter, they’d be putting themselves in the recipient’s shoes, and that feeling would come out in the letter.

But whether tools like ChatGPT do a good job or not isn’t the issue. The issue is that we treated the family member on the other side as just another task to shuffle off of a pile and not the mourning family member they are.

The Larger Issue

There’s an old saying, “It’s the thought that counts.” There’s a lot of truth to this, but the “thought” is exactly what we are outsourcing here. A sympathy letter isn’t about the words you use, it’s letting someone know you are thinking about them.

The “thought” is exactly what we are outsourcing here

Why would anyone read your sympathy letter if they knew an AI wrote it all or in part? I know I wouldn’t. My response would be, “You narcissistic asshole, you couldn’t even be bothered for a couple of minutes out of your day to think about me and the tragedy that befell my family.” LLMs aren’t sorry. They don’t feel bad, they don’t feel anything.

You may think the sympathy letter was an extreme example, but I don’t think it is. If you remember, last month, Vanderbilt University had to issue an apology after using ChatGPT to draft an email about a shooting at another school. At the time, I wasn’t sure if I had a problem with it on the surface and that I’d have to give it some thought. I’ve thought about it, and I have a problem with it. Even though the shooting didn’t happen at Vanderbilt and it was a one-to-many communication, the email simulated human emotions and, in effect, was trying to manipulate humans. The bad thing about this is that if Vanderbilt hadn’t pointed out the fact that they used ChatGPT to write the message, it probably wouldn’t have been noticed. This teaches the wrong lesson because people learn not to reveal they used an assistant.

Even in more mundane and less emotional communication tasks with humans, there are still issues. We are headed for a near-term future where we treat humans as apps or API calls, with communication as just another task that needs to be checked off of a list. What does this say about us and where we are headed that we are so wrapped up in ourselves that we can’t spend any time out of our day to think about others? It’s not a good place.

Some Tasks SHOULD Have Friction

Not every part of human life should be a target for efficiency gain or friction reduction. I’m not sure when the appification of humans started, but I first recognized it with Uber. For example, request turning the driver “off,” aka telling them not to talk, and treating the human as a self-driving car.

We started treating people differently when we communicated with them online, via social networks, vs. in-person communication. This abstracted communication gave us a license to dehumanize them, justifying our actions in our heads.

Will this lead to writing assistant wars where people’s bots battle it out on social media? It’s always hard to tell with these things. The fact is, we really don’t know what the impact of this will be.

Some things aren’t meant to be frictionless or need an efficiency boost. Some tasks aren’t meant to be painless, but remember, it’s not about your pain. It’s about other people. Friction in human communication tasks forces you to think, consider, compromise and adjust.

The Impact

It’s early, and it’s always hard to predict how these things will play out because real life is far more complex than we give it credit for. From a psychological perspective, these tools will further accelerate our dehumanization of others, but there are more logistical issues as well.

Writing isn’t just an act of communication, it’s an act of discovery.

We rarely type without thinking. As I write, even in mundane replies to co-workers, I’m still thinking and sometimes, on the spot, come up with new ideas and new solutions as I write. One of the issues that seem to get lost in the LLM debates is about writing itself. Writing isn’t just an act of communication, it’s an act of discovery. It’s one of the main reasons I wouldn’t use ChatGPT to write blog posts, books, works of fiction, and a whole host of other writing tasks. Even if ChatGPT made me a better writer (something I highly doubt), it would make me a worse thinker, and that is not a good tradeoff.

Even if ChatGPT made me a better writer, it would make me a worse thinker.

How do we learn to cooperate with others, consider their positions, compromise, create consensus, and all of the other things we do as humans, if we are letting our writing assistants battle it out? Who is the one being convinced?

As we outsource more communication and emotional connections to intermediary assistants, we are in for more miscommunication and less understanding, consideration, compromise, etc. The list goes on. This begs the question, is it really reducing friction after all?

Conclusion

As humans, we need to decide if we want technology to manipulate us. I’m firmly in the camp that I don’t want this. Believe it or not, this is an unsettled issue that isn’t getting enough attention. But like it or not, this is happening, and there’s not really much we can do about it. It’s one thing to say don’t use these tools, but they may become so tightly integrated that it’s almost impossible not to use them. We need to go out of our way to think about the people on the other side of our communication, even if we don’t like them.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: