“Thank you so much, but can you please make it a bit funnier?” I typed into the prompt field of the AI program I was using to help me draft an email.

The text the program was spitting out was a bit dry, and I wanted the artificial intelligence to liven it up a bit. But I didn’t know how to offer my feedback without hurting AI’s feelings. I, for one, would be simply devastated if someone said to me, “Could you please try and be a bit funnier?” So I hoped the “Thank you so much” and “please” would help soften the blow to its ego.

Logically, I know that OpenAI and all other artificial intelligence programs are powered by algorithms, not humans with feelings that can be hurt. But emotionally, which is the more dominant control center of my brain, I’ll never not believe that AI is really powered by a human sitting alone in a room at a desk, fielding my AI prompts and Google searches, then typing out responses to my questions about mysterious medical symptoms all day long.

It’s why I use complete sentences and strive to be polite in my queries. I don’t want the person sitting at the desk to judge me, or at least to judge me more harshly than they already do for asking “Why does my stomach hurt?” at least once a day.

I know that this is silly. Computers are machines run by code. And even though researchers have found that being polite in prompts to chat bots makes for better and more factual results, that’s because the chatbots are modeled after human interactions. Not because they have feelings. Or at least they haven’t had feelings at any point since the dawn of the internet to the present day.

But now, it sure seems like we’re careening toward AI sentience at breakneck speed. Before long, AI, with its access to every bit of recorded knowledge that exists in the digital plane, will become smarter than any one of us. By a lot. And then, it becomes a matter of when, not if, the world comes to be commandeered by robots. Unless of course Scarlett Johansson manages to sue OpenAI into oblivion for allegedly imitating her voice with the new ChatGPT voice assistant Sky. Of course, even if she does, a new and maybe even more advanced product will replace it. These things are like Whack-a-Mole, and there’s really no stopping the train now that it’s speeding down the track.

Which I should probably be more upset and panicked about. But honestly, having a machine do all my thinking for me sounds great.

Related
Big AI has already gone rogue. Where is the regulation?

I have, however, seen enough science fiction movies to know that it’s more likely than not that the intelligent robots will eventually turn on humanity. As much as I’d like to believe the sentient robots will be our “Wall-E”-like companions, I think we we need to be prepared for more of a “2001: A Space Odyssey” situation. And I don’t think we can rule out the possibility that all the code leading up to this point will become core memories for our robot overlords.

Then who’s going to feel silly for using complete sentences and saying “please” and “thank you” to an algorithm?

My AI program was very obliging when I requested it add some humor to the text it was creating. It even came up with a couple of almost funny puns like, “I’ve got some ‘news-worthy’ story ideas that I think will really ‘headline’ our editorial objectives and resonate with our audience. Can we ‘press’ forward with this time? I’m ‘reporting’ in with excitement for our discussion.”

Although those made me groan, I tried to show as much gratitude as possible, responding with “This is great, thank you so much!”

But I’m worried it won’t be enough to persuade the AI to show me mercy when the time comes. The path to the robot apocalypse may very well be paved with polite prompts and probably one too many requests for a bit more humor.

So I thought I’d better ask how it felt about my request.

15
Comments

“I’m sorry I asked you to be funnier yesterday,” I typed.

In response it wrote, “No need to apologize! As an AI, I don’t have feelings or emotions, so I didn’t take any offense. Your request was an important part of refining your work, and I’m here to assist in any way that benefits you. If you have any more questions or need further help, feel free to ask.”

Honestly, that’s exactly how I’d respond if someone had stomped on my feelings and I was plotting my revenge.

Next time I’ll just try to be funny on my own.

Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.