Kids, Courtesy and AI

June 27, 2019

By Lisa Brandt (Voice Artist & Blogger at Witlingo)


A friend of mine lost her cool on Siri the other day. I was driving, so I wasn’t about to start fussing around with my phone, and my pal offered to get directions as we got closer to an unfamiliar destination. Siri was having none of it. She kept picking different locations and even different states than the one my friend was asking for. I thought it was funny, but my friend felt a responsibility to get us to our meeting on time, and she ended up calling Siri a few choice names.

Siri, of course, didn’t react to the name-calling. She simply kept on bringing us results we didn’t want. Siri might be AI, and not human, but she can clearly have an off day, too.

Later, as we told the story, my friend was embarrassed by how she talked to Siri in her moment of frustration. Siri, Cortana, Bixby, Alexa, Google Assistant – they’re not real and they don’t have feelings to hurt. But should we be so blasé about taking our frustrations out on them?

This conversation mainly centers on the behavior of children, the generation that’s growing up never knowing a world without AI. We are all aware of bullying and the sometimes-devastating effects the behavior of some children can have on others, but is it okay for a child to bully AI? Is allowing a child to treat a non-human entity with disrespect, or even impolitely, just asking for trouble when it comes to how they deal with frustrations with flesh and blood people?

AI Technologist Nisha Talaga writes in Forbes that as young children get used to interacting with AI via smart speakers, they may misinterpret who’s in charge. For example, parents set controls on the Amazon Echo or Google Home, but the child might think the speaker is to blame for blocking something they want to see or hear. The kid wouldn’t dare give Mom or Dad the grief that Alexa or Cortana might receive.

Dr. David Hill, chairman of the American Academy of Pediatrics council on communications and media, tells The Wall Street Journal that to a child under 4, Alexa could be a tiny woman living in a small machine!  Kids that young simply can’t distinguish fantasy from reality. That’s how Santa Claus and the Tooth Fairy get a grip on their developing imaginations.

So, if the child thinks there is a real person in the speaker, and the child barks orders to that “person”, do we have a problem?

Dr. Hill suggests to the WSJ that we might. “If they practice rudeness at home with something they perceive to be their servant, then what is to keep them from being equally rude to the cleaning staff at school?”

Tech writer Mike Elgan argues that we should not teach children to be polite to AI. He is opposed to a message that Siri, Bixby and the rest have feelings like people do, and that they’re judges of human behavior. He points to Mattel’s short-lived AI Aristotle that refused to respond unless it was spoken to with “please” and “thank-you”.  He cites the differences between typing in a search term without niceties, versus having to say them out loud to a smart speaker, claiming it’s no different.

As someone who has made most of her living with her voice, I can promise you that oral communication becomes much more of a habit than when the words are written. We automatically write differently to Grandma than we do when crafting a business letter. But we’ve all known that person who peppers their everyday conversation with F-bombs and then forgets to turn it off at a funeral. It’s more difficult. Speech is a part of our personality, not something we tailor in each situation. Further evidence is the colleague who, like, adds the word like to, like, everything he or she says.

Amazon and Google have both released optional politeness features. Google’s Pretty Please allows Google Assistant to respond positively to please and thank you, and even ask for the “magic word” if it’s forgotten. When enabled, Amazon’s Alexa will also praise a child for talking politely to the Echo Dot Kids. 

Children are smart and they also like to test limits. If they don’t have to ask politely for Alexa to react, where else will that behavior be deemed okay? In the way they talk to the family dog. A relative they see once a year and feel no connection to. The disembodied voice at the fast-food drive-thru who operates a lot like their smart speaker. Where does it end?

Writing for Expert Parenting, Amy McCready offers tips for teaching politeness and good manners to toddlers. AI isn’t mentioned of course, but vigilance is, and so is consistency. We can’t expect our kids to turn politeness on and off just because they might experience short-term confusion over whether Siri is a real woman. Please and thank you make little people more pleasant to be around. If she could, I’m sure Alexa would agree. 

Leave a Reply

Your email address will not be published. Required fields are marked *