Should objects have feelings?
- Sara Gomez

- Oct 18, 2018
- 2 min read
A reflection of "Over the Shoulder" video exercise
For our Creativity and Design course we recently created a video that interpreted the possible emotions of a bag during the first day of school. Exploring this concept led me consider how humans treat inanimate objects, and the complex relationships that could arise if these objects did indeed experience emotions.
I first thought of the clock and candlestick from Beauty and the Beast and the excitement, frustration and anger they experience throughout their elaborate scheme to become human again...
But more relevant to today's technological world, I thought of the Amazon Echo (Alexa) or any voice enabled device.
In the past when requesting something from Alexa, I have felt compelled to be "nice" to her. I don't particularly say Thank You or Please, but I do make requests with a polite and calm voice. Additionally, I admit to feeling uncomfortable when friends speak to Alexa in a harsh or demanding manner.
Is it strange that I feel uneasy when people are rude to Alexa? Is it valid for me to expect my family and friends to be kind to her?
I decided to explore this topic, and ran across an interesting article on Fast Company, The case against teaching kids to be polite to Alexa.
According to the article, parents are mortified when they witness their child rudely speaking to Alexa. Parents want their children to learn manners, and fear that what they sense as disrespect towards Alexa could emerge during real-world conversations.
To combat this fear, companies with voice enabled devices are creating politeness features to teach kids to say please and thank you when interacting with these objects.
Although from a very shallow look it could seem like a great etiquette idea, its important to consider the implications of teaching young children to be polite to inanimate objects.
If we are teaching kids to be polite to voice-enabled devices from a very young age, they may grow up thinking that the devices actually have real feelings that need to be respected. Once they are old enough to make the differentiation, how much will they actually be able to separate human feelings from AI assistant programming?
What are the potential consequences when a machine does not agree with the instructions given by a child? Will the child feel compelled to respect the wishes of the device?
Extending human social norms to technology products will produce and expose the unwanted side-effects of using voice enabled devices. These effects will be highlighted in young children as they are still in early stages of navigating, and making sense of, the world.
Future generations need to be taught the sharp difference between real emotions + human connection, and IA programming + technology enabled emotions.
I highly recommend reading the article! It was thought-provoking and insightful.
As for me, I no longer feel guilty being direct and, if needed, harsh with voice enabled devices :)

Key Quote from Article:
"Being polite to a piece of technology may suggest that AI assistants are capable of feeling appreciated or unappreciated; that machines have rights; and that one of these rights is the right to refuse our requests."







Comments