On a chilled Saturday afternoon couple of weeks ago, I was driving my little cousin over at my house so he could play on my PlayStation.
Trying to show off, I asked Siri to call my fiancé.
“Hey Siri, could you call Alex on speakerphone please?”
Siri replied “I’m on it.”, but the 10-year-old was definitely not impressed.
He did, however, make a sarcastic comment.
“Did you really just say “please” to your phone?”
I paused for a moment and tried to understand what was so weird about the polite command I gave my virtual assistant.
“Well… yeah?” I replied, still puzzled by the question.
He then laughed and said “…why? It’s not a real person.”
My brain froze. I cancelled the call.
Just like in Pixar’s “Inside Out” movie, the little “mind people” didn’t know how to respond.
I knew he was right; Siri is not a real person…
But how do I explain to a 10-year-old that we always need to be polite, without sounding like a crazy woman who is being super kind to something lifeless?
For the next few minutes my brain started rewinding back to the different occasions I was polite to devices like Siri.
Apparently it happens very often, like a reflex even.
Just the other day, Alex was annoyed with the GPS’ route and I actually tried to defend the device…
I literally snapped and said, “Relax, she’s just trying to find a shortcut!”
He didn’t seem to understand what my problem was either.
Should we simply command devices to do stuff?
That doesn’t seem right, right?
This started bothering me more than it should, and while researching how tech affects politeness a couple of days later, I was astounded to come across studies showing that kids were actually affected by this.
The real question was then revealed.
Could Virtual Assistants turn our kids into rude adults? And what should we as adults do?
According to a ChildWise survey last year, 42% of children between nine and 16 years old use voice recognition gadgets.
No surprise there, kids could be born holding a phone or tablet nowadays.
Like this little guy!
All jokes aside, the statistics showed that 36% access Apple’s Siri, 20% access Microsoft’s Cortana, 15% access Amazon’s Alexa at home and 7% access Google Assistant.
Apparently, kids would ask the virtual assistants to help with their homework, and one out of seven kids would “look up” facts or ask unknown words.
So far so good… Kids use technology to improve certain skills and enhance their knowledge.
However, assistants like Siri, Microsoft’s Cortana and Alexa, tolerate any tone of voice when given a command.
Thus, children fail to say “please” and “thank you” because they still get what they want from the device regardless.
Simon Leggett, the research director at ChildWise stated that “We are on the tipping point with this technology and it is about to become mainstream for children. This is likely to have implications around how children will learn to communicate.”
Adding that “as there is a surge in children’s use of gadgets that respond to verbal commands we may see them learning ways of communicating that then slip into their interactions with humans.”
Besides educational purposes, even games are being developed based on just voice recognition.
Like Doppio’s “The 3% Challenge” game, where you need to complete voice-based challenges to get to the next level.
This conversational game is played without a screen, just voice and the aim is to compete with other players in weekly tournaments and earn rewards.
You can play this game just by commanding Alexa or Google Assistant to access it.
So, if you turn commands into games, how does this affect children who are playing it?
Well, this is what worries both ChildWise and myself.
Since kids from as young as the age of 5 curt orders such as “play this song” or “wake me up at 7 am” without any consequences, will it be clear later on that’s not how we talk to a real human?
“Hey Siri, what did the doctor say?
Dr Valerie Risoli, clinical psychologist at Dubai’s Physiotherapy and Family Medicine Clinic, said that the “relationship” kids have with such devices nowadays, “is one of the factors that contribute to creating a generation that is more spoilt and rude towards others.”
More specifically, Dr Risoli underlined that “a behaviour that is repeated over and over with no negative consequence becomes a habit”.
Continuing, Dr Risoli gave an example of what is likely to happen when a child is rude to their own device.
“A child speaks badly to the phone — yet he gets what he wants, so the consequence is positive — so the inappropriate behaviour is reinforced and maintained — it becomes a habit — the child learns to speak rudely in order to obtain something from his phone as well from his parents as well from other people.”
This makes sense, and to be honest it’s quite worrying.
“Thank you, Siri…”
Janet Read, a professor in child computer interaction at the University of Central Lancashire, said that parents can stop this behaviour from happening if they are kind to their own devices.
“If the parents say ‘thank you, Google Home’, when they’re finished, the child will also say ‘thank you Google Home’,” Read said.
“The way you talk to the device will just reinforce the manners that are acceptable as a family. If you’re not pushing good manners in your family and you’re also being rude to the device, then you’re just reinforcing the idea that that manner is acceptable.”
Dr Risoli also mentioned that children tend to copy what adults do, and repeat what they say.
This process is called “vicarious learning”.
Virtual Assistants: The end for humanity?
AI performs, progresses and evolves on the go.
While we give them commands and they execute them quite accurately, they also collect data and update.
What I’m trying to say, is that for the time being, we are rude to devices and demand certain things from them. But what if the time comes that they can recognize the tone of voice?
Virtual assistants are learning as much as they can to please us and give us better answers, suggestions and overall know what we want, the way we want it.
Through this data built, however, they will eventually recognize human behaviour and respond accordingly.
And believe me, it won’t be pretty.
Brian Russell, the artist behind the webcomic “Underfold” illustrated the probable long future, and this is what it looks like:
Okay, smart home devices probably won’t kill us all in our sleep.
At least, not in our lifetimes.
But they will eventually learn based on the interactions we have with them, and this data will then be adjusted to how they will interact with us.
This rude, snappy attitude we have towards smart devices won’t be in our favour.
In a Forbes article about AI and virtual assistants, writer Curtis Silver said that we should treat these devices with a bit more respect.
After all, they’ve seen us naked (Echo look) and put up with some terrible music tastes, weird questions and annoying requests.
More specifically, Curtis said: “If we can’t at least be polite to strangers on the internet (I mean, have you been on Twitter lately) then we should at least try to be polite to the devices in our own homes. We can do better. Our future depends on it.”
In my opinion, it couldn’t be said any better!