October 10, 2018 ᛫ 5 min
Rheganne Mooradian was sitting on her bed crying one day after having just quit her job, listening to music, when she said she heard a voice tell her, “It’s going to be OK.” The words might have been comforting had she not heard them from Alexa—Amazon.com Inc.’s voice assistant which powers the Echo Dot speaker on her nightstand.
Recently, the Wall Street Journal released an article detailing Amazon Alexa and Google Home units unexpectedly speaking to their owners and performing erroneous functions.
After the user in the above article excerpt heard Alexa, she turned the home assistant off and stuffed it into a drawer downstairs for several days.
It's reasonable to expect that voice assistants will encounter false activation signals and other software bugs as companies continue to develop these products. But I'm not writing about the functionality of home assistants. I am instead focusing on the way we respond and interact with these conversational, human-like machines.
From its design to the very name it holds, the Amazon Alexa was built to emulate human behavior. This is a sensible product choice, for it makes interacting with Alexa an intuitive experience.
So whether by intention or incidence, people with Alexa have begun to refer to the machine as "her" rather than "it." This is a drastic social change in the way we perceive artificially intelligent systems. From an analytical perspective, we understand that Alexa is no more than a plastic cover containing sophisticated circuitry and computing components. But because Alexa has used current technology in a novel way—to communicate with people seamlessly and naturally—it is easier to assume that Alexa has developed some sort of cognitive functionality and personality.
Of course, a robot has no feelings nor ability to meaningfully think or feel. However, we have nonetheless associated living, organic traits such as gender and speech to an inherently mechanical thing, and such a trend can only continue in the forseeable future. In fact, earlier in 2018, Amazon added the ability for Alexa to reward good behavior, which is aimed at teaching children manners.
The point is: should we treat Alexa with respect? If so, why?
The basis of social etiquette is rooted in ensuring human interactions do not infringe upon the rights of others. Of course, no currently known machine can grasp the concept of rights and fairness, let alone possess defined rights. Alexa's emotions are simulated attempts to be personable and friendly. But knowing this, why do people still choose to refer to Alexa by "her?"
If we choose to treat Alexa with contempt when we are frustrated, or passitivity upon a great response, what effect would this have on our interactions with those employed in the service industry? If children are raised without being taught the important of treating Alexa with manners, how would such children treat their subordinates once they mature in the workforce? Because Alexa resembles humans and human interaction, we are obliged to treat her with respect, lest we lose it for ourselves.