View on GitHub

ArtificiallyIntelligent

Download this project as a .zip file Download this project as a tar.gz file

Why Alexa is Racist and Sexist

Andrew Prescot

In our interactions with computers, we are always transfixed by the interface. When I was a librarian, I was always astonished how, when new systems were being tested, users were preoccupied with the positioning of buttons and the colour of banners, rather than how the search was constructed or the ability of the system to retrieve the correct records. This is partly due to our preoccupation with the idea of computers as personal tools. We want to feel they will be pleasurable and easy to use. And the companies manufacturing the computer packages that we use also want us to think about glossy exteriors rather than what is under the hood.

Increasingly, our engagement with artificial intelligence will be through the artificiality of commercially constructed personas. We have become familiar with Alexa and Siri, but they are just the beginning. In their default settings, Alexa and Siri are white American women.

As a recent article in The Atlantic by Ian Bogost observed:

If you survey the major voice assistants on the market—Alexa, Apple’s Siri, Microsoft’s Cortana, and Google Home’s unnamed character—three out of four have female-sounding names by default, and their voices sound female, too. Even before the user addresses Alexa, the robot has already established itself as an obedient female presence, eager to carry out tasks and requests on its user’s behalf.

Siri offers the possibility of customising to communicate in languages including Chinese and Arabic. Siri can be a man. But there are still limitations. We cannot set up Siri to be a Jamaican man or a Haitian woman. Maybe this will change in the future, but the question still arises - why is our default view of artificially created intelligence like Siri or Alexa?

Alexa is a woman because it suggests subservience. She is white because that is taken to signify intelligence and efficiency. It enables her to talk back as well. Do we have some idea that a woman is always listening and might be a good listener as well? Is Alexa a sexist and racist construct? Almost certainly.

There is also a continuing machine-like quality to her voice that seems reassuring to human users. Again, it is about distances and what she feel to be appropriate social relationships. We want to know that Alexa is a machine, albeit a racist and sexist one. We like to be reassured that Alexa is, a the end of the day, artificial.

It has become commonplace that, because of the training sets used, artificial intelligence is frequently sexist and racist. A study at Stanford showed how an internet-trained artificial intelligence associated European American names with positive words like ‘love’, ‘care’, and African American names with negative words like ‘failure’. Software used by many police forces in the United States to predict who might be likely to commit crimes was biassed against African Americans because of inaccurate data. Recently a group of researchers including Joanna Bryson from the University of Bath have published a paper in Science demonstrating at length how ‘Meaning really is no more or less than how a word is used, so AI absorbs true meaning, including prejudice’.

Behind the sexist and racist construction of Alexa as a personality lies the deeply embedded prejudices of AI itself. How do we construct an ethical voice AI? And how do we make an ethical AI that we would actually want to sit in a room with?