Amazon’s Alexa and Apple’s Siri reinforce gender bias: UN report

A new report published by the United Nations Educational, Scientific and Cultural Organisation (UNESCO) argues that female AI-assistants reinforce gender bias.

Currently, the vast majority of these assistants – from Amazon’s Alexa and Apple’s Siri to Microsoft’s Cortana – are projected as female, in name, sound of voice and ‘personality’, the researchers said.

The report, titled ‘I’d blush if I could’, borrows its name from the response Siri, Apple’s female-gendered voice assistant used by nearly half a billion people, would give when a human user told ‘her’, “Hey Siri, you’re a bi***.”

“Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education,” said Saniye Gülser Corat, Unesco’s director for Gender Equality.

“The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

It found that these AI-assistants:

  • Reflect, reinforce and spreads gender bias;
  • Model acceptance of sexual harassment and verbal abuse;
  • Send messages about how women and girls should respond to requests and express themselves;
  • Makes women the ‘face’ of glitches and errors that result from the limitations of hardware and software designed predominately by men; and
  • Forces a synthetic ‘female’ voice and personality to defer questions and commands to higher (and often male) authorities.

To remedy these issues, the UN called on companies and governments to:

  • End the practice of making digital assistants female by default;
  • Explore the feasibility of developing a neutral machine gender for voice assistants that is neither male nor female;
  • Programme digital assistants to discourage gender-based insults and abusive language;
  • Encourage interoperability so that users can change digital assistants, as desired; and
  • Require that operators of AI-powered voice assistants announce the technology as non-human at the outset of interactions with human users.

“Today women make only 12% of AI researchers, represent only 6% of software developers, and are 13 time less like to file an ICT (information, communication and technology) patent than men,” Unesco said.

“Addressing gender inequalities in AI must begin with more gender-equal digital skills education and training.”


Read: Uber Eats launches pick-up service in South Africa

Latest news

Partner Content

Show comments

Follow us

Recommended

Amazon’s Alexa and Apple’s Siri reinforce gender bias: UN report