'I'd Blush If I Could': AI assistants like Siri, Alexa promote sexist stereotypes, UN warns
A man uses 'Siri' on the new iPhone 4S after being one of the first customers in the Apple store in Covent Garden on October 14, 2011 in London, England. (FILE Photo/ Getty Images)


Are the female voices behind Apple's Siri and Amazon's Alexa amplifying gender bias around the world?

The United Nations thinks so.

A report released Wednesday by the U.N.'s culture and science organization raises concerns about what it describes as the "hardwired subservience" built into default female-voiced assistants operated by Apple, Amazon, Google and Microsoft.

The vast majority of assistants such as Apple's Siri, Amazon Alexa and Microsoft's Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.

The report is called "I'd Blush If I Could." It's a reference to an answer Apple's Siri gives after hearing sexist insults from users. It says it's a problem that millions of people are getting accustomed to commanding female-voiced assistants that are "servile, obedient and unfailingly polite," even when confronted with harassment from humans.

The study highlighted that Siri was previously programed to respond to users calling her a "bitch" by saying "I'd blush if I could" as an example of the issue.

"Siri's submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products," it said.

Apple, Amazon and Microsoft were all not immediately available for comment.

A spokeswoman for Microsoft has previously said the company researched voice options for Cortana and found "a female voice best supports our goal of creating a digital assistant".

Voice assistants have quickly become embedded into many people's everyday lives and they now account for nearly one-fifth of all internet searches, said the report, which argued they can have a significant cultural impact.

As voice-powered technology reaches into more communities worldwide, the feminization of digital assistants may help gender biases to take hold and spread, they added.

"The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them," said Saniye Gülser Corat, UNESCO's director for gender equality.

The report called on companies to take action including to stop making digital assistants female by default, exploring gender neutral options and programming assistants to discourage gender-based insults and abusive language.

A team of creatives created the first gender neutral digital assistant voice earlier this year in an attempt to avoid reinforcing sexist stereotypes.

The UNESCO report was welcomed by women's groups, with Womankind spokeswoman Maria Vlahakis saying it gave "much needed attention" to gender bias in algorithms.

"These algorithms perpetuate gender stereotypes and sexist and misogynist behavior and are reflective of wider structural gender inequalities in technology," she said.