This is the title of a report by the United Nations Educational, Scientific, and Cultural Organization, otherwise known as UNESCO.

“I’d blush if I could,” was named after a response Siri utters when receiving certain sexually explicit commands,

The report explores the effects of bias in AI research and product development and the potential long-term negative implications of conditioning society, particularly children, to treat these digital voice assistants as unquestioning helpers who exist only to serve owners unconditionally.

AI-powered voice assistants with female voices are perpetuating harmful gender biases, according to a UN study.

These female helpers are portrayed as “obliging and eager to please”, reinforcing the idea that women are “subservient”, it finds.

Particularly worrying, it says, is how they often give “deflecting, lacklustre or apologetic responses” to insults.

The report calls for technology firms to stop making voice assistants female by default.

The report says:

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says.

“Because the speech of most voice assistants is female, it sends a signal that women are… docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

“In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Research firm Canalys estimates that approximately 100 million smart speakers – the hardware that allows users to interact with voice assistants – were sold globally in 2018.  And, according to research firm Gartner, by 2020 some people will have more conversations with voice assistants than with their spouses.

Tech companies have failed to build in proper safeguards.

Going further, the UN report argues that tech companies have failed to build in proper safeguards against hostile, abusive, and gendered language. Instead, most assistants, as Siri does, tend to deflect aggression or chime in with a sly joke. For instance, ask Siri to make you a sandwich, and the voice assistant will respond with, “I can’t. I don’t have any condiments.”

The report also highlights the digital skills gender gap, from lack of internet use among girls and women in sub-Saharan Africa and parts of South Asia, to the decline of ICT studies being taken up by girls in Europe.

According to the report, women make up just 12% of AI researchers.

Microsoft’s Cortana was named after a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman, while Apple’s Siri means “beautiful woman who leads you to victory” in Norse. While Google Assistant has a gender-neutral name, its default voice is female.

Apple did make a male Siri voice available in 2013 and that is the default voice in languages including British, Arabic and French.

The report calls on developers to create a neutral machine gender for voice assistants, to programme them to discourage gender-based insults and to announce the technology as non-human at the outset of interactions with human users.

A group of linguists, technologists and sound designers are experimenting with a genderless digital voice, made from real voices and called Q.

Yet as the report says, these features and gender voice options don’t go far enough; the problem may be baked into the AI and tech industries themselves. The field of AI research is predominantly white and male, a new report from last month found. Eighty percent of AI academics are men, and just 15 percent of AI researchers at Facebook and just 10 percent at Google are women.

gender bias
KOYPO LABS