Is Siri sexist? UN cautions against biased voice assistants
Apple's Phil Schiller talks about Siri with the new Apple iPhone 4S during an announcement at Apple headquarters in Cupertino, Calif., Tuesday, Oct. 4, 2011. THE CANADIAN PRESS/AP, Paul Sakuma
NEW YORK -- Are the female voices behind Apple's Siri and Amazon's Alexa amplifying gender bias around the world?
The United Nations thinks so.
A report released Wednesday by the UN's culture and science organization raises concerns about what it describes as the "hardwired subservience" built into default female-voiced assistants operated by Apple, Amazon, Google and Microsoft.
The report is called "I'd Blush If I Could." It's a reference to an answer Apple's Siri gives after hearing sexist insults from users. It says it's a problem that millions of people are getting accustomed to commanding female-voiced assistants that are "servile, obedient and unfailingly polite," even when confronted with harassment from humans.
The agency recommends tech companies stop making digital assistants female by default and program them to discourage gender-based insults and abusive language.