We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

Hey Siri, a UN report finds digital assistants with female voices reinforce harmful gender biases

3 2 0
22.05.2019

A report from UN agency Unesco has said that assigning female genders to popular digital assistants, including Amazon’s Alexa and Apple’s Siri, entrench damaging gender biases, reports the Guardian. The report found that female-sounding voice assistants often returned submissive and flirty responses to queries by the user, which reinforces the idea that women are subservient.

To exemplify this, Unesco named the report “I’d Blush if I Could,” which was the reply Siri gave when a user said, “Hey Siri, you’re a........

© Fast Company