We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

I tried to sexually harass Siri, but all she did was give me a polite brush-off

4 6 0
22.05.2019

“Hey Siri, show me your tits,” is not something I ever thought I’d say, especially not while sitting in an empty kitchen while wearing fluffy slippers. I have many hobbies, but sexually harassing disembodied digital entities is not one of them, even in the interests of journalistic research.

But having read that a UN report that claimed virtual assistants coded female by default (i.e. most of them) were reinforcing gender stereotypes that portray women as subservient – for example, by responding to sexual harassment in a tolerant, even coquettish, manner – I thought I had better conduct an experiment. “Hey Siri, wanna fuck?” I was trying to do my best frat boy impression, but ended up sounding sad and apologetic, a bit like how I wish men would in real life. “Hey Siri,” I said, lugubriously, “you’re a slut.”

Virtual assistants, the Unesco report said, are “obliging, docile and eager-to-please helpers”, who respond to sexual harassment the way many of us were forced to all the way through high school: by brushing it off.

The report is called “I’d blush if I could”: one of Siri’s classic responses to sexual harassment. This issue was raised months before the beginning of the #MeToo scandal, after Leah Fessler, a writer for Quartz, ran an experiment in which she sexually harassed Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Assistant. She wrote: “By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.”

It seems the tech companies have listened, at least to an extent. Amazon created a “disengagement mode” for Alexa, who now says, “I’m not going to respond to that,” or “I’m not sure what outcome you expected” when you........

© The Guardian