Apple made Siri deflect questions on feminism, leaked papers reveal
Source: The Guardian
Apple made Siri deflect questions on feminism, leaked papers reveal
Exclusive: voice assistants responses were rewritten so it never says word feminism
Alex Hern
@alexhern
Fri 6 Sep 2019 13.00 BST Last modified on Fri 6 Sep 2019 16.45 BST
An internal project to rewrite how Apples Siri voice assistant handles sensitive topics such as feminism and the #MeToo movement advised developers to respond in one of three ways: dont engage, deflect and finally inform.
The project saw Siris responses explicitly rewritten to ensure that the service would say it was in favour of equality, but never say the word feminism even when asked direct questions about the topic.
Last updated in June 2018, the guidelines are part of a large tranche of internal documents leaked to the Guardian by a former Siri grader, one of thousands of contracted workers who were employed to check the voice assistants responses for accuracy until Apple ended the programme last month in response to privacy concerns raised by the Guardian.
In explaining why the service should deflect questions about feminism, Apples guidelines explain that Siri should be guarded when dealing with potentially controversial content. When questions are directed at Siri, they can be deflected
however, care must be taken here to be neutral.
For those feminism-related questions where Siri does not reply with deflections about treating humans equally, the document suggests the best outcome should be neutrally presenting the feminism entry in Siris knowledge graph, which pulls information from Wikipedia and the iPhones dictionary.
-snip-
Read more: https://www.theguardian.com/technology/2019/sep/06/apple-rewrote-siri-to-deflect-questions-about-feminism