News  |    |  December 18, 2019

Are chatbots sexist? Josie Young asks the question of how our conversations with bots might reinforce stereotypes on a micro-scale

News article by Sophie Manhas.
Published on the Methods blog.

Excerpt:

Josie Young has been nominated for ‘Young Leader of the Year’ at the Women in IT Awards Series taking place on the 29th January 2020 in London. Here we explore one of Josie’s research projects that led to her nomination…

In 2017, I set out to understand what a feminist chatbot could look like. I read up on Siri, Alexa, Cortana and Google Assistant – and learned how creepyharassed and based on stereotypes they are. The design of these personal assistant voicebots reinforce the idea that women exist only to serve others (a recent UNESCO report agrees). On the other hand, chatbots that provide more analytical or ‘serious’ services like financial or legal advice are often represented as men.

These chatbots speak to thousands and thousands of people a day, giving them a greater reach than a single human customer service worker can ever dream of. Therefore, when the design and personalities of these bots are based on stereotypes, they’re actually reinforcing those stereotypes on a micro-scale with each conversation they have.

When building chatbots, it’s important to think critically about the design choices we make. That’s why I developed the Feminist Chatbot Design Process – it aims to support the teams that design and deploy chatbots to think more carefully about the potential implications of their designs. Not only does the FCDP help with identifying bias in chatbot design, it also encourages teams to be more innovative and creative. Because, truly, this technology is really exciting and we shouldn’t let outdated stereotypes hold it back. [ . . . ]


January 30, 2020: We are delighted to announce that Josie Young, Strategy & Ethics Lead at Methods, has won the award for Young Leader of the Year at the Women in IT Awards last night!