Towards a Feminist Internet: Combatting Algorithmic Gender Biases
Words: Ellisha Walkden-Williams and Viktoria Bielawa
The Internet has often acted as a stage by which gender biases have been acted out. Just looking at the gender parity in the tech industry, algorithms automating sexist datasets, or the blatant gender stereotyping in subservient female chatbots and home artificial intelligence assistants, gender bias is prevalent in different aspects of the Internet. Humans tend to see sexism as a social problem that needs to be addressed, but AI is not programmed in the same way.
Algorithmic bias is the result of structural limitations in its design. The imagined objectivity is often conflated with pure, mathematical logic. These AI programs utilise algorithms to function. Algorithms are a procedure for solving a problem by using machine learning and artificial intelligence, typically for the purpose of search engine optimisation. AI is designed to measure, analyse, represent and predict human behaviour. The people who code these AI systems insert their own societal norms and values. Only 22% of these creators are women, which is why popular forms of misogyny may at times be automated within these systems and reflected back to users as unequivocal fact. The problem is language. Cultural biases are constituted through language, then recorded, and finally transferred into ideological meaning. This is coded into the very interfaces of algorithmically designed technologies — exposing and consummating existing social prejudices and identities.
The tech world is starting to realise these problems, and The Feminist Internet launched in 2017 by Charlotte Webb to challenge systematic gender bias that was circulating on the Internet. It has programs like Hollabot, a prototype app that tackles online abuse, or F’xa, a chatbox that educates users about AI bias, to help eliminate gender bias online. The team has introduced ‘Designing a Feminist Alexa’ workshops, allowing students to prototype and code alternatives to Alexa using feminist standards. Then there’s the configuration of a Feminist Alexa, which fights for a more equitable Internet for all users.
As part of Our Algorithmic Lives, Feminist Internet will be presenting ‘In Your Shoes’ Chatbot in collaboration with UAL MA students. The chatbot has been designed as a futuristic job-hunting tool that purposefully discriminates against marginalised subgroups. Imbued with humanistic qualities, users are able to interact with the bot within the exhibition space — making you think about your own attitudes and the need for algorithmic accountability for the sake of future generations. We spoke to The Feminist Internet’s founder to discuss how the movement is changing the conversation around AI and bias.
What urged you to create Feminist Internet?
I had just finished my practice-based PhD looking at how the Internet changes the way we understand creative authorship, and was increasingly frustrated with inequalities I saw both in the digital creative industries and academia. I was invited to propose a research project for UAL looking at tech and gender, and it evolved into a 10-day course for UAL students, where we co-created the Feminist Internet Manifesto and prototyped creative responses to gender inequality relating to the Internet. Many of the participants wanted to keep our work going, and so Feminist Internet has evolved organically ever since then.
What are Feminist Internet’s key principals?
Our mission is to make the internet more equal through critical, creative practice. We address issues such as AI bias, online abuse and the environmental impact of technologies. Our approach is always to engage audiences with the complexities of these topics in a clear, accessible and playful way. As well as the points our manifesto, we focus on bringing feminist approaches to technology development, encouraging the next generation of talent to foreground equality and inclusion in their practices so they can be a force for good in the digital (and wider) world.
What do you hope to achieve through this movement?
We hope to raise awareness amongst young people, particularly those interested in becoming designers, developers, social entrepreneurs, and equip them with the practical, technical and conceptual tools to promote equality in techno-social systems. We also hope to contribute to the growing field of activists, academics, and developers who together are calling for tech companies to be more accountable for the negative social impacts their platforms and services can cause.
Is it the technology or the programmed algorithms of the platform that are non-inclusive, or is it the users shaping them?
There is a feedback loop between existing social biases, data sets, machine learning models, and people’s behaviours. We can’t separate these from each other since they all co-exist and tend to reflect each other. For example, take a healthcare algorithm that prioritises white patients over sicker black patients. The algorithm is designed to direct medical care to those most in need, but it makes its predictions based on data about people’s past health costs. Black patients historically receive less health care than white patients, so the system flags white patients more often because they have spent more on health care in the past. In this case, there is a problem with the design of the system - using cost as a proxy for health status, but there is also a problem with society! The reason black people spend less on healthcare is due to systemic social and economic inequalities.
Due to patriarchal and capitalist values still being so rooted in our society, do you think the world needs to change before the internet does?
The world and the internet can’t be separated - they are part of one ecosystem. As long as racism, sexism, white supremacy, capitalism and other oppressive systems exist, the internet will reproduce them. That’s not to say there isn’t great potential for the internet to be a liberating force, but that it reflects society, as society reflects it.