Developers can transform almost any complicated task into an algorithm, which will assist you in the decision-making procedure. In fact, some algorithms aren't written by humans at all, but by other algorithms.
Because algorithms are a part of our daily lives, it's important that we understand their existing and potential biases.
Supervised learning algorithms are implemented in many applications we use daily. For example, the Instagram algorithm takes into consideration thousands of user actions to determine the order and frequency of which you see posts and stories.
Even if you are being secure online, the services you use appoint data that can be used to identify you, data that you did not agree to disclose like your habits, who you’re connected to online, and how you interact with them.
What you are seeing online is filtered out through algorithms that have been customized based on what a company or platform thinks you would like to see.
Racism and Sexism by the Numbers
We believe computers are neutral, but that is simply not the case.
Algorithms, although invisible, are programmed to reinforce and promote sexism and racism in our increasingly digitally-driven and informed society. Think of it as a sensory calibration problem: much of the data absorbed by algorithms comes in one way or another from interactions between individuals, and those interactions are, empirically and verifiably, influenced by biases.
When an algorithm learns from biased feedback, or makes decisions based on data that’s biased, that’s when racism or sexism become inherent in the design.
The topic of whether an algorithm is sexist or racist often leads to a lot of exasperated opinions, but the truth of the matter is when you look at the coding of an algorithm, it is simply a formula. And when biased data is inputted into the formula, the algorithm not only becomes biased - but exponentially so.
Profit Over Ethics
Take for example when companies like Facebook instantly profit from algorithms that promote sexism.
Recently, a group of men gang-raped a 15-year-original girl and streamed it on Facebook Live. If one woman sees it and reports it to Facebook and there are another 30 other men looking it at the same time and a portion of them links to it, the algorithm will align itself with the opinion of the 30 men.
When a company’s goal is to keep you engaged with the content, there is an undeniable conflict of interest between that goal and helping you to see content that is unbiased.