It is not uncommon for the eyes of blind people to move around involuntarily. Haben’s do. And this wreaks havoc on facial recognition systems because they did not consider people like Haben. Haben Girma is black, deaf, and blind. She is also a Harvard graduate. The fact that we have women, non-white people, deaf and blind people on our planet somehow was missed in facial recognition systems.
Or when the algorithms used to determine if the name you use to sign up to Facebook or other sites is a ‘real name‘ are based on western archetypes. They cause problems for people who’s names don’t fit the archetype (e.g. Native Americans, Japenese, Gaelic and so on).
When an algorithm is biased, oppressive, and causes harm, who is responsible?
Is it the person who coded it? Who used it? The person(s) who designed it? Or is it everyone involved?
It can’t be the algorithm itself and yet it is often what gets blamed. Which is a diversion and an avoidance of responsibility.
And algorithms are everywhere now. They are used in determining who we hire, how we police, who has access to benefits, and who receives aid.
Algorithms are based on historical data. And they are developed based on the way the world is viewed by the designers. Bias, oppression, inequality, do not magically disappear. Data is often a reflection of this bias in our world and ur view of it.
Algorithms are not responsible, they efficiently carry out a task. We are responsible, even when we choose to give over power and decision rights to algorithms.
Another nicely-put piece, Amos. Thanks!
Thanks Rupert! I’m glad it resonated!