For some, it can be hard to grasp the idea that automated technology and algorithms can be racist. On Apr 29, the Critical Platform Studies Group hosted a panel of guest speakers at the University of Washington to talk about racism in algorithms and the tech industry as a whole.
Seattle University graduate Haleema Bharoocha moderated the event, as she now works for the Greenlining Institute,a public policy, research, and advocacy non-profit organization based in Oakland, California.
The auditorium was packed with attendees for a panel discussion about Racism and White Suppremacy in Technology.
Other panelists included Nikkita Oliver, a Seattle mayoral candidate for the Seattle Peoples Party; Anna Lauren Hoffman, an assistant professor with The Information School at the University of Washington; Shankar Narayan, the director of the Technology and Liberty Project at the ACLU of Washington; and Pedro Perez, the cofounder and executive director of Geeking Out Kids of Color, an organization that is closing the digital literacy gap, battling racism and sexism for youth.
Bharoocha began the event by introducing the topic and opened the discussion about how tech can have profound political and social effects that go far beyond privacy concerns. She offered a practical example about how automated systems used for resume reading may be heavily biased and exclude people of color or people who come from certain backgrounds from gaining employment at certain companies. If the software is supposed to use the company’s past hiring history then it searches for candidates similar to those that the company hired in the past.
“If your past hiring history is all white men, then that’s what the algorithm will replicate,” Bharoocha said.
Oliver offered other examples, such as the 2016 presidential election and how most liberals in Seattle had no idea that President Trump had a chance to win the election.
“Most of my friends are pretty liberal, so based on the way my Facebook is set up and what that starts to do is create bubbles,” Oliver said.
Algorithms used by social media platforms like Facebook and Twitter show users what they want to see and are already familiar with. Oliver pointed out to the crowd of about 240 people that these social bubbles can be harmful.
“How do you create social change? You have to experience new ideas,” Oliver said.
Practical, everyday examples like these set the tone for the discussion and led the panelists to a deeper and more detailed discussion about racist software that perpetuates white supremacy.
Narayan was quick to add that technology has always had disproportionate impacts on vulnerable communities. He mentioned how surveillance technology has been targeted at those groups. This then sparked a conversation about the use of surveillance technology in the Seattle Police Department. Oliver mentioned the use of predictive analytics by Seattle Police in certain neighborhoods.
“Police presence increases the likelihood of finding crime, not the other way around,” Oliver said.
How companies decide to use predictive analytics and whom they value shows a bias. Placing predictive analytics in neighborhoods that are made up of people of color and not in white neighborhoods implies that white people don’t commit as many crimes. Oliver then gave a lighter example.
“How many of you sped to get here tonight? A lot of you would’ve gotten tickets if there was a police officer watching you and waiting for you to speed.”
Throughout the event, panelists reiterated that the hierarchical systems that favor white people cannot be ignored by tech. Because people live with bias in their everyday lives, their technology is also going to be biased. The solutions remained undefined but there was a sense of accomplishment throughout the audience due to the fact that these problems in the tech field are continually being unmasked to the general public.
The editor may be reached at