by Arturo Pineda
If an Internet user Googled “N**** House” in 2015, Google Maps directed them to the White House address. If someone searched “Black girls,” “Asian Girls,” or “Latina Girls” in 2011, the company’s algorithm produced pages of pornographic ads. Google classified both of these incidents as “glitches” in the algorithm. On April 10th during her lecture, “The Intersectional Internet,” UCLA Information Studies professor Safiya Umoja visited Yale University to argue these “glitches” are not accidents at all.
Umoja, whose research analyzes how algorithms perpetuate bias against marginalized communities, asserted that search engines are social constructs embedded with social, economic, and political views. In the previously mentioned search for women of color, the algorithm had been designed to associate women of color with pornography. That reflects the views of the people who developed the algorithm.
Search engines like Google also favor the opinions of the popular, wealthy, and powerful. For example, Google features the results of their donors by listing them as sponsored and placing them at the top of search results. The search results include websites for the corporations Google invests in, and exclude competitors. Opinions with no monetary tie to Google will ultimately be buried in later pages. This design flaw is compounded by the amount of trust people place in technology. Approximately 70% of search engine users identify Google search results as being “reliable” or “trustworthy,” meaning Internet users trust the results provided by a skewed algorithm.
Issues of bias extend to library database search engines as well. ARTSTORE is one of the largest databases for digitalized images of art and media. When Professor Umoja searched “Black stereotypes,” only 66 results appeared. When she searched “white history racism,” only 3 results did. The results highlight an ulterior purpose of technology: erasure. Based on this data, it would appear that white people are not racist, and that there are relatively few stereotypes of Black people. These search results showcase how algorithms can censor parts of history.
Sometimes technological bias produces fatal effects. In 2015, Dylann Storm Roof—the white gunman who massacred 9 Black worshippers at Emanuel African Methodist Episcopal Church in South Carolina—believed flawed information. His manifesto reads:
“The event that truly awakened me was the Trayvon Martin case. It was obvious that Zimmerman was in the right. But more importantly this prompted me to type in the words ‘black on White crime’ into Google […] There were pages upon pages of these brutal black on White murders. I was in disbelief.”
What if the results of the Google search had been different? What if the links had included scholarly articles explaining that the highest rates of murder occur within the same race? “In a society with more data and more injustice,” warns Professor Umoja, “it has become vital for all people to reject notions of neutrality in technology.”