We have learned that algorithms are not as straightforward as they might seem. They are written by people to perform a certain task but the results are based on the motivations and goals of the people who write them. Algorithms can also take on somewhat of a life of their own when they work as designed but come up with undesirable or unforeseen results.

There are a lot of motivations for the way people and companies write algorithms and the way people want them to work. As internet users, we want everything to be free but the companies who provide the digital services we use need to make money (and evidently lots of it). So, as Dr. Otis and others have pointed out, when a service is free the user is the product and is subjected to data mining and advertising to “pay” for the free services they are receiving.

This system contributes to the inequity of search results found on the internet. An algorithm is showing you some of what you want to see but certainly not all of it. Why is this? Possibly it is the fault of the algorithm and the way it was written. Or, people who have paid for ads or have technical expertise to manipulate their results in web searches will be first in your results and bump down what you are looking for. Maybe the algorithm is giving more weight to things it shouldn’t and there is no human oversight or correction to the results it is returning. It may be as well, that all content on the internet is not tagged or formatted well enough that search engines can find it. Essentially, there may be many factors that contribute to why search engines can sometimes return biased, useless and even insulting content.

Safiya Umoja Noble pointed this out with her research on internet searches for black women and other variations of women. While her story begins with internet search racism, she follows that deeper to how people have been classified over centuries. Are the people who are writing algorithms creating them with an unconscious, or sometimes conscious, bias to the classifications and power structures that have developed in the world? Safiya Noble and others argue that people in charge of archives and online content need to be educated on the history of marginalized people so they can empathize and consider them when creating digital tools.

In Open Data in Cultural Heritage Institutions: Can We Be Better Than Data Brokers?, S.L. Ziegler talks about the exchange of information without considering the people it may affect. He says that “The representation of one group by another group can range from obvious fiction to pretense of objective truth.” He quotes David Pilgrim, founding curator of the Jim Crow Museum, saying:

“All groups tell stories,”…“but some groups have the power to impose their stories on others, to label others, stigmatize others, paint others as undesirables, and to have these social labels presented as scientific fact, God’s will or wholesome entertainment”

S.L. Ziegler, “Open Data in Cultural Heritage Institutions: Can We Be Better Than Data Brokers?” Digital Humanities Quarterly 14, no 2 (2020).

Bias can spill over into many areas in cultural heritage. A common theme among our readings is the need for more human understanding, empathy, and oversight of digital tools and archival methods so that we don’t perpetuate or enable continued disrespect and misunderstanding of groups of people.

You may also like...

2 Comments

  1. You make a great point about ensuring we are empathetic with our digital tools as the tool itself is never neutral, but we also must remember that even having access to the tool or learning how to use the tool is an act of exclusion in and of itself. How information flows across digital spaces is one of exclusion and absence. While I think empathy is vital as you said, understanding the accessibility and consequences of the flow of information is largely how certain perspectives are established and how research is influenced by certain biases. So not only should we be empathetic, we must also be proactive in the space of the absences of others who have people writing about them or for them in a digital sphere.

  2. Hello Julie,
    Before I ever took this class, I never thought about algorithms and the way that people use them. I had not realized that when we use a free digital service the user is subjected to data mining and advertising to “pay” for the service they provide. Like you said, this can lead to an inequity of search results and therefore search engines can sometimes be biased.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php