Algorithm– a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.
If you think computing, technology and the Internet is color blind you would be wrong. We are learning that race plays a big part in the way computers operate because people operate computers.
Amazon is currently under fire because its algorithms have told it not to offer Amazon Prime service to minority areas of Boston, Atlanta, Chicago, Dallas, New York City and Washington, D.C. The algorithm basically figured out that these areas are unprofitable. According to Amazon it was not a racist act but instead a simple business decision based on an algorithm. But the question remains; is this racism?
Algorithms can be racist and sexist. Algorithms decide what ads minorities and women see on websites. Studies have shown that marketing algorithms recognize certain patterns that indicate the race and gender of the person by the websites they visit. This is called bias. Because of algorithms women will see jobs that pay less than men and Black people see ads for certain neighborhoods but not others. The result is that certain information and opportunities are hidden or withheld from women and Black people.
An analysis by Bloomberg reveals that Amazon has some explaining to do. Amazon claims that it cares not who shops at their online stores. According to Craig Berman Amazon’s Vice President of Global Communications, ““We don’t know what you look like when you come into our store, which is vastly different than physical retail. We are ridiculously prideful about that. We offer every customer the same price. It doesn’t matter where you live.”
Amazon has been working to compete with the retail stores by meeting the need for immediate gratification of buying something and having it right then and there. Amazon’s same day delivery is meant to do just that. It promises same day delivery on millions of products for Prime members in cities where the service is available. The service is available in 27 cities with coverage in most areas within the city limits. However Bloomberg’s analysis shows the service is not available in predominantly black neighborhoods in six major same-day delivery cities. African-Americans are about half as likely to live in neighborhoods with access to Amazon same-day delivery as white residents.
Amazon is not deliberately making decisions on where to deliver based on race. According to Berman the ethnic make up of neighborhoods isn’t part of the data Amazon uses to make decisions on where to deliver. “When it comes to same-day delivery, our goal is to serve as many people as we can, which we’ve proven in places like Los Angeles, Seattle, San Francisco, and Philadelphia. Demographics play no role in it. Zero.”
After coming under fire for its algorithms Amazon has changed some of its delivery areas as a result. In Boston, Amazon has expanded to the predominantly Black community of Roxbury. Amazon has also responded to complaints in the Bronx, New York and Chicago.
But there is clearly more to algorithms than just performing calculations. A recent discovery by an MBA student revealed a search on Google for unprofessional hairstyles returned hundreds of pictures of black women with natural hairstyles. A search for professional hairstyles returned pictures of white women. In many of the images the hairstyles were similar with the color of the woman’s skin the only difference. This is the algorithm at work.
But you have to ask; how does an algorithm recognize a black sounding name? If a person were to Google search a black sounding name the ads that appear with the search return are often for services that let you look up arrest records. Another example occurred last July when Google’s facial recognition software tagged black faces with the word gorilla. A Google engineer apologized and fixed the problem. But another search using the word “nigger house” or “slave quarters” returned images of the Whites House. How does this happen?
Some researchers believe that search results that reveal racial or gender biases is the result of the programmers working on an algorithm. Programmers are overwhelmingly white men. Could this be an extension of the lack of diversity in the technology industry? Researchers cite the disproportionate ratio of 2:1 male to female ratio in students seeking coding careers and an even more dismal ration of black programmers. This diversity gap of racial and gender bias in algorithms demands urgent attention and correction.