Forget Presidential Candidates. Are Our Computers Also Racist and Sexist? (Audio)
Austin, Texas is currently the place to be for music industry insiders, with a whole host of artists performing across a spectrum of genres. However, the South by Southwest (SXSW) Festival is not just a place to get put up on the next hottest artist; it’s also a well-established platform for showcases, panels, and unveiling of products in the tech industry. Products like robots and virtual reality headsets are making a big splash in the techie world, but all is not geared towards the pushing of new, hot products. Much of the ongoing happenings at SXSW take a more cerebral, discussion-based approach to technology, particularly in dealing with the moral, philosophical, and human implications of advancements in the field. For example, the relationship between human computer and software developers and the programs they create has been a prevalent theme in the tech world for generations, but a recent conversation at SXSW is extending it to contemporary social ills. As NPR asks, “Can Computer Programs Be Racist And Sexist?”
Such a question has become more commonplace in the tech industry as more and more of our digital and online interactions rely on algorithms. And therein lies the rub. By definition, algorithms can’t be racist per se – they are, after all, a form of calculations which follow a given set of rules by a computer. However, one cannot ignore the very human element involved in the creation of the rules making up any given algorithm. As such, there have been instances in which a computer inadvertently does something racist or sexist, and as NPR’s Laura Sydell reports, it can have seriously harmful results. In a recent segment for “All Tech Considered,” Sydell speaks with a young man named Jacky Alciné who first made headlines in 2015, when he brought to Google’s attention a horrifying error they had made in filing an image he had uploaded. Featuring he and a friend – both of whom are African American – Google assigned the tag of “gorilla” to his friend’s face, unmasking that fatal flaw in the site’s algorithm structure. Without getting too technical, it essentially meant that the qualities assigned to gorillas in the back-end of the software programming mirrored some of the qualities assigned to Black people, essentially making it impossible for the computer to recognize any differences between the two. That glitch has since been fixed, but the conversation has only just begun.
“Alciné’s experience is one of many strange biases that turn up in computer algorithms, which sift through data for patterns,” writes Sydell. Beyond the more pedantic, benign use of algorithms for thinks like movie suggestions on Netflix based on previous choices, these formulas can be used for more consequential things – which is where structural racism and sexism can rear their ugly heads. A “Harvard study found that when someone searched in Google for a name normally associated with a person of African-American descent, an ad for a company that finds criminal records was more likely to turn up,” and “other studies show that women are more likely to be shown lower-paying jobs than men in online ads,” she explains. Harvard computer-science professor Sorelle Friedler shares with Sydell some of the possibly dangerous implications of algorithms, including the fact that social scientists are incorporating them into new ways, including in “testing the idea of using algorithms to suggest how to sentence a particular type of criminal,” adding that “[s]ome companies are using them to narrow the pool of job applicants.”
Algorithms are definitely one of the most fundamental tools we use on a daily basis in our growing reliance on technology in daily life, but their omnipresence is likely to lead to more examples in which the human brain’s conceptual capabilities clash with the more rigid structure inherent in the mathematics of computer programming. Is there a happy medium between altogether eliminating algorithms and accepting that sometimes racist or sexist things may happen?