Technologist Mary Hodder discussed yesterday on the blog TechCrunch the ethical issues related to the Amazon ‘glitch’ that removed the rankings of gay content made me think about analogies between the technical causes of the glitch and how unconscious bias can fuel structural racism.
Background on the Amazon ‘glitch’ issue from Wikipedia (edited):
Users on Twitter generated a firestorm of criticism that some erotic, lesbian, gay, bisexual, transgender, feminist and progressive books were being excluded from Amazon’s sales rankings.
Various books and media were flagged as “adult content” (including children’s books, self-help books, non-fiction, and non-explicit fiction), with the result that works by established authors like E. M. Forster, Gore Vidal, Jeanette Winterson and D. H. Lawrence were now unranked.
The change first received publicity on the blog of author Mark R. Probst, who reproduced an e-mail from Amazon customer service describing a policy of de-ranking “adult” material.
However, Amazon later said that there was no policy of de-ranking LGBT material, and blamed the change first on a “glitch” and then on “an embarrassing and ham-fisted cataloging error” that had affected 57,310 books.
Here’s the meat of Mary Hodder’s TechCrunch post:
The ethical issue with algorithms and information systems generally is that they make choices about what information to use, or display or hide, and this makes them very powerful. These choices are never made in a vacuum and reflect both the conscious and subconscious assumptions and ideas of their creators.
The ethics bar in creating algorithms and classification systems should be very high. In fact I would suggest that companies with power in the marketplace, both commercial and ideas, should consider outside review of these systems’ assumptions and points of view so the results are fair.
Algorithms are often invisible, and difficult to detect by design, because technologies that use them are designed not to share the methods for providing information. This is mainly because users are focused on the tasks at hand in information systems, and given good information, they don’t need to know everything under the system’s hood, and because technology makers like to keep the ’secret sauce” hidden from competitors, not to mention people who would game systems for their own devices such as spammers or other bad actors.
Several elements of Hodder’s analysis of the issue seem analogous to structural discrimination and how it fosters racial discrimination.
Hodder writes about how algorithms are based on choices that reflect “conscious and subconscious assumptions and ideas” of the programmers. The algorithms then choose the specific content that’s revealed and hidden, and the content that gets ranked more prominently. (Think of how Google’s search algorithm decides which results you see first.)
Similarly, human beings function in social systems, e.g. schools and workplaces, which are constructed by others and then set into motion and maintained, producing outcomes guided by the “rules of the game.”
In both the Amazon glitch and structural social groups, the impact of system-driven automatic choices is often irrefutable: a category of books and a category of people suffer from discrimination that has a clear negative impact on their opportunity to succeed.
In both cases, the causes of the problem are constructs – one technological, one sociological – a creation by human beings that have no inherent malice, but result in discrimination because bias seeds the way the systems make choices.
Some of the reactions to Hodder’s analysis also sing the same tunes to those we hear when we present the notion that unconscious bias, even in the absence of conscious discrimination, impedes opportunity.
A commenter on the post, identified as “AmillionBucks,” writes: “… discrimination made by an algorithm, whether it’s a reflection of its human parents or an accident, is more systematic than real human discrimination…”
An anonymous commenter wrote: “To suggest ethical (and/or moral) issues are at stake because of an inherent (prejudice) in the al´g&-rith-&m is just so stupid.”
Another commenter supporters Hodder’s assertions:
“It was a human who made the choice, consciously or unconsciously, to place those terms in proximity and to weight those terms as something to filter out. This glitch (which I have no doubt it was) simply revealed the unconscious biases of the programmer,” wrote “Smart Back Jack.”
The post and resulting exchange reveal that, even in another context (technology), we face incredible challenges in framing the issue relating human bias and structural discrimination.
2 thoughts on “How the Amazon ‘Glitch’ Relates to Structural Discrimination and Racism”