Matt Artz looks at how algorithms limit the spread of new art and content
October 2021 – Last month, Drew University’s Matt Artz, a design anthropologist, discussed how capital and algorithmic bias contribute to inequality in the art world at the ninth annual Why the World Needs Anthropologists conference.
Artz, an adjunct assistant professor in Drew’s business program, based his talk on his market research with Artmatcher, a mobile app that aims to address access and inclusion issues in the art market.
“Algorithms are becoming increasingly responsible for mediating our social interactions, and bias within these systems has a tendency to reaffirm existing lines of inequality in society,” explained Artz, who has given TEDx and SXSW talks in recent years.
“This can be seen in policing, image classification, and recommender systems – the gatekeepers to information found in search engine results, social media interactions, jobs, academic publications, books, and media content.”
Artz explained that art and other forms of content that aren’t favored by biased algorithms are essentially blocked from reaching large audiences, and instead those that have the right mix of capital are favored and result in a rich-get-richer outcome.
Artz is currently teaching a course on human-centered design which focuses on the people that products, services, and experiences are created for, “both in the mainstream and those at the extremes.”
“This perspective is important because it is often the disempowered people at the extremes who are absent when systems are designed, and they’re the ones who suffer the most from algorithmic bias.”
While he enjoys discussing why problems like algorithmic bias occur, Artz spends most of his time in class discussing what can be done to reduce the risk of such issues.
“We discuss the value of empathy, research, and storytelling, and how those skills can help us understand and give voice to marginalized groups.”
The biggest lesson students can take away from the topic?
“They will learn that technology is not neutral.
“Technology is viewed as an unbiased, objective tool, but it is a social construction that mirrors the culture in which it was created. With this course, Drew is offering students the opportunity to learn about this perspective and, in doing so, is contributing to the next generation of leaders who will help address the problems of inequality and bias we face globally.”