Home / Technology / Researchers find evidence of bias in recommender systems

Researchers find evidence of bias in recommender systems

In a new preprint study, researchers at the Eindhoven University of Technology, DePaul University, and the University of Colorado Boulder find evidence of bias in recommender systems like those surfacing movies on streaming websites. They say that as users act on recommendations and their actions are added to the systems (a process known as a feedback loop), biases become amplified, leading to other problems like declines in aggregate diversity, shifts in representations of taste, and homogenization of the user experience.

Collaborative filtering is a technique that leverages historical data about interactions between users and items — for example, TV show user ratings — to generate personalized recommendations. But recommendations provided by CF generally suffer from bias against certain user or item groups, usually arising from biases in the input data and algorithmic bias.

It’s the researchers’ assertion that bias could be intensified over time when users interact with the recommendations. To test this theory, they simulated the recommendation process by iteratively generating recommendation lists and updating users’ profiles by adding items from those lists based on an acceptance probability. Bias was modeled with a function that took into account the percent increase of the popularity of recommendations compared with that of ratings provided by users on different items.

bias recommendation algorithms

In experiments, the researchers analyzed the performance of recommender systems on the MovieLens data set, a corpus of over 1 million movie ratings collected by the GroupLens research group. Even in the case of an algorithm that recommended the most popular movies to everyone, accounting for movies already seen, amplified bias caused it to deviate from users’ preferences over time. The recommendations tended to be either more diverse than what users were interested in or over-concentrated on a few items. More problematically, the recommendations showed evidence of “strong” homogenization. Over time, because the MovieLens data set contains more ratings from male than female users, the algorithms caused female user profiles to edge closer to the male-dominated population, resulting in recommendations that deviated from female users’ preferences.

Like the coauthors of another study on biased recommender systems, the researchers suggest potential solutions to the problem. They suggest using strategies for user grouping based on average profile size and popularity of rated items and different algorithms that control for popularity bias. They also advocate not restricting the regrading of items already in users’ profiles, instead updating them in each iteration.

“The impact of feedback loop is generally stronger for the users who belong to the minority group,” the researchers wrote. “These results emphasize the importance of the algorithmic solutions to tackle popularity bias and increasing diversity in the recommendations since even a small bias in the current state of a recommender system could be greatly amplified over time if it is not addressed properly.”

Let’s block ads! (Why?)

VentureBeat

About

Check Also

The scale of ambition in gaming is getting bigger | Brian Ward fireside chat

The scale of ambition for Saudi Arabia when it comes to moving into the games …