If you’ve ever felt like it’s hard to “untrain” YouTube’s algorithm from suggesting a certain type of video once you slip into its recommendations, you’re not alone. In fact, it can be even more difficult than you think to get YouTube to accurately understand your preferences. A major problem, according to made by Mozilla, is that controls in the YouTube app, such as the “dislike” button, are largely ineffective as a tool for controlling suggested content. According to the report, these buttons “prevent less than half of unwanted algorithmic recommendations.”

The Mozilla researchers used data collected from RegretsReporter, their browser extension that allows people their recommendation data for use in studies like this one. In total, the report was based on millions of recommended videos, as well as anecdotal reports from thousands of people.

Mozilla tested the effectiveness of four different controls: the thumbs-down “dislike” button, “not interested,” “don’t recommend channel,” and “remove from watch history.” The researchers found that these had varying degrees of effectiveness, but that the overall impact was “small and inadequate.”

Of the four controls, “do not recommend from channel” was the most effective, preventing 43 percent of unwanted recommendations, while “not interested” was the least effective, only preventing about 11 percent of suggestions. unwished. The “dislike” button was about the same at 12 percent, and “remove from watch history” removed about 29 percent.

In their report, Mozilla researchers noted the great efforts study participants said they would sometimes go to avoid unwanted recommendations, such as watching videos while offline or while connected to a VPN. The researchers say the study highlights the need for YouTube to better explain its controls to users and give them more proactive ways to define what they want to watch.

“The way that YouTube and many platforms operate is that they rely on a lot of passive data collection to infer what your preferences are,” says Becca Ricks, a senior researcher at Mozilla and co-author of the report. “But it’s a bit of a paternalistic way of operating where you’re making decisions on behalf of people. You could ask people what they want to do on the platform instead of just looking at what they’re doing.”

Mozilla’s investigation comes amid a surge in calls from major platforms to make their algorithms more transparent. In the United States, lawmakers have proposed bills to “dull” recommendation algorithms and hold companies back for algorithmic bias. The European Union is even further ahead. The recently passed Digital Services Law will require platforms on how recommendation algorithms work and open them up to outside researchers.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at time of publication.

News Source link

Leave a Reply

Your email address will not be published. Required fields are marked *