Search

Mozilla wants your help to fix terrible YouTube recommendations - The Verge

jemputjembut.blogspot.com

YouTube’s recommendations algorithm can lead you down some very weird rabbit holes, suggesting videos that feel weirdly personal and off-target at the same time. Today, Mozilla will introduce a new browser extension, RegretsReporter, that aims to crowdsource research about users’ “regrettable recommendations,” to let users better understand how YouTube’s recommendation algorithm works and provide details about patterns it discovers.

Mozilla started gathering stories from users last year about the videos that YouTube recommended to them; one user searched for videos about Vikings and was recommended content about white supremacy; another searched for “fail” videos and started getting recommendations for grisly videos of fatal car wrecks.

But there hasn’t really been a large-scale, independent effort to track YouTube’s recommendation algorithm to understand how it determines which videos to recommend, said Ashley Boyd, Mozilla’s vice president of advocacy and engagement.

“So much attention goes to Facebook — and deservedly so— when it comes to misinformation,” Boyd said. “But there are other elements in the digital ecosystem that have been under attended-to, and YouTube was one of those. We started to look at what YouTube said, how they curated content and noticed that they responded to concerns about the algorithm and said they were making progress. But there was no way to verify their claims.”

A YouTube spokesperson said in a statement to The Verge that the company is always interested to see research on its recommendation system. “However it’s hard to draw broad conclusions from anecdotal examples and we update our recommendations systems on an ongoing basis, to improve the experience for users,” the spokesperson said, adding that over the past year, YouTube has launched “over 30 different changes to reduce recommendations of borderline content.”

The Google-owned video platform has promised on numerous occasions to tweak the algorithm, Boyd points out, even as company executives were aware that it was recommending videos containing hate speech and conspiracy theories.

The browser extension will send data to Mozilla about how often you use YouTube, but without collecting information about what you’re searching for or watching unless you specifically offer it. You can send a report via the extension to provide more detail about any “regrettable” video you encounter in the recommendations, which will allow Mozilla to collect information about the video you’re reporting and how you got there.

Mozilla is hoping the extension will make the “how” of YouTube’s recommendation algorithm more transparent; what type of recommended videos lead to racist, violent, or conspiratorial content, for instance, and identify any patterns about how often harmful content is recommended.

“I would love for people to get more interested in how AI and in this case, recommendation systems, touch their lives,” Boyd said. “It doesn’t have to be mysterious, and we can be clearer about how you can control it.”

Boyd stressed that user privacy is protected throughout the process. The data Mozilla collects from the extension will be linked to a randomly-generated user ID, not to a user’s YouTube account, and only Mozilla will have access to the raw data. It will not collect data in private browser windows, and when Mozilla shares the results of its research, it will do so in a way that minimizes the risk of users being identified, Boyd said.

Mozilla does not have a formal arrangement with Google or YouTube for its research into the recommendation algorithm, but Boyd says they’ve been in communication with the company and are committed to sharing information.

YouTube, however, said the methodology Mozilla was proposing seemed “questionable,” adding that it wasn’t able to properly review how “regrettable” is defined, among other things.

Mozilla plans to spend six months collecting information from the extension, after which it will present its findings to users and to YouTube. “We believe they are committed to this issue,” Boyd said of YouTube. “We would love it if they could learn anything additional from our research, and making some viable changes to work toward building more trustworthy systems for recommending content.”

Let's block ads! (Why?)



"help" - Google News
September 17, 2020 at 04:00PM
https://ift.tt/3mEL0au

Mozilla wants your help to fix terrible YouTube recommendations - The Verge
"help" - Google News
https://ift.tt/2SmRddm


Bagikan Berita Ini

0 Response to "Mozilla wants your help to fix terrible YouTube recommendations - The Verge"

Post a Comment

Powered by Blogger.