Begin typing your search...

YouTube pushing harmful videos: Mozilla

YouTube’s controversial algorithm is recommending videos considered disturbing and hateful that often violate the platform’s very own content policies –a crowdsourced investigation by Mozilla

YouTube
X

YouTube creators contributed Rs 6,800 cr to Indian economy in 2020

New Delhi: Firefox browser developer Mozilla has claimed that Google-owned YouTube keeps pushing harmful videos and its algorithm is recommending videos with misinformation, violent content, hate speech and scams to its over two billion users.

The in-depth study also found that people in non-English speaking countries are far more likely to encounter videos they considered disturbing.


Highlight:

The in-depth study also found that people in non-English speaking countries are far more likely to encounter videos they considered disturbing

"YouTube's controversial algorithm is recommending videos considered disturbing and hateful that often violate the platform's very own content policies," according to a 10-month long, crowdsourced investigation released by Mozilla late on Wednesday.

YouTube told NBC News that videos promoted by the recommendation system result in more than 200 million views a day from its homepage, and that it pulls in more than 80 billion pieces of information.

"We constantly work to improve the experience on YouTube and over the past year alone, we've launched over 30 different changes to reduce recommendations of harmful content," the company said in a statement.

Mozilla conducted the research using RegretsReporter, an open-source browser extension that converted thousands of YouTube users into YouTube watchdogs.

People voluntarily donated their data, providing researchers access to a pool of YouTube's tightly-held recommendation data.

Research volunteers encountered a range of regrettable videos, reporting everything from Covid fear-mongering to political misinformation to wildly inappropriate "children's" cartoons.

"The non-English speaking world is most affected, with the rate of regrettable videos being 60 per cent higher in countries that do not have English as a primary language," the findings showed. Over 71 per cent of all videos that volunteers reported as regrettable were actively recommended by YouTube's very own algorithm.

Almost 200 videos that YouTube's algorithm recommended to volunteers have now been removed from YouTube -- including several that the platform deemed violated their own policies.

These videos had a collective 160 million views before they were removed, said the Mozilla report.

Bizz Buzz
Next Story
Share it