YouTube has deleted under one percent of flagged hate videos

YouTube has removed less than one percent of the 15 million hate videos flagged to it, figures given to MPs have revealed.

Statistics provided to the UK Home Affairs Select Committee show that only 25,000 of nearly 15 million videos that were flagged as hateful or abusive between July and December last year have been removed.

Proportionately, the videos removed equate to just 0.17 percent.

The statistics were requested from YouTube as part of the parliamentary committee’s inquiry into hate crimes.

The revelation prompted accusations from the committee’s chair, Yvette Cooper, that YouTube and its parent company Google weren’t “taking any of this seriously enough”.

“We have raised the issue of hateful and extremist content with YouTube time and time again, yet they’ve repeatedly failed to act. Even worse than just hosting these channels, YouTube’s money-making algorithms are actually promoting them, pushing more and more extremist content at people with every click.

“We know what can happen when hateful content is allowed to proliferate online and yet YouTube and other companies continue to profit from pushing this poison.

“It’s just not good enough. Other social media companies are at least trying to tackle the problem but YouTube and Google aren’t taking any of this seriously enough. They should be accountable for the damage they are doing and the hatred and extremism they are helping to spread.”

YouTube says most of the videos were flagged by artificial intelligence rather than humans and computer programmes struggled with the “complex” area of hate speech.

The technology giant says the flags are often found to be inaccurate when reviewed by human moderators.

The UK select committee asked to see the statistics after Marco Pancini, YouTube’s Director of Public Policy Europe, the Middle East and Africa, appeared before MPs last month.

The committee is now waiting for answers from Google on a number of other questions posed in the hearing.

Among these are why videos of the March 2019 Christchurch mosque shootings are still appearing on YouTube.

Google was criticised strongly after copies of the live stream posted by the shooter on Facebook was uploaded to YouTube thousands of times on the day of the terror attack.

A letter from the committee says one of the Christchurch videos had received more than 720,000 views.

UK Members of Parliament also want Google executives to explain why YouTube recommends videos of far-right figures such as Stephen Yaxley-Lennon (also known as Tommy Robinson) even to viewers who have never watched such content.

Source

Additional reading

News category: World.

Tags: , , , , , ,