Google will increase the number of staffers who will work on rooting out violent extremist content on YouTube to ten thousand.
“We will continue the growth of our teams, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018,” YouTube CEO Susan Wojcicki told Britain’s Daily Telegraph.
According to her, some “bad actors” used the site to mislead, manipulate, harass and even harm other people. She added that YouTube developed a computer-learning technology that could quickly find extremist videos. The technology can be used to remove other inappropriate content like videos that contain inappropriate or exploitive content with children. According to Wojcicki, more than 150,000 videos have been removed and nearly two million videos had been reviewed for violent extremist content since June.
She also said that the company will take aggressive action on comments. Wojcicki explained that new comment moderation tools will be launched and in some cases, comments will be shut down altogether.
YouTube, as well as Twitter and Facebook, has faced calls to increase the oversight on possible extremist content and the latest announcement comes after British Prime Minister Theresa May pressed on the social media companies to weed out radical content after a series of terrorist attacks that have happened this year in her country, The Hill reports.
“The tech companies have made significant progress on this issue, but we need to go further and faster to reduce the time it takes to reduce terrorist content online,” May said in a speech to the United Nations in October.
She has often called for an end to the “safe spaces” terrorists enjoy online, BBC reports. In March the United Kingdom government suspended its adverts from YouTube because of concerns that they were appearing next to inappropriate content.