How Facebook Uses Technology to Curb the Spread of Terror-Related Content

Facebook recently detailed new measures, including using artificial intelligence, to tackle the spread of terror-related content.

Social media giants are under pressure to block terrorist activity on their sites. The new initiative proposed by Facebook is aimed at identifying content like recruitment and propaganda as early as possible in an effort to keep people safe, says Monika Bickert, the company’s director of global policy management.

“We want to make sure that’s not on the site because we think that that could lead to real-world harm,” she tells NPR’s Steve Inskeep.

Bickert says the social network is using technology to identify people who have been removed for violating its community standards for sharing terrorism propaganda, but then go on to open fake accounts.

She added that the company is using image-matching software to tell if someone is trying to upload a known propaganda video and blocking it before it gets on the site, NPR reads.

“So let’s say that somebody uploads an ISIS formal propaganda video: Somebody reports that or somebody tells us about that, we look at that video, then we can use this software to create … a digital fingerprint of that video, so that if somebody else tries to upload that video in the future we would recognize it even before the video hits the site,” she says.

If it’s content that would violate Facebook’s policies no matter what, like a beheading video, then it would get removed. But for a lot of content, context matters, and Facebook is hiring more people worldwide to review posts after the software has flagged them.

“If it’s terrorism propaganda, we’re going to remove it. If somebody is sharing it for news value or to condemn violence, we may leave it up,” Bickert says.
The measures come in the wake of criticism of how Facebook handles content. Last year, for example, Facebook took down a post of the Pulitzer Prize-winning photo of a naked girl in Vietnam running after a napalm attack. The move upset users, and the post was eventually restored. Facebook has also been criticized for keeping a graphic video of a murder on the site for two hours.

Be the first to comment

Leave a Reply

Your email address will not be published.