LISTEN LIVE KPR - On Air: Listen Live to classical, jazz and NPR news Schedule LATEST
NEWSCAST
KPR 2 - On Air: Listen live to KPR's all talk-radio service, KPR2 Recordings

Share this page              

How Facebook Uses Technology To Block Terrorist-Related Content

Listen to the Story

Facebook has created new tools for trying to keep terrorist content off the site.

Social media companies are under pressure to block terrorist activity on their sites, and Facebook recently detailed new measures, including using artificial intelligence, to tackle the problem.

The measures are designed to identify terrorist content like recruitment and propaganda as early as possible in an effort to keep people safe, says Monika Bickert, the company's director of global policy management.

"We want to make sure that's not on the site because we think that that could lead to real-world harm," she tells NPR's Steve Inskeep.

Bickert says Facebook is using technology to identify people who have been removed for violating its community standards for sharing terrorism propaganda, but then go on to open fake accounts. And she says the company is using image-matching software to tell if someone is trying to upload a known propaganda video and blocking it before it gets on the site.

"So let's say that somebody uploads an ISIS formal propaganda video: Somebody reports that or somebody tells us about that, we look at that video, then we can use this software to create ... a digital fingerprint of that video, so that if somebody else tries to upload that video in the future we would recognize it even before the video hits the site," she says.

If it's content that would violate Facebook's policies no matter what, like a beheading video, then it would get removed. But for a lot of content, context matters, and Facebook is hiring more people worldwide to review posts after the software has flagged them.

"If it's terrorism propaganda, we're going to remove it. If somebody is sharing it for news value or to condemn violence, we may leave it up," Bickert says.

The measures come in the wake of criticism of how Facebook handles content. Last year, for example, Facebook took down a post of the Pulitzer Prize-winning photo of a naked girl in Vietnam running after a napalm attack. The move upset users, and the post was eventually restored. Facebook has also been criticized for keeping a graphic video of a murder on the site for two hours.

Morning Edition editor Jessica Smith and producer Maddalena Richards contributed to this report.

Copyright 2017 NPR. To see more, visit http://www.npr.org/.

Tower Frequencies

91.5 FM KANU Lawrence, Topeka, Kansas City
96.1 FM K241AR Lawrence (KPR2)
89.7 FM KANH Emporia
99.5 FM K258BT Manhattan
97.9 FM K250AY Manhattan (KPR2)
91.3 FM  KANV Junction City, Olsburg
89.9 FM K210CR Atchison
90.3 FM KANQ Chanute

See the Coverage Map for more details

Contact Us

Kansas Public Radio
1120 West 11th Street
Lawrence, KS 66044
Download Map
785-864-4530 (Main Line)
888-577-5268 (Toll Free)
contact@kansaspublicradio.org