© 2024 WFIT
Public Radio for the Space Coast
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Leading Anti-Terror Technologist Says Suspend Facebook Live Following Mosque Shootings

AILSA CHANG, HOST:

A leading expert in anti-terror technology is calling on Facebook to cease to suspend live video in the wake of the New Zealand massacre. He says the company's failure to pull down footage of the tragedy is absolutely inexcusable. The suspect had streamed the shooting live on Facebook, and from there, it was shared hundreds of thousands of times, even after New Zealand police alerted the company. Here's NPR's Aarti Shahani.

AARTI SHAHANI, BYLINE: After Facebook removed the video, users attempted to upload it again in various forums about 1.5 million times. Of those attempts, 300,000 slipped through the cracks. That's a 1 in 5 failure rate.

HANY FARID: The repeated uploading is an absolute failure, and it is inexcusable because we have the technology to stop it.

SHAHANI: Hany Farid, a leading architect of that technology.

FARID: And if your technology isn't working, well, then you haven't innovated enough. You can't claim this is a hard problem. It's the same video. It's the same video. How can this be this hard of a problem? I simply don't buy that argument.

SHAHANI: Farid, an incoming professor at the University of California at Berkeley, worked with Microsoft 10 years ago to create Photo DNA, a tool that tech giants rely on to fingerprint digital content. The algorithms have evolved, so a photo video or audio clip can be fingerprinted and automatically blocked even when it's been modified.

Facebook says they used automated technology, but the video was recut and rerecorded into formats that made it harder to match copies. Farid says this excuse rings hollow. It's a common problem, and tech giants have had a decade to solve it.

FARID: Haven't figured out that problem yet, I think, says a lot about your priorities at these companies. It's simply not your priority.

SHAHANI: The U.S. Congress and European regulators have relied on Farid to fact-check the tech giants. He says political leaders should launch an inquiry and insist on honest answers in this recent Facebook failure, which he compares to another public safety debacle - Boeing.

FARID: There was a global outcry. We grounded planes. We stopped until we got answers to secure that.

SHAHANI: With investors, Facebook leaders talked up their ability to solve the hardest technical problems like getting livestream videos to work for millions of people on smartphones. At the same time, that's a really hard problem. CEO Mark Zuckerberg in November 2016.

(SOUNDBITE OF ARCHIVED RECORDING)

MARK ZUCKERBERG: So there aren't that many companies that can do this at the scale that we're talking about, and this has been a big advantage for us.

SHAHANI: When it comes to security - building the guardrails - company leaders are much quieter. Facebook declined to say how many views the massacre footage got in total from the 30,000 re-uploads. The company also declined to respond to Farid's comments, which NPR shared in an email. Aarti Shahani, NPR News, Berkeley.

CHANG: And we should say Facebook is one of NPR's financial sponsors. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.