Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

'The Wall Street Journal' Takes A Deep Dive Into The Facebook Files

NOEL KING, HOST:

Facebook reportedly knows that it exposes its users to misinformation, violent imagery, human trafficking, calls for violence against specific ethnic groups and a lot more. And Facebook has tools to stop all of this. But internal documents obtained by The Wall Street Journal suggest it's not using them. I should note here that Facebook is an NPR sponsor. Jeff Horwitz is a Wall Street Journal reporter. He's part of the team that conducted a four-part investigation.

Jeff, start by telling me about XCheck.

JEFF HORWITZ: So XCheck was the system Facebook created to try to make sure it didn't mess up on really high-profile user complaints. And it had to do this because its normal process for adjudicating content issues isn't reliable enough for what it considered to be VIP users. And so what it ended up doing was putting 5.8 million or so users of its nearly 3 billion around the globe into this program that gave them special privileges and, in fact, better treatment during enforcement. And in fact, for some users, it completely exempted them from all rules at all, which meant that users could basically just do whatever they wanted on the platform. This resulted in billions of views of violating content on the platform that Facebook knew it could've stopped and also allowed powerful popular accounts, such as that of Neymar, the Brazilian soccer player, to show what Facebook deemed to be revenge porn...

KING: Oh.

HORWITZ: ...To more than 50 million people.

KING: Oh. OK. Well, that's a lot. But that's just one part of it. I mean, you also wrote about how Facebook rewards outrage content, the stuff that makes people really mad and then makes them engage. And Mark Zuckerberg, you write, had a chance to fix it. And what happened?

HORWITZ: So the company realized that promoting engagement and creating sort of algorithms that promote engagement ended up promoting really angry content and that you could see internally that Facebook researchers were worried that they were making politics and political discourse around the world much, much more contentious and vitriolic. And they realized this but then didn't want to roll back the changes that they'd made to their algorithm because that would hit their growth metrics.

KING: The fourth part of your series details horrific things that can be found on Facebook, including job postings that lure women into situations akin to slavery. Tell me about what you found and why that has been allowed to go on.

HORWITZ: So this is sort of a crisis of, I suppose, a lack of concern in some respects.

KING: Huh.

HORWITZ: Facebook was very much aware that it had a problem with human trafficking on its platform, large scale. And in fact, in some instances, it even allowed people to sell maids so long as they were doing it through, quote-unquote, "brick-and-mortar" establishments in the Persian Gulf and Gulf states. And they simply tolerated this until Apple, the - you know, maker of my iPhone, decided that it needed to tell them that it was going to either kick them off the App Store - so remove Facebook and Instagram from its App Store - unless they took care of it immediately. They did it then, but then they let the problem get back out of hand. And there's no question that they knew that there were massive amounts of this stuff on their platform.

KING: What has Facebook's response been to your reporting?

HORWITZ: Facebook has said that they're working to get better. They're not denying any of this stuff.

KING: Hmm.

HORWITZ: This is drawn directly from their documents. And they, you know, basically, sort of are - they're arguing that perhaps we're being a little bit negative and that they are good.

KING: Wall Street Journal reporter Jeff Horwitz. Again, this series is really great. Thanks, Jeff.

HORWITZ: Thank you.

(SOUNDBITE OF SUBLAB AND AZALEH'S "VIDURA") Transcript provided by NPR, Copyright NPR.