From Andrew Marantz at The New Yorker:
In retrospect, it seems that the company’s strategy has never been to manage the problem of dangerous content, but rather to manage the public’s perception of the problem. In [former UK Liberal Democratic Party leader Nick] Clegg’s recent blog post, he wrote that Facebook takes a “zero tolerance approach” to hate speech, but that, “with so much content posted every day, rooting out the hate is like looking for a needle in a haystack.” This metaphor casts Zuckerberg as a hapless victim of fate: day after day, through no fault of his own, his haystack ends up mysteriously full of needles. A more honest metaphor would posit a powerful set of magnets at the center of the haystack—Facebook’s algorithms, which attract and elevate whatever content is most highly charged. If there are needles anywhere nearby—and, on the Internet, there always are—the magnets will pull them in. Remove as many as you want today; more will reappear tomorrow. This is how the system is designed to work.
“It’s an open secret,” Sophie Zhang, a former data scientist for the company, recently wrote, “that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention.” Zhang left Facebook in September. Before she did, she posted a scathing memo on Workplace. In the memo, which was obtained by BuzzFeed News, she alleged that she had witnessed “multiple blatant attempts by foreign national governments to abuse our platform on vast scales”; in some cases, however, “we simply didn’t care enough to stop them.” She suggested that this was because the abuses were occurring in countries that American news outlets were unlikely to cover.
Nothing surprising in the article, but Marantz adds a lot more detail than most people have realized.