Facebook COO shirks fight against online hate
Facebook’s Chief Operating Officer, Sheryl Sandberg spoke at the World Economic Forum in Davos yesterday and delivered an astonishing message: fight hate on Facebook by ‘liking’ hate pages and posting positive messages on them.
For a company with an extremely poor track record in fighting online hatred, we expected that the company’s next announcement might be for technology that detects and removes common content that incites and intimidates, such as Islamist or neo-Nazi links, memes and hashtags.
Instead, Sandberg’s ridiculous suggestion places the burden once again on Facebook’s users.
Already the company requires users to report “hate speech” such as antisemitic content, but then Facebook invariably tells users that the content they reported did not breach “community standards”. We continue to see everything from modern day antisemitic conspiracy myths that Jews are funding ISIS, to mediaeval blood libels, spread like wildfire on the social network, with Facebook’s reporting system failing abysmally to have any effect.
Sandberg’s new idea that users should spend their time ‘liking’ pages that incite hatred and trying to post positive content on them is the most ludicrous non-solution imaginable. ‘Liking’ and commenting on pages increases their visibility on users’ news feeds, and thereby exposes more people to the page’s message. The idea that positive comments would have any effect is also nonsensical as comments can be deleted faster than they can be written, which is why antisemitic comments posted on our own Facebook page rarely stay there longer than a few minutes.
The sole evidence Sandberg offered in support of her idea was a story about the ‘successful’ targeting of the German National Democratic Party’s page on Facebook. Extolling the benefits of what she called a “like attack”, Sandberg took Facebook users for fools, patronisingly saying: “Rather than scream and protest, they got 100,000 people to like the page, who did not like the page and put messages of tolerance on the page, so when you got to the page, it changed the content and what was a page filled with hatred and intolerance was then tolerance and messages of hope. The best antidote to bad speech is good speech and the best antidote to hate is tolerance.” Except that what really happened is 100,000 people wasted their time because not long after the “like attack”, NPD’s Facebook page was back to normal, except that now it had a lot more ‘likes’ and appeared more often on Facebook users’ news feeds and higher in search results.
Instead of coming up with increasingly pointless ways of distracting users from Facebook’s failure to robustly tackle incitement and intimidation on its platform, it is time for Facebook to innovate. Facebook must develop means of detecting and removing hatred in exactly the same way that it proactively removes child pornography and copyright music and video from the platform.