News
Facebook looking to open content moderation for external audit
Facebook is working to enable external reviews on its content moderation systems, chief executive Mark Zuckerberg said in a comment piece on the Financial Times on Monday.
The tech giant has been criticised for being too slow to remove hate speech and terrorist propaganda from its platform, but has also been accused of suppressing right-wing voices.
In the article, Zuckerberg repeated his appeal for government regulation on issues such as electoral advertising, harmful content and data portability.
“Companies like mine need better oversight when we make decisions, which is why we’re creating an independent Oversight Board so people can appeal Facebook’s content decisions.
“We’re also looking at opening up our content moderation systems for external audit,’’ he added.
Facebook came in for intense criticism after the deadly attack on two mosques in Christchurch, New Zealand, in 2019 during which the attacker live-streamed the events on Facebook.
The company has also been criticised for allowing Russian-backed trolls to post ads aimed at influencing the 2016 U.S. presidential election, as well as over the Cambridge Analytica data harvesting scandal.
Zuckerberg said that his company was working with governments, inclusing New Zealand’s, “on what regulation could look like.”
“I believe good regulation may hurt Facebook’s business in the near term but it will be better for everyone, including us, over the long term.
“If we don’t create standards that people feel are legitimate, they won’t trust institutions or technology,’’ Zuckerberg added.