Facebook is increasing the number of people working on safety and security issues around the world to 20,000 people by the end of the year, as it employed dedicated teams working on all of the world’s upcoming elections – including Pakistan – to help detect and prevent malicious actors or abuse on Facebook.
Ahead of Pakistan’s elections, Facebook has ramped up its efforts to ensure they doing everything they can to prevent abuse, working in collaboration with the Election Commission of Pakistan to better understand and address the specific challenges faced here.
Facebook has incredible potential to be a positive force for democracy around the world. Facebook gives a voice to people of all ages and political beliefs, it encourages debate and the healthy exchange of ideas, and it makes leaders more accountable to their constituents.
We also know that misinformation is harmful to our community and makes the world less informed, particularly in the context of elections. We can’t combat false news alone – we believe it requires a concerted effort across industry, academics, civil society and government, but we are absolutely committed to playing our part. That’s why we are working hard to fight the spread of false news on Facebook through a combination of technology and human review, and through arming our community with ability to recognise what might be false. Our efforts are focused on the following key areas:
Third-party fact-checking: As of this week, we will begin a pilot of our Third Party Fact Checking to our community in Pakistan, in partnership with AFP. Third Party Fact Checking is one of the ways we are fighting misinformation and our partners are well-respected fact-checkers who have been certified by Poynter’s non-partisan International Fact-Checking Network. Here’s how we work with our fact-checking partners, using a combination of technology and human review to detect and demote false news on Facebook: