The next US presidential election takes place in November 2020, but Facebook is already demonstrating what its position will be in the period. The company said it will not conduct fact-checking, nor will it be as strict with politicians’ statements.
The statement was made by Facebook’s vice president of international relations, Nick Clegg. “It is not our role to intervene when politicians speak,” the executive said during a speech in Washington, DC on Tuesday (24).
Clegg, who was the UK’s deputy prime minister between 2010 and 2015, said the company does not believe it has a role in mediating political debates and preventing certain speeches from reaching voters and being subject to public scrutiny.
“That’s why Facebook exempts politicians from our fact-checking program,” he said on the company’s blog. This policy has been in existence for a year and prevents organic or sponsored publications from politicians from being sent to checking agencies.
The only exception, according to Clegg, is for publications by politicians that include links, videos and photos that have already been checked and denied. In this case, the content loses strength in the news feed, starts to circulate with the check and cannot be used in ads.
Facebook will be softer with political speeches
The company also said that posts by politicians who violate its rules will not be removed automatically. Instead, your team will analyze the situation and be able to keep the content on the air if the public interest outweighs the risk of harm.
“From now on, we will treat the speech of politicians as a prominent content that, as a general rule, should be seen and heard,” explained Clegg about the so-called exemption from news. According to the executive, it will not apply to advertisements, which should still be within the rules of the social network.
To assess the public interest, Facebook will consider factors specific to each country, such as whether an election is in progress and whether the country has a free press. On the other hand, when assessing the risk of harm, the company intends to analyze factors such as the potential of the content to incite violence.
In his speech, Clegg also presented what has been done to avoid foreign interference in elections. “It is no secret that Facebook made mistakes in 2016 and that Russia tried to use Facebook to interfere in the election, spreading division and disinformation,” he said.
According to him, the company removed several fake accounts, which were used to spread fake news. In addition, it increased its moderation team to 30,000 people and invested in artificial intelligence to prevent harmful content on the platform.
Social network wants to be faster with deepfakes
For 2020, deepfakes appear to be one of Facebook’s biggest challenges. Clegg admitted that the company should do more in cases of manipulated videos such as that of US House Speaker Nancy Pelosi.
He recalled Mark Zuckerberg’s statement that the company took too long to act, which made many people think the video with the MP’s image was real. After that case, the company announced a challenge for anyone who can develop methods of identifying deepfakes.
“We need and will improve the identification of lightly manipulated content before it goes viral and provides users with much more compelling information when they see it,” promised Clegg.