Nova Scotia’s election season is ramping up. On social media, political advertisements and donation pleas from parties have been filling newsfeeds for the past few weeks, all leading up to election day, Tuesday, August 17.
But it’s not just social media managers who are keeping track of what gets posted to Facebook—the social media giant has its own fact checkers who are working to verify every election-related post. “If content does not violate our community standards but may contain disinformation, this is when our independent third-party fact-checking partners come in. When a fact checker rates a story as false, we show it lower in News Feed, significantly reducing its distribution,” says Rachel Curran, public policy manager for Facebook Canada, in a phone call.
Globally, the conglomerate has over 35,000 people working in three main areas: political ad transparency, fighting misinformation and combating interference on Facebook. In Canada, those fact-checking positions are contracted out to people from Agence France-Presse and Radio Canada.
“Content across Facebook and Instagram that has been rated false or partly false is prominently labeled,” Curran explains. “So people can better decide for themselves when to read, trust and share.”
Facebook says that since 2017, it's been involved in 200 elections around the world, including the Canadian federal election in 2019. It was also here in 2017 for the last Nova Scotia provincial election. “In 2017, our teams identified and removed over 150 covert influence operations for violating our policy against coordinated inauthentic behaviour,” Curran says.
These violations include everything from not saying who paid for an ad to giving people misinformation about the voting process.
“A critical part of preventing abuse on our platforms around elections is removing content that misleads people on when or how to vote,” Curran says. “As we find material that violates our community standards, we not only remove that specific ad or photo, we also proactively search for other instances of the same bad content, and we remove that as well.”
Since our last provincial election, Facebook says it’s increased its vetting process and upped its transparency requirements, to meet and exceed the requirements of the federal government’s Elections Modernization Act, Bill C-76.
“In Canada, we rolled out a transparency tool for advertisers in compliance with the obligation of the Elections Modernization Act in June,” Curran says. “Now, anyone who wants to run ads on social issues, elections or politics in Canada needs to first confirm their identity and location in Canada, and disclose who paid for the ad.”
In total as of the first quarter of 2021, Facebook says it’s taken action on over 1.3 million fake accounts, many of which aren’t politically motivated but, Curran says, “financially motivated.”
“I won't mention them by name, but the business model is about generating outrage, spreading misinformation and getting donations through that kind of spread of misinformation,” she explains. “And there are certainly some in Canada as well.”
While Facebook executives are virtually unreachable for the average user, Facebook Canada has given direct lines of communication to each political party, and Elections Nova Scotia, in case any issues arise.
“We've conducted outreach to all Nova Scotia political party and candidate page administrators, reminding them about two-factor authentication and ensuring they have access to our cyber threat hotline,” Curran says. “As far as misinformation goes on specific Nova Scotia issues, we have not seen any samples of that so far, but we're always monitoring very actively.”
Curran says Facebook ads leading up to this year’s election should be transparent. “If you see any political ads on the platform, it will be evident who's running them, which political party is running them,” she explains. “And then of course, if you go in the ad library, you can search how much was spent on the ad, who it reached, all of the information about the ad.”
For those political groups that may violate Facebook’s rules or provincial laws, Curran says there will be consequences, and users should report any suspicious activity.
“If these anti-mask groups are promoting events that are contrary to public health orders, we will remove that. If they're spreading misinformation, we will remove that,” Curran says. “Eventually, if they accumulate a certain number of strikes against their account, we may remove or ban them from the platform altogether. But it's not just sort of one strike against the group that will result in its removal. We're constantly improving our automated systems to pick up this stuff proactively, but it really helps to get user reports also.”