Foreign and domestic actors will almost certainly abuse digital media to try to manipulate voters and undermine the integrity of Canada's next federal election, experts are warning the Trudeau government.
But while there's no silver bullet to prevent it, the government is being urged to implement at least two measures that experts say will help make Canadians less susceptible to manipulation: require online political advertisements to be totally transparent and require digital media to disclose every automated account or "bot" deployed to amplify political messages.
"Those are simple things that we can begin to do to give people more transparency into how the (digital) platforms operate," says Ben Scott, policy adviser on innovation in former U.S. president Barack Obama's administration.
"And even if it doesn't solve even a significant amount of the problem, it begins to change the way people think about digital media and it sets the stage for more difficult interventions in market regulation, like data privacy regulation and competition policy."
Scott, now policy and advocacy director for the Omidyar Network, a foundation that invests in supporting democratic institutions, was one of three experts invited to talk to Prime Minister Justin Trudeau and his ministers during a cabinet retreat Wednesday.
The government sought the expert advice as it ponders ways to beef up Bill C-76, omnibus legislation introduced last spring and aimed, among other things, at preventing foreign interference in Canadian elections.
Democratic Institutions Minister Karina Gould confirmed earlier this week that the government is particularly interested in doing more to ensure foreign money is not involved in efforts to influence Canadian voters.
Scott doubted the government has time to tackle some of the bigger issues surrounding cyberthreats before the next election in October 2019. But he said it could tackle the "low-hanging fruit."
First, he said every online political ad should be required to include a pop up message that tells people, "'Here's who paid for this ad, here's how much they spent, here's how many people it reached and here's why you got it.'
"(It should say) 'you got it because they selected this demographic, this race, this gender, this geography, this profession, this income level,' so that you understand who's trying to influence you and why."
In addition, Scott said digital media should be required to disclose all automated accounts.
"People should know there's not a human being behind this account," he said.
Scott said the "disaster" that has befallen the American political system was accelerated by disinformation disseminated through digital media, accentuating extreme views and causing "a rapid and destructive polarization in American political culture." The same phenomenon has occurred in different ways across the democratic world and Canada will not be immune, he said.
"In my view, it's not a question of whether it will happen in Canada. It will happen in Canada. The only question is what will it look like and how well will we be prepared to mitigate the negative effects of that new reality of digital media?"
On that score, Scott declined to say how Bill C-76 measures up, stressing that he's not an expert in Canadian election law. Nevertheless, he said: "In my view, digital ad transparency and the labelling of automated accounts ought to be part of any comprehensive election reform package."
Another expert consulted by the cabinet, University of British Columbia digital media professor Taylor Owen, agreed that C-76 could use some added muscle.
"The single thing that they could do that would be most powerful would be more robust ad transparency so that we know, every political piece of content that's shown to us as an individual is identified: who purchased it, who they targeted, how much they spent and who it reached," he said.
"If we have that, then at least we'll get transparency in the system so that after an election we can hold actors accountable for who tried to influence whom."
Owen said he also believes that online giants like Facebook, Google and Twitter, should be made legally liable for the content they allow on their platforms. There is no reason, he argued, that digital media should be able to publish hate speech that is illegal in the non-digital world.
Comments