Preventing social media from interfering in Canadian elections

Sara Bannerman, McMaster University

The clock is ticking.

Critical Canadian elections are being held in the coming months. The Ontario provincial election is this June, New Brunswick’s is in September, the Quebec election is being held in October and the federal election is just a year and seven months away, in October 2019.

Companies that use social media to manipulate attitudes and behaviours are again facing scrutiny due to revelations about lax privacy practices.

Social media propaganda generated by Russian operatives in key swing states during the 2016 presidential election in the United States generated 30 to 40 per cent of election-related tweets, according to a U.S. congressional study. Advertisers can sow information that is duplicitous, inflammatory or false, the report also found.

A Facebook ad linked to a Russian effort to interfere with the 2016 U.S. presidential election is shown in this February 2018 photo.
(AP Photo/Jon Elswick)


Facebook has now suspended Cambridge Analytica and SCL Group, not for their long-known practice of amassing personal data from social media and crafting micro-targeted messages to manipulate elections, but for breaking promises to delete the data.

The U.S. Federal Trade Commission, the country’s privacy watchdog, has announced that the social media giant is now under investigation
over the use by Cambridge of the personal data of 50 million Facebook users to help elect Donald Trump.

Canada’s privacy commissioner has also announced he’s launched a formal investigation into Facebook and will look into whether the personal information of Facebook users in Canada was affected.

The power of analytics companies — and of platforms whose reach is now greater than the most-watched broadcast programming, according to the U.S. Senate study — requires far greater public accountability.

Advertising has long been aimed at specific populations, whether it’s women, LGBTQ audiences or ethno-cultural communities. To some extent, traditional advertising is transparent in that advertisements placed in publications are public and everyone sees the same ad, and understands plainly the target of that ad.

Ads are personalized and opaque

But online advertisements are now so individualized that this is no longer the case; only social media platforms know what information is being targeted to whom.

The data being harvested, and the inferences drawn from such data, permits manipulation on a massive scale that, currently, only social media companies are in a position to see. We’re no longer living in a marketplace of ideas, but in a fun house of infinity mirrors.

Watch: Politics in the age of data mining

There’s one solution that would go a long way toward combating election manipulation and advertising manipulation more generally: A full public archive of all online ads.

Karina Gould, Canada’s minister of democratic institutions, has acknowledged the threats of cyber-attacks and foreign influence on Canadian elections, but in February dismissed many of the solutions that have been proposed to confront these problems. At the same time, she has committed to talking to social media companies about the issues, and wants to see results by around August 2018.

She recently emphasized the importance of algorithmic transparency, and Prime Minister Justin Trudeau has called on Facebook to fix its fake news problem.

Self-regulation not working

But calls for social media companies to regulate themselves are too little, too slow.

Canada’s political parties are making use of online advertising as much as anyone else. The founder of Steve Bannon’s “psychological warfare tool” once worked for both the Liberal party and Barack Obama’s campaign.

The Liberal party and the federal government have connections and partnerships with Facebook and other platforms.

Regardless, the Liberal government and all political parties should put in place regulations to help safeguard the integrity of our democracy by ensuring the actions of social media companies will withstand public scrutiny.

Self-regulation is simply not enough.

In October, Twitter promised a public archive of ads on its platform, but has now missed the deadline it set, failing to make good on its promise. Facebook has made limited information about its ads available to users in Canada, and claims to have plans for an archive of election-related ads.

Facebook’s so-called Election Integrity Initiative promises a “Cyber Threats Crisis Email Line” to report suspected cyber-interference.

But how will we know about interference on Facebook if we can’t examine Facebook ads? When, if ever, will social media companies’ plans for ad archives materialize? The promised self-regulatory initiatives fall far short of what is needed to ensure true online transparency.

Other countries are taking action.

U.S. pushing for public ad archive

In the United States, the Honest Ads Act would require online platforms to make a public archive of election ads, including a description of their target audience, searchable by the name of the purchaser of the ad, the name of the candidate that is the subject of the ad, the issue discussed in the ad, or by date.

Canada should pass legislation to require platforms to have a full, live public archive of all targeted online advertisements, displaying the ad itself, its source, the targeted audience, the amount spent, impressions delivered and the demographics of the audience reached.

In the case of election advertisements, the archive should also identify the candidate and issue that is the subject of the ad. The archive should be live so that problems can be identified immediately — not after the election is over.

A full public archive is the only mechanism that will allow individuals, journalists, politicians, corporations and academics to bring online advertising campaigns and strategies into public light.

Online platforms must not be left to self-regulate and produce half-measures that come too late.

The ConversationToo much is at stake, including the very future of democracy itself.

Sara Bannerman, Associate Professor and Canada Research Chair in Communication Policy and Governance, McMaster University

This article was originally published on The Conversation. Read the original article.

Related Stories