First Reference company logo

Inside Internal Controls

News and discussion on implementing risk management

machine cogs image

What others say may be held against you

politicalactivitiescraThe Canada Revenue Agency (CRA) recently issued an Advisory on Partisan Political Activities (the Advisory). It warned charities that the prohibition against partisan political statements also extended to Internet links to third-party statements and to user comments posted on charities’ websites, blogs or other social media. (See the Advisory here). Additionally, two recent cases which came before Ontario and British Columbia courts are reminders that statements made by others on Internet and social media could expose individuals, charities or other entities that operate the platform to liability for defamation.

Partisan political activities

Charities are prohibited from devoting resources to partisan political activities. Partisan political activities refer to any direct or indirect support of, or opposition to, political parties at any time, whether during or outside of an election period. According to the Advisory, partisan political activities encompass several activities including: making public statements endorsing or denouncing a candidate or party and explicitly connecting the charity’s views on an issue to a political party or candidate. The Advisory explains the link between partisan political activities and social or Internet media:

Charities that use the Internet or social media to post information should ensure the information does not contain partisan political statements. Also, the information should not link to statements made by a third party that support or oppose a candidate or political party. When a charity invites comments on its website, blogs, or on social media, it should monitor them for partisan political statements and remove, edit, or moderate such statements within a reasonable time.”

Elsewhere on its website, CRA reminds charities of acceptable political activities, by providing the following example. A charity’s social programs may involve helping refugees to complete application forms for government assistance. If the charity determines that refugees are not applying for government assistance because the forms are unavailable in the language widely used by refugees, the charity may, for example, create petitions that publicly call for the government to provide forms in the desired language. Since the charity is advocating for changes to a government policy this would be a political activity. However the political activity would be “acceptable as long as the charity continues to meet all the requirements of the Income Tax Act (that is, the charity devotes only a limited amount of resources to the activity—generally no more than 10%, it remains non-partisan, and is connected and subordinate to the charity’s exclusively charitable purposes).” (See Distinguishing between charitable and political activities).


As the following two cases demonstrate, CRA is not the only party who may be interested in what others say on Internet or social media.

Brief overview of the law of defamation

A plaintiff alleging defamation must prove that: (i) the words in question were defamatory, in the sense that they would tend to lower the plaintiff’s reputation in the eyes of a reasonable person (ii) the words referred to the plaintiff, and (iii) the words were published, in that they were conveyed in any manner or form to at least one other person.

If the plaintiff proves the above, the onus shifts to the defendant to advance a defence, for example a defence of “fair comment”, which involves, among other things, proving that the comment was on a matter of public interest. The fair comment defence is not available if the plaintiff shows that the defendant was motivated by malice.

Baglow v Smith, 2015 ONSC 1175 (CanLII)

Plaintiff Baglow operated a blog on which he posted public interest and left-wing political views and commentary. The Fourniers and Smith were co-defendants. The Fourniers are a married couple who moderated a message board called “Free Dominion”, a venue for conservative views. Co-defendant Smith posted commentary on Free Dominion, and in a series of back and forth postings about Omar Khadr, Smith labelled the plaintiff a “Taliban supporter”. The Plaintiff alleged that the comments were defamatory and asked the Fourniers to remove the comments. The Fourniers refused, hence the lawsuit against both the Fourniers as publishers and Smith, who posted the comments.

The Fourniers’ position was that Smith made and posted the comments and they had only a passive role as administrators and operators of the website so they should not be liable if the comments are defamatory. The Fourniers argued that the message board’s software allowed users to register and post, without their intervention, and as such they did not publish the comments. (Publication of course being a necessary element of proving defamation, as outlined above). They characterised their involvement as akin to creating a hyperlink that merely indicated the existence and location of information.

The Judge disagreed, explaining that a message board is created precisely to provide content and the Fourniers were not passive bystanders. They sometimes posted comments and participated in threads. As administrators and moderators they could control the content on the message board by deleting comments. As a result both the Fourniers and Smith published the comments in question. The Judge also found that the words were defamatory, in that they would tend to lower the Plaintiff’s reputation in the eyes of a reasonable person.

However, the defendants successfully proved the elements of the “fair comment” defence, starting with the fact that the Omar Khadr debate was a matter of public interest. Malice did not defeat their defence of fair comment because while the post demonstrated malice, it was not the predominant motive for the publication.

In the final analysis, the post on the Fourniers’ message board was defamatory and although they did not make the comments, they would have been liable if their fair comment defence had failed. The lesson: administrators of websites or social media accounts may in certain circumstances be jointly liable for defamatory comments made by others.

Weaver v Corcoran, 2015 BCSC 165 (CanLII)

This case involved both traditional and social media. The National Post newspaper published articles about Plaintiff Weaver, a renowned professor in the climatology sphere. The articles appeared on several of the Post’s Internet sites as well as its print newspapers. The Plaintiff sued the journalists, the newspaper and others, for defamation based on the contents of the articles.

The Judge found that the articles were defamatory, as a reasonable person, after reading the articles would conclude that the Plaintiff was incompetent, inept and unethical in his work on climate change and related issues. The defendants could not successfully prove all the elements needed for a “fair comment” defence.

One allegation in the lawsuit was that the defendants authorised republication by inviting readers to post comments, and to email, tweet or otherwise send the articles to friends. Defamation occurs with publication and every repetition or republication of a defamatory statement constitutes a new publication. The new and the original authors could be jointly and severally liable for defamation. The Judge found that the articles were republished on the Internet and found the defendants liable for the republication.

Another aspect of the lawsuit was the readers’ comments on newspaper’s own website. The Judge found that although some of these comments were defamatory, the newspaper and the journalists did not publish the comments, in the sense that the word “publish” is used in the law of defamation. The Judge explained that the National Post’s web traffic numbered many thousands of visits per month and it would be unreasonable to expect the newspaper to pre-vet readers’ comments. But, once the paper became aware of the comments, whether through its own internal review or through complaints it received, failure to take deliberate action would amount to approval or adoption of readers’ comments. Once the offending matter is brought to the newspaper’s attention, if immediate action is not taken to deal with the comments the paper is a publisher of the comments as at that date. In this case, the National Post removed the offending materials within a day or two of receiving complaints. The Judge said this was all the newspaper could realistically do in the circumstances, and as such were not publishers of the readers’ defamatory comments.

The law of defamation in the Internet and social media context is evolving, and while the tests used by CRA to establish partisan political activities are different from the tests used in the law of defamation, what others say on Internet or social media platforms that you operate, may be held against you. Liability will depend on the circumstances, but all organizations should monitor and prevent, or immediately remove, readers’ defamatory posts. In addition, charities should not post, link, or reference partisan political comments, and should prevent, or immediately remove partisan posts, links or references made by others.

prod-npppNot-for-Profit PolicyPro published by First Reference will help understand the internal controls and policies needed in these situations.

Apolone Gentles, JD, CPA,CGA, FCCA, Bsc (Hons)

Apolone Gentles is a CPA,CGA and Ontario lawyer and editor with over 20 years of business experience. She has held senior leadership roles in non-profit organizations, leading finance, human resources, information technology and facilities teams. She has also held senior roles in audit and assurance services at a “Big Four” audit firm. Apolone has also lectured in Auditing, Economics and Business at post-secondary schools. Read more here
Send to Kindle

, , , , ,

Comments are currently closed.