Women’s Legal Education and Action Fund study makes 14 recommendations
Federal legislative reforms are needed to counteract the enormous harm done by digital platforms that provide new mechanisms for those who engage in abusive conduct to target groups and individuals, says a new report from the Women’s Legal Education and Action Fund.
In particular, the report says, federal laws need to protect women and those from marginalized and systemically oppressed communities.
“Online platforms such as social media networks, discussion forums, search engines, and video sharing websites have become central venues of our personal and professional lives,” the Deplatforming Misogyny report says. However, it adds that it comes as “no surprise” that online platforms are also central sites of what has become known as “Technology-Facilitated Gender-Based Violence, Abuse, and Harassment,” or TFGBV.
TFGBV has often been exacerbated by the action, or inaction, of the platforms themselves, the report says. For example, “Facebook has allowed pages glorifying intimate-partner violence to stand, while removing images of women breastfeeding,” and Twitter has been “quick to suspend users who are targets of online abuse, while frequently ignoring the activity of abusive users.”
Among the types of behaviour that the report points to as being a form of “online violence” are: hate speech, trolling, impersonation, voyeurism, online luring, doxing (publishing identifying and location information online) and “sextortion.”
These forms of abuse relegate women and girls to secondary status online and globally, the report says. “They are rendered unable to freely and fully participate in society and prevented from enjoying true or equal protection of their human rights and fundamental freedoms.”
The report goes on to say that the most common response to facing online abuse and harassment “is that women reduce their online activities, avoid specific social media platforms or conversations, withdraw from expressing their views, or self-censor if they continue to engage online.
“This curtails their ability to participate in the contemporary public sphere, including engaging in activism and advocacy, influencing public opinion, or mobilizing social, cultural, or political change. The current state of affairs amounts to a systemic democratic failure and must be addressed as such.”
The report calls for federal action, based on guiding principles such as:
• recognizing a need for legal reform to address TFGBV, including through platform regulation;
• acknowledging that Canadian constitutional law justifies imposing “proportionate limits” on freedom of expression to uphold and protect the rights to equality and freedom from discrimination, and also to give full effect to the core values underlying freedom of expression; and
• requiring transparency from platform companies regarding their content moderation policies and decisions, and the outcomes of such policies and decisions concerning TFGBV.
The report was researched and written by Cynthia Khoo, a technology and human rights lawyer and researcher who was also counsel at the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic.
“The purpose of this report was to bring attention to the fact that platform technology has facilitated a lot of gender-based violence – it’ s a huge pervasive problem in Canada,” she says.
“But it’s not just about individual users behaving badly. It’s a problem that ties into traditional systemic oppressions, misogyny, and of course, oppressions such as racism, colonialism, homophobia, and transphobia, and so forth.”
Khoo adds that the report calls for action, but for a “nuanced” response that gets to the heart of the issue of online violence in its many forms, not lumped together with other matters such as dissemination of terrorism through digital platforms. While much of the report focuses on women, Khoo says much it could also focus on other oppressed or marginalized groups subjected to similar online violence.
One type of platform that warrants attention, Khoo says, is the category of digital platforms that appear “deliberately designed to encourage and profit from such abuse.” Examples of such “purpose-built” platforms are ‘The Dirty’ and sites dedicated to sharing images that constitute “revenge porn.”
There are Canadian laws that could theoretically create platform liability for TFGBV by a platform’s user, Khoo says, given the right circumstances. Many of these laws have not been tested in court and don’t address this kind of online violence.
The report states: “There appears to be a gap in Canadian law, in that there is no specific form of legal liability for platforms with respect to TFGBV.” However, some common principles emerge from current liability law and jurisprudence, informing how both can address platform liability for TFGBV.
However, Deplatforming Misogyny concludes with 14 recommendations for legislative changes at the federal level. They are:
• Applying a principled human rights-based approach to platform regulation and platform liability, including giving full effect to the rights to equality and freedom from discrimination.
• Ensuring that legislation addressing TFGBV integrates substantive equality considerations and guards against exploitation by members of dominant social groups to silence expression by historically marginalized groups.
• When pursuing legislative or other means of addressing these issues of online violence, consult with and consider the perspectives of victims and survivors.
• Establish a centralized expert regulator for TFGBV specifically, with a dual mandate to provide legal remedies and support to individuals impacted by TFGBV on digital platforms and provide training and education to the public, relevant stakeholders, and professionals.
• Enact one or more versions of the current ‘enabler’ provision in subsections of the Copyright Act, adapted to specifically address different forms of TFGBV, including ‘purpose-built’ platforms.
• Enact a law that allows for victims and survivors of TFGBV to obtain immediate removal of certain clearly defined kinds of content from a platform without a court order.
• Ensure that legislation to address TFGBV focuses solely on TFGBV and not dilute, compromise, or jeopardize the constitutionality of such legislation by ‘bundling’ TFGBV with other issues that the government may wish to also address through platform regulation.
• Require that platform companies provide users and non-users with easily accessible, plain-language complaint and abuse reporting mechanisms to address instances of TFGBV.
• For TFGBV-dedicated platforms, provide that an order to remove specific content on one platform will automatically apply to any of that platform’s parent, subsidiary, or sibling platform companies where the same content also appears.
• Require platform companies to undergo independent audits and publish comprehensive annual transparency reports.
• When determining legal obligations for digital platforms, account for the fact that platforms vary dramatically in size, nature, purpose, business model, and user base.
• Finance and mandate information on how to support those subjected to TFGBV
• Fund frontline support workers and community-based organizations working to end these forms of online violence.
• Find research money for law and policy experts and organizations to study the impacts of emerging technologies on those subjected to TFGBV.
Khoo says that while “freedom of speech” has an important social, cultural and legal role in the Constitution of the United States, the Supreme Court of Canada has recognized that “we have a matrix of rights all which needs to be equally respected and kept in balance with each other. “
The problem is that the digital platforms, often based in the U.S., have imported this “First Amendment absolutism” to Canada, Khoo says. She adds the Supreme Court of Canada has said freedom of expression must advance three things - the pursuit of truth, individual self-fulfillment through self-expression, and participation in democracy.
“So when it comes to righting that balance, substantive equality has to be the North Star,” she says. “It has to be proportionate and constitutional in addressing gender-based violence online.