It’s hard to sue a robot: product liability considerations and AI in Canada

As the use of AI proliferates and as the systems themselves become more autonomous, the risk that they will cause harm to property or individuals naturally increases. It is now unclear how and if the existing Canadian legal framework will apply to damages or losses resulting from AI use or operation.

Lisa R. Lifshitz

There is no question that artificial intelligence is surging in Canada. During the first quarter of 2018 alone, $89 million was invested into Canadian AI companies.

To date, critics evaluating the legal impact of AI have largely focused on concerns related to the protection of individual privacy, security issues and, of course, ethics and bias. Other areas of legal concern associated with AI usage have received less airplay, including product liability.

However, as the use of AI proliferates and as the systems themselves become more autonomous, the risk that they will cause harm to property or individuals naturally increases. It is now unclear how and if the existing Canadian legal framework will apply to damages or losses resulting from AI use or operation.

For example, will software-based AI systems offered as a service be deemed to be products?  

Defective products laws could apply when AI systems do not perform as intended and cause damages or injuries to a person or property. In Canada, those seeking redress will look to the legal concepts of tort, contract and, in certain instances, consumer protection legislation.

Tort

Under Canadian tort principles, the three main causes of action arising from a defective product relate to  negligent design, negligent manufacture and failure to warn. In a negligence-based cause of action (as opposed to strict liability standard), the plaintiff must be able to prove not only that the defendant both owed a duty of care to the plaintiff and breached the requisite standard of care but also that there was a defect with the product. Successful plaintiffs must also establish a link between the defect, the negligent act and the harm suffered.

Liability for damages or injuries caused by a product may fall on a number of different parties that are involved in the manufacture, sale or distribution of the product. All or a portion of liability may fall on the party that owns, uses or operates the product should it be found to have misused or negligently operated the product.

Manufacturers have a legal duty to create a product that is reasonably safe and to educate and inform the end user of any risks associated with its use. Manufacturers also have obligations to take reasonable care in the manufacture of their products, including the components of such products.

Identifying the “defect” in AI systems as well as proving negligence on the part of an organization within the supply chain of a product will be difficult to establish. This is due not only to the autonomous nature of AI but because of the vast amount of components required to operate the AI. The process for establishing liability is even more convoluted when dealing with black-box algorithms, whereby it may be unclear or unknown as to how the AI system operates. If the AI algorithm is autonomously making countless decisions based on inputs it receives, it will be difficult to establish that the harm or damage was directly caused by the negligent actions of one particular party in the supply chain.      

As AI systems improve and become more autonomous, the ultimate end users’ role in the operation of such AI systems will be dramatically reduced. Traditionally, when determining if the design of a product is dangerous or if a defendant has failed to adequately warn a plaintiff of the dangers of using the product, Canadian courts will consider whether the plaintiff's use of the product was reasonably foreseeable. Canadian courts have typically taken the position that defendants will not be found liable for damages or injury caused by a product if such damage or injury resulted from a plaintiff’s abnormal use of the product and the defendant could not be expected to reasonably foresee such use. Defendants in an action could rely on this defence of abnormal use to limit or absolve themselves of liability.

The increase in automation will lead to a reduction in the end users’ input into the use and operation of the AI system. This change may then result in a shift in liability from the end user to the manufacturer or those parties involved in the distribution or sale of the AI system. For example, the driver of an autonomous vehicle’s responsibility for the operation of the vehicle will be reduced to the point where the driver is no longer in direct control of the vehicle. Who is liable when a machine commits or participates in a tort? Liability will, therefore, be transferred from the driver to the car manufacturer and those parties involved in the distribution of the vehicle. Manufacturers and those parties involved in the distribution and sale of AI systems should look to address this shift in liability by negotiating exclusions of liability and indemnities in their contracts with suppliers and customers. 

Statutory obligations

Canada’s Constitution Act grants Ottawa exclusive jurisdiction over certain matters, including the regulation of trade and commerce. Under such jurisdiction, Canada’s federal government has passed legislation that regulates the manufacture of certain goods, such as hazardous products, automobiles and related products. Each provincial government has exclusive jurisdiction over property and civil rights within its province. Under such jurisdiction, each province has also passed legislation dealing with the consumer protection, sale of goods and product warranties.

Implied contractual warranties may also exist under common law. Such implied warranties have been codified in provincial statutes such as the Ontario’s Sale of Goods Act, which includes implied warranties of fitness for purpose and of merchantable quality in contracts between buyers and sellers for the sale of goods. However, parties may, either by expressly agreeing or through the course of their dealings, vary or exclude such implied warranties for goods sold in Ontario, except in the case of consumer goods. An organization should, therefore, determine if it desires to have these statutory warranties apply to an agreement for the sale of goods to which it is a party, and if not, confirm that its agreements expressly disclaim the application of such warranties, if able to do so.

Each province has also enacted its own consumer protection legislation governing transactions between organizations and consumers located in that province for sales to consumers. In Ontario, the Consumer Protection Act applies to all consumer transactions (subject to some exceptions) where the consumer is located in Ontario and when the transaction takes place in Ontario.

Ontario’s act sets out specific obligations on the part of the supplier for various types of transactions between businesses and consumers. For example, a supplier is deemed to warrant that any services supplied under a consumer agreement are of reasonably acceptable quality.

The Ontario Consumer Protection Act prohibits the varying of the implied warranties of fitness for purpose and of merchantable quality in contracts between consumers and suppliers as provided under Ontario’s Sale of Goods Act.

Québec has some of the strongest consumer protections laws in Canada. These laws are applicable to agreements for services and may even apply to services that are made available free of charge. Under Québec’s Civil Code, Québec courts have jurisdiction to hear an action involving a consumer contract if the consumer is a resident of Québec, even if such consumer has waived such jurisdiction by way of a choice of venue clause, according to s. 3149. Québec’s Consumer Protection Act provides that all consumer agreements shall be governed by the federal laws of Canada and the provincial laws of Québec, and that any choice of law clause to the contrary is prohibited. Additionally, s. 10 prohibits organizations from excluding liability for its own acts or the acts of its representatives. And, according to s. 19.1, the Québec act requires that any clause in a consumer contract that is not applicable in the province of Québec must be immediately followed by a prominently presented statement to this effect.

Mandatory arbitration clauses are prohibited under consumer protection laws in certain Canadian provinces such as Québec, Ontario and Alberta. For example, in Ontario, any term in a consumer agreement or related agreement that requires disputes arising from such consumer agreements be submitted to arbitration is invalid. Consumers in Ontario will, therefore, have the right to commence an action relating to such agreement in the applicable Ontario court. Contracts with consumers in Canadian provinces should address this prohibition on mandatory arbitration.

Both provincial and federal laws may apply to AI systems, depending on the application or implementation. For example, if an autonomous vehicle is sold in Ontario, the production and operation of such autonomous vehicles falls under the scope of both federal and provincial legislation. Transport Canada has established regulations, policies and standards relating to vehicles. Ontario consumer protection laws will also apply if the vehicle is purchased by a consumer. Organizations developing or selling AI systems within Canada should remain aware of applicable statutory requirements at both the federal and provincial level for the provinces in which they operate with respect to product liability in order to design and implement AI systems that comply with such legislation.

Contractual liability

Alternatively, if a contractual relationship exists between a customer and a vendor, liability may be decided under the terms of the contract. Warranty law principles, such as statutory warranties, may also apply along with concurrent and independent tort law liability. Unlike under tort principles, contract claims are strict liability claims and do not require negligence on the part of one party.

A party may bring an action for breach of contract should a product be defective or not meet those standards outlined in the contract. Typically, warranties are expressly stated in the contract between the parties, but they may also arise during negotiations. Canadian courts have taken the position that there must be strong evidence in support of claim that the parties intended there to be a collateral warranty with respect to a product, especially where such warranty would vary the written terms of a contract.

Going forward, advancements in AI may lead to a transfer of liability from the end user to those parties within the product supply chain. To account for this shift in risk, organizations involved in the development or sale of AI systems, or those organizations that acquire or incorporate such AI systems into their products, should carefully review the terms of their contracts with both customers and suppliers to ascertain the scope of responsibilities and obligations between them and further tailor such terms to the AI system developed or sold. In doing so, AI developers may wish to limit their exposure to liability contractually or require customers or suppliers to indemnify against certain liabilities that may otherwise fall on such organization. To account for this potential reallocation in liability, AI developers and users of AI systems should also review the terms of their existing insurance policies to confirm that they are adequately protected.

As the law of AI in Canada is in its infancy, questions remain as to whether Canada’s existing laws and regulatory framework can adequately deal with issues such as product liability or whether more specific legislation is required to adequately address the complex legal issues that this technology brings. In the interim, AI developers and users of AI systems operating in Canada must be mindful of and comply with present-day Canadian laws and regulations that currently impact such technology.

The author gratefully acknowledges the assistance of Myron Mallia-Dare in preparing this column.

Recent articles & video

Last few days to nominate in the Top 25 Most Influential Lawyers

Why this documentarian profiled elder rights advocate Melissa Miller in Hot Docs film Stolen Time

Saskatchewan government boosts practical learning at University of Saskatchewan College of Law

BC Supreme Court clarifies the scope of solicitor-client privilege in estate administration

Federal Courts invite public feedback on the conduct of a global review of its rules

BC proposes legislative changes to support First Nations land ownership

Most Read Articles

National Bank cannot fulfill Greek bank’s credit guarantee due to fraud exception: SCC

Canada facing pervasive ransomware, broader cyber-criminal landscape and threat from AI: lawyer

Ontario Court of Appeal rules against real estate developer for breach of a joint venture agreement

Canadian Lawyer partners with legal associations to survey legal graduates