Policy Monitor

Canada – Moffatt v. Air Canada

Although this case was decided under Canadian law, it is not unlikely that European or Belgian courts would render a similar decision under European consumer and/or contract law. Therefore, also in Europe, companies should take up responsibility and ensure that chatbots provide accurate information to persons interacting with them.

What: Ruling

Impactscore: 1

For who: Deployers and users of chatbots

URL: https://www.canlii.org/en/...

Key takeaways for Flanders

Although this case was decided under Canadian law, it is not unlikely that European or Belgian courts would render a similar decision under European consumer and/or contract law. Therefore, also in Europe, companies should take up responsibility and ensure that chatbots provide accurate information to persons interacting with them.

When booking a flight, the complainant visited the Air Canada website and used the chatbot to request information about bereavement fares. Unfortunately, the chatbot provided incorrect answers, although it did include links to pages on the website that contained the correct information. Trusting the chatbot’s responses, the complainant did not visit these pages and followed the incorrect procedure outlined by the bot.

Later, when he contacted the airline, he discovered that the procedure provided by the chatbot was wrong, and his application for bereavement fares was denied. Consequently, the complainant filed a complaint for negligent misrepresentation with the court.


Negligent misrepresentation occurs when a seller fails to exercise reasonable care to ensure that their representations are accurate and not misleading. The judge ruled that the airline owed a duty of care to its website visitors, which included ensuring the accuracy of the information provided.

Air Canada argued that they were not guilty of negligent misrepresentation because the chatbot should be considered a “separate legal entity responsible for its own actions.” Additionally, they noted that the correct information was available elsewhere on the website.

The court rejected this argument, ruling that it is reasonable for a website visitor to trust the accuracy of a chatbot's responses. Therefore, Air Canada failed to exercise reasonable care in ensuring the accuracy of the chatbot, as it was an integral part of their website.

The ruling establishes that companies can be held liable for the information provided by their chatbots, as the chatbots are not considered separate entities but extensions of the company. This means that companies are obligated to keep their chatbots or other automated systems up-to-date and ensure they provide accurate answers, consistent with applicable company policies, practices, and procedures.

This is especially important given that many AI systems (especially when relying on Large Language Models) still suffer from hallucinations. Therefore before a company relies on a chatbot, it must ensure the bot can accurately answer relevant questions.

Unfortunately, Air Canada did not address why the chatbot should not be considered negligent.

It is stressed that this case stems from a Canadian court, and so far, there has been no similar ruling in the European Union. However, a similar outcome could arguably be expected from a European court. In Belgium, a form of pre-contractual liability would likely be relied upon. Parties are obligated to act in good faith during the pre-contractual phase and not provide misleading information. A breach of this obligation can result in compensation being awarded.