This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minutes read

AI Chatbot flies solo and Air Canada foots the bill - Moffatt v. Air Canada

The Canadian decision of Moffatt v Air Canada 2024 BCCRT 149 has provided an early example of the sorts of claims that may be brought when AI chatbots give out inaccurate information.

Chatbots are increasingly being used by people as a quick way to get to information on company websites - last year, the Office for National Statistics reported that around 1/3 of adults had used chatbots in the month of June 2023. As the prevalence of these tools increases, we can expect to see individuals relying on the information provided more frequently and in broader contexts. 

Key Takeaways

  1. This case provides an early insight into the emerging issue of allocating liability in the context of chatbots as this technology is increasingly deployed in customer-facing environments. Whilst this was a small claim, it is easily foreseeable that the costs of misstatements by AI tools could be significantly larger (where the transaction is of a greater value or where a higher number of individuals are impacted).
  2. Companies using chatbots must prioritise ensuring their accuracy, particularly given the known risk of errors associated with AI. Companies should also consider whether there are specific topics that chatbots should avoid providing information on altogether, because the risk of misstatements is unacceptably high. 
  3. When contracting with third party providers of chatbots and other AI tools, companies should consider how their contracts allocate risk and liability in situations where these tools produce incorrect information. 

Background

Mr Moffat used a chatbot on Air Canada’s website to enquire about the airline’s bereavement fares policy for passengers travelling after the death of an immediate family member. The chatbot incorrectly informed Mr Moffat that he could apply for a reduced bereavement fare retrospectively by completing the relevant form within 90 days of his ticket being issued. That chatbot also provided a link to an Air Canada webpage with additional information on bereavement fares. This webpage stated that the bereavement policy did not apply to requests made after travel had been completed.

Mr Moffat did not click on the link to the webpage and, relying on the chatbot’s advice, booked his flights. He subsequently applied for the bereavement fare within the 90 day window. Air Canada conceded that the chatbot had provided “misleading words” but refused to grant the bereavement fare.

The Tribunal’s Decision

The Civil Resolution Tribunal found that Air Canada had negligently misrepresented the procedure for claiming its bereavement fares. It therefore ordered Air Canada to pay Mr Moffat damages in the form of the difference between the fare he paid and the bereavement fare.

Air Canada argued that:

(i) Mr Moffat failed to follow the correct procedure to claim the bereavement fare; and 

(ii) that it could not be held liable for the information provided by one of its agents, servants, or representatives – including a chatbot. 

The Tribunal found that part (ii) of Air Canada’s argument was “a remarkable submission” and was tantamount to suggesting that the chatbot was a separate legal entity responsible for its own actions. It was immaterial whether the information provided was via a static page or a chatbot, the chatbot was merely a component of Air Canada’s website for which Air Canada was entirely responsible. The Tribunal accepted that Mr Moffat relied on the chatbot to provide accurate information and there was no reason why he should know that one section of Air Canada’s website was accurate, whilst another was not.

 

While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.

Tags

artificial intelligence