Airline held liable for its chatbot giving passenger bad advice
When Air Canada's chatbot gave incorrect information to a traveller, the airline argued its chatbot is "responsible for its own actions"
Artificial intelligence is having a growing impact on the way we travel, and a remarkable new case shows what AI-powered chatbots can get wrong – and who should pay.
Key takeaways
- Air Canada's chatbot mistakenly offered a discount to a passenger needing to book a full-fare flight for his grandmother's funeral, promising he could apply for a bereavement fare later;
- However, when the passenger tried to claim the discount, the airline denied it, stating the request needed to be made before the flight. Air Canada argued the chatbot was a separate legal entity responsible for its actions;
- The British Columbia Civil Resolution Tribunal disagreed, stating Air Canada is ultimately responsible for all information on its website, whether from a static page or a chatbot.
Get the full story at BBC