Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare

Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.

Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.

Amid a broader push by companies to automate services, the case – the first of its kind in Canada – raises questions about the level of oversight companies have over the chat tools.

  • LufyCZ@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    8
    ·
    9 months ago

    Well it’s true, to a certain extent.

    If an employee (or a chatbot, for that matter), promised an egregious sum for no reason, I don’t think the company should be liable either.

    Imagine getting hired to do support, having a friend open a chat and you promising to give him a milion dollars. Makes no sense.

    But getting mislead about ticket pricing and them then refusing to refund the fare at least partially (the part that they promised would not be charged) is absolutely something they should be liable for.

    And lawyer fees plus some pocket money for wasting peoples’ time, if getting a refund entails more than an email or two.

    • hitmyspot@aussie.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      Yes, it was reasonable to believe was the point. I think what’s also interesting is the bot referred them to the correct information, which was part of the defence. However, the ruling said that both were provided by the company, the customer had no reason to believe the website gave more accurate information in one place compared to another.