YNOT
  • Home
  • Industry News
    • Adult Business News
    • Adult Novelty News
    • YNOT Magazine
    • EU News
    • Opinions
    • Picture Galleries
  • PR Wire
    • Adult Company News
    • Adult Retail News
    • Adult Talent News
    • Adult Videos News
  • Podcasts
  • Industry Guides
    • Adult Affiliate Guide
    • Affiliate Marketing for Beginners
    • Top Adult Traffic Networks
    • Top Adult PR Agents
    • Funding an Adult Business
  • Business Directory
    • View Categories
    • View Listings
    • Submit Listing
  • Newsletters
  • Industry Events
    • Events Calendar
    • YNOT Cam Awards | Hollywood
    • YNOT Awards | Prague
    • YNOT Cammunity
    • YNOT Summit
    • YNOT Reunion
  • Login with YNOT ID

Automating Customer Service has its Risks; Just Ask Air Canada’s Chatbot

Posted On 19 Feb 2024
By : GeneZorkin

Automating Customer Service has its Risks; Just Ask Air Canada’s Rogue ChatbotBRITISH COLUMBIA, Canada – At a time when businesses are embracing automated solutions in a customer service context, be it driven by sophisticated artificial intelligence or simpler form of old-school “chatbot,” a cautionary tale has arisen out of Canada.

Thankfully for the company involved (Air Canada), the liability the company incurred thanks to its automated customer service representative was minor and inexpensive, but it doesn’t take much imagination to think up more dire scenarios flowing from similar circumstances.

At issue in the dispute between Air Canada customer Jake Moffat and the airline, the company’s chatbot gave Moffat inaccurate information regarding bereavement fares, most crucially telling Moffat he could apply for a bereavement refund within 90 days of purchasing his ticket, by completing an online form.

Air Canada didn’t dispute that the chatbot gave Moffat the wrong information, but the company did dispute the question of whether Moffat was due his bereavement refund due to the misinformation provided by the chatbot.

Moffat sued Air Canada for the difference between the fares, with the issue landing in front of the Civil Resolution Tribunal, which functions as a small claims court within British Columbia’s public justice system.

Air Canada argued it wasn’t liable for the chatbot’s mistake, because “it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot,” as Tribunal member Christopher Rivers put it in his decision.

Rivers, perhaps unsurprisingly, was not swayed by the airline’s defense.

“(Air Canada) does not explain why it believes that is the case,” Rivers wrote. “In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Reading through the decision, it’s probably fair to say that Air Canada didn’t give much thought to how it went about arguing against Moffat’s claims – or much thought as to whether the negative publicity which could result from the dispute might outweigh the roughly $870 that simply issuing the refund would have cost the company.

“In its boilerplate Dispute Response, Air Canada denies ‘each and every’ one of Mr. Moffatt’s allegations generally,” Rivers observed in his ruling. “However, it did not provide any evidence to the contrary.”

(Pro tip: When denying an opponent’s claims, providing evidence to the contrary is, generally speaking, not such a bad idea.)

“When a party fails to provide relevant evidence without sufficient explanation, an adjudicator is entitled to draw an adverse inference,” Rivers wrote. “An adverse inference is where an adjudicator assumes a party has failed to provide relevant evidence because the missing evidence would not support their case.”

Now, to my knowledge, there are no adult websites that offer “bereavement memberships,” but the point here isn’t that it’s bad to have chatbots offering misinformation about specific types of refunds. The point is it’s bad – and potentially liability-inducing – to have chatbots offering your customers incorrect information of any kind.

The $812.02 the company has been ordered to pay Moffat obviously isn’t going to break Air Canada’s bank, but the hit to the company’s reputation (not least because it seems rather ghoulish to bicker with a customer over a legitimate bereavement refund to begin with) certainly feels like it would have been worth $812.02 to avoid.

So, if you use chatbots or automated systems of any other sort to assist with your customer service, what’s the moral of the story here?

First, it’s a good idea to occasionally put yourself in your customer’s shoes as literally as possible, by using the automated system yourself. Ask questions to which you know the correct answer, and to which the chatbot should also know the correct answer, and if it gets the answer wrong, you know somebody has some coding to do.

Another consideration is sometimes the correct answer is subject to change. In that case, you need to be sure you’re updating your systems accordingly, so your customer service representatives, robot or otherwise, have the latest information to offer your customers.

Finally, if something does go awry because your system provided inaccurate information, whether that misinformation was offered by a human employee or a bot, simply own the mistake and do what’s needed to make things right with your customer(s). Fighting for every last nickel might seem fiscally responsible in the short term, but you might feel differently about things later, if your miserly ways wind up going viral.

 

Robot image by Pavel Danilyuk from Pexels

About the Author
Gene Zorkin has been covering legal and political issues for various adult publications (and under a variety of different pen names) since 2002.
  • google-share
Previous Story

Tuckituppp Returns as Gold Sponsor of 2024 TEAs

Next Story

The Flourish XXX Releases New Mimi Boliviana Gangbang Scene

Leave a Reply Cancel reply

You must be logged in to post a comment.

Sponsor

YNOT Shoot Me

YNOTShootMe.com has exclusive pics from adult industry business events. Check it out!

YNOT Directory

  • Babestation
    Clips Stores
  • CrossCard by PPRO
    Payment Services
  • Hosting Metro
    Website Hosting Services
  • Premiere Listing

    Mail Value Profits

    More Details

RECENT

POPULAR

COMMENTS

Beth McKenna Announces Latest Collaboration with "College Girls Reunion"

Posted On 16 Jun 2025

Ricky’s Room Bows Stunning New Anna Claire Clouds DP Scene

Posted On 16 Jun 2025

Ria Bentley Unveils Hot New Scene with Masculine Jason

Posted On 16 Jun 2025

Vanessa, Meet Vivid

Posted On 29 Sep 2014
Laila Mickelwaite and Exodus Cry

Laila Mickelwaite, Exodus Cry and their Crusade Against Porn

Posted On 03 May 2021

Sex Toy Collective Dildo Sculptor

Posted On 19 Mar 2019

Find a good sex toy is now a problem,...

Posted On 18 Mar 2024

Thanks to the variety of sex toys, I can...

Posted On 02 Feb 2024

I understand the concerns about...

Posted On 05 Jan 2024

Sponsor

Sitemap
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.OkPrivacy Policy