this post was submitted on 16 Feb 2024
750 points (99.5% liked)

Technology

59652 readers
4637 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Air Canada appears to have quietly killed its costly chatbot support.

you are viewing a single comment's thread
view the rest of the comments
[–] autotldr@lemmings.world -5 points 9 months ago (1 children)

This is the best summary I could come up with:


On the day Jake Moffatt's grandmother died, Moffat immediately visited Air Canada's website to book a flight from Vancouver to Toronto.

In reality, Air Canada's policy explicitly stated that the airline will not provide refunds for bereavement travel after the flight is booked.

Experts told the Vancouver Sun that Moffatt's case appeared to be the first time a Canadian company tried to argue that it wasn't liable for information provided by its chatbot.

Last March, Air Canada's chief information officer Mel Crocker told the Globe and Mail that the airline had launched the chatbot as an AI "experiment."

“So in the case of a snowstorm, if you have not been issued your new boarding pass yet and you just want to confirm if you have a seat available on another flight, that’s the sort of thing we can easily handle with AI,” Crocker told the Globe and Mail.

It was worth it, Crocker said, because "the airline believes investing in automation and machine learning technology will lower its expenses" and "fundamentally" create "a better customer experience."


The original article contains 906 words, the summary contains 176 words. Saved 81%. I'm a bot and I'm open source!

[–] wjrii@lemmy.world 11 points 9 months ago

This is, uhhh, not good. Appropriate (or maybe ironic, if you're a Canadian singer songwriter and You Can't Do That on Television alum) for an article about a bad chatbot.