Ashley MacIsaac, who is seeking $1.5m in civil lawsuit, says inaccurate information led to concert cancellation
He’ll get a correction and maybe a few bucks. Shit happens all the fucking time with journalism.
-
This wasn’t journalism.
-
This does not happen all the time with journalism.
-
A correction doesn’t fix this. Unlike with actual journalism, the problem is with the AI and not with a specific output.
- Doesn’t matter. Same legal concept.
- 👌
- Yes, retaining and providing guided training data can absolutely reduce the problem to a trivial error rate that is just as bad as traditional search when dealing with a shitty input.
You people say the dumbest shit just ‘because ai’.
You had me until that whole “you people” fascist talk.
👌🤣 that’s definitely a new one.
And you’re weirdly invested in defending an AI that falsely claimed someone committed sexual assault.
Good look.
👌 sorry the reality take is weird.
-
Mr MacIsaac had similar slander before, like 20 years ago. I would be surprised to find that lie fed this one.
I’m not sure it was slander? I think he admitted to a relationship with a younger guy (like 20?) while Ashley was in his 30’s. It was kinda scandalous at the time, but I think it was as much the surprise that he was gay.
Two adults doing whatever they want how scandalous.
Hell Canadian “scandals” back in the day were and still fairly tame.
I remember when the fact the “Barenaked Ladies” simply being called “The Barenaked Ladies” was incredibly scandalous here in Canada. Or the fact Alanis Morrisette cussed on her ablum.
Not sure if this is wider than the UK, but the word “fiddler” is often used here to describe someone who sexually assaults children. AI may well have decided this to be the case here.
This is way beyond a bit of regional language differences.
Google’s AI Overview also wrongly stated that MacIsaac had been listed on the national sex offender registry for life, the lawsuit says.
My guess is that someone with a similar name is on that list, and the LLM confidently presented them as the same person.
AI didn’t decide squat, it’s auto completing text which is what leads to these kinds of mix ups.
Which is an argument to shut down chatbots down until they do an actual AI capable of understanding the semantics of the text.
I agree a AI Language model that cannot understand context, region or colloquialism is pretty low standard.
It does. Just not regional shit it hasn’t been trained on as much.
It does not.
It uses probability to predict words that belong near the other words in your prompt. It’s a more powerful version of the suggested words at the top of your phone keyboard.
You could write a sentence or two using that, but it will quickly descend into gibberish.
It does not “understand” what you wrote in any way. It just has a heuristic that shows the word “Mom” comes after the word “Your” with a high frequency, so that’s what it’s going to output.
Ah. You think ai “understands.”
Moron detected.
It bears repeating thatLLMs are unable to “understand” anything at all. They are not knowledge knoledge-based. Instead they auto complete based on the word probabilities that are found in the text they were trained with
Did the AI get the wrong end of the stick because of the word “fiddler”?
Sue for more than $1.5M bro
Sue for $1T.
Fuck em. Be as egregious as them.
At least in the US, you file suit for the maximum damage for every claim you make. Awards can (and usually are) be reduced from that, but cannot be increased beyond that. It’s unclear where he filed the lawsuit, so this may not apply here. I’m sure the $500k for each claim is the maximum allowed given the allegations.
Definitely top tier strategy going in and pissing off a judge for what amounts to be a simple lible case that would normally just get you a retraction. Top legal minds don’t want you to know this one trick.
Dang, you’re a lawyer, too?!
Apparently you need to be a lawyer to know it’s a bad idea to piss off judges. til!







