Taxi Driver Online

UK cab trade debate and advice
It is currently Thu Apr 30, 2026 8:44 am

All times are UTC [ DST ]




Post new topic Reply to topic  [ 24 posts ]  Go to page Previous  1, 2
Author Message
PostPosted: Sat Dec 13, 2025 5:43 am 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
Quote:
The gender-critical group Not All Gays Ireland was referred to as “Not for Gays” in Kemp’s ruling, prompting them to formally demand a correction and an apology.

If I had to guess from my own experiences, I'd say that was an autocorrect clanger.

But I've never used AI, so can't really comment about the more advanced stuff :?

Quote:
Kemp’s ruling also features numerous American spellings of words including “victimization” and “minimization”.

One senior legal figure said: “Sandy Kemp is an employment and discrimination lawyer by trade. The idea that he would spell victimisation with a z is unthinkable. That strongly suggests to me that it was written by AI.”

The default basic spellchecker that my laptop uses on here constantly suggests words like that should be spelt with a z, but obviously I just ignore it [-(

(Maybe there's a Windows setting somewhere that has my basic settings set to US rather than English spellings, but I've never bothered to look. The other wee laptop I use in the car doesn't have a default spellchecker, and I can never be bothered looking into that either. But if anyone thinks some of my posts are riddled with spelling errors, but others aren't, then that's the reason :lol:)

So maybe some minion who was tasked with spellchecking the judgment used it to change a lot of the spellings to using a z rather than s :lol:


Top
 Profile  
 
PostPosted: Sat Dec 13, 2025 9:08 pm 
Offline
User avatar

Joined: Wed Sep 03, 2003 7:30 pm
Posts: 57349
Location: 1066 Country
This one is going to run and run.

I get that grammar and spelling errors happen all the time, but this judgment has invented quotes.

Someone, or some people, are going to lose their jobs over this.

_________________
IDFIMH


Top
 Profile  
 
PostPosted: Sun Dec 14, 2025 4:21 am 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
An opinion piece in today's Sunday Times. And it's the national edition, and not just the Scottish edition, as far as I can tell - it's by Dominic Lawson, who's Nigel's son. And if you know that particular fact, then like me you're probably showing your age a bit :lol:

Anyway, it was only in the last couple of days I noticed the word 'hallucination', which I hadn't seen before. At least, not in the context of AI :-o

But in the past few days, I've been seeing it all over the place, and not just in the context of the Sandie Peggie case.


Justice is meant to be blind, not to hallucinate

https://www.thetimes.com/comment/column ... -cqfsc627h

The judge in the Sandie Peggie tribunal is suspected of using AI. Both police and courts are befuddled by this technology: it is no laughing matter

Hallucination, they call it: the innocent term used to describe entirely fictitious answers given by generative artificial intelligence tools. This is generally regarded as a source of (instructive) amusement. Recently Jimmy Wales, co-founder of Wikipedia, explained what happened when he asked ChatGPT: “Who is Kate Garvey?” In the real world Garvey, who worked for Tony Blair in Downing Street, is married to Wales. But ChatGPT came up with a string of hallucinations, one of which was that “Kate Garvey is married to Peter Mandelson”. When Wales responded to ChatGPT: “Isn’t Peter Mandelson quite famously gay?”, the tool chided him for inappropriate speculation about the peer’s sexual orientation.

Wales has a dog in this fight. ChatGPT, to some astonishment, overtook Wikipedia in monthly visits this year. Still, I thought I’d try the same sort of query, asking it who Rosa Monckton is. As far as I know, she is my wife, and in the professional world launched Tiffany & Co in the UK. But on one of my visits ChatGPT informed me she was the “co-founder of Pimlico Carpets”. Er, no. Although to be fair to ChatGPT, she may once have bought a carpet in Pimlico. On another visit it asserted that “Rosa Monckton’s full name is Lady Bowes-Lyon (after her marriage)”. So, either my wife is bigamously married into the family of the late Queen Elizabeth the Queen Mother, or I am, without previously being informed of it, a scion of the Anglo-Scottish aristocracy.

But such hallucinations cease to be amusing when they pollute the legal system, as a result of laziness — or worse — on the part of professional practitioners. Over the past few days there were two apparent examples, in cases of high public importance. The Sunday Times had already discovered how West Midlands police misled parliament over the evidence for its decision to ban supporters of Maccabi Tel Aviv from attending last month’s match against Aston Villa. Now we learn that WMP’s dossier, revealed to MPs, included a reference to a match Maccabi Tel Aviv played “against West Ham in the Uefa Europa Conference League group stage on November 9, 2023 … at the London Stadium”.

No such match took place. The Birmingham-born MP Nick Timothy pointed out: “The match was a fiction. It was a hallucination by AI.” Not true, says the WMP chief constable, Craig Guildford. While admitting it was “completely wrong”, he said it was a result not of the use of AI but of “social media scraping”. However, he provided no source. In any case why should the police, of all organisations, require “social media scraping” to discover when Maccabi Tel Aviv last played a match in this country? And we know WMP are assiduous users of AI, from a press release they put out in September boasting how “we updated our technology to unlock the power of AI”.

Last week an employment tribunal in Scotland ruled on claims of harassment under the Equality Act 2010 by an NHS nurse, Sandie Peggie, who had objected to being expected to share a changing room with a trans woman — that is, a biological male — called Dr Beth Upton. Judge Alexander Kemp cited an earlier tribunal case involving Maya Forstater, which, he wrote, had stated that the Equality Act does not create “a hierarchy of protected characteristics”. Forstater, now head of the charity Sex Matters, tweeted, definitively: “This ‘quote’ from my judgment doesn’t come from my judgment. It is completely made up.”

Michael Foran, an expert in this field of the law at Oxford University, declared: “The [Peggie] judgment included supposed quotes from specific judgments that do not appear in those judgments. That in itself is extraordinary.”

The Scottish Courts and Tribunal Service has guidelines stating it is “not using or considering the use of AI in relation to any form of decision-making”, for the good reason that AI tools may, according to the UK Courts and Tribunals Judiciary, make up “fictitious cases, citations or quotes, or refer to legislation, articles or legal texts that do not exist”. Kemp was obliged to reissue his judgment, with the fake citations removed but with no admission that AI use was the cause, referring instead to “clerical mistake[s]”. Bull.

The starkest example of what is beginning to happen in the courts came a couple of months ago, when Judge Mark Blundell exposed in the upper tribunal the conduct of a barrister, Chowdhury Rahman, who had presented an appeal on behalf of clients turned down for asylum by the first-tier tribunal. Rahman claimed the original judge had made an error in law. When Blundell examined these submissions he found this lawyer had cited cases which were “entirely fictitious” and that Rahman “appeared to know nothing” about any of the authorities he had cited, some of which “did not exist”.

Blundell concluded: “It is overwhelmingly likely, in my judgment, that Mr Rahman used generative artificial intelligence to formulate the grounds of appeal in this case, and that he attempted to hide that fact from me during the hearing.” On his chambers’ website Rahman is described as having “a unique judicial mind”. One would hope his approach is unique, but I fear not.

In the US, the home of AI, there have been hundreds of cases of such legal misdemeanour (to put it politely). The US technology publication 404 Media has analysed many of those in which “lawyers have been caught using generative AI to … generate fictitious citations [and] false evidence … that ultimately threaten their careers”. It tried to talk to all the offending lawyers, with some difficulty. One claimed his misuse of ChatGPT came about because “a serious health challenge … causes me to be dizzy and experience bouts of vertigo and confusion”. Now why didn’t the West Midlands chief constable think of that excuse?

More seriously: we know that even the latest iterations of generative AI can have a hallucination rate of about 35 per cent. Rather like a hugely sophisticated version of auto-complete, this technology is designed to predict plausible text patterns, not to know facts. It is very useful if the person employing it is expert in the field under investigation, who would sense if the “answer” was fishy; it is treacherous and possibly career-ending in the hands of the ignorant or uncritical.

The astounding sums now being plunged into it by the Silicon Valley behemoths (not billions but trillions of dollars) has been compared to the biggest capital expenditures of the 19th century — the railways. This was described as the “railway mania”: there was a vast investment bubble, and although many fortunes were lost, it was also transformative.

However, while much of the track was built speculatively, it was never the case that a third of it, or more, took passengers to fictitious destinations.


Top
 Profile  
 
PostPosted: Sun Dec 14, 2025 4:22 am 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
Dominic Lawson wrote:
More seriously: we know that even the latest iterations of generative AI can have a hallucination rate of about 35 per cent. Rather like a hugely sophisticated version of auto-complete, this technology is designed to predict plausible text patterns, not to know facts.

That's kind of what I was saying about auto-correct yesterday, which is a bit like auto-complete - I'm sure we've all come across it while texting and the like, and if you're in a hurry and don't notice then you end up with a couple of words of complete gibberish :-o


Top
 Profile  
 
PostPosted: Sun Dec 14, 2025 4:24 am 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
Dominic Lawson wrote:
It is very useful if the person employing it is expert in the field under investigation, who would sense if the “answer” was fishy; it is treacherous and possibly career-ending in the hands of the ignorant or uncritical.

Came across this the other day on a trade 'news' website which I've only read once or twice (no, not Taxi Point). Spot the clanger :-o

Quote:
For years, cross-border hiring has caused friction. Wolverhampton-licensed vehicles operating hundreds of miles away, London boroughs competing on licensing fees, and councils powerless to enforce rules against cars licensed elsewhere — all of this has created a fragmented system that frustrates both drivers and regulators.

I mean, whenever have London boroughs 'competed' on licensing fees? :-s

London boroughs have never licensed anything connected to the trade, correct me if I'm wrong (the black cabs were formerly regulated by the Public Carriage Office, which was part of the Met Police, wasn't it? And once the minicabs were regulated as private hire 20+ years ago that was always the remit of TfL? It's never been anything to do with the London 'boroughs'.)

So I wonder if that was maybe some kind of AI hallucination? :-o

And that, to paraphrase Dominic Lawson, the 'author' didn't sense that the 'answer' was 'fishy'.

(Which in turn reminds me of the Taxi Point survey - I doubt that was an AI clanger as such, but the likes of the proportion of PHs as WAVs :D and the popularity of Ola in Scotland :lol: should have been flagged up as 'fishy' by the authors of the survey document [-( )


Top
 Profile  
 
PostPosted: Sun Dec 14, 2025 4:27 am 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
I fed some trade 'news' stuff on that other site through the AI detection tool thingy, and it came up with:

Quote:
Lightly edited by AI

We are moderately confident this text was originally human written and polished by AI

28% AI generated
72% mixed
0% human

There's also a 'hallucination detector' upgrade option, but I didn't bother with that 8-[


Top
 Profile  
 
PostPosted: Sun Dec 14, 2025 11:56 am 
Offline
User avatar

Joined: Wed Sep 03, 2003 7:30 pm
Posts: 57349
Location: 1066 Country
I wonder if there are trade magazines/websites that ChatGPT issues, and then briefly adapt them before publication.

In fact I’m certain most do.

But I love that we now have those cheat checkers.

_________________
IDFIMH


Top
 Profile  
 
PostPosted: Wed Dec 24, 2025 9:58 pm 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
This was sneaked out yesterday afternoon, just as the political class, bureaucracies and media were largely shutting down for Christmas - a good time to bury bad news, as someone once said.

And some saying even some of the corrections are wrong, and that there will still be errors in the judgment :-o

Image


Top
 Profile  
 
PostPosted: Wed Dec 24, 2025 10:01 pm 
Offline

Joined: Wed May 16, 2012 6:33 am
Posts: 18523
They certainly cocked up with the date - unless they're capable of time travel. Which I'd guess they're not :oops:

Maybe they'd been on the sherry before packing up for the Christmas break :lol:

Image


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 24 posts ]  Go to page Previous  1, 2

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users and 353 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
Powered by phpBB® Forum Software © phpBB Group