
In the hallowed, often paper-strewn halls of UK accounting firms, a new and frustrating recurring character has emerged: the client who “did their own research” via a Large Language Model (LLM).
We’ve all seen it. A client walks in, digital transcript in hand, claiming they’ve found a revolutionary way to claim their Caribbean holiday as a “subsistence expense” because a chatbot told them it technically qualifies as a wellness retreat. It’s easy to chuckle at the absurdity, but new data suggests the joke is wearing thin and the financial punchline is hitting the bottom line of British businesses.
According to a sobering new report from Dext, which surveyed 500 accountants and bookkeepers across the UK, the honeymoon period with general-purpose AI is officially over. We are no longer talking about “hallucinations” as a tech quirk; we are talking about them as a liability.
The Hard Cost of “Artificial” Intelligence
The statistics are a wake-up call for any practitioner who has spent their Monday morning unpicking a botched VAT return. 50% of UK accountants are now aware of businesses that have suffered direct financial losses due to incorrect AI advice.
This isn’t just a matter of a few pounds and pence. These losses manifest as:
-
Overpayments: Cash flow is strangled by unnecessary tax payments.
-
Missed Allowances: R&D or capital allowances left on the table because the AI didn’t know the specific nuance of the current Finance Act.
-
HMRC Penalties: The taxman, as we know, is rarely moved by the excuse of “the chatbot told me to do it.”
Perhaps most frustrating for the profession is the “productivity drain.” The research finds that two-fifths of accountants are now losing between 4 and 10 hours every single week fixing mistakes generated by public AI tools like ChatGPT. In a world where we are constantly told AI will “save us time,” it currently seems to be doing the exact opposite for those of us on the front lines of compliance.
The “AI Slop” Infecting the Ledgers
What exactly is going wrong? Paul Lodder, VP of accounting product strategy at Dext, calls it a “fundamental difference” between specialist tools and general-purpose bots. A chatbot is a world-class mimic; it knows how a tax answer should sound, but it doesn’t actually know the tax code.
The report identifies the most common “AI slop” errors landing on accountants’ desks:
-
Incorrect interpretation of business expenses (46%)
-
Flawed VAT claims or charges (41%)
-
Personal tax planning blunders (35%)
Take, for instance, a recent (and common) anecdotal case of a sole trader who asked a popular LLM about the VAT threshold. The AI, potentially pulling from outdated training data or US-centric sources, gave advice that led the trader to believe they didn’t need to register for another six months. By the time they sat down with a human professional, they were looking at a backdated bill and a failure-to-notify penalty.
2026: The Year of the “AI Insolvency”?
If 2025 was the year of experimentation, 2026 looks to be the year of the reckoning. The Dext research suggests a darkening horizon. A third (33%) of professionals warn that a continued reliance on public AI could trigger business failures and insolvencies next year.
Why the sudden escalation? Consider the timing. With Making Tax Digital for Income Tax (MTD for IT) looming, thousands of landlords and sole traders are about to be thrust into a more frequent, digital-first reporting cycle. The temptation to use a “free” chatbot as a proxy for a qualified accountant will be at an all-time high just as the compliance stakes are raised.
Furthermore, 43% of accountants expect a rise in fraudulent or inappropriate claims justified by AI outputs. It creates a “false confidence” loop: the business owner feels empowered by a professional-sounding bot, submits a claim, and then faces the full brunt of HMRC scrutiny which 37% of respondents expect to intensify in the coming year.
A Call for Guardrails
The sentiment in the industry is shifting from curiosity to a demand for control. A staggering 92% of accountants believe that public AI tools should be regulated or restricted when providing financial or tax advice.
As practitioners, we know that tax is not a “pattern” to be predicted, it is a set of rules applied to a specific, messy human context. A chatbot doesn’t know your client’s long-term exit strategy, their family dynamics, or their specific risk appetite.
“AI has a powerful role to play in finance,” notes Lodder, “but general-purpose LLMs should not be mistaken for tax advisers.”
The Verdict
The message to our clients needs to be clear: AI is a powerful calculator, but a terrible architect. As we move into 2026, the value of the human accountant isn’t just in “doing the books” it’s in providing the “human guardrail” that keeps a business from falling into the “AI slop” trap.
We aren’t just fighting for our fees; we’re fighting to keep British businesses solvent in an era of convincing-sounding misinformation.