In Felicity Harber v HMRC [2023] TC09101, ‘artificial intelligence’ (AI) failed to assist a taxpayer in her appeal against penalties after it emerged that it had invented the case citations she had relied on in her defence. The First Tier Tribunal (FTT) found that this was not the first time that AI created fake cases.
- The appeal centred on the taxpayer's failure to notify Capital Gains Tax (CGT) on the disposal of a residential property.
- HMRC assessed her underdeclared gains when querying her receipt of undeclared rental income, and charged Penalties for failure to notify.
- In making her Appeal to the FTT, she claimed poor mental health as her Reasonable excuse for her failure.
- In preparing her defence, it transpired that she had relied on the output of AI.
First Tier Tribunal Judge Ann Redston, a former visiting professor of tax law at Kings College, was quick to spot the discrepancies in the defence as AI had also made up the case names. This resulted in the FTT's consideration of the failings of a large language model AI, like the ChatGPT in some detail.
It transpires that the output of ChatGPT has already been caught out in the same way. The FTT noted the US case of Mata v Avianca 22-cv1461(PKC), in which two barristers sought to rely on fake cases generated by ChatGPT. They placed reliance on summaries of court decisions which had “some traits that are superficially consistent with actual judicial decisions”. When directed by the Judge to provide the full judgments, the barristers went back to ChatGPT and asked “Can you show me the whole opinion?” and ChatGPT complied by inventing a much longer text. The barristers filed those documents with the court on the basis that they were “copies…of the cases previously cited”. The Judge reviewed the purported judgments and identified “stylistic and reasoning flaws that do not generally appear in decisions issued by United States Courts of Appeals”.
In the FTT case, the AI did not 'know' that there was a difference between the offences of failure to notify and late filing and it generated a mixed list of cases which Mrs Harber used in her appeal against tax penalties. The AI used the US spelling of 'favor' in British judgments and suspiciously repeated certain phrases across the cases it cited.
After reviewing the cases and working through how it was that the AI might have created such a work of fiction, the FTT found as a fact that Mrs Harber was not aware that the cases in her Response were fabricated and did not know how to locate or check case law authorities by using the FTT website, BAILLI or other legal websites.
The FTT found as a fact that the cases in the Response are not genuine FTT judgments but have been generated by an AI system such as ChatGPT.
Ultimately, the taxpayer did not provide sufficient evidence to persuade the FTT that she had a reasonable excuse for not declaring CGT. Despite her health claim, the FTT found that she was leading a normal life and collecting rental income etc, and there was nothing that would have prevented her from seeking tax advice. She had no excuse for not declaring her CGT and her case was dismissed.
Back to Nichola's Weekly Tax-Update 7 November 2023
Useful guides on this topic
Penalties: Failure to Notify
What tax penalties apply if you fail to notify HMRC that you are chargeable to tax? Can they be appealed or reduced?
Capital Gains: How to report
How do you report your capital gains? What return do you use? There are different ways for individuals to report capital gains depending on whether you are resident or non-resident, and whether you are in or out of Self Assessment.
How to appeal a tax penalty
What are the steps in making an appeal? What should your appeal cover? What does recent case law say on this topic?
External links
Felicity Harber v HMRC [2023] TC09101
Are you enjoying our content?
Thousands of accountants and advisers and their clients use www.rossmartin.co.uk as their Practical TAX resource.
Register with us now to receive our FREE Topical Tax Newsletter.