AI Arguments Rejected in HICBC Appeal

AI-Generated Arguments Fall Short in Taxpayer's HICBC Appeal
A recent case before the First Tier Tribunal (FTT) demonstrates the limitations of relying on Artificial Intelligence (AI) for tax appeals in the UK. In Bodrul Zzaman v HMRC [2025] TC09520, Mr Zzaman’s use of AI to craft his legal arguments failed to persuade the tribunal, which ultimately dismissed his appeal against the High-Income Child Benefit Charge (HICBC).Key Details of the Case
- Mr Zzaman’s adjusted net income for 2018-19 exceeded £50,000, and he, not his spouse, had the higher income.
- HMRC became aware in April 2021 that Mr Zzaman had not notified them of his HICBC liability.
- In January 2023, HMRC issued a discovery assessment of £2,501 to Mr Zzaman after correspondence.
- Mr Zzaman appealed to the FTT, representing himself with a statement of case largely produced using AI tools.
- Retrospective legislation is lawful: No compelling argument was made against section 97 of the Finance Act 2022.
- Case law cited did not support the arguments: Several references were irrelevant or couldn’t be located. For example, Wilkes didn’t apply because Mr Zzaman missed the deadline for written appeals.
- Complexity of ANI calculation is not grounds for relief: The tribunal acknowledged ANI can be difficult to calculate, but this does not negate liability.
- HMRC has no general duty to notify: Taxpayers are responsible for monitoring personal tax obligations.
- Claims of unfairness or human rights breaches not upheld: The laws, though seen as harsh by the taxpayer, were found to be proper and not in violation of rights.
- Using precise and clear prompts.
- Requesting AI to reference specific paragraphs of authoritative sources for manual verification.
- Ensuring AI tools provide disclaimers or note uncertainties.
- Having users actively check if sources exist and support their arguments.
- Careful scrutiny of AI-generated legal arguments.
- Relying on verified case law and statutory authority in appeals.
- Recognising personal responsibility for notifying HMRC of tax liabilities.
- Consult a qualified tax adviser before submitting appeals.
- Use AI tools as a supplement, not a replacement, for professional advice.
- Double-check every case citation and legal argument before presenting to a tribunal.
Arguments Advanced by the Taxpayer
Mr Zzaman presented a range of arguments, including:1. The case of HMRC v Jason Wilkes was relevant to his circumstances. 2. Retrospective assessment under the Finance Act 2022 was unfair. 3. Calculating Adjusted Net Income (ANI) is complex, especially for PAYE employees. 4. HMRC should have proactively notified him of his liability. 5. HICBC is unjust and breaches human rights.
The FTT noted that these points, many derived from AI-generated text, were either unclear or lacked sufficient substantiation during oral submissions.
Tribunal’s Findings
The tribunal addressed each of Mr Zzaman’s claims:HMRC’s assessment was upheld as timely and accurate.
Emphasis on Human Oversight
Editor’s observations highlighted a crucial issue: While Mr Zzaman’s AI-generated arguments did not reference fictitious cases, as seen in the earlier Felicity Harber v HMRC [2023] case, the citations were often mistaken or irrelevant. This demonstrates the potential risks involved in relying solely on AI-generated legal materials."Human checks remain essential. AI tools may misinterpret questions and produce inaccurate legal references, which can undermine the credibility of an appeal."
The tribunal recommended measures to reduce risks when using AI for legal submissions, such as:
Takeaway for Taxpayers and Professionals
The Zzaman case reinforces the need for:Next Steps:
If you are facing a similar situation:
Explore further resources or seek professional support for complex tax matters to ensure compliance and robust legal representation.