High Court Judge Warns That AI-Generated Expert Report Constitutes ‘Gross Breach of Duty’

A solicitor’s use of an AI-generated draft expert report has been described as a “gross breach of duty” by High Court judge Mr Justice Waksman at the recent Bond Solon Expert Witness Conference. This serves as a clear warning to the legal profession about the proper boundaries for artificial intelligence, particularly where expert evidence is concerned.

Bond Solon Expert Witness Survey

The most recent Bond Solon Expert Witness Survey shows that 20% of expert witnesses have used artificial intelligence in their role, while 80% have not. Although the majority of experts remain cautious, this represents a significant increase from the previous year, when only 9.31% reported using AI.

Despite this rise, AI use among expert witnesses remains well below the wider UK workforce average. Research by KPMG indicates that 65 per cent of UK workers intentionally use AI in their work, highlighting the relative caution within the expert witness community.

This reluctance appears to reflect ongoing uncertainty about when and how AI can be used appropriately in expert evidence. Concerns about misuse have been echoed at the highest judicial level. On 6th June 2025, the President of the King’s Bench Division warned that misuse of AI carries “serious implications for the administration of justice and public confidence” in the legal system.

Use Of AI-Generated Expert Report “Gross Breach Of Duty”

Addressing the conference, Mr Justice Waksman described a solicitor’s insistence on an AI-generated draft expert report as a “gross breach of duty”. His criticism extended beyond solicitors. He expressed serious concern at the survey finding that 14% of experts said they would accept instructions where an AI-generated draft report was provided. He stated:

“I cannot see how that can be appropriate conduct on the part of the expert, even if they’re doing it to avoid a row with the solicitor and they intend to dispose of the draft report soon afterwards”.

Updated Judicial AI Guidance

The recent conference took place shortly after judges received updated judicial guidance on the use of artificial intelligence, which was issued the previous week. The Artificial Intelligence (AI) -Judicial Guidance (October 2025) guidance reflects an understanding that AI is now part of the legal environment. However, it also draws clear boundaries around its use, particularly where accuracy, independence, and judicial responsibility are concerned. As cited in the announcement of the updated guidance, Lord Justice Birss, Lead Judge for Artificial Intelligence, said, “The use of AI by the judiciary must be consistent with its overarching obligation to protect the integrity of the administration of justice and uphold the rule of law. I welcome the publication of the latest AI Guidance, which reinforces this principle and the personal responsibility judicial office holders have for all material produced in their name. I encourage all judicial office holders to read the guidance and apply it with care”.

Judicial Use Of AI Tools

Conference attendees were reminded that legal professionals are not prohibited from using AI tools. They have access to a private version of ‘ChatGPT 365’, designed for judicial use. Crucially, prompts and information entered into this system are not publicly accessible. This addresses concerns about confidentiality and data leakage, which remain central issues for legal professionals using public-facing AI platforms.

Mr Justice Waksman confirmed that judges do not have a duty to disclose their use of AI, drawing a comparison with the use of judicial assistants. The responsibility for the judgment, however, must always remain with the judge alone.

During his talk, Mr Justice Waksman also issued a clear warning against the use of AI for legal research or legal analysis.

The risk of hallucinations, including false precedents, remains a serious concern. Instances of fake authorities being cited have already been reported in several jurisdictions, including England and Wales.

For expert witnesses, Mr Justice Waksman’s warning was even more direct. He advised experts to “steer clear” of using AI to answer the substantive questions they are instructed to address.

Final Words

Mr Justice Waksman has made his views clear that experts should ‘steer clear’ of using AI to answer the questions that is their job to answer. As such, AI tools may have a role in administrative tasks, document management, or high-level summarisation. They do not, however, have a place in generating expert opinions or shaping legal analysis.

We have been helping solicitors and other legal professionals with disciplinary and regulatory advice for 30 years. If you have any questions relating to an SRA investigation or an SDT appearance, please call us on 0151 909 2380 or complete our Free Online Enquiry