Is AI a Match for Human Legal Advice?
Artificial Intelligence (AI) is playing an increasingly important role in the legal sector, offering new opportunities to enhance legal work. However, the legal professionals you instruct in your matter need to be mindful of the way in which they use AI to assist their work. Over-reliance or abuse of AI tools can present serious risks and challenges.
What can AI do well?
AI has made significant progress in assisting law firms and fee-earners with their workloads and business needs. This past May, the Solicitors Regulation Authority authorised what they described as ‘the first law firm providing a legal service through large language model artificial intelligence’. Costs are said to start from £2 for sending a ‘police chaser’ letter, while the firm can manage claims in England and Wales up to £10,000.
A £2 legal letter would no doubt be an attractive option to the public for obvious reasons, particularly individuals pursuing small claims. It is becoming increasingly evident that the public is using AI tools, such as ChatGPT, for various legal needs. By relying on AI to draft anything from new enquiry emails, to claim forms and statements of case, they are by-passing the instruction of lawyers in the early stages of a claim.
Law firms are clearly becoming aware of how AI technology can support legal work and enhance efficiency and productivity while reducing costs. The Financial Times recently noted that the legal sector will be ‘radically reshaped by AI’, with the technology playing a growing role in ‘commoditised…data orientated…analytical and administrative work.’
There is certainly a place for AI in work of this nature, however, there is a lot more to law and AI arguably cannot provide the advocacy, rigour, attention to detail and strategic thinking that your humans lawyers can; and rightly so!
The limitations of AI
Legal professionals (and the clients who instruct them) need to be mindful if they use these tools in their work.
For all its benefits, recent judgments have suggested that lawyers will be put to task if they are negligent in its use. In the case of Frederick Ayinde, R (on the application of) v The London Borough of Haringey [2025] EWHC 1040, the Defendant applied for a wasted costs order against the Claimant’s solicitor and counsel, on the grounds that five fake cases were put into the Claimant’s statement of facts and grounds for judicial review. While the court could not definitely state whether the cases were generated through the use AI, Mr Justice Ritchie was of the view that ‘it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading.’
The Claimant’s lawyers were both referred to the relevant regulating authorities, and the court awarded a wasted costs order in favour of the Defendant. Furthermore, the Claimant’s costs for the judicial review were substantially reduced.
Ultimately, the later findings of the Divisional Court serve as a warning to us all, clients and legal practitioners alike:
“…our overarching concern is to ensure that lawyers clearly understand the consequences (if they did not before) of using artificial intelligence for legal research without checking that research by reference to authoritative sources. This court’s decision not to initiate contempt proceedings in respect of Ms Forey is not a precedent. Lawyers who do not comply with their professional obligations in this respect risk severe sanction.”
The continuing need for human lawyers
Law, at its root, is human-powered. While we cannot, and should not, dismiss AI for the reasons set out above, it must be seen as a tool that supports and enhances the knowledge, judgement and experience of legal professionals – not a substitute for it.
At Saunders Law, we specialise in complex legal matters that demand strategic thinking, collaboration and human insight. We can offer informed and tailored advice that benefits from our vast experience. Arguably, AI can't effectively replicate these qualities.
Fundamentally, as solicitors, we are bound by codes of conduct and principles of ethics. These rules require us to always act in the best interests of our clients and to uphold the integrity of the profession. AI tools, at least for now, are not bound by any such rules.
So, while it may be tempting to resort to ChatGPT to draft your judicial review letter before action, you may wish to ask yourself: Can I trust that everything in this draft is real and true? Do I have the capacity and the knowledge to check its contents? May I be found out by a judge if anything is false? Might it just be better to instruct a lawyer instead and potentially save myself time, money and stress?
The answers are up to you.
Our lawyers at Saunders law are experts in civil liberties and we are passionate about assisting members of the public to protect their rights. Please contact us on 020 7632 4300 to discuss your matter.