In a speech given to the Professional Negligence Bar Association in May 2024, Sir Geoffrey Vos, the Master of the Rolls, suggested that professionals would be “damned if they do and damned if they don’t” in their use of Artificial Intelligence (AI). In the future professionals may face liability from losses caused by their reliance on AI, or, in other circumstances, they may face liability for choosing not to use AI in their practice. As the use of all forms of AI becomes ever greater and more widespread, professionals cannot afford to either ignore or simply embrace AI – they should consider their potential liability if they use or fail to use AI and how to mitigate that liability as technology evolves.
Damned if you do – use AI
Professionals might use AI simply to answer questions or help with research or to rationalise large volumes of data. Alternatively, they may embrace this technology on a more fundamental level, by asking AI to produce bespoke advice directly for their clients.
However, where AI is misused, misinterpreted or malfunctions and is relied on by a professional to provide advice or a professional service to their client, and this causes a loss, they may be professionally negligent.
Professionals remain liable for the way they use tools like generative AI (GenAI) in their practice because of the duty of care that they owe to their clients under the contract or retainer they have with them, and otherwise in Tort Law under the ‘special relationship’ between professional advisors and clients.
Any erroneous advice or response given by GenAI is attributed to the professional who has the responsibility to check the validity and factual accuracy of these before relying on them or advising their clients to rely on them. In the case of AI models malfunctioning and causing a loss, professionals may be able to make a claim against the provider of the AI, and a client who suffered a loss may have a claim in product liability against the AI developer, but this is a largely unexplored and uncertain area of the law and litigation at present.
Damned if you don’t – use AI
Professionals may also face a “new” type of professional negligence claim in the future if they choose not to use AI when providing professional advice or services. This is because of an existing principle in assessing liability for professional negligence, where the omission or conduct of a professional is judged relative to the accepted practice or “reasonable body of opinion” of others in the same profession.
This principle is known as the ‘Bolam test’ and was developed in the context of clinical negligence[1]. Where a clinician did something that resulted in an adverse outcome, they were not necessarily negligent if they acted in accordance with an accepted practice that was widely considered proper by a responsible body of medical professionals in that area of practice.
Subsequently the Bolam test has been applied to cases involving a range of professionals, including construction professionals, valuers and financial advisers. The Supreme Court[2] has broadened the Bolam Test in situations where the standard of practice of a professional should not only be that expected by their colleagues, but also that of the reasonable expectations of their client. A professional may therefore be negligent if their action or omission was within the scope of the duty of care that they owed their client and if they did not exercise that duty of care to the appropriate professional practice standard.
As AI tools become increasingly capable, specialist and available to professionals, their use may become ubiquitous in all professional services. As well as being necessity to ensure that professional services are efficient, expeditious and competitively priced, professionals may begin to be expected to use AI to ensure that they give the most accurate and up to date advice, without the possibility of human error or oversight. Using AI in a profession may become the accepted standard of practice.
If confidence in AI reaches such a level, professionals who refuse or fail to use AI may face a new type of professional negligence claim for falling below a (new) acceptable standard. For example, a doctor who refuses to use an available AI tool to diagnose cancer or an accountant who fails to use an AI tool to check for fraud in a company’s books when undertaking an audit may be considered to have failed to meet the expected standard of their duty of care. Similar examples may be imagined in every professional sector.
Conclusion – professionals need to stay up to date
As professional practices continue to be transformed by AI and particularly GenAI, the legal framework surrounding professional negligence and indeed professionals themselves is expected to evolve to address this new world.
Professionals should stay up to date about AI developments in their fields, maintaining a careful balance between innovation and caution. We may soon see our Courts adapting traditional negligence principles to this new technological paradigm. As this area of law develops, it will be vital for professionals, policymakers, and legal experts to collaborate in shaping a framework that promotes innovation while safeguarding professional standards and client interests.
Our expert lawyers in our Commercial Litigation team have brought and defended many professional negligence matters over the years. We combine deep legal knowledge with a wealth of practical experience, ensuring a robust and strategic approach. Please get in touch to discuss your concerns.
[1] Bolam v Friern Hospital Management Committee [1957] 2 All ER 118
[2] Montgomery v Lanarkshire Health Board [2015] UKSC 11; McCulloch v Forth Valley Health Board [2023] UKSC 26
With thanks to Jaskiran Sandhu for researching and drafting this article. Jas is a University of Warwick final year LLB student serving a year-long legal placement with Wright Hassall in the Commercial Litigation team.
Key Contact -
Nathan Talbott
Partner - Head of Commercial Litigation
Nathan is head of our tax and commercial litigation teams, dealing with a wide range of commercial and contractual disputes.
T: 01926 884661
E: nathan.talbott@wrighthassall.co.uk
The information provided in this article is provided for general information purposes only, and does not provide definitive advice. It does not amount to legal or other professional advice and so you should not rely on any information contained here as if it were such advice.
Wright Hassall does not accept any responsibility for any loss which may arise from reliance on any information published here. Definitive advice can only be given with full knowledge of all relevant facts. If you need such advice please contact a member of our professional staff.
The information published across our Knowledge Base is correct at the time of going to press.