Generative AI is now a commonplace feature of employee relations, with grievances, appeals, data subject access requests and even Employment Tribunal claims all showing tell-tale signs of its use. It can be very challenging to identify an employee's real concerns from a long screed of AI-generated text and incorrect references to legislation and ACAS guidance. However, the use of AI by managers and decision-makers can be equally problematic.
We've seen a number of examples of this cropping up in our practice, and it's increasingly being litigated in the Employment Tribunals. While it's tempting to lean on the considerable power of AI tools to summarise information, suggest options you haven't thought of and express your ideas cogently, there are some legal pitfalls you need to bear in mind.
Whose decision is it?
It's a fundamental requirement of a fair disciplinary or grievance process that the employee knows who is making the decision. This has caused difficulties in cases where the roles of the hearing officer and HR adviser were not clearly delineated. However, this issue becomes even more problematic if the manager tasked with making the decision has used AI to help them resolve disputed points of fact, research issues which have cropped up in the case, or summarise evidence. If the manager has relied on AI-generated output to make a decision, the employee could legitimately challenge this as unfair, particularly as the employee would have had no opportunity to influence how the prompt was entered into the AI tool (which could generate very different responses). Although the Employment Tribunals have yet to consider this point formally, as far as we are aware, the basic principles of unfair dismissal and discrimination law make clear that the decision must be “owned” by the decision-maker to be defensible. The lack of transparency and accountability inherent in using AI for these purposes would make such decisions hard to defend in litigation. In addition, this is likely to fall foul of data protection requirements in relation to automated processing.
Check your privilege
If you have ever received a lengthy advice note from a lawyer and spent hours trying to digest it, you will understand the temptation to ask an AI tool to provide a user-friendly summary. However, this may well mean that the legal advice loses its quality of confidentiality and is no longer privileged (which means that it may then be disclosable in legal proceedings). Much will depend on who potentially has access to the data that is plugged into the tool. Publicly available tools are the most risky, but businesses should also take care with supposedly “closed” systems: the data may be accessible by individuals within the business (such as IT administrators) who are not authorised to obtain legal advice on the Company's behalf. Under the English courts' current strict approach to legal privilege, such dissemination may cause privilege to be lost.
Full disclosure
Where employees suspect that AI tools have been used when taking decisions about their employment, they may seek disclosure of the prompts entered - and Employment Tribunals may well be willing to order such disclosure if it is likely to reveal the basis of the employer's decision. However, many users delete old chats on tools such as Co-Pilot (and some organisations auto-delete this material). While Employment Tribunal claims often take in excess of 18 months to reach trial, this is likely to leave many employers struggling to explain the basis for their decision, which is likely to damage their defence significantly.
With employees' having an increasingly sophisticated understanding of how AI is used and how it can be leveraged in disputes, employers need to be alive to these risks and ensure that managerial enthusiasm for time-saving tools doesn't cost the business dearly in the long run.

/Passle/5c3c9d7eabdfe80c3892f86c/SearchServiceImages/2026-01-27-22-25-56-502-69793b74267b7998acf7c895.jpg)
/Passle/5c3c9d7eabdfe80c3892f86c/SearchServiceImages/2026-01-27-21-02-49-771-697927f9ad3984136cdcef90.jpg)
/Passle/5c3c9d7eabdfe80c3892f86c/SearchServiceImages/2025-12-16-17-49-34-046-69419bae5657195f59081e3b.jpg)