Employment mediators are starting to use AI the way we previously adopted secure video platforms and electronic briefing: as a tool to reduce friction and move the case toward resolution. Used correctly, AI can help with preparation, organization, and drafting. Used carelessly, it can create avoidable confidentiality and privilege problems, especially in employment cases.
This post is about what I’m actually seeing, where AI helps, and the ground rules I recommend for counsel mediating employment disputes in North Carolina, including Charlotte, Raleigh, Wilmington, Winston-Salem, and Asheville.
Where AI is being used in employment mediations
-
Case intake and organization
Employment mediations often arrive with large sets of documents: personnel files, email chains, performance reviews, complaint histories, investigation summaries, pay records, and separation paperwork. AI is being used (by mediators and counsel) to turn that material into a clean chronology and issue list: who did what, when the key communications occurred, what the stated reasons were, and where the proof is strong or thin.
The value is time. If a mediator can get oriented quickly, the mediation spends less time on repetition and more time on risk and options.
-
Converting mediation statements into usable settlement discussions
Most mediation statements serve multiple purposes. They preserve a theory, stake out a position, and try to persuade. AI can help convert a longer brief into a practical outline: disputed facts, “hinge” issues, and what a jury is likely to focus on. It can also help generate targeted questions for the other side that clarify what is actually driving the case.
The limitation is obvious but important: AI is not weighing credibility. It is organizing information. The mediator’s job is still to assess how the story will be received and how the parties are experiencing the conflict.
-
Damages ranges and scenario testing
This is a legitimate use case. Employment cases frequently turn on ranges rather than point estimates: different assumptions about mitigation, timing to trial, front pay exposure, benefit value, bonuses/commissions, and the likelihood of fee shifting. AI-assisted tools can help model “if/then” scenarios and make sure nobody is negotiating from a spreadsheet with hidden errors.
The risk is false precision. A model is only as good as its assumptions, and the assumptions are often disputed. AI can help organize the math, not decide the facts.
-
Negotiation framing and bracket mechanics
Some mediators are using AI as a drafting and phrasing assistant for neutral language: explaining brackets, translating a positional number into what it might mean in movement, and proposing structured paths (for example, separate tracks for economics, non-economic terms, and timing).
The benefit is communication. In employment cases, it is easy for parties to hear offers as moral judgments. Better framing lowers the temperature and keeps the negotiation substantive.
-
Term sheets and settlement agreement drafting
This is where AI can save real time. Late in the day, deals fail when the parties run out of energy or momentum before terms are reduced to writing. AI can help generate a structured term sheet and first-pass clauses for confidentiality, non-disparagement, neutral reference language, payment timing, and standard release provisions.
But this is also where mistakes are most costly. AI can produce language that sounds acceptable and is not. Every term has to be lawyer-reviewed, with particular attention to scope of release, carve-outs, payment and tax reporting, and any provisions that may affect future employment or licensing.
The main issue: confidentiality and privilege
Employment mediations routinely involve sensitive material: allegations of discrimination or harassment, medical information, mental health history, internal investigations, pay data, trade secrets, and personnel records. If that information is uploaded into an AI system without a clear understanding of storage, retention, and training use, you may have created a problem that did not need to exist.
The practical standard I use is simple: if you would not want the document in the hands of a third party outside the mediation, do not paste it into a tool unless you have a clear, written basis to believe it will remain confidential and not be stored or used in ways you can’t control.
Practical guardrails for counsel and parties
-
Put an AI paragraph into the mediation agreement when appropriate
This does not need to be complicated. The point is clarity. At minimum: whether AI tools will be used, whether documents may be uploaded, and confirmation that confidentiality obligations apply regardless of the tool used. -
Use sanitized inputs whenever possible
If AI is used to summarize or draft, use redacted or de-identified materials. Names can be replaced with roles. Attachments can be summarized rather than uploaded. Most of the time, you do not need personal identifiers to move settlement discussions forward. -
Treat AI output as a draft, not authority
AI is helpful for structure and speed. It is not a substitute for legal judgment, and it does not know the case like counsel does. Any AI-generated summary or draft should be checked against the source material and the actual negotiation posture. -
Be especially careful with releases and post-employment terms
Release language, confidentiality, non-disparagement, non-rehire provisions, and reference language are recurring sources of post-mediation conflict. If AI is used to draft, counsel should review line-by-line and confirm the terms match the parties’ intent. -
Decide whether AI is “in the room”
Some parties are comfortable with AI being used behind the scenes for drafting. Others are not. Either position can be reasonable. What matters is transparency and agreement so that nobody leaves the mediation believing the process was handled differently than it actually was.
Bottom line
AI is already being used in employment mediations. In my view, the best use is narrow and practical: organization, scenario testing, and faster drafting, with strong confidentiality discipline and attorney review. The worst use is casual uploading of sensitive material into tools that were never intended to be part of a confidential dispute resolution process.
If you treat AI as a tool for efficiency, not decision-making, and you keep confidentiality and privilege front and center, it can help mediations move faster and end more reliably with durable agreements.