Skip to content
Legal Technology

AI case study for law professors: How to build complimentary teaching tools

Natalie Runyon  Content Strategist / Sustainability and Human Rights Crimes / Thomson Reuters Institute

· 7 minute read

Natalie Runyon  Content Strategist / Sustainability and Human Rights Crimes / Thomson Reuters Institute

· 7 minute read

Law school professors can use AI-powered tools that require minimal technical expertise to create and deploy in order to help improve students’ legal reasoning and skills by simulating courtroom scenarios

Key insights:

        • Creating prototypes of IP-protected teaching tools — Law school faculty can build working AI teaching tool prototypes in one to two hours without IP worries because key optional settings enable a closed system to ensure professors’ intellectual property remains protected.

        • Strong prompting skills create faster prototypes — The best instructions initially set the AI’s character, explains what the AI needs to accomplish, lists which documents to reference exclusively, describes how the response should be formatted, and mentions any applicable legal jurisdiction limits.

        • Feedback from students is positive — Students’ responses show AI simulators reduce anxiety and build confidence by providing unlimited low-stakes practice opportunities that make legal concepts more digestible through active dialogue rather than passive reading.


Law schools face a persistent challenge on how to provide individualized skills practice when one professor must serve many students. And today’s traditional legal education offers limited opportunities for students to practice oral arguments, evidentiary objections, and witness examinations. Indeed, the repetition necessary to build authentic courtroom skills does not scale easily with law professors in the classroom alone.

To address this challenge, Prof. Alexandria Serra at the University of Missouri–Kansas City School of Law pioneered custom-built tools that simulate trial judges, three-panel appellate courts, witnesses, and evidentiary objection scenarios. Prof. Serra has seen firsthand how these tools give students unlimited, low-stakes practice opportunities that reduce their anxiety while building confidence in their legal reasoning and judgement.

Building your first AI learning tool, step by step

Creating custom AI teaching tools requires far less technical expertise than most professors would assume. As Prof. Serra explains, if you have a general idea of what you want the tool to accomplish, then “you can have a working prototype in less than two hours from idea to execution.”

The process begins with choosing a large language model (LLM) platform, such as ChatGPT, Claude, or Gemini, and securing a paid subscription, which most law schools will provide, she explains. During the sign-up process, optional settings enable a closed system to ensure law professors’ intellectual property is not shown to the students and is not used to train the LLMs.

law professors
Prof. Alexandria Serra

Next, you should gather class materials, including slides, case files, manuals, and problems the professor has already created. After that, it is necessary to define one specific use case, such as an evidentiary objections practice tool, a Socratic method simulator, or a client interview assistant.

The building process itself takes about one to two hours and requires no coding skills. “You just start talking to the LLM like you are training a teaching assistant to do exactly what you want to do,” Prof. Serra adds.

Having built many tools, she highlights three critical components that are necessary for the efficient, useful, and flexible prototype. These include:

1. Prompting skills

Effective prompting is key to generating a good prototype.  According to Prof. Serra, the ideal prompt includes defining the AI’s role (You are a trial judge in a federal district court), specifying the task the AI should deliver, identifying which documents to use exclusively, describing the desired output format, and including any jurisdictional constraints.

2. Multimodal features in AI tools

Most platforms allow for voice-activated chat mode, in addition to typing back and forth, which helps students respond out loud in real time. Custom AI tools also have shareable links, which enables easy deployment to students. Once a student engages with the tool, they can send back a transcript of the interaction. Some platforms even allow shareable audio files so students can get feedback from their professors on skills performance, not just content.

3. Verifying reliability

Evaluating the quality of the AI output is important but naturally varies by use case. For classroom tools, Prof. Serra recommends deploying prototypes quickly and using students as testers. If the tool produces outputs with inaccuracies, she encourages students to bring these errors to class for discussion. That way, everyone learns how to critically diagnose problems with AI outputs. A variety of problems cause AI inaccuracies — the AI itself, poor prompting, incorrect legal reasoning, or incomplete training.

For wider deployment without the builder’s direct oversight, Prof. Serra recommends an extended period of testing and iteration. Her tool, MootMentorAI, which simulates a three-judge appellate panel for first-year law students preparing for oral argument, is one example. Because MootMentorAI was developed for use by a colleague, Prof. Serra worked with a research assistant to conduct 80 simulations over the course of a semester — 40 from the plaintiff’s perspective and 40 from the defendant’s perspective — to verify reliability and improve performance before deployment without her supervision.

Overcoming adoption barriers among peers

Faculty resistance remains the most significant barrier to deploying AI-enabled teaching tools in legal education. “There’s lots of faculty pushback, distrust, and a healthy dose of skepticism with AI,” Prof. Serra acknowledges, arguing that even so, AI-powered tools are teaching assets for all law school courses. “Even in doctrinal classes that run on traditional Socratic dialogue, professors can still use AI to reinforce learning outside the classroom through tools, such as podcast-style lectures, a multiple-choice practice assistant, tools to enable issue-spotting, and essay practice tied to course fact patterns.”

Common concerns among law school faculty include confidentiality, intellectual property protection, fear of revealing exam content, and perceived lack of technical expertise. However, Prof. Serra points out that these fears often stem from her colleagues’ misunderstanding of how closed systems work. Indeed, if privacy settings are correctly deployed, uploaded materials will not be used to train public models and students cannot access source documents.

Indeed, the most effective strategy for overcoming resistance is personal demonstration, she says, noting that she frequently sits down with colleagues virtually to build tools based on the colleague’s own use case. She’s built everything from a Startup CEO simulator for a business course, to an interview assistant for Career Services, to a simulated forensics expert for students to cross-examine. This grassroots approach, combined with speaking at conferences and identifying super fans who can champion the technology, gradually builds institutional buy-in, she adds.

Multifaceted student feedback

Student feedback has been overwhelmingly positive, with learners describing how AI simulators make legal skills training more accessible, more engaging, and less intimidating. In fact, students are often surprised by how convincingly AI tools can simulate judges, witnesses, and other real-world lawyering scenarios. They also appreciate having permission to use AI as a legitimate learning aid.

They also report that real-time interaction makes course concepts more digestible because these tools turn learning into an active dialogue rather than passively staring at a casebook. Finally, students say the simulators reduce anxiety before oral arguments or presentations by enabling unlimited, low-stakes repetition that builds confidence and keeps practice from feeling overwhelming.

Clearly, AI tools are quickly becoming essential learning infrastructure, and legal education cannot afford to treat them as optional add-ons if it expects to stay relevant. As a growing chorus of educators and employers warns that institutions must evolve, the real question is whether schools will build responsible, faculty-guided systems fast enough to meet students where the profession is headed.

When deployed thoughtfully, these platforms can scale individualized skills training, deepen engagement beyond the casebook, and build durable confidence that law students can carry into their future legal practice.


You can download a full copy of the Thomson Reuters Institute’s recent white paper, Responsible AI use for courts: Minimizing and managing hallucinations and ensuring veracity, here

More insights