How to navigate the use of ChatGPT in law schools

Students will inevitably use generative AI, which poses challenges for assessments

How to navigate the use of ChatGPT in law schools
Monica Goyal

This fall, I started teaching a data and AI course at the Lincoln Alexander Law School at Toronto Metropolitan University. It has prompted me to consider what law students need to know about AI.  Over the last year, the accessibility of tools like ChatGPT's generative AI has many educators, including myself, thinking about how to assess law student applications or written assignments.

ChatGPT is a generative AI tool developed by OpenAI. Based on user prompts, it quickly generates sophisticated, on-point text responses in real time. I find teaching an AI course exciting because there are many legal issues to consider, such as the problem this tool poses to schools. Students can easily turn to ChatGPT for sample text for essays, blog posts, or any other written work their professor requests. BestColleges recently surveyed 1,000 undergraduate and graduate students. They found that 43 percent of students have used ChatGPT or a similar AI application, and 22 percent of those surveyed used AI to complete assignments or exams.

Currently, law schools and law professors are not thinking consistently about how ChatGPT or generative AI is used in classes. In April 2023, the University of California, Berkeley School of Law, adopted a formal policy to allow the use of AI in certain circumstances like class work, but not for written assignments or essays. Some law schools remain silent and have not announced any formal policy. George Washington University, for example, has no firm policy on ChatGPT and relies on recommended guidelines that allow professors to choose to permit, partially permit, or not to permit the use of ChatGPT.

This August, several law schools released very different policies on using ChatGPT in the application process. The University of Michigan Law School banned the use of ChatGPT on applications and required applicants to certify that they followed the prohibition. In contrast, the Sandra Day O’Connor College of Law at Arizona State University allows prospective students to use ChatGPT in their applications, likening ChatGPT or generative AI to using a paid consultant or someone else to help them with their statements. The students must certify their use of generative AI in their application submission to Arizona State Law School, like student applicants having to certify their use of a professional consultant. There is still no consensus on the use or disuse of generative AI.

Law schools may be contemplating the best way to assess law students through their written assignments, understanding that student work can be affected by the use of generative AI tools. An emerging solution is to fight AI with AI. Several AI tools claim to be able to detect whether AI has written an essay. Turnitin, a tool many schools use to detect plagiarism, says it can detect if ChatGPT was used to prepare written assignments with 98 percent accuracy. That means that 1 in 50 cases will turn out a false positive.

However, the Washington Post conducted a test and found that in their sample set of 16, there was a higher rate of false positives than Turnitin’s claim of 98 percent accuracy. In response to the article, Turnitin added a flag to its score that reads, “Percentage may not indicate cheating.”

The takeaway is that AI detectors may get it wrong, so whatever it tells you, an educator would need to evaluate and consider carefully before making any conclusions or accusations. They must review the material. It is also unclear how well the detectors will do with mixed source submissions or when a student takes ChatGPT output and modifies or paraphrases more than 30 percent of the content, so you no longer have a pure output from ChatGPT.

ChatGPT and other generative AI tools like Facebook, the iPhone, or Google are technologies so compelling that their wide adoption is inevitable. The proverbial genie is out of the bottle. I feel that law students and students will need to learn about these tools and use them in their work. Furthermore, educators may have to rethink how assessments are done to deter the use of ChatGPT.

When I teach a course in AI, I will introduce the students to the latest generative tools and continue to think about the best way to assess students effectively.

Recent articles & video

Last few days to nominate in the Top 25 Most Influential Lawyers

Why this documentarian profiled elder rights advocate Melissa Miller in Hot Docs film Stolen Time

Saskatchewan government boosts practical learning at University of Saskatchewan College of Law

BC Supreme Court clarifies the scope of solicitor-client privilege in estate administration

Federal Courts invite public feedback on the conduct of a global review of its rules

BC proposes legislative changes to support First Nations land ownership

Most Read Articles

National Bank cannot fulfill Greek bank’s credit guarantee due to fraud exception: SCC

Canada facing pervasive ransomware, broader cyber-criminal landscape and threat from AI: lawyer

Ontario Court of Appeal rules against real estate developer for breach of a joint venture agreement

Canadian Lawyer partners with legal associations to survey legal graduates