Skip to content
A Member of the Law Professor Blogs Network

Rethinking the Writing Sample

As my colleague and I were planning out our fall semester, we had big ideas about how we were going to teach legal analysis—by spending more time focusing on the real skills needed, such as rule identification, extraction, and synthesis.  We wanted students to understand and practice rule-based and analogical reasoning, rather than memorizing and reproducing the components of an office memo (e.g., questions presented, brief answers, etc.).  And we were excited to try new forms of assessment to measure student learning and success on these vital skills.  But then came the dreaded question:  if we didn’t have them draft and submit a memo for their final assessment, what would they use for a writing sample for summer job applications?

Legal employment, from summer clerkships to tenured professorships, has long relied on the writing sample to evidence an applicant’s ability to conduct and effectively communicate complex legal research and analysis.  But with generative AI capable of producing polished, analytical prose indistinguishable from human work, what is the writing sample actually proving today (if it ever proved anything at all)?

The perceived value of a writing sample lies in the belief that it can reveal a candidate’s ability to organize facts, apply law, and communicate persuasively.  It can also theoretically serve as evidence of a person’s diligence and attention to detail.  But this value exists only if the work is the candidate’s own unaided writing.

The unaided aspect has, of course, always been an issue with writing samples because good law students and lawyers seek help and feedback on their written work before submitting it to a class, court, or client.  And, from a pedagogical standpoint, insisting on unaided memo writing gives students a false sense of what legal work actually entails. Lawyers are expected to use every legitimate tool available (including colleagues) to produce efficient, accurate, and well-reasoned work. Banning assistance reflects a world that simply doesn’t exist and may never have.

Now, non-legal generative AI tools like ChatGPT, Claude, and Gemini can draft legal-style memos, analyses, and briefs indistinguishable from human output.[i]  And this human-like output appears more lawyerly with law-based generative AI platforms, such as CoCounsel, Protégé, and Harvey, among others.  Employers can, of course, require that writing samples be free from AI assistance or that applicants disclose if AI is used, but that requirement is unenforceable given the unreliability of AI detection tools.  The result is that writing samples can no longer be relied upon to accurately reflect a candidate’s cognitive ability or style; instead, we must assume they reflect a candidate’s ability to effectively prompt AI platforms, as well as evaluate and revise their output.

Maybe these are the skills that employers should be seeking now.  But the other skills that writing samples were meant to reflect are still vital to the profession.  So how can employers account for them?

Outside of the United States, some firms now require candidates to sit for in-person written assessments as part of the interview process.[ii]  Other legal employers are using psychometric tests, such as the Watson Glaser test, to measure critical thinking skills.[iii] 

But, here in the United States, the writing sample still appears to reign supreme. And, so long as it does, we need to be honest about what it is and is not demonstrating about a candidate.  We also must consider the effect of required writing samples on student learning and engagement.  The interactive and iterative process of feedback and revision is essential to learning, but if legal writing courses must provide students the opportunity to produce writing samples independently for job applications, we are neither providing them the needed skills nor preparing them for real practice.

*****

[i] See Tejal Patwardhan, Rachel Dias, Elizabeth Proehl, et al., GDPval: Evaluating AI Model Performance on Real-World Economically Valuable Tasks, available at:  https://arxiv.org/pdf/2510.04374 (last accessed Oct. 13, 2025).

[ii] See, e.g., Hong Kong-based legal recruiter Ropner Lewis Sanders, How to Prepare for a Written Assessment, available at: https://www.ropnerlewissanders.com/how-to-prepare-for-a-written-assessment/ (asserting that “It is common for law firms to require potential candidates to sit written assessments during the interview process. Even for senior associate and counsel roles, having to undergo a written assessment is now becoming more common place.”).

[iii] See A Comprehensive Breakdown of Law Firm Assessments, available at: https://www.thelawyerportal.com/mastering-law-firm-applications/mastering-law-firm-assessments-a-comprehensive-breakdown/ (analyzing approaches in the United Kingdom).