ChatGPT Again? Lessons for Education

ChatGPT lessons for education

By Naomi S. Baron

ChatGPT pushes the boundaries for AI tools generating or editing text. In the process, it challenges educators and students to puzzle out where its use is or is not appropriate. Linguist Naomi S. Baron takes on the dual questions of how the bot potentially undermines thinking and what kind of guidance we might offer for deciding when and how to invoke ChatGPT for written work.

When a technology is new, would-be users understandably struggle to make sense of its perils or powers. Take the telephone, which arrived on the public scene in 1876. Less than twenty years on, a newspaper editor warned “not to converse by phone with ill persons for fear of contracting contagious diseases.”I And in 1905, a woman in Columbus, Indiana, wanted a telephone installed at the grave of a dead relative, looking to conveniently communicate with the dearly departed.II In both cases, the actual workings of the phone were still elusive.

While the challenges confront us all, some of the greatest concerns are in education, particularly when it comes to students’ writing.

Today’s technological marvel is generative artificial intelligence (AI), especially Open AI’s ChatGPT. Since its launch on November 30, 2022, we have continued grappling with how—and how not—to use it. While the challenges confront us all, some of the greatest concerns are in education, particularly when it comes to students’ writing. There’s a legitimate fear that ChatGPT can undermine the writing process.

Emergence and Adoption of AI Writing Tools

Programmes like ChatGPT are only the latest in the progression of AI software for editing or generating text. The potential for AI to weaken both writing skills and the motivation to do your own composing has been decades in the making. Spellcheck and now sophisticated grammar and style programmes like Grammarly1 and Microsoft Editor2 are probably the most widely known AI-driven editing tools. AI text-generation developments have included autocomplete for Google searches, Gmail Smart Reply, and predictive texting.

Over the past two years, I have been asking university-age students in both the US and Europe about their attitudes towards using AI tools when they write. Participants in my surveys appreciated AI assistance with spelling and word completion, but they also admitted that

“At some point if you depend on a predictive text [program] you’re going to lose your spelling abilities.”


“Spellcheck and AI software … can … be used by people who want to take an easier way out.”

Regarding the use of predictive texting, one respondent mentioned laziness:
“It’s OK when I am feeling particularly lazy.”

while another perceived that predictive texting diminished her writing voice:

“Don’t feel I wrote it.”

A high school student in Britain3 echoed this same concern about individual writing style when describing Grammarly:

“Grammarly can remove students’ artistic voice…. Rather than using their own unique style when writing, Grammarly can strip that away from students by suggesting severe changes to their work.”

In a similar vein, Evan Selinger, a philosopher, worried4 that predictive texting reduces the power of writing as a form of mental activity and personal expression:

“[B]y encouraging us not to think too deeply about our words, predictive technology may subtly change how we interact with each other…. [W]e give others more algorithm and less of ourselves…. [A]utomation … can stop us thinking.”

Writing as a Thinking Process

ChatGPT lessons for education

In literate societies, writing has long been recognised as a way of facilitating thinking.5 Many people have quoted author Flannery O’Conner’s6 comment that “I write because I don’t know what I think until I read what I say.” A host of others, from William Faulkner7 to Joan Didion,8 have also voiced this sentiment. If AI text generation does our writing for us, we diminish opportunities for thinking out problems for ourselves.

One eerie consequence of harnessing programmes like ChatGPT to generate language is that the text is grammatically perfect. A finished product. It turns out that a lack of errors is a sign9 that AI, not a human, probably wrote the words, since even accomplished writers and editors make mistakes. Human writing is a process. We question what we originally wrote, we rewrite, or sometimes start over entirely.

When undertaking school writing assignments, ideally there is ongoing dialogue between teacher and student: Discuss what the student wants to write about. Share and comment on initial drafts. Then it’s time for the student to rethink and revise. In practice, this sequence happens far too infrequently. Teachers often don’t have time to fill a collaborative editorial—and pedagogical —role. Moreover, they might lack interest, necessary skills, or both. Conscientious students sometimes undertake aspects of the process themselves — as professional authors typically do. But the temptation to lean on editing and text generation tools like Grammarly and ChatGPT makes it all too easy for technology to substitute ready-made results for opportunities to think and learn.

The temptation to lean on editing and text generation tools like Grammarly and ChatGPT makes it all too easy for technology to substitute ready-made
results for opportunities to think and learn.

Before the appearance of ChatGPT, an earlier version of the same underlying programme, GPT-3, was already licensed by commercial ventures such as Sudowrite.10 Users enter a phrase or sentence and then ask the software to fill in more words, potentially stimulating the human writer’s creative juices. Yet there’s a slippery slope between collaboration and encroachment. Writer Jennifer Lepp admits11 that as she increasingly relied on Sudowrite, the resulting text

“Didn’t feel like mine anymore. It was very uncomfortable to look back over what I wrote and not really feel connected to the words or ideas.”
Students are even less likely than seasoned writers to recognise where to draw the line between a writing “assist” and letting an AI text generator take over their content and style.

Pragmatic Strategies for Guiding Student Use of ChatGPT

Over the past year, educators, AI mavens, and students have reacted to the new bot in their midst. Teachers and educational researchers have been developing a stream of creative uses of the technology12 to bolster learning. Meanwhile, a whole industry has risen up13 for detecting when AI, not humans, has done the writing. Yet, however crafty the tools become, they remain easy to fool.14 Just ask students15 about their success in gaming the system.

We need a way past the arms race between ChatGPT detection programmes and student circumvention. To this end, I suggest five points for students (and the rest of us) to consider when invoking generative AI as a writing maven.

Effort: How much are you willing to expend when you write?

Writing takes mental effort. Of course, so do many activities we undertake—solving calculus problems, learning the trombone, even deciding which book to read. Psychological theories explain16 that we don’t always need to apply 100 percent of our mental energies to a task. Sometimes, “good enough” is good enough.
It’s understandable wanting to minimise effort in the writing or editing process by conscripting help from AI. As of now, local policies vary widely.17 More confusing still, sometimes they are unarticulated. But where rules exist, writers need to adhere to them.

Trust: Do you trust what AI writes?

We’ve all heard that generative AI can hallucinate. A lawyer learned this lesson18 the hard way by relying on ChatGPT to prepare a brief, which the bot filled with made-up citations. When it comes to AI’s truthfulness, users are well-served by the old dictum, “Trust, but verify.” Plus, beware that AI-based programmes like Microsoft Editor can be wrong19 on points of grammar or word choice.

The hitch is recognising when verification is needed, especially if you don’t know the topic or you’re not confident about your writing skills. Don’t automatically assume the AI programme knows better. Double check information you’re not sure about, and ask someone who’s a good writer to confirm your wording choice.

Writing skills: Does AI improve or weaken them?

Writing is often time-consuming, even painful. Like a teacher with infinite patience, AI stands ready to offer an assist.

Ideally, AI’s writing prowess can be harnessed to improve our own skills—editing text we’ve written, helping us generate new ideas, and maybe even making us better spellers. But there’s another possibility: By letting AI do large chunks of writing and editing for us, we risk losing our own writing abilities or, for younger users, not feeling motivated to develop skills in the first place.

Reducing human skills—not just for writing but in, say, art or argumentation—is one of the biggest challenges20 of today’s AI. What’s more, by maintaining your writing skills, you’re not stuck when there’s no internet connection and an essay is due in the morning.

Writing voice: Does AI compromise your own?

A cascade of AI writing tools – from predictive texting to Grammarly— offer up text completion or “better” ways of saying what we mean. But when we mutely acquiesce, what happens to our own writing voice? In my research involving predictive texting, one university student worried that “maybe it makes me a little too repetitive.”
We all have individual writing voices. Beware of letting AI obliterate yours.

Commitment: How much do you care?

The biggest elephant in the room is commitment. What level do we have for a particular piece of writing? Maybe the stakes are low for routine emails or blog posts. Next up the ladder is writing for which we have more responsibility. Our name is on that plot synopsis we bade AI to draft. Did we bother reading the book to gauge where we agree with what ChatGPT wrote and where we don’t?
But then come even higher rungs of commitment for putting words together. At its best, human writing is a self-discovery process. AI has no self to discover. And, remember as well that writing is ultimately a craft in which we can take pride, whatever our level of expertise. When we cede control to AI, we forgo personal artistry.

Writing is often time-consuming, even painful. Like a teacher with infinite patience, AI stands ready to offer an assist. But AI can also seduce us into minimising effort or trusting it when we shouldn’t. Its smooth prose (that companies paid billions to generate) potentially leads us to believe its skills and voice always surpass our own. It tempts us to forget how empowering it can be to think through a problem with a pen or keyboard at hand, and how satisfying it can feel to find our own perfect wording.

Writing is a profoundly human activity. AI now has a role in the drama, but people need to remain the playwrights.

This essay draws upon opinion articles that previously appeared in The Conversation (“How ChatGPT Robs Students of Motivation to Write and Think for Themselves,” January 19, 2023) and Inside Higher Ed (“5 Touch Points Students Should Consider About AI,” September 6, 2023).

About the Author

Naomi BaronNaomi S. Baron is professor emerita of linguistics at American University in Washington, DC, and author of Who Wrote This? How AI and the Lure of Efficiency Threaten Human Writing (2023, Stanford University Press). She is a former Guggenheim Fellow, Fulbright Fellow, and Visiting Scholar at the Stanford Center for Advanced Study in the Behavioral Sciences. For more than thirty years she has been studying the effects of technology on language, including the ways we speak, read, write, and think.


  1. Marvin, Carolyn (1988). When Old Technologies Were New. Oxford University Press, p. 81.
  2. “Wishes Telephone Placed” (1905). Telephony 10, p. 65.


  1. Grammarly
  2. Microsoft Editor
  3. Grammarly Both Helps, Hinders Students, The Standard, November 3, 2020, Daniel de Beer
  4. Will Autocomplete Make You Too Predictable? BBC, January 15, 2015, Evan Selinger
  5. How Can I Know What I Think Till I See What I Say? Quote Investigator,
  6. Flannery O’Connor, Britannica,
  7. William Faulkner, Nobel Prize,
  8. Why I Write, The New York Times, December 7, 1976, Joan Didion
  9. How to Spot AI-generated Text, MIT Technology Review, December 19, 2022, Melissa Heikkilä
  10. Sudowrite
  11. The Great Fiction of AI, The Verge, July 20, 2022, Josh Dzieza
  12. Empowering Students with AI Literacy, CRAFT,
  13. 16 of the Best AI and ChatGPT Content Detectors Compared, Search Engine Land, April 25, 2023, Tom Demers
  14.  AI-text Detection Tools Are Really Easy to Fool, MIT Technology Review, July 7, 2023, Rhianon Williams,much%20to%20get%20past%20them.&text=
  15. I’m a Student. You Have No Idea How Much We’re Using ChatGPT, The Chronicle of Higher Education, May 12, 2023, Owen Kichizo Terry
  16. Two Brains Running, The New York Times, November 25, 2011, Jim Holt
  17. Law Schools Split on ChatGPT in Admissions Essays, Inside Higher Education, August 4, 2023, Lauren Coffey
  18. The ChatGPT Lawyer Explains Himself, The New York Times, June 8, 2023, Benjamin Weiser and Nate Schweber,citations%
  19. Who Wrote This? How AI and the Lure of Efficiency Threaten Human Writing, Naomi S. Baron, Stanford University Press (2023)
  20. AI Is an Existential Threat –Just Not the Way You Think, The Conversation, July 5, 2023, Nir Eisikovits,

The views expressed in this article are those of the authors and do not necessarily reflect the views or policies of The World Financial Review.