Courtesy | Unsplash
Every academic department and program at Hillsdale is deciding what to do about artificial intelligence. So is The Collegian. But unlike the history or chemistry departments, The Collegian is run not by professors but by students.
That’s why our three rules — printed below — were written by our editors. Earlier this month, our staff gathered to decide what role AI should have in our work.
Together, we wrote these rules. While our journalism faculty made suggestions, we wrote the substance of our policy, which we think is fair and straightforward. We will now apply it to everything that publishes in The Collegian. And we will enforce it.
- Do not let artificial intelligence write for you. This includes composition and revision. The Collegian teaches students to write and edit their own work. It also promises its readers that our articles are written by people.
- You may use AI tools that highlight errors, including misspellings, improper punctuation, or subject-verb disagreement. You may use AI as a dictionary and a thesaurus.
- You may use AI for research, such as the discovery or review of sources and documents. AI operates well as a high-powered search engine. Do not cite an AI-generated answer as a source. Sometimes AI is wrong. Check the source it cites.
“Don’t use AI ever” is an impossible standard. You can’t do a Google search these days without seeing an AI response from Gemini.
A blanket ban also seems arbitrary in some cases. Many word processors like Microsoft Word already alert to misspellings and absent commas, and allowing a tool like Grammarly to highlight errors is fundamentally different from using it to compose something new.
For research and fact-checking, AI tools like Claude and ChatGPT can quickly lead a journalist to relevant sources. In those cases, the line between Google or Wikipedia and an AI tool seems fuzzy. And like these other resources, using AI as a search engine or to highlight errors doesn’t erase the need for human judgment.
But writing is different. Some students who have let AI write for them have told me something like this: “It was all my own thoughts. The AI just put them into words for me.”
This appeals to what Arkansas editorial writer Paul Greenberg called the “oldest fallacy about writing — the assumption that there is a clear distinction between writing and thinking.” This error, Greenberg writes, imagines that “once you’ve thought about something (wordlessly, somehow) you sit down and perform the mechanical task of putting it on paper. ’Tain’t so.”
Writing is not a mechanical task. “Writing is thinking,” per Greenberg. Every choice you make about your argument, reasoning, evidence, attribution, sentence order, diction, syntax, and even punctuation shapes the thought you express. Writing involves thousands of these little decisions, some conscious, some not. When AI writes for you, it thinks for you.
Some newspapers care less about this distinction. The 184-year-old Cleveland Plain-Dealer, the city’s largest newspaper, has started using AI to draft news articles. They say it’ll help boost web traffic and allow their human journalists to do more reporting.
Maybe so, but that’s not what we’re here for. Just as “the goal of classroom writing is not for students to produce the best possible papers using any tool,” as Hudson Institute Senior Fellow Aaron MacLean put it in an essay, the goal of The Collegian is not to produce the best possible newspaper by any means. That’s why we “reject the crutch,” as MacLean said. The goal of providing our readers with an excellent newspaper is important but secondary. Our primary aim is to teach students how to write, edit, and think.
We’ll take a B+ article written by a person over an A+ one written by a robot. And we suspect you, gentle reader, might appreciate that a human, not a disembodied thing, wrote this piece.
Thomas McKenna is a senior studying political economy.
![]()
