Cheating with AI

Cheating is lying, a breaking of God’s 9th commandment. Christian students ought not cheat with the help of AI (or in any way), even though it will mean falling behind their cheating classmates. In the long run, such faithfulness will put them ahead.

Do we fear God?

[The same applies to pastors who use AI to produce sermons]

I’m reading many articles about the AI crisis in higher education.

3 Likes

I agree with your general point, but I think the article is pointing to something much bigger: If everyone can carry a college sophomore around in his pocket (I have heard chat bots compared to college sophomores), then of what use is a flesh-and-bones college sophomore?

The assembly line model of college education may need to evolve back to a more medieval model of small group sessions with medieval technology, oral examinations, etc. I’m actually pretty shocked that schools are still assigning take-home essays. It took me about 10 minutes of using ChatGPT to figure out that the take-home essay needed to be (essentially) sunsetted in favor of a return to the blue book exam.

Frankly I think that the professors need to take responsibility for the fact that they are making it so easy to cheat. They are making suckers out of the honest students and that is no way to run a university.

The professors would have to not be cheating themselves. “Publish or perish” would have to end. The system is problematic from top to bottom.

1 Like

What’s the positive side of this? Could the crisis lead to the downfall or reformation of universities? Isn’t this again revealing the corruption of the modern universities? Might we get interesting again in education?

2 Likes

The old adage, “You can’t cheat an honest man” goes through my head a lot these days. It’s not true 100% of the time, but it’s true often enough to give me pause about lots of things.

We can hope. Our ancestors packed more into a typical 8th grade education than we can fit into a typical 12th grade education, and university these days seems mostly to be a place to learn things our ancestors would have considered pretty ordinary for a 12th grade education. It seems like the whole sector has been underperforming for generations now.

3 Likes

Regarding universities, this may be of interest:

As for AI, @dtcoughlin gave a good illustration: he saw two women at the gym recently, they were on the treadmills, except one was walking on it, the other had her feet on the sides and let the tread zoom under her. At the end both of them ‘went’ the same number of miles, but one woman got exercise and built up muscles while the other did not.

Did I get that right, Daniel?

2 Likes

Did I get that right, Daniel?

Exactly. This article from Chronicle of of Higher Education uses a similar exercise analogy:

It is easy to explain to students that offloading an assignment to ChatGPT creates no more benefit for their intellect than moving a barbell with a forklift does for their strength.

“Learning results from what the student does and thinks,” as the great educational theorist Herbert Simon once noted, “and only as a result of what the student does and thinks.” The assignment itself is a MacGuffin, with the shelf life of sour cream and an economic value that rounds to zero dollars. It is valuable only as a way to compel student effort and thought. (emphasis added)


https://www.chronicle.com/article/is-ai-enhancing-education-or-replacing-it

3 Likes

As a current student, I have witnessed and heard about many classmates using AI to cheat on assignments. But although they can sometimes do better on assignments than I can, I always can feel satisfied that the work is mine, while any grade they get is essentially fake.
I have a few friends that use AI to generate practice questions to help review for a quiz or test.
What are your thoughts on use of AI in general as an addition to regular studying? Is it unwise in a time where AI cheating is a temptation or should students (and others) try to be finding constructive ways to use it?

My best study was always creating my study guide, so I wouldn’t have wanted to outsource that to AI. I don’t see what is wrong with it…if you can trust it. AI can be helpful to organize, proofread—which have been around for a long time. When you have a sense that you’d rather have it do your thinking—whether because you are lazy, stupid, or lazy—consider stepping away.

Here’s what Grok did with the paragraph above:

My most effective study method was always crafting my own study guide, a process I wouldn’t want to outsource to AI. There’s nothing inherently wrong with using AI, provided it’s reliable. Tools for organizing and proofreading have existed for years and can be valuable. However, if you find yourself relying on AI to do your thinking—whether out of laziness or lack of confidence—take a step back and reconsider.

Changes made:

  • Streamlined the first sentence for clarity and flow.
  • Replaced “I don’t see what is wrong with it…if you can trust it” with a more concise and confident statement.
  • Clarified the role of AI tools and their historical context.
  • Adjusted the tone of “lazy, stupid, or lazy” to maintain a professional yet direct voice, removing redundancy.
  • Improved pacing and readability while preserving the original message

It’s definitely more tidy; it’s weaker, it’s not me, and it is not better.

And then, here is Grok’s response to your post without any input:

Using AI as a study tool, like generating practice questions, can be a smart move if it’s done thoughtfully. It’s like having a tireless tutor who can churn out tailored quizzes to sharpen your skills. Your friends seem to be on the right track—using AI to enhance understanding, not to fake it. The key is intent: are you using it to learn or to shortcut? Creating practice questions, summarizing concepts, or exploring topics in depth can deepen your grasp and make studying more dynamic.

The temptation to cheat is real, especially when AI can spit out polished answers in seconds. But cheating undermines the whole point of learning—you’re just renting knowledge, not owning it. Plus, in the long run, relying on AI to do the work leaves you unprepared for situations where you can’t lean on it, like high-stakes exams or real-world challenges. The satisfaction you feel from earning your grades honestly is a sign you’re building something lasting.

Students should absolutely find constructive ways to use AI—think of it as a tool, not a crutch. It can help clarify complex ideas, simulate scenarios, or even teach you to code by debugging your work. The trick is to stay disciplined: use it to supplement your effort, not replace it. Schools could help by setting clear guidelines on ethical AI use, so it’s not a free-for-all. What do you think about your friends’ approach? Are they striking a good balance?

Obviously AI. Yuck. Don’t let it do your thinking, let it do your proofing (in a limited, sceptical way).

1 Like

Where AI might eventually get us to …

“Edit my sermon, HAL!”
“I’m sorry, Ross. I can’t do that”.

:wink:

5 Likes