NEW ROCHELLE, NY (May 8, 2025) — Lea Cohen, a student at New Rochelle High School, is spotlighted in a New York Times article for her stance on whether using artificial intelligence (A.I.) for schoolwork is cheating or a helpful learning tool, as educators and students navigate the growing presence of A.I. in education.
SEE: NYTimes: What Students Are Saying About Using A.I. for Schoolwork
Cohen argues that using A.I. for class assignments constitutes cheating. “It does not come from themselves, since the answer was produced in a matter of seconds,” she said. She believes A.I. disconnects students from their work, undermining personal thought and learning. “It’s too much of a fast track to completion,” Cohen added, noting that schools haven’t adapted to integrate A.I. supportively.
Educators face similar dilemmas. Many aim to restrict student A.I. use due to cheating concerns, while increasingly adopting A.I. tools for tasks like grading essays and tutoring, according to the Times article by Dana Goldstein. This raises ethical questions, such as whether it’s fair to use A.I. to grade essays while prohibiting students from using it to write them.
Jennifer Carolan, a former history teacher and founder of Reach Capital, said, “A.I. is already being used by the majority of teachers and students.” Yet, some educators worry that A.I. applications, like tutoring bots, disrupt the human relationships central to teaching.
In Washington, D.C., Alex Baron, an administrator at E.L. Haynes Public Charter School, considers A.I. math apps, which provide step-by-step solutions that students can copy, a form of cheating. However, Baron uses A.I. to analyze student data for targeted support.
In Providence, R.I., history teacher Jon Gold employs A.I. ethically in lesson planning, training ChatGPT with his curriculum to edit readings or create sample essays.
“Transparency is key,” Gold said, explaining his A.I. use to model ethical behavior. He allows A.I. for summarizing notes but prohibits it for drafting essays or research, stressing that synthesizing sources is vital for learning history. “I am more pro-A.I.-literacy than I am pro-A.I.-use,” he said.
Some students echo Cohen’s concerns, warning that even limited A.I. use risks dependency. Cohen noted that schools should be vigilant about A.I.-assisted work passed off as original, as it “starts small, then gradually becomes a big problem.”
This article was drafted with the aid of Grok, an AI tool by xAI, under the direction and editing of Robert Cox to ensure accuracy and adherence to journalistic standards.
Engle
April 24, 2025
Is it cheating when students use artificial intelligence to help them with their schoolwork?
In your opinion, how, if at all, should students be allowed to use A.I. in school?
How concerned should teachers be when students pass off A.I.-assisted work as their own?
In “Teachers Worry About Students Using A.I. But They Love It for Themselves.,” Dana Goldstein writes about concerns educators have about students’ growing use of A.I. She also writes about how teachers are using it more, too.
As artificial intelligence makes its way into schools, a paradox is emerging.
Many educators, concerned about cheating and shortcuts, are trying to limit student use of A.I.
At the same time, teachers are increasingly using A.I. tools themselves, both to save time on rote tasks and to outsource some of their most meaningful work, like grading essays and tutoring struggling students.
That tension has prompted some difficult ethical questions. For example, is it fair to use A.I. to grade student essays, if you’ve prohibited students from using A.I. to write them?
School leaders are grappling with these dilemmas as they confront a barrage of marketing claims around how A.I. could “transform,” “personalize” and “accelerate” learning.
A.I. “is already being used by the majority of teachers and students,” said Jennifer Carolan, a former history teacher and founder of Reach Capital, a venture capital firm that invests in A.I. learning tools.
But as the technology works its way into schools, some educators say they are concerned that tech companies are pouring resources into A.I. applications, like tutoring bots, that disrupt the human relationships at the core of teaching and learning — instead of creating tools to ease the bureaucratic burdens that shift adults’ attention away from children.
The article continues:
Among middle school students, word has gotten out about a solution for tricky math assignments. If you take a photograph of a problem and feed it into one of several free A.I. apps, the software will show you the correct answer and break the solution down step by step.
It’s easy to then copy those steps out, exactly as if you had solved the problem by hand.
Alex Baron, an administrator at E.L. Haynes Public Charter School in Washington, D.C., said he considered the widely used math apps a form of cheating.
But he acknowledged that he has found some compelling uses of A.I. in his own work. For instance, he can analyze students’ academic and behavioral data, and then split them into groups for targeted support.
Ms. Goldstein writes about some of the gray areas of the debate over the use of A.I. in classrooms:
In Providence, R.I., middle school history teacher Jon Gold has found generative A.I. useful in lesson planning.
He trained ChatGPT by feeding it dozens of pages of curriculum materials he wrote over many years. That helped the bot spit back useful material. It can edit a long reading assignment down to three paragraphs for a short exercise, or create dummy essays that illustrate for students the difference between an effective essay and one that lacks supporting evidence.
Transparency is key, he said. He explains to students exactly how he has used A.I., in part to model ethical use.
Asking a chatbot to summarize notes into a study guide is a good idea, for example. But he does not want students using A.I. to draft essays or conduct research. He tells them that finding various sources of information and synthesizing them in writing is key to learning history.
He also talks with students about knotty ethical issues around how chatbots rely on copyrighted material and consume an immense amount of energy.
“I am more pro-A.I.-literacy than I am pro-A.I.-use,” said Mr. Gold, who teaches at Moses Brown School, a private Quaker academy.
Students, read the entire article and then tell us:
Is it a form of cheating for students to use artificial intelligence for class assignments? Why, or why not?
How concerned should schools be when students pass off A.I.-assisted work as their own? Is it a big problem?
What rules does your school have about student use of A.I. on class assignments, and do you believe these rules are fair and effective? If you were a teacher, what rules about A.I. would you create for your own classroom?
What do you think about the use of A.I. by teachers? Ms. Goldstein writes: “Is it fair to use A.I. to grade student essays, if you’ve prohibited students from using A.I. to write them?” What’s your answer to that question?
The article details how A.I. in schools is big business. Do you think schools and educators should invest more in A.I. or be wary of it? Would you like to see your school invest in it?
How much have you used A.I. — in or out of school? What do you see as its benefits and drawbacks?
Stephanie Elizalde, the superintendent of the Dallas school district, says of A.I.: “It’s irresponsible to not teach it. We have to. We are preparing kids for their future.” Do you agree? Do you think there will be a day when A.I. plays a major role in all schools and classrooms? Should we be doing anything now to prepare for that day?