You have /5 articles left.
Sign up for a free account or log in.

What’s a professor to do? That’s exactly what I had to ask myself this past semester when a student handed in several assignments that, to me, were clearly written not by him but by generative AI.
My dilemma: I couldn’t definitively prove it, but I knew it.
If you’ve taught in a college classroom these last two years, you no doubt know exactly the dilemma I’m describing. The rabbit hole of makeshift detective work that resulted may feel familiar, too.
The student’s “work” had a number of tells. It was a class on the 1960s in America, and the student’s research paper on the counterculture included references not only to what one might expect—Haight-Ashbury, psychedelic music, communes, Timothy Leary and Woodstock—but also just about every other anti-establishment movement of the decade, from the Black Panthers to the 1964 Free Speech Movement at Berkeley, the Freedom Rides to feminism, the Students for a Democratic Society to the protests against the Vietnam War. For good measure, the paper also tossed in references to Gloria Steinem, Rachel Carson, the Civil Rights Act of 1964 and the Voting Rights Act of 1965. It felt like a generic encyclopedia article on the ’60s.
In class, we were very clear when defining the counterculture and distinguishing it from all the other liberation and protest movements of the ’60s. We also discussed how it may have been political in a cultural sense—a rejection of materialism and conformity—but it was separate from the activism that opposed the war and fought for civil rights. My student was either oblivious to the readings and everything we covered in class, or there was something else going on.
So I decided to plug this prompt into a couple AI sites: “Write a 10-page research paper on the 1960s counterculture in the United States.” And what they all spewed out were papers strikingly similar in content, language and organization to what my student handed in—clearly aggregating all that was ever written about or associated with the phrase “1960s counterculture.” But given that AI evolves second by second, and there are now multiple sites that with a simple click produce papers and essays, I could not find any exact examples of plagiarism.
I then checked the paper’s sources, and sure enough, a number of the references had nothing to do with the content they were supposedly supporting. In one instance, the paper notes psychedelic art and imagery used to advertise shows at the legendary ’60 rock ’n’ roll venue Bill Graham’s Fillmore West in San Francisco. But the cited article neither mentions the Fillmore nor anything to do with the era’s psychedelic typography or imagery. Worth noting is the title of the article cited: “The Rise of 1960s Counterculture and Derailment of Psychedelic Research.” It had all the right words—“1960s,” “counterculture,” “psychedelic”—but it had little to do with what was in the paper. And that’s a characteristic of AI: Insert sources that seem like a good match simply because of keywords in the title.
In another case, when talking about the counterculture’s legacy, the paper turned in by the student describes how corporate America commodified countercultural symbols, giving examples of Chevrolet using a Jefferson Airplane song and fashion brands mass-producing tie-dyed shirts. But again, the cited article had nothing to do with the point being made—it didn’t include any of the examples, and it was focused primarily on the Netherlands—though the title sounded like it could have been relevant, “Children of the Revolution: The Impact of 1960s and 1970s Cultural Identification on Baby Boomers’ Views on Retirement.”
There were other instances of strange sourcing, such as citing the documentary No Direction Home to explain how some of Bob Dylan’s songs were anthems for the civil rights and antiwar movements, or referencing a Harvard psychology department biography of Leary to explain his iconic “turn on, tune in, drop out”—a phrase we discussed in class that didn’t need a citation. Cherry-picking sources is another sign of AI.
Also interesting: the student emailed me a day after the research paper was due apologizing for not having turned it in, claiming he was sick but also saying that he thought the deadline was the following week. Yet even though the student believed he had an extra week to write it, somehow he had the paper ready to include in that email. Writing and researching a research paper in less than 24 hours would be an impressive feat—or, perhaps, the student simply hit “enter” on an AI site.
For the final exam, students were to choose from two essay questions. Once again, there were tells suggesting an AI at work. The essays submitted by the student didn’t weave in any of the assigned readings or class examples, as required, and instead provided broad brushstroke answers that were exactly what AI produced when I entered the questions into various sites. I actually found one site that listed word-for-word three of the six section titles used in one of the two essays.
But my student was also clever. He knew that the exam required references to the readings, so he attached a bibliography at the end that included random readings from the syllabus, and in one case showed the date he allegedly accessed the document. But there was not a single example or idea in his essays that I could trace to any of the readings he listed, and some of the sources cited had absolutely nothing to do with anything he supposedly wrote. I smiled at his resourcefulness, but not the apparent cynicism.
Because this student’s papers were late, they got bumped to the very end of my grading queue, so I planned to raise my concerns with him when he was supposed to stop by and hand in his final exam. But he emailed me the final instead, saying he was stuck at his internship. My university gives us 72 hours to submit grades after the final exam, and I had mounds of finals and a few late papers to grade. So all I had time to do was email him expressing my suspicions. He never replied.
With no provable evidence of plagiarism, I had no grounds to fail him or even raise this with administrators. After all, he could claim that he’s a student, that students make sourcing mistakes, that he’ll be better the next time, and that much of what he “wrote” reflects the history of the ’60s, which in a general sense it did. So I simply graded him based on his problematic citations, apparent fake bibliography and lack of examples from the readings and class. He didn’t do as well as I suspect he hoped.
I will use this case in future classes as an object lesson and a warning. But I’ve also come to terms with the fact that some students come to college for an education, and others simply for a credential. The former will learn how to research, think critically, identify patterns in culture and history, and synthesize large amounts of material—which is exactly the type of mindset that will enrich our society and help them succeed. As for the latter, I have confidence that their truth will come out—that they will flatline in their careers because life in the real world will expose their reliance on shortcuts, cunning and cheating.
That said, I will continue to give my heart and soul as a teacher to all of them, regardless of why they’re here, because I still have faith that educators can light the spirit of inquiry in their students even if their original goal for attending college is more about résumé building than learning.