Students are headed back to school armed with AI. Here's how some colleges are adapting.
Different teachers have approached AI in different ways. Diane Gayeski, a professor of strategic communication at Ithaca College in New York, requires her students to use AI, knowing that they will have to show proficiency in AI tools in the corporate communications world. "[AI] is an expected part of the portfolio in the same way that [students] are expected to use PowerPoint or Excel, right?" Gayeski told Stacker.
A survey of nearly 2,700 U.S. instructors by education research group Ithaka S+R found that 3 in 5 teachers allow or encourage students to use AI. Some of the more accepted applications include brainstorming ideas, drafting or editing written assignments, and creating outlines.
When Leslie Layne started teaching with ChatGPT, a language model trained to write text using vast information on the internet, she had her students ask for articles to back up the information provided—and the chatbot obliged. But there was a problem: Most of the articles weren't real, said Layne, an associate professor of English at the University of Lynchburg, Virginia. Moreover, she told Stacker that all of the information wasn't accurate because ChatGPT is a writing tool, not a search engine. A Purdue University study published in May 2024 found it still offered incorrect information 52% of the time. Layne's experience serves as a cautionary tale as colleges and universities grapple with the rise of AI tools at breakneck speed.
One of the most salient concerns about using AI is its effect on the quality of education. In the same survey of provosts by Inside Higher Ed, more than 7 in 10 respondents said they were worried about the threat to academic integrity.
AI has inspired bans from several educational institutions, including one of France's top universities, Sciences Po, and Bangalore's RV University. In Australia, a few jurisdictions have blocked ChatGPT on school networks. In the U.S., schools in New York City, Los Angeles, and Seattle have done the same.
Professors interviewed by the Associated Press in August 2023 described wanting to "ChatGPT-proof" test questions and assignments, with some requiring students to show their drafts. One writing professor, Timothy Main of Conestoga College in Canada, told AP that he had recorded 57 questions of academic integrity, about half the result of AI. That was up from eight in each of two earlier semesters.
Despite the increased concerns about AI use, the levels of misuse have largely remained stable even after its grand debut. A Stanford Graduate School of Education study found that rates of cheating had stayed the same—between 60% and 70% of students engaging in "cheating" behavior in the last month—or even decreased slightly after the introduction of ChatGPT.
It can be easy to spell out AI's pitfalls in education. "The college essay is dead," The Atlantic stated in December 2022, pointing to the further collapse of humanities. Although supporters also see its advantages, its applications should be nuanced.
Layne likens ChatGPT to a calculator, a tool that can help brainstorm points to support a premise, for example, or create an outline. "It's quite bad at the actual writing," she said. ChatGPT's weakness in writing presents an opportunity for students to develop critical thinking. Layne asks her students to grade a ChatGPT-produced essay. Gayeski requires her students to document how they approach AI, their prompts, and their final results.
Not only students but educators stand to benefit. The same Ithaka S+R survey found that nearly 3 in 4 teachers had experimented with AI for instruction. The most popular use was in designing course materials, though less than a quarter (22%) said they used AI to do so. AI was also an extra resource for administrative tasks (16%) and creating visualizations (15%).
As revolutionary as AI is, it also presents the need for education to be just as adaptable, upending the typical top-down teacher-student relationship to be more collaborative. The World Economic Forum points out that AI can radically personalize learning, which benefits everyone with access, including neurodiverse students and those with disabilities.
At Boston University, the computing and data sciences department has adopted a student-designed policy that openly acknowledges AI use and encourages transparency from students and teachers.
Azer Bestavros, associate provost for computing and data sciences at Boston University, told NBC Boston: "We decided that [AI] is something to embrace, and to use actually to elevate the game for everybody. It's about making teachers better teachers, and making students better students."
Story editing by Carren Jao. Additional editing by Kelly Glass. Copy editing by Paris Close. Photo selection by Clarese Moller.