(Digital illustration by Sara Casias)

AI in Education: Helpful Tool or Sinister Danger?

Most college professors are wary of it, but others embrace it as a learning advantage

Back Article Dec 9, 2024 By Sasha Abramsky

This story is part of our December 2024 issue. To subscribe, click here.

All around the world, academic institutions, from the renowned online Khan Academy through to leading research universities, are grappling with the extraordinarily rapid rise of artificial intelligence systems capable of generating human-like responses to complex questions. ChatGPT can, at speed, generate essays in response to prompts. AI systems can take — and pass — medical school and law school entrance exams. They can write computer programs. They can generate artwork and videography. AI programs have written symphonies and deep-faked the voices and instrument tones of rock stars. 

In this world where the boundaries between real and fake, human and non-human cultural products are collapsing at speed — and where AI has already fundamentally impacted how we engage politically and how we interpret the news — universities are on the front line in attempting to coordinate a response. Some, like UCLA, are incorporating ChatGPT into the suite of software tools the university makes available to faculty and staff. Meanwhile, Sciences Po, one of France’s top universities, has entirely banned the use of ChatGPT, threatening to expel students caught flouting this rule.

Closer to home, the Capital Region hosts several higher education institutions, including UC Davis, Sacramento State and the Los Rios Community College District. At each of these institutions, faculty and administrators are scrambling to keep pace with the AI revolution to harness its benefits — for example, a sped-up research tool, a Grammarly or Google on steroids — while generating firewalls against its harms. It is, however, playing out in a somewhat scattershot way.
Most of the onus for reworking classes is falling on professors and lecturers, with administrations shying clear of rushing into one-size-fits-all solutions handed down from on high. 

“I really think the faculty, among their department and governance groups … are teaching students to use it as a tool: when is it appropriate as a tool, and to use it as an early writing tool, and then making the writing their own.”Jennifer Laflam, Sacramento City College’s Dean of the Outreach Centers at West Sacramento and Davis

“I really think the faculty, among their department and governance groups, are addressing the issue,” says Jennifer Laflam, Sacramento City College’s dean of the outreach centers at West Sacramento and Davis. “They are teaching students to use it as a tool: when is it appropriate as a tool, and to use it as an early writing tool, and then making the writing their own.”

In fact, across the region, college administrators have been extremely reluctant to hastily jump into the fray, concerned that if they take a heavy-handed approach to AI use they will hobble students who, once they leave college or university, will oftentimes be entering jobs in which proficiency in AI tools will be a requirement. 

Some are embracing AI reluctantly; others, however, have few qualms. Some, like Sac State Chief AI Information Officer Sasha Sidorkin, are, despite concerns from more than a few members of their faculty, unapologetically boosterish in their vision for AI. 

Sitting in his office, his hair in a ponytail, his face heavily bearded, his fedora perched on a nearby filing cabinet, Sidorkin says that much of the fear about AI-facilitated cheating is simply a “moral panic.” He views it as being more a case of fear of the unknown than of a carefully thought-out critique. (Somewhat incongruously, though, he does recognize that there are at least some risks that out-of-control AI systems could cause civilization-wrecking consequences.) 

Last spring, Sidorkin’s office conducted a faculty survey on AI usage on campus. It reached no consensus as to what constituted cheating. In other words, what to one professor might seem illegitimate is to another simply technology-augmented research. “Everyone wants some guidelines,” he says. But, he adds with a chuckle, they all want the guidelines to reflect their particular class preferences for how to deal with the issue. 

Brian King, chancellor of the Los Rios Community College District, regards AI as “one of the greatest opportunities in higher education in many years.” (Photo by Rudy Meyers)

The report, based on the responses of 183 faculty members, found that many faculty also, through lack of knowledge about AI tools and lack of time to modify curricula to incorporate these new technologies, “do not yet routinely use advanced AI tools in their professional activities, with only a minor portion often or always using AI for tasks like course development, grading, or class preparation.” More than a third of the faculty who responded said that they explicitly banned the use of AI in research. More than a quarter banned it for brainstorming ideas around assignments. Nearly half banned it for drafting outlines of assignments. And close to 60 percent had a prohibition on using it to write the actual assignments.

Sidorkin regards faculty resistance to utilizing AI as a generally regressive phenomenon, and his office recently announced a new Sac State course titled “College and Career With AI.” He sees it as likely that several years down the line, his university will standardize its guidelines for AI usage. He hopes, however, that the powers-that-be will take their time doing so, worrying that if it’s done too soon, it could put roadblocks in the way of a powerful — and increasingly ubiquitous — educational tool, “sending students the wrong message, that this is something sinister.”

Yet for many faculty it is indeed sinister. Fifty-eight of the faculty respondents to the survey reported that they have put into their syllabus wording specifically discussing the use of AI in academic settings. Many warn that using these programs constitutes a “form of plagiarism or academic dishonesty.” More than half of the faculty who responded expressed concern that students who were overly reliant on AI wouldn’t learn important skills, from how to analyze and contextualize information through to being taught how to write a fluent narrative. Nearly half were concerned by the lack of a campus-wide policy regulating the use of these new technologies.

Sidorkin acknowledges that AI programs such as ChatGPT make it all but impossible to accurately assess students’ writing abilities based on outside-the-classroom assignments. But, he says, in many cases writing is simply a means to an end. And, if a machine can do it better than a human, he doesn’t see that as being any more inherently problematic than using a calculator to rapidly perform mathematical tasks that humans used to have to painstakingly and slowly perform themselves.

“This technology will free us from all these illusions,” he explains, sounding more like a techno-futurist than a typical university administrator. “We are a lot more mechanical than we like to think. When AI takes over these tasks, it frees us for more interesting things — more artistic, more creative.” Will it involve curriculum change? Of course, he says. But that’s not necessarily a bad thing. “For every subject, you can rebuild the curriculum from the ground up.” Traditional essay-writing will, he says, likely disappear. What will replace the essay? Only time will tell.

Other administrators and academics are somewhat more tempered in their approach. Brian King, chancellor of the Los Rios College District, regards AI as “one of the greatest opportunities in higher education in many years.” For department administrators, he says, it will allow them to develop schedules for professors, lecturers and students at previously unimaginable speeds. It will dramatically increase efficiencies in the registration process for classes. It will allow help desks to function 24/7. Yet, at the same time, King is all too aware of the risks that AI poses to the integrity of the academic setting, making it harder for faculty to know whether they are evaluating, and grading, the work of their students, or whether they are instead assigning scores to material produced by sophisticated AI systems.

Jennifer Laflam, dean of Sacramento City College’s outreach centers in West Sacramento and Davis, says faculty are addressing the use of AI and teaching students to use it as a tool. (Photo by Rudy Meyers)

As the year winds down, the Los Rios district academic senate has been meeting to develop policies on the new AI realities, a process that King says will continue throughout the remainder of the academic year. A similar set of summits is occurring within other community colleges, including one at Folsom Lake College. Faculty are holding conversations on how to identify AI-generated plagiarism. Some are considering moving away from homework assignments done outside the classroom setting and instead introducing more in-class assignments that, the theory goes, will be harder to farm out to ChatGPT.

“The research paper is going to be a different product and process,” King believes. Ultimately, each professor and each department will likely come up with their own ways of modifying assignments.

At Sierra College, academics and administrators are also wrestling with questions around AI. Earlier this year, a working group was set up to explore how to develop a set of best practices — though President Willy Duncan did not impose a hard deadline for when the group had to report back by. Until they do, Duncan acknowledges it is something of a “wild west” situation, with individual professors left to their own devices around how to enforce academic dishonesty policies and whether or not to punish students who employ Chat GPT and other software to essentially write their essays for them.

In general, Duncan is an AI booster, believing not only that its advance into the far reaches of academia is inevitable, but also that it is a good thing and one that industry will support over the coming years. “It’s a tool that is extremely powerful,” he explains. “It’s here, and it’s here to stay. I think it’s a powerful tool that can be utilized positively.” The Sierra College president is mindful of the fact that a growing number of businesses in a growing number of industries now want students to graduate from college competent in using AI to augment their work. And he is keen to steer the college in a direction that embraces, rather than shies away from, AI tools. “It’s a natural extension for us in our workplace preparation work,” he says.

For Andy Jones, academic associate director for academic technology services at UC Davis and a longtime lecturer in writing, the trick is to craft assignments that mandate such an individual response from students that it would be more work for a student to program ChatGPT to answer the question than to simply do the writing themselves. “Breaking larger assignments up into multiple parts; opening context specific to the classroom — to the book that’s just been read, for example — and including some autobiographical context.” 

Jones and his colleagues have been exploring assigning more writing tasks to students, but at less length. Rather than asking students to write a long essay on their own time — which might well end up being farmed out to ChatGPT — he thinks in the future, there will likely be more emphasis on short in-class assignments. Meanwhile, through coming up with one example after another at speed, AI will be used to help students understand what things like dangling modifiers are, or to assist them in navigating the often-arcane world of syntax. Consider ChatGPT a form of “ersatz teaching assistant,” says Jones. “An always-available TA (teacher’s assistant) who can talk them through their concerns and activities and offer them lessons.” If the teacher in his or her traditional role is what Jones calls “the sage on the stage,” he argues that ChatGPT can be “the guide on the side.”

Sidorkin, who is hoping to make CSU Sacramento a hub for AI usage and development, largely agrees with this approach. “We are dealing with a new technology,” the information officer says. “Let people know more about it.” Done well, he argues, it will empower students. “We have to raise our expectations of students, not dumb down,” he argues. “Because now, with artificial intelligence, we have this wonderful tool.”  

Stay up to date on business in the Capital Region: Subscribe to the Comstock’s newsletter today.

Recommended For You

Speaking Out

Man with ALS is able to ‘speak’ again thanks to neurotechnology from UC Davis

Within minutes, the words he wanted to speak appeared on a monitor and were spoken aloud by a computer in a voice resembling his own using recordings from videos made before his diagnosis. It brought everyone in the room to tears. “On day two of this use by Casey, he was talking to his daughter for the first time in her memory. That was so gratifying,” says UC Davis Neuroprosthetics Lab co-director Sergey Stavisky.

Dec 2, 2024 Jennifer Junghans

Is Our Future Tied to the Tracks? with James Stout

PODCAST EPISODE: Journalist James Stout delves into his latest feature for Comstock’s “What’s Holding Up Valley Rail?” where he investigated delays in railway expansion throughout the Central Valley. We discuss the hurdles that are baked into the system, changing attitudes towards public transit and whether hopping on a train might be more commonplace in future America. 

Oct 14, 2024 James Stout