Students, Staff Engage in Dialogue About ChatGPT

Students+ask+ChatGPT+questions.

Erin Koo

Students ask ChatGPT questions.

A faculty workshop series on the use of artificial intelligence in a college setting, particularly generative text models like ChatGPT, began Feb. 9. Sponsored by the Gertrude B. Lemle Teaching Center in collaboration with the Bonner Center, the series aims to explore opportunities for faculty to incorporate generative text models, such as ChatGPT, into their curriculum.

Interim Director of the Bonner Center Thom Dawkins was inspired to host a workshop series after a conversation with Writing Associates Program Fellow Ryo Adachi.

“I had been speaking with Ryo Adachi … and in our conversation, we were talking about how to teach with ChatGPT and how they’ve been using it,” Dawkins said. “I reached out to a bunch of the deans and to [Associate Professor and Chair of Africana Studies] Charles Peterson, [director of] the Lemle Center.”

According to Senior Instructional Technologist Albert Borroni, the Lemle Center connected him with Dawkins after they both expressed interest in hosting workshops on ChatGPT. During the first workshop, Borroni provided faculty with a general overview of how the technology functions.

“I thought we really [needed] to have faculty understand what this is doing and why it’s doing it,” Borroni said. “It was machine learning. It takes data that’s fed to it. It will then create something novel based on that.”

Borroni hopes that future workshops will help faculty explore how they can use ChatGPT as an educational tool. Dawkins and Borroni have already found ways to incorporate AI into their lesson plans this semester.

“One of the first days of classes this semester, we took a look at my writing prompts for the first essay and the final essay, and I said, ‘Try to find the best way possible to have ChatGPT write your paper for you,’” Dawkins said.

Borroni, in addition to exploring ChatGPT with his students, has utilized it for lesson planning.

“I asked it to create a syllabus for 12 weeks for a non-majors neuroscience course, and it went straight through and did it,” Borroni said. “I had to edit a little bit of it … [but] I looked at it [and] I thought, ‘Oh, this is great!’”

According to College fourthyear and Hearing Coordinator for the Student Honor Committee Olivia Bross, as generative text models have become more accessible, the Student Honor Committee has considered concerns about increased plagiarism.

“The honor committee has had a couple of conversations regarding the increased prevalence of generative AI,” Bross wrote in an email to the Review. “After talking with administrators, we share similar concerns since it is more difficult to find enough evidence to prove that someone has plagiarized using AI.”

However, Bross also noted that plagiarism was a concern before ChatGPT, and that the use of ChatGPT is a violation of the Honor Code as it currently reads.

“From what has been communicated with the Student Honor Committee, there have been no clear plans to change language or procedures to account for AI,” Bross wrote. “An honor code violation using AI still falls under current language that the SHC uses, such as ‘the use of unauthorized materials’ and ‘plagiarism.’”

Associate Professor and Chair of the Computer Science Department Cynthia Taylor sees the danger of AI as being less in its capabilities than in its limitations.

“I think the danger is, if you generate a lot of code using one of these systems and you put that into practice … you don’t know what bugs you’ve introduced,” Taylor said. “You don’t know what that code depends on [and] that is very dangerous, especially in terms of running code that people depend on … these black box systems [where] we can’t see what they’re doing inside … I would worry about students who would come to depend on these systems and not [have] to think critically about what they’ve generated.”

Despite concerns, Dawkins does not feel that generative text models pose a threat in the classroom.

“Coming from a humanities perspective, I think that we shouldn’t be as threatened by it because we’re not in the content production business,” Dawkins said. “We teach these disciplines, we teach writing as critical thinking, as ethical deliberation, as [a way] to explore how we really feel about an issue to connect with an audience.”

Borroni echoed this sentiment.

“If somebody wants to plagiarize, there’s paper mills all over the place already, so I’m not worried about it,” Borroni said.