ChatGPT, an Artificial Intelligence (AI) tool that lets users instantly obtain human-like answers to questions or prompts, rose to controversy in the past month. Ithaca College’s campus community is considering how the new AI chatbot might affect students’ academics.
Ever since its release in November 2022 by OpenAI, college students across the nation have been using the chatbot to complete their assignments and papers. Higher education institutions have been changing their academic policies to address the issue of students misusing the AI tool.
Doug Turnbull, associate professor in the Department of Computer Science at Ithaca College, said ChatGPT is based on a framework called transformer, which first emerged in 2017 and facilitated a wide range of functions, like writing code or stories, across different disciplines.
OpenAI also launched ChatGPT Plus on Feb.1, which is a subscription model available to users in the United States at $20 per month. ChatGPT Plus will allow users to access the AI chatbot even when there is high web traffic on the site, learn about new features before other users and obtain faster results from the chatbot.
Turnbull said that while he has not studied transformers in-depth, what sets ChatGPT apart from other transformers is its ability to focus on specific elements or words of a given prompt and predict the answer based on the content from its database that is relevant to those elements. Turnbull said the AI chatbot uses Natural Language Processing (NLP) to communicate information to users in a human-like manner.
“[ChatGPT] did a couple of clever things,” Turnbull said. “It really uses attention from what are the keywords from earlier in a … sentence, that really matter [when] deciding what the next word is going to be.”
Antony Aumann, professor of philosophy at the University of North Michigan, now requires his students to physically write and present initial versions of their essays in class and explain the changes they make to the draft as they go along the editing process.
Aumann said he also believes that colleges do not need to institute new policies for ChatGPT and can let professors individually decide how they interpret pre-existing anti-cheating and plagiarism policies of the institution in the context of ChatGPT.
“I think that there are uses of ChatGPT that are obviously problematic and uses that are acceptable,” Aumann said. “But obviously, there’s a wide range of uses in between those two extremes that are really hard to negotiate.”
John Barr, professor in the Ithaca College Department of Computer Science, said it is too early to predict if ChatGPT will be detrimental to academics considering that it has not yet reached the stage where it can produce extremely sophisticated or advanced content. Barr said professors are able to distinguish between content curated by ChatGPT and students.
“[ChatGPT] can solve very simple, small prompts that you might get in a classroom, but it’s not ready for prime time,” Barr said. “It’s not the monster people think it is. It’s not God, it’s not going to take over the world. We don’t want to get too excited about this.”
Barr said professors teaching humanities courses can ask students to write something on the first day of class which will help faculty recognize when students misuse the tool because they will know what students’ real writing style is like. He said he plans on working with ChatGPT in his classes and expects students to be transparent with him about it.
“So my plan is to say in my courses, ‘It’s OK to use ChatGPT to get a start in this code but you have to acknowledge it and you have to say what it gave you and then how you modified it,’” Barr said. “And it’s like any other tool, right? I can’t stop my students from using a spreadsheet. I can’t stop my students from using Google. And in fact, to be a successful professional, you have to be able to use those tools.”
First-year student Haris Li said that as a computer science major he feels that knowing how to use ChatGPT based on the kind of assignments that students might have is important.
“I feel like it’s the same as if you went on let’s say, GitHub or any type of question-asking forum dedicated to coding and then just straight up pull [or] copy the entire code and paste it into your assignment because in that case, it kind of feels like plagiarism because you didn’t do anything, you didn’t edit it,” Li said. “If it was only a subset of assignment, I will understand because it would be more like, ‘Oh, you got a little bit of help on this’ and, ‘Then you edited this part yourself,’ or, ‘You added on here and there.’”
David Weil, chief information officer in the Department of Information Technology (IT), said ChatGPT has not fully been developed yet. Weil said that while the AI tool collects data from different sources to present the user with one comprehensive answer, the repository of data that the AI is aware of does not extend beyond 2021.
“ChatGPT is beta,” Weil said. “It’s still in development. I could say, you know, who will win the NFL playoff games this weekend? It says, ‘My knowledge cut-off is in 2021.’ So it can’t tell me about that.”
Weil said users should consider the fact that ChatGPT is getting information from sources that already exist online.
“So you really have to look at it and you have to question it and use your judgment,” Weil said. “The other problem with it is you don’t know where it’s getting this information from. So you don’t know if there are biases built into it. You don’t know what the bias is. Because you don’t know who’s providing that information.”
Turnbull said he does not feel the need to institute a collegewide policy to deal with the repercussions of the chatbot among students.
“It’ll be interesting to see how the new tool is adopted, but I think it does place more value on professors explaining why they’re having students do the work that we’re doing,” Turnbull said. “I think professors and students can have mutual respect and work with each other. I don’t necessarily feel like I need students to have some broad policy. … I’d rather have that conversation with the students in my classroom myself.”
Mead Loop, professor in the Department of Journalism and director of the Sports Media program, said ChatGPT is a new tool that the campus community will have to get used to. Loop said that as of now, professors should be able to address any concerns that might arise as a result of ChatGPT being used by students, as they would normally.
“Plagiarism is plagiarism regardless of the tools that you use,” Loop said. “[Using ChatGPT] would be another form of plagiarism, and we’ll deal with it when we do like anybody else would [under any other circumstance].”
The college also hosted a panel discussion about the AI from 12:10 to 1:00 p.m. Jan. 31 in Textor 102. A panel of six members from IT, the Provost’s Office, the Center of Faculty Excellence (CFE), the Student Governance Council (SGC) and the Department of Computer Science discussed their thoughts on the AI tool.
In the panel discussion, members of the CFE said they will host discussion sessions that will focus on what faculty can do this semester to adapt to ChatGPT in their classes.
Senior Austin Ruffino, Senate Chair of the SGC, said ChatGPT has been helping him communicate more effectively via email and put his thoughts across succinctly.
“I think [emails] are more effective and simpler, which is really nice,” Ruffino said. “So it’s really cool as a resource. … It’s exciting … it’s like Google but more interesting.”
Aumann said colleges must teach students how to use ChatGPT ethically and reasonably.
“I think that this technology is going to exist, whether we like it or not,” Aumann said. “And students are going to have access to it the second that they graduate, so you can be naive and just try as best you can to prevent students from using it but that’s just going to widen the gulf between school and the real world. I think we owe students, we have an obligation to train students to use this technology responsibly.”