With the official launch of ChatGPT in November 2022, students and faculty all across the country are being presented with major questions about how artificial intelligence (AI) should be used in the classroom, what constitutes as “cheating” and what this new technology means for the future of higher education.
Many colleges across the country, including Ithaca College, do not currently have official state-wide policies for the entire school to follow. This leaves faculty members having to make their own decisions about how — and if — students should be allowed to use AI.
According to a survey by Forbes in October 2023, out of 500 current educators across the United States from all school levels, 60% are using AI in the classroom and 55% believe that AI has improved educational outcomes.
Jenna Linskens, director of the Center for Instructional Design and Educational Technology, pulled together open conversations with the campus community, partnering with the Center for Faculty Excellence, starting in January 2023.
“Some of the things that we heard is that faculty really needed some guidance around a syllabus statement,” Linskens said. “We pulled together and curated a list of syllabus statements from colleges all around the world and shared them with faculty so they can put those statements in.”
Linskens said she knows of professors teaching anthropology, biology and screenwriting who ban AI use in the classroom. However, some professors, like Diane Gayeski, professor in the Department of Strategic Communication, are carefully implementing it into their curriculum.
Gayeski said she started requiring AI in the classroom during the Spring 2023 semester with platforms such as Chat GPT and scite.AI. scite.AI is a tool designed to make research easier and more efficient.
“My students are going to be expected to use AI in future jobs,” Gayeski said. “It’s similar to using any other tools like [Google] Spreadsheets or Powerpoint. … It’s also an emerging technology, which is an area I teach. AI is very much on the horizon.”
Sophomore Jaimie-Kae Smith has noticed an uptick in the number of professors outlining AI policies in their syllabus this semester. In Smith’s Power of Injustice class she took during the Fall 2023 semester, she said her professor encouraged students to use AI to compare and contrast their own essays to work that AI put out about the same topic.
“We had to go through and nitpick the little discrepancies,” Smith said. “[For example], the AI version started mentioning characters that didn’t exist in the book. There were parts that were consistent but others didn’t line up.”
Senior Isabella Lambert is currently taking a sports analytics class where they are talking about how AI technology can increase injury prediction and data analytics in sports. An analysis by Acceleration Economy discusses how data-based prediction analysis is currently used by the National Football League, reportedly lowering injuries in lower extremities by 26%.
“Having some knowledge of any type of AI is huge heading into the future of the sports industry,” Lambert said. “If you know how to properly cite [AI], you should be able to use it like any other source.”
Smith said that she sometimes uses AI to help find sources of information and create a starting point that alleviates stress.
“But that’s where I try to draw the line with it,” Smith said. “I pride myself in having things in my work that are unique to myself and my writing style. … The AI is not going to be able to put my personal spin on how I would do something.”
Some of the major concerns about AI use in the classroom stem from the fear of students generating papers without putting any effort into writing it themselves. Smith said that academic integrity comes into question with other students who may see it as “the easy way out.”
“I think we’re all here with the intention of getting an education,” Smith said. “Using AI to bypass that … then what’s the point? Just to get a grade, but what can you say that you learned at the end of the day?”
Gayeski said students have been finding different ways to cheat over the past hundred years, one example being paying somebody else to write their paper.
“Bottom line, if people want to cheat, they will cheat,” Gayeski said. “There are lots of ways around that, specifically that I require students in exams or papers to reflect very specifically on the readings or on lectures and discussions in the classroom.”
While many of the concerns seem to be poised toward how students will use AI, Gayeski said students can turn that same question back on their professors.
“I think it’s going to be used on both sides of the teaching experience,” Gayeski said. “I think students wouldn’t like it if they thought professors just used AI to create all of the content and grade the papers. Professors wouldn’t expect that students use AI to perform all their work. I think it’s going to be a matter of negotiation.”
In the current Academic Integrity Policy, the section of possible academic misconduct does not specifically mention AI use. Luke Keller, professor of physics and astronomy and chair of the policy subcommittee of the IC Academic Policies Committee, said that there are ongoing discussions about how to include AI in the existing list of examples.
“Our committee members agree that the ultimate decisions and definitions for use of AI in student academic work should be up to the instructor,” Keller said via email. “It’s important to note that the APC revision to the Standards of Academic Conduct simply includes the use of generative AI in an existing list. … We need a group of faculty, students and staff to work out the details of how to implement this policy and give guidance.”
Keller said it could take a few meetings of the faculty council to figure out the policy, with the possibility of sending it back to the APC for further discussion.
Moving forward, Linskens and her department are working on two new developments: an AI literacy course for students in Canvas and a faculty resource course that will help guide professors in modifying their assignments or assessments with emerging technology in mind.
“That’s really a big piece of what the AI disruption is,” Linskens said. “We are seeing that faculty are beginning to think about changing the way they’re creating assignments for students or assessing student learning in order to get a more authentic assessment.”
Following her discussions with people all across campus, Linskens said it’s important for faculty and students to have one-on-one open conversations about how they are each using AI.
“Ultimately, it is a tool that helps anyone be more productive, more effective in their writing, more cohesive in their work when used properly,” Linskens said. “So I encourage the conversation to remain open.”