ChatGPT challenges university perspectives on AI technology
ChatGPT has stormed college campuses across the nation in recent months, causing a disruption in classrooms, comparable to when classrooms first saw similar technologies like visual algebra solver PhotoMath or community code sharing website Stack Overflow.
At SEMO, professors and staff have been reporting a high usage of ChatGPT, an artificial intelligence (AI) language processing tool, mostly for essay or long-form text assignments. Currently, the consequence for using ChatGPT is at the professor’s discretion, but a workgroup made up of professors and university staff are working to set guidelines regarding this new technology.
Director of Academic Technologies Floyd Lockhart defines ChatGPT as a software that generates predictive text based on information in a database. He said while it is a useful technology, it can’t do the same tasks a human can.
“It’s much better at doing well-defined tasks like writing programs. Don’t expect it to do works of Shakespeare, but the better defined task you give it, the better result you can sort of get out of it,” Lockhart said. “Remember, all of these [artificial intelligences] don’t think, they just take in large bodies of text.”
Assistant professor of computer science Reshmi Mitra works closely with ChatGPT and AI detection tools to help the university better understand the software.
“[ChatGPT] depends on a lot of good-quality training data,” Mitra said. “When you ask questions related to text, it will give you a good enough response, but the [response] depends on two things: the type of questions you’re asking and whether the answers to those questions are available in the training data.”
According to OpenAI, the company which created ChatGPT, the software uses data from the internet to learn how to respond to questions. This data includes human written work and conversations. Currently, ChatGPT’s database is only updated through September of 2021, and it may provide inaccurate information about events or topics taking place after this date.
The Center for Writing Excellence has reported a multitude of submissions in which they have detected students have used ChatGPT to either help or completely write an essay, according to Writing Center Coordinator Jennifer Weiss.
Lockhart said while ChatGPT may appear to give the user a working solution to a math or text-based problem, it can actually use filler words it finds in common solutions to provide a very basic cover answer. He said ChatGPT uses the answers with the highest statistical probability of being right and fills in the rest to what is commonly known about the topic.
“One of the issues with ChatGPT, if you tell it to generate references [for you], it won’t actually go get you a good bibliography that you could go look up in the library; it will generate references that look like good references but may be [from] journals that don’t even exist, written by people who have never existed,” Lockhart said.
Lockhart said ChatGPT can even create false quotes that sound perfect to the casual reader.
Mitra noted many computer science students have been using similar technology to ChatGPT for years with StackOverflow, a website that allows users to post past code they have written or respond to other users’ posts asking for help or clarification with a coding problem.
Writing Center coordinator Jennifer Weiss said she thinks the university needs to tackle the issue of ChatGPT now and not ignore it.
“What we’re trying to figure out as a campus community is, how do we live with this tool? We know it’s there, and it’s obvious students are using it,” Weiss said. “We are exploring it. We’re looking for legitimate educational purposes.”
Weiss said the Center for Writing Excellence is using detection tools to detect essays and papers written with ChatGPT, but noted they are never able to confirm if a student has used the technology.
“The thing with ChatGPT is, everything that it generates is completely unique. It will never generate the same thing twice,” Weiss said. “You can never get it to regenerate what it generated for the student.”
The Center for Writing Excellence is a support center meant to help students, Weiss said. She said their goal isn’t to police students on their use of ChatGPT, but help them get back on the right track.
Lockhart said the goal of the newly-formed ChatGPT workgroup is to rethink assessments of students’ work, and to try to understand when ChatGPT can be a useful tool and when it can be considered cheating. He said the group is also working to standardize consequences for those who misuse the technology.
Lockhart noted one of the main focuses of the group is to create a syllabus statement that can be used by professors to indicate their stance on ChatGPT in relation to their course work.
“We have to figure out where it’s useful,” Lockhart said. “We have to figure out what a good use is and what a bad use is, and make sure that is morally, ethically and legally clear to the students, so that they can be good leaders in the future.”
Lockhart said a big challenge for SEMO with the rise of AI technology is continuing to make sure critical thinking skills and creativity are not lost to the use of artificial intelligence.
“What [AI] changes more than anything is assessment, to make sure that we have thinking, creative leaders as a product of their bachelor’s education,” Lockhart said.
Both Lockhart and Weiss noted AI cannot beat critical thinking, and if professors base assignments around this skill, students will not be able to use ChatGPT on those specific assignments.
“[ChatGPT] or not, this problem is on us. It is going to grow bigger, but we have to find creative ways to solve it,” Mitra said.
Mitra said the workgroup is working to create workshops for faculty to help them understand how they can use ChatGPT in the classroom. She said the workgroup is also investigating tools that can help recognize AI-generated text, and providing these resources to university staff and faculty.
The Arrow asked ChatGPT to rewrite the article above. This was the result:
ChatGPT, an artificial intelligence (AI) language processing tool, has been increasingly used in college classrooms. This technology generates predictive text based on information in a database and has been mostly used for long-form text assignments, according to professors and staff at Southeast Missouri State University (SEMO). While ChatGPT can be useful for well-defined tasks such as writing programs, it has limitations, as it cannot replace human intelligence in tasks such as writing literary works.
SEMO is currently working on setting guidelines regarding the use of ChatGPT. The university has established a workgroup, which is composed of professors and university staff, to define the parameters of this new technology. The consequence for using ChatGPT is currently at the professor’s discretion. However, the workgroup is working to establish standards to ensure that the use of this technology is appropriately and ethically incorporated into the learning process.
Floyd Lockhart, Director of Academic Technologies at SEMO, noted that ChatGPT relies on good-quality training data to function effectively. The software learns to respond to questions using data from the internet, including human-written works and conversations. However, the software’s database is only updated until September 2021, which means that it may provide inaccurate information about events or topics taking place after this date.
The Center for Writing has reported that students have used ChatGPT to either assist or completely write an essay, which Writing Center Coordinator Jennifer Weiss said is a cause for concern. While ChatGPT may appear to provide a working solution to a math or text-based problem, it may use filler words it finds in common solutions to provide a very basic answer. It selects answers with the highest statistical probability of being right and fills in the rest of what is commonly known about the topic. Lockhart cautioned that ChatGPT can even create false quotes that sound perfect to the casual reader.
The workgroup at SEMO is rethinking assessments of student work and trying to understand when ChatGPT can be a useful tool and when it can be considered cheating. Lockhart noted that one of the group’s main focuses is to create a syllabus statement that professors can use to indicate their stance on ChatGPT in relation to their course work. Lockhart stated that the group aims to make it morally, ethically, and legally clear to students when it is acceptable to use ChatGPT. Additionally, the group is standardizing consequences for students who misuse the technology.
Weiss noted that the Center for Writing Excellence at SEMO is using detection tools to identify essays and papers written with ChatGPT. However, they are never able to confirm if a student has used the technology because everything ChatGPT generates is entirely unique. Their primary goal is to help students get back on track and use ChatGPT for legitimate educational purposes rather than police them.
Lockhart emphasized that the rise of AI technology poses a challenge for SEMO to ensure that critical thinking skills and creativity are not lost to the use of artificial intelligence. Lockhart noted that AI technology primarily affects assessment and that the goal is to ensure that critical thinking and creativity are essential components of a bachelor’s education. He stated that professors can avoid the use of ChatGPT on specific assignments by basing their assignments on critical thinking skills.
In conclusion, SEMO is facing the challenge of incorporating ChatGPT technology into the learning process appropriately and ethically. While ChatGPT can be a useful tool, it has limitations, and it cannot replace human intelligence in tasks that require critical thinking and creativity. The workgroup at SEMO is defining the parameters of ChatGPT, rethinking assessments of student work, and ensuring that critical thinking and creativity remain essential components of a bachelor’s education.