Main Navigation

How ChatGPT changes higher education

From academic integrity to developing better AI tools, U leadership explores what ChatGPT means for the university experience.

Your first experience with the artificial intelligence chatbot ChatGPT is probably tinged with disbelief. Can it really do whatever I ask it to? If I say “Summarize 'Hamlet' in the form of Ed Sheeran lyrics,” can it really digest the plot of "Hamlet" into a rhyming song?

And then you see this appear, almost instantly.

“In Denmark, things ain't right,
The king is dead, the prince is in a fight.
His uncle's on the throne, but it's all a lie,
Hamlet wants to set things right, before he dies.

Oh, he's got a plan, but it's all so strange,
He's acting mad, but it's all part of the game.
He's got to catch the guilty, before they catch him too,
It's a tale of love and loss, and what one man will do.”

It certainly looks like magic, but it’s the result of years of engineering and training to create a tool that can mimic the style and structure of human language. The power of such a transformative tool has prompted a wave of news stories and opinions, with headlines ranging from “Don’t Ban ChatGPT in Schools. Teach With It” to “The End of High-School English.”

AI tools like ChatGPT pose challenges to teaching and learning at all schools, including the University of Utah, alongside tremendous opportunities. U administrators see the introduction of these tools as a way to re-explore the meaning of fundamental concepts of higher education, such as academic integrity, the learning process and the creation and distribution of knowledge.

Read more:

Nothing new

ChatGPT is a type of artificial intelligence system called a large language model. It’s a fine-tuned version of the GPT-3 large language model, says Taylor Sparks, associate professor of materials science and engineering. The first GPT model was released in 2018. GPT-3, Sparks says, uses deep learning to extract patterns in bodies of text. “This model can take an input prompt and then generate text output that would be a probable continuation of the prompt.”

ChatGPT is specifically trained to a specific dataset of question-and-answer dialogue.

“This was considered supervised learning because, in the training data, both questions and answers were human-generated,” Sparks says. Further refinement helped the model learn the most appropriate responses.

The result is a model that can reproduce the style and content of its training data. “However, the output is limited to what they have learned from the training data and they cannot create completely new ideas or concepts,” Sparks says.

AI and faculty

As ChatGPT has gained widespread attention, faculty at the U have looked to the Martha Bradley Evans Center for Teaching Excellence for guidance. The center includes information and resources about ChatGPT and other AI tools in newsletters for teaching faculty, says Anne Cook, director of the center.

“These include tips about assignments that can prevent or discuss the use of AI as well as assignments that use AI productively,” Cook says. “We are also creating a listserv so faculty who encounter new uses or new challenges around this topic can keep the conversation going and help inform Center for Teaching Excellence approaches.” A faculty workshop on the impact of AI tools in higher education is in the works, she says.

In the meantime, the Center for Teaching Excellence will host a panel discussion led by Chong Oh, professor of information systems, titled “Unleashing the potential of ChatGPT: An Information session on AI Generative Tools in Higher Education” in the Marriott Library Faculty Center on Friday, Feb. 3 from 11:30 a.m.-1 p.m.

A tool that can rapidly and coherently produce a competent essay on any topic (i.e. feminism in "King Lear") naturally raises concerns about students passing off work generated by ChatGPT as their own. Plagiarism in universities is nowhere near a new topic, but plagiarizing ChatGPT may be more difficult to detect. In the past, instructors were able to use tools like Turnitin, accessible through Canvas, to screen for plagiarized content. It doesn’t work for ChatGPT.

“I know,” Cook says, “because I tried it out.”

A workaround, Cook says, could be to assign students writing projects that require them to draw on and incorporate personal experiences. Also, requiring citations (since ChatGPT doesn’t include citations or other ways to trace the sources of its information) or incorporating class discussion could also be effective. And detectors like Crossplag and GPTZero have emerged, identifying (although admittedly not perfectly) AI-generated text.

“Although I think ChatGPT is alarmingly good,” Cook says, “it presents an opportunity for faculty to up their game in thinking about how to create quality assessments that get students to really connect and interact with course material.”

ChatGPT itself has some “thoughts” about its role in universities.

“ChatGPT can be used in a variety of ways in a university setting, including as a tool for teaching, research and administrative tasks,” the chatbot writes. “For example, ChatGPT could be used to generate practice problems or quizzes for students, assist with research by generating summaries or analyzing data or help with administrative tasks such as answering frequently asked questions. The specific role of ChatGPT would depend on how it is implemented and used by the university.”

AI and students' learning experiences

“There are myriad possibilities for using and examining these tools related to educational efforts and inquiry," says T. Chase Hagood, senior associate vice president for Academic Affairs and dean of the Office of Undergraduate Studies. "As with any rapidly developing tool and adoption, we have the opportunity to consider how technology can positively impact higher education. ChatGPT, among other educational technologies, is an opportunity for U faculty and students to remain on the leading edge of teaching, learning and research.”  Hagood has convened a working group of experts from across the U to innovate in thinking, producing and consuming information related to ChatGPT and academic integrity.

"The integration of advanced language models such as ChatGPT in education has the potential to revolutionize teaching and assessment methods, as well as enhance student writing skills,” says Sparks. “Similar to how programmable calculators and mathematical modeling software have expanded students' mathematical abilities, ChatGPT can serve as a powerful tool for students to improve their writing and coding. Furthermore, the ability to responsibly use ChatGPT as a writing and coding aid is fast becoming a valuable skill sought after by employers."

Deborah Keyek-Franssen, associate vice president and dean for University Connected Learning, says that students should ask themselves what it means to learn and what they are here to learn. The introduction of new technology, she says, is an opportunity to re-examine and redefine how teaching and learning work.

“One of those skills is the ability to communicate and the ability to write and argue,” she says. "The ability to persuade, the ability to analyze, the ability to synthesize. And some of those skills take trial and error and practice.” Using AI tools to circumvent the learning process, she says, undermines their educational experience.

Academic integrity is at the front of the discussion for Jason Ramirez, dean of students.

“What I am hearing mostly from students is the need for clarification regarding if or when they can use it,” he says, adding that ChatGPT poses clear risks for academic integrity violations, and policies regarding its use are still being developed. “Where I think we need clear and consistent communication is when it is okay for students to utilize it, and what happens when they use it inappropriately.”

AI and knowledge

The opportunities presented by ChatGPT and other AI language models go far beyond the classroom, says Hollis Robbins, dean of the College of Humanities. They go straight to the core purpose of higher education: to generate and transmit knowledge.

“What does the large language model know?” she says. “It scrapes the web. If you look at the inputs of large language models, it is not our archives and our special collections in the Marriott Library. It is not the academic papers that our top faculty have been producing for a century and a half. It does not know what we know.”

Robbins is exploring how U scholars can partner with AI developers to improve the incorporation of knowledge into language models. For example, a historian could help develop an AI tool with an understanding of the methods used in historical research and analysis, “a kind of ChatGPT for historians,” Robbins says. “So if somebody went on this chat and said, ‘What were the four causes of the French Revolution?’ the chat would say back to them, ‘That's the wrong kind of question to ask about history.’”

She and the department chairs in the College of Humanities are already working to partner with developers and gain access to upcoming AI models. Looking forward to improving the next generation of AI language models is how the U can stay ahead of the AI conversation, Robbins says.

“I say we seize the opportunity.”

Click here for resources about how to address AI tools in the classroom.