Skip to main content

As technology like ChatGPT evolves, considering the implications for Northwestern

Northwestern experts weigh in on generative artificial intelligence, what is working and what might come next
chat gpt
During the winter faculty webinar, several members of the committee offered views on ChatGPT and discussed its implications for academics, faculty resources and institutional policy.

Northwestern community members gathered recently for the winter faculty webinar to discuss generative artificial intelligence — specifically ChatGPT — its growing popularity and its potential impact on University culture.

Hosted in early March by the Office of the Provost, the webinar featured several guest speakers and attracted about 200 participants.

“Generative AI has become increasingly accessible, and it will impact teaching and learning in numerous ways,” said Provost Kathleen Hagerty in her introduction.

To help Northwestern address the questions, concerns and opportunities associated with the emergence of these technologies, Hagerty announced the creation of the Generative AI Advisory Committee, a multi-disciplinary group of experts who provide counsel on an institutional approach to AI and promote coordination of best practices across Northwestern schools and units.

The Office of the Provost has created a list of tools, upcoming events and answers to FAQs.

During the winter faculty webinar, several members of the committee offered views on ChatGPT and discussed its implications for academics, faculty resources and institutional policy.

Here are four takeaways:

“The notion that I can take a data set and make it understandable to everybody in very different ways is unbelievably exciting.”

– Kristian Hammond, professor of computer science

We live in a world, Hammond said, where we’re surrounded by data, numbers and symbols that are impenetrable to most people. But now there is the ability to take a data set, turn it into raw facts, hand it to a language model and tell it to generate a story tailored to the needs of a particular community. 

“AI can help speed up generating certain types of communications, but it can't replace lawyers.”

– Sarah Lawsky, professor of law

In recent months, AI tools have successfully passed law exams at major and prestigious universities across the U.S. What does this mean for law students? Think of generative AI as a co-pilot for attorneys. For example, standard tasks such as generating questions for a deposition, a first draft of a contract or summarizing a case could be seen as achievable tasks for ChatGPT. Still, there needs to be a human involved, Lawsky said. For a lawyer, “80 percent correct is still very wrong.”

“Writing is a lot more than just a product.”

– Elizabeth Lenaghan, director and associate professor of instruction in the Cook Family Writing Program; and assistant director of The Writing Place 

Lenaghan spoke of a human approach to writing, one that requires a sense of feel to fully appreciate the quality of a piece. Fundamentally, AI cannot tap into this essence because it lacks the faculties required to understand its own product. Still, she said she is confident AI can be useful in streamlining the writing process, helping not only improve grammar and syntax, but explaining why a specific word choice is right or wrong, thus serving as a teacher. To that end, she recommended asking generative AI to do more than one thing as it may encourage a user to think about how that feedback might be incorporated into the creative process.

“This presents a unique opportunity for co-discovery with our students, a chance to underscore the value of lifelong learning.”

– Jennifer Keys, senior director of the Searle Center

In illuminating how faculty engage students in conversation on this topic, Keys suggested that students must have critical thinking skills to be ready for the workplace. So, the next best thing faculty can do is help students understand the limits of generative AI so they can make informed decisions about exactly when these tools can add value, she said.