Across Dartmouth, faculty are experimenting with new and inventive ways to bring artificial intelligence into the classroom and their research.
Students in an Environmental Studies class this term are using AI to collect and summarize information about wind energy.
Other ideas are in development such as AI coaches or tutors for Spanish and German courses that could help students practice the vocabulary or grammar they learn in class. The AI model has access to a particular library of content that the faculty member has chosen, which it uses to converse with students.
The initiatives are all part of a Dartmouth tradition of bringing innovation and new technology into the classroom.
“Back in the 1960s, in the early days of computing, Dartmouth was quick to adapt, teaching basic computer literacy to both faculty and students so they gained experience and an understanding about what computers could and couldn’t do,” says James “Jed” Dobson, an associate professor of English and creative writing and director of the Writing Program.
“Generative AI tools are the 21st century analogue, and we must create opportunities for everyone to learn how the technology works, think about it critically, and understand how and when to use AI tools effectively and responsibly,” says Dobson, the special advisor to the provost for artificial intelligence for 2024-25.
To that end, Dobson is working with a research assistant, Melania Torrey ’28, to integrate an AI literacy component in the spring term as part of First-Year Seminars.
“One of our goals is to make sure that students know how to engage with AI, contemplate the kind of role that it’s going to be playing in their academics and even in their lives beyond graduation,” says Torrey.
Reimagining teaching with AI
The conversation about how advancements in AI and machine learning might transform teaching and learning became supercharged when the generative AI chatbot ChatGPT was released, says Scott Pauls, director of the Dartmouth Center for the Advancement of Learning, which plays a pivotal role in helping faculty adapt to these emerging technologies.
Working in partnership with experts from Learning Design and Innovation, Dartmouth Libraries, Research Computing and Data Services, and Information, Technology, and Consulting, DCAL facilitates discussions about the possibilities AI unlocks as well as its limitations and shortcomings, organizes workshops where faculty gain hands-on experience with AI tools and brainstorm with peers, and offers resources and expertise that accelerate the development of ideas into innovation.
Among several initiatives designed to support faculty with integrating AI into their courses is a new grant program launched by DCAL and LDI. Members of the grant team will partner with faculty to provide resources and support to faculty who are looking to incorporate generative AI in future courses.
The nine grants, of up to $1,500 each, that have been awarded this term span a variety of approaches—from using AI to help students with course content to examining the technology itself, says Pauls, who is also a professor of mathematics.
By keeping track of the ever-expanding suite of AI tools, staff at DCAL and LDI are able to help faculty zero in on the most appropriate applications that fit their vision, says Erin DeSilva, associate provost of digital and online learning.
In December, DCAL and LDI organized a two-day faculty workshop, Teaching with GenAI. This was an opportunity for instructors to familiarize themselves with different AI tools, learn from peers who had already gained firsthand experience using these tools, and think through and experiment with the educational potential of generative AI in the context of their individual courses.
One of the key takeaways for participants, says DeSilva, was the idea that “what we often measure with student learning is what they produce, whether in a test or assignment, when maybe what’s more important is engaging with the process of how they’re learning and the way that they’re creating those products.”
The workshop gave faculty a chance to unmask those process pieces and identify the critical tasks that are meaningful to student learning versus the ones where generative AI can help them be more productive or creative, she says.
Getting their hands dirty also helped participants realize that these tools could save a lot of time and reduce barriers to entry for students.
“Probably the most transparent example of this is around coding,” says Pauls. In a lot of STEM and social science courses, students must learn a certain amount of coding to be able to do things like statistical analysis or visualization. But now, given a description of the requirement an AI app can generate the required code that can be refined to make it work as needed, rather than beginning from scratch.
What’s more, says Elizabeth Wilson, professor of environmental studies, “These AI tools allow students to rapidly access large volumes of information” and broaden the type of question they might have asked in the past.
Wilson is teaching the Environmental Problem Analysis class this winter, where students study the offshore wind sector. “Because the field is so new, there isn’t a lot of easy, publicly available information about it,” she says.

So, for their final project, students will update Wikipedia pages related to offshore wind. But given the vast amount of material students must peruse to understand the sector, they are using AI tools as an aid.
To ensure that students appreciate both the power and limitations of the tools, assignments require students to summarize material without AI and then compare it to AI-generated summaries.
“My hope is that with these tools and their comparative analysis and critical analysis of these tools, they’ll be more comfortable in using them when they leave Dartmouth as well,” says Wilson, whose research examines the evolution of energy systems.
Recently, she began studying the emergence of the offshore wind sector globally. “There are about 1.2 million pages of documentation, just in the U.S. offshore wind sector alone,” says Wilson. She turned to AI to analyze the complex textual datasets to gain insight into this evolving industry.
Amid the excitement to explore the possibilities, there is an undercurrent of caution. Faculty and researchers continue to scrutinize and engage in discussions with peers and students about the shortcomings of AI tools as well, such as bias and hallucinations—false or misleading information that AI presents as facts.
Laying the groundwork for innovation
AI tools have tremendous potential as research tools as well, and many Dartmouth faculty are building innovative applications powered by AI. One example is the AI model developed by the Polarization Research Lab to track the rhetoric and actions of all 535 members of the U.S. House of Representatives and Senate.
Researchers and engineers at ITC and Research Computing were quick to tool up. “The technology was beginning to blow up, but nobody really knew how that would change the way we worked. We joined forces to ascertain what can we offer as faculty and researchers on campus figure out how they want to work with these tools,” says Simon Stone, a research software engineer for high performance computing and AI in Research Computing.
They created a chat interface to a collection of large language models that allows users to interact with commercial models such as OpenAI’s GPT-4, as well as open-source models deployed on Dartmouth infrastructure.
“For people who are interested in using large language models in their research pipelines, it offers them a one-stop shop where they can easily compare different large language models and find the right tool for the job,” Stone says.
Along with a focus on teaching with AI, Dartmouth researchers are leveraging the power of AI in health care.
Research hubs including Center for Precision Health and Artificial Intelligence and the Center for Technology and Behavioral Health foster collaboration and spur innovation in digital health technology.
MoodCapture, a first-of-its-kind smartphone app that uses AI and facial image processing to detect the onset of depression in real time, was developed by researchers from the Department of Computer Science and the Geisel School of Medicine.
Last year, a team led by Thomas Thesen, associate professor of medical education at Geisel School of Medicine, created the AI Patient Actor app, which helps second-year medical school students practice interacting with patients and sharpens their diagnostic and interpersonal skills.
Dartmouth faculty and researchers in the liberal arts are also looking to tap into AI to spur creativity, such as collaborating with computers to make music.
“There is a lot of research on creativity and an interest in creative ways of innovation that have connections to artificial intelligence,” Dobson says.
The interest and focus is especially fitting since AI itself traces its roots to a 1956 summer workshop at Dartmouth.
“Given our history, Dartmouth is committed to being on the cutting edge of innovation,” says Barbara Will, vice provost for academic and international affairs. “Infusing AI into our research and teaching is part of our commitment to advancing education and using technology to improve the world around us.”