Meet Carolina’s first GenAI faculty fellow
Dana Riger talks to instructors about using artificial intelligence in teaching and assessment.

Dana Riger is in her seventh year at Carolina, where she is a clinical associate professor in the UNC School of Education’s human development and family science program.
Since May 2024, she’s also served as the inaugural generative artificial intelligence faculty fellow at the Center for Faculty Excellence. This work, Riger said, is about translating “the technical into the practical” by helping Carolina faculty and graduate student instructors understand both the ethics of AI use in the classroom and effective strategies for either integrating AI or intentionally resisting it.
Riger was an early adopter of AI and explored how it affected her teaching, redesigning assessments and presenting at various CFE events in the fall of 2022 and early 2023. Her scholarly work focuses on how technology affects relationships and how it’s used in family therapy.
“They eventually invited me to step into this role because I was doing a lot of this work already and being invited to speak on this topic,” said Riger.
Here are five things to know about Riger’s work as GenAI fellow.
1. She wants to provide clarity on AI — not push an agenda.
Faculty should have agency and be confident in AI decision-making, Riger said. She cares about “empowering them to make informed choices,” whether that means integrating AI, avoiding it or choosing a “blending” approach.
2. She feels a responsibility to prepare students for the realities of AI.
“As a university, we’ve established that digital literacy is a learning outcome we want students to be proficient in,” Riger said. “When I think about my responsibility to my students, I think about preparing them to feel confident and competent in whatever professional roles they take on.”
Most faculty she’s talked with agree that students need to be ready to interact with AI in some capacity professionally. “That doesn’t necessarily mean using AI. Sometimes it means knowing what AI is capable of so you can decide not to use it,” Riger said. “Avoidance is also a strategy, but it needs to be an informed one.”
3. She tailors her work to faculty needs.
In her 16 months as GenAI fellow, Riger has led 35 custom workshops and plans to lead 10 more this fall. “They’ve spanned almost every school and college, working with faculty, doctoral instructors and even TAs,” she said. She also assisted and led multiday CFE institutes last fall and two three-day AI assessment institutes this past summer.
“I bring in research on AI use within faculty’s specific fields and share discipline-specific examples of AI output,” Riger said. “Most importantly, when a faculty member requests a custom session, I ask them to send me a specific assignment or assessment where they’re struggling with issues of AI misuse or overuse.”
She’ll then redesign the assessment and use it as a case study on how AI could be integrated or resisted, depending on the desired learning outcome.
4. Faculty want to focus on what only humans can do.
Faculty members want to know how AI might help them make space for aspects of teaching, mentoring and research that are uniquely human. They ask her, “How can I streamline administrative tasks?” and “How might I conduct research more efficiently using AI?”
They are also interested in ways AI can enhance human teaching and learning. “It can help them design more engaging, creative classroom activities that inspire students,” she said.
5. She emphasizes ethics, adaptability and grace.
Riger acknowledges how quickly AI changes and the impossibility of keeping things “AI resistant.” That’s why having an ethical framework is important.
“Policies and regulations will change, but values like fairness, transparency and student trust must remain constant,” she said. “This is new terrain for all of us. Giving ourselves some grace as we learn together is just as important as keeping pace with technology.”








