Normal

The University is currently operating under normal conditions

Innovation and Entrepreneurship

Surveys mark latest step in Carolina’s AI approach

Analysis of the data will inform the University’s intentional implementation of artificial intelligence campuswide.

Graphic of text boxes and metric graphs.
(Graphic by Gillie Sibrian/UNC-Chapel Hill)

As generative artificial intelligence use grows, UNC-Chapel Hill’s stated goal is to “perform innovative research and prepare the workforce to shape a better future through the intentional implementation of artificial intelligence.”

The Provost’s AI Committee, with 50 members from all academic units, is one driver of campus AI implementation. That group’s metrics subcommittee gathered valuable baseline data about the perceptions and use of AI by surveying faculty and students. Staff were invited by email to complete a similar survey. The surveys and assessment of the results are part of a campuswide, intentional effort to enable faculty, staff and students to use AI ethically and intelligently.

“Metrics allow us to avoid the groupthink that can be common when emerging technologies hit the market, contributing to a rush to act without a full understanding. We hope survey results can support careful decision-making on AI,” said Kate Hash, metrics subcommittee chair and assistant vice chancellor for customer experience and engagement in Information Technology Services.

Other drivers are two committees commissioned by Chancellor Lee H. Roberts and the deans of all schools, the Office of the Vice Chancellor for Research and an AI acceleration program.

“In unison with other campus efforts, we’re trying to understand where faculty are with AI, then use the findings to intentionally go about AI,” said Mark McNeilly, chair of the provost’s committee and professor of the practice of marketing and organizational behavior at UNC Kenan-Flagler Business School.

The faculty survey found that 32% use AI either daily or weekly, matching student use, and that:

  • 94% say graduates must learn to use AI effectively, ethically and critically.
  • 75% feel responsible for teaching ethical AI use.
  • More than 80% are ready and already talking about AI with students.
  • 69% think AI benefits teaching; with clinical faculty (89%) most optimistic.

Faculty see AI helping with productivity, and they want department-wide, consistent AI frameworks and more training, tools and time — especially clinical and nontenured faculty.

The survey highlighted concerns about:

  • Honor code violations
  • Changes in grading (80%) and redesigning assessments (69%)
  • Replacement of critical thinking and hindrances to meaningful learning
  • False outputs called hallucinations, privacy and environmental effects
  • Adoption barriers like skepticism and philosophical resistance
  • Risk of a “figure-it-out-yourself” approach

“The survey shows what the concerns are, so we can address them in an integrated cross-campus way,” said McNeilly.

Although Carolina is ahead of other schools in addressing AI, 25% of faculty have never used it, McNeilly noted.

To foster an AI community of students, faculty, staff and industry partners, the University launched an AI community hub, which currently has 500 members.

The committee also created AI@UNC to support and inform campus through:

  • Metrics for use by faculty, graduate students and undergraduate students
  • Guidance for faculty on teaching, research and operations
  • Information on AI acceleration programs and fellowships for faculty, staff and students with a funding range of $5,000 to $100,000
  • The availability of AI tools such as Microsoft Copilot, Adobe Firefly and other resources
  • Newsletters, event information, new features and updates
  • New features and updates

Work in progress includes:

“This is a thoughtful and intentional approach, and Carolina is a leader in doing so,” McNeilly said. “The committee wants campus AI to have governance and to thrive through innovation and collaboration.”