|
Getting your Trinity Audio player ready...
|
Wade Maki has been using artificial intelligence for the past year. The ponytailed philosophy lecturer at the University of North Carolina Greensboro has experimented with the technologyโs ability to summarize talking points from difficult readings or identify missing ideas from lecture notes. As chair of the UNC System Faculty Assembly, he employed AI-generated images of cats in the boxing ring for a presentation on diversity, equity, and inclusion before the Board of Governors.

โYou donโt even have to go find a meme,โ Maki said. โYou can create it. Describe it to the AI, and there it is. Thatโs useful for everybody.โ
In Makiโs classroom, students have permission to use AI to brainstorm ideas, conduct research, and even proofread their papers, so long as they donโt use it to write their essays and exams. His principle is best summed up as: โYou can work with AI, but AI shouldnโt be doing the work for you.โ
Faculty and students across the country have had to adapt as generative artificial intelligence tools have become increasingly ubiquitous. The technology has changed how students take notes, conduct research, and summarize readings. Itโs also changing the college admissions process and recruitment efforts, as well as how teachers generate course content and grade assignments.
Some experts worry that heavy reliance on generative AI will strip away studentsโ critical thinking skills and allow them to take shortcuts instead of learning new material. โI hear anecdotes about students that are completing assignments that are AI-generated,โ UNC System President Peter Hans said last month at the first event in The Assemblyโs Newsmakers Series. โAnd faculty are using the tool more often as well. Itโs one bot talking to another bot.โ
But university leaders also acknowledge that students have to learn how to use the technology. โThereโs no way weโre going to get around it,โ North Carolina Central University Chancellor Karrie Dixon said at the same event.

Higher education institutions across North Carolina are scrambling to implement classroom guidelines to keep pace with the rapidly evolving technology. Duke University unveiled its own AI platform this summer, DukeGPT. Public universities have crafted example syllabi to help professors set guardrails for chatbots, machine learning tools, and more. The UNC System is working on its own recommendations.
Maki serves as a member of the AI Oversight Committee at UNCG and oversees a UNC System AI task force. He said applying a one-size-fits-all policy is not sustainable. He thinks schools should establish best practices and offer plenty of wiggle room for faculty to customize their AI policies.
โThe Wild West is how we always start with these things, and then it gets less wild, and the roads get paved, and thereโs more law and order,โ Maki said.
Taming the Wild West
The current conversation about the line between study aid and cheating is in some ways reminiscent of past debates about tools like CliffsNotes or using the Internet for research. Maki, 51, said he had professors who โdidnโt like calculators because you should do all that math in your head.โ
None of the five North Carolina universities surveyed by The Assembly bans AI in all classrooms. Universities have generally been reluctant to establish campuswide policies regulating generative AI, leaving it to faculty membersโ discretion. But many institutions have established recommendations and best practices that faculty can adopt when drafting their course syllabi.

UNC Charlotte has outlined two suggested syllabus policies that faculty can adapt. The first option allows AI use in all assignments, so long as students disclose how the technology is used. The second limits AI use to designated assignments. Use of AI that is not authorized by the instructor constitutes a violation of academic integrity, the guideline states.
โWe wanted to put guidance around best practices, knowing that the catโs out of the bag in a lot of ways. This is not going away,โ said Jules Keith-Le, academic technology support analyst with the Office of OneIT, UNC Charlotteโs IT services office.
Similarly, at North Carolina State University, sample statements range from most restrictiveโno AI allowedโto least restrictive, which incorporates the technology into the coursework. A โmoderately restrictiveโ option allows students to use chatbots, text generators, and paraphrasers when seeking guidance on assignments, but not for other purposes.
Wake Forest University offers an AI decision tree to help faculty evaluate how knowledge and skills are gained and assessed in a course and whether it is important that students develop them independently of AI assistance. The decision tree guides faculty to syllabus options based on how faculty answer the questions.
Some faculty across the state said the university suggestions for classroom policies donโt help them with one big concern: enforcement.
โThereโs no way weโre going to get around it.โ
Karrie Dixon, North Carolina Central University chancellor
Scott Simkins, associate professor in the Department of Economics at North Carolina A&T State University, said that despite not wanting to spend time policing AI use in his classroom, he finds it inevitable. This has prompted him to adapt his courses to โfocus as much on the process as the product of the learning,โ he said, through things like team-based assignments and establishing a โlearning community compact,โ an agreement outlining shared goals and expectations.ย
He said a campuswide guideline for the emerging technology would help foster a culture of responsible AI use among students, faculty, and administrators.
But Sarah Egan Warren, assistant teaching professor at NC Stateโs Institute for Advanced Analytics and a digital education faculty fellow, said thatโs impractical because what works today may not work in the future. For example, Egan Warren pointed to Googleโs AI Overviews, introduced in 2024 to provide summaries found in Web search results. She said that while students and faculty can use Google to search for information, there is no guideline on whether AI-generated summaries can be used in an academic setting. An AI policy implemented in 2022 wouldnโt have accounted for this feature.
She also said that she prefers the flexibility of setting her own classroom AI guidelines.
โI do want them to experiment with AI,โ she said. โI do want them to see the limits of it. I want them to talk about what are the biases that are built into it.โ

Keith-Le, who teaches visual design of instructional products at the graduate level, said that not all of her students like using AI, but she encourages them to explore the tool.
After students used Google Gemini to generate an image for a visual design project, Keith-Le polled them on whether they would use AI going forward. She was surprised when the results showed a 50-50 split. Half of the students said they didnโt like using AI for class.
For Maki, whatโs most important is telling students the rules from the get-go and recognizing that student AI use does not always imply cheating. โCheaters arenโt going to ask,โ he said. Therefore, he said, faculty should encourage students to learn how to use the right tools.ย
โEvery time we drive down the road, we want to know what the speed limit is,โ Maki said. โSome of us might still speed, but we at least want to know what it is.โ
DukeGPT
Like many of its peer institutions, Duke has no blanket rule on AI in the classroom. Yet the university has stood out for how it is pushing AI into studentsโ lives.ย
โOur goal is to help faculty create learning experiences that prepare students to thrive in a world where AI is part of everyday lifeโand where human curiosity and critical thinking still make all the difference,โ said Aria Chernik, assistant vice provost for Faculty Development and Applied Research in Learning Innovation at Duke.
All Duke undergraduates, staff, faculty, and professional students in Dukeโs graduate schools have free, unlimited access to ChatGPT. The university also launched DukeGPT, an interface that lets users compare an array of language models. Duke community members can use suggested prompts to learn about university events and resources. Dukeโs pilot with OpenAI is part of the companyโs broader strategy to make AI a โcore infrastructure of higher education.โย
The university says that giving all students access to ChatGPT levels the playing field for those who cannot afford subscriptions. Exposure to the technology will help students prepare for a job market where AI skills are valued and necessary, Duke says.
More recently, the university introduced MyGPT Builder, a customized chatbot that allows students to generate flashcards, practice quizzes, and study guides from a syllabus and set personalized study schedules. The AI tool will also help staff streamline administrative procedures and support faculty in generating course content and answering common student questions.

This fall, prospective Duke students have the option to write an essay about AI in their application.
Duke is not the only institution in North Carolina that has partnered with AI companies. In March, N.C. Central announced a collaboration with OpenAI to launch the Institute for Artificial Intelligence and Emerging Research, the first of its kind program among HBCUs. Students at UNCG have free access to Microsoft Copilot. UNCG said it picked Microsoft over companies like OpenAI because the university thought Copilot offered more security safeguards.
The UNC System is also part of Googleโs $1 billion initiative to fund AI literacy programs and research across more than 100 public universities. Under the partnership, all students will gain access to Googleโs Gemini 2.5 Pro and Google Career Certificates, an online AI training program.
The UNC System has formed professional development groups for faculty to discuss generative AI issues in education, and it holds an annual learning and technology symposium that covers similar topics. The system doesnโt have guidelines on AI use in the classroom, but officials are studying the issue and expect to release recommendations this fall, said Heather McCullough, director of learning technology and open education.
โThe technologies are changing on a daily basis, and so to be able to have a policy that is nimble enough to be adaptable to the shape of AI six months from now feels a little daunting,โ she said.
Lucas Lin is a junior at Duke University pursuing a major in economics and a certificate in documentary studies. He is managing editor at The Duke Chronicle, the university’s independent student newspaper. Beyond journalism, he is passionate about storytelling through documentary filmmaking and photography.


You must be logged in to post a comment.