Many North Carolina schools are still trying to effectively manage the use of generative artificial intelligence while maintaining academic integrity, even though the technology started changing how students — and the future workforce — approach schoolwork years ago.

State education officials have spent two years issuing guidance on how to incorporate the technology into teaching and holding training sessions with educators. Still, many teachers aren’t using it and many are prohibiting students from using it for assignments.

Other WRAL Top Stories

State leaders worry that without proper instruction on AI tools, there will be a widening gap between students who are ready for the workforce and those who are not. 

At the same time, students tell WRAL News that many of their peers are using generative AI as an academic shortcut and concealing their use of it from teachers and authority figures.

Numerous surveys bear out a trend that isn't going away: More and more students and teachers are using generative AI in school. It's only buoyed by the nearly ubiquitous issuance of individual laptops to students.

Generative AI tools, such as ChatGPT, are tools trained on the vast troves of data on the Internet, able to respond nearly instantaneously and authoritatively to prompts asking questions about almost any topic. Students can use it to generate anything from ideas for an essay assignment, to summaries of a text a student was assigned to read, to writing an entire essay. That’s prompted concerns for years about academic honesty among students, even leading teachers to ban its use or use error-prone AI-detection software on student work. 

But the technology is expected to be a major component of the workforce in the decades to come as companies look to automate tasks and make workers more efficient, so knowledge of how to use it is critical. 

"AI is a tool,” said Vanessa Wrenn, chief information officer at the North Carolina Department of Public Instruction. “… It is embedded in our family's lives, in our children's lives, everywhere,"

WRAL News spoke with students, educators, administrators and education policy analysts about how generative AI is being used and how schools can guide its use in a way that ensures students are still learning. Many believe there are ways to do it — by altering assignment types and formats, teaching students generative AI literacy, and teaching students about the academic possibilities of generative AI.

"You have to redesign the instruction so that when you're using AI, it can't be in a way that's cheating," said Michell McNeill, executive director of instructional technology for Johnston County Public Schools. "So if you have an assignment where they're writing an essay ... possibly even redesigning that to where the writing was done in class, but the brainstorming was done with AI."

Not everyone is convinced. Selina Sentosa Harjo, a senior at Green Hope High School in Cary, says it can hinder learning. 

"When we're doing articles in school, or if we're reading stuff, and you just ask AI to do stuff for you, it takes away from the whole point of education," she said.

Students and generative AI

Students say they use generative AI to enhance their learning — to edit essays, to bounce ideas off of, to generate new ideas and uncover new research.

If you ask them how their peers use it, they'll say they're worried.

Cheating is one of the top concerns cited by numerous students interviewed by WRAL.

Some students use it to produce the entire assignment, passing it off as their own, said Caleb Nease, a sophomore at Johnston County Career Technical Leadership Academy who wants to become a lawyer. “All their work, every single paragraph,” he said. “But some people, they have that integrity still, and so they keep it at the bare minimum."

Noah Campbell, a freshman at the University of North Carolina at Chapel Hill, says AI overuse can be a detriment to the community and the student.

“If we want to be doctors, we kind of got to learn this stuff,” said Campbell, who still considers himself an advocate for generative AI.

The North Carolina Department of Public Instruction has issued guidance for schools since January 2024, emphasizing teacher training and how teachers should use generative AI with their students or instruct their high school students to use it.

One thing it's designed to do is to help teachers address concerns about academic integrity. DPI and teachers have to address that head-on, because so many students have free, unfettered access to artificial intelligence at home.

The guidance emphasizes teaching students what AI is, how it works, how it can be ethically and effectively used, how it affects their surroundings, and how to use AI to enhance their work.

"Education institutions should ensure that today’s students have a solid foundation in AI literacy to guide them to making safe, ethical, and responsible decisions as they will live their entire lives in an AI-enabled world," DPI says in its guidance to teachers.

Educator literacy on the technology is critical, said Wrenn, the DPI information chief. 

"If we can get to this point where there is a baseline understanding of the technology, how that tool can be used ... then we can get to where there's a better understanding of how this can help students in their educational experience, help them be a stronger agent of directing their own learning," Wrenn said.

DPI advises against having elementary school students use AI, though they should be taught about it and teachers are encouraged to use it with student input for creative activities. Middle school students should be guided through using it, and high school students should be able to use it to enhance their academic work, DPI advises. Schools shouldn't allow students to interact with social chatbots, either, DPI says.

The key is assigning the right work to students, administrators say.

"It definitely takes some creativity,” McNeill said. “It takes some redesigning and rethinking.”

That's when she and educators brainstorm ways to curb AI's influence on individual assignments.

Assignments that are student-driven and focused on the process of doing something can make generative AI a partner in schoolwork rather than a replacement for it, according to DPI. Traditional ways of learning are focused on a final product and instruction delivered by a teacher or from a lesson.

But a new model of teaching, protected from compromise by generative AI, would be driven by student interests and feature projects — basically, complex real-world challenges in which students are learning along the way. AI can analyze information, generate ideas and analyze options. Humans can drive work with curiosity and make decisions.

Teachers need to think about changing their teaching in a way that makes AI less useful or relevant, Wrenn said. If teachers assign something generative AI can easily do, they may have doubts about whether their students really learned. "Then we've got to go back to the root of ‘what am I teaching and what am I asking my students to do?’"

In sample assignments, DPI guidance suggests flipping the learning process. Instead of the student generating answers to be analyzed by a teacher, the student analyzes the answers of generative AI, then submits those to a teacher.

Instead of asking a student to write an essay — a task generative AI can easily accomplish — teachers should switch what it is students are analyzing, said Vera Cubero, a DPI consultant on emerging technologies. Students could have generative AI draft three different essays from three different viewpoints, and then the student could argue with the essays, citing primary sources themselves for their arguments, Cubero suggested.

Some teachers are avoiding digital assignments entirely, an approach Cubero doesn't necessarily recommend. But she encourages a lot more oral defense of schoolwork in class, allowing teachers to quiz students and students to prove their mastery of a topic.

DPI’s vision of AI in schools is transformative, turning classrooms into creative and collaborative laboratories, in which critical thinking is a central activity.

That approach was evident on a recent February day when the department hosted its first-ever AI Solve-A-Thon. 

What AI can do

DPI's first-ever AI Solve-A-Thon took place during a conference attended by about 1,500 North Carolina teachers this month. Ten teams from across the state presented tools that they used AI to help build, largely inspired by identifying a community problem and trying to help solve it.

The winner was an app designed to answer people's questions about local resources, created by a ninth grader and a 10th grader, sisters in Cabarrus County.

Second place was a chatbot-powered app designed to help people bring the right documents to the Division of Motor Vehicles: NC DMV Express Assistant, designed by students at Pine Lake Preparatory in Mooresville.

Third place was an app and website designed to help Cumberland County's poor or homeless residents find resources, such as where they can shower for free, which laundromats offer free clothes washing at what times, or where they can pick up food.

The "Jade Book" app was created by four Douglas Byrd High School students, one of whom has been homeless three times and whose experiences shape the product. Sophomore Chris Butler said the app would have helped his family.

"We put in the work, we made the app, we solved a real-world problem," said Tremaine Thomas, a senior at Douglas Byrd and a member of the "Byrd Brains."

The Byrd Brains, like many students who talked to WRAL, see generative AI as a major part of their futures. They want to design games, make websites.

"It also helps me on my day-to-day life,” Thomas said. “Like, if I just like, need help with something, I just type in my prompt and ask for the help. It's a very good tool."

But it still gives them some pause. Thomas thinks generative AI will evolve and be either "really good or really bad." Teammate Camille Singleton thinks it could solve problems across the world but it "depends on how people use it."

Teammate Demarcus Billups is worried about what bad things generative AI could do down the road. He's already seen it do things in his everyday life that make him feel uncomfortable. "You could take a picture of your assignment, and the AI will send you a picture of it, completely written and completely just uploaded," Billups said. "It's scary in a way."

What teachers can do

The Education Week Research Center found that 61% of teachers were using generative AI in their work to some degree in the fall, up from just one-third each of the two previous years. About half said they had taken at least one professional development session on using generative AI in their work, up from less than a third in an early 2024 survey.

Surveys of students show similar trends but less education on how to responsibly use the technology.

A RAND survey from last school year found just more than half of students said they used generative AI for school.

Only about one-third of school district leaders said they had provided training to students on using generative AI, and 80% of students said teachers had not trained them on how to use it. Most districts and schools didn't have a policy or guidance on using generative AI.

DPI and some educators have various approaches toward managing generative AI use, beyond just redesigning assignments.

Schools should find out if the generative AI tool they want to use allows teachers to monitor AI chatbot use, educators say. Some teachers also look at version histories when students edit documents to see if they are steadily completing an assignment or essay, or if they are placing the essay in all at once.

DPI suggests teachers keep color-coded charts in their classrooms to show varying situations for AI use or nonuse, ranging from no AI use to AI use required.

In similar efforts to curb cellphone use in classrooms, many schools had red, yellow and green codes for when cellphone use was not OK, limited or OK — charts that worked for some students but were ignored by others.

Schools should communicate with parents about how generative AI is being used, DPI tells teachers. The department’s guidance includes resources for parents to educate themselves on AI and keeping tabs on when and how their children are doing their schoolwork and what tools they could be using.

What not to do

As easily accessible generative AI tools have proliferated, other companies have jumped to get in on the action of detecting them in schoolwork.

Those include GPTZero, CopyLeaks, SafeAssign and many others. They often generate a score that's based on what percentage of an essay the tool believes used generative AI, or based on the likelihood generative AI was used.

But students told WRAL News they aren't so sure the tools are accurate.

Nease, the Johnston County student, has tested one himself on papers he wrote without using generative AI, and it's told him it detected AI. Still, many teachers in his district use them.

Sentosa Harjo, the Green Hope High student, said many teachers at her school use them, too. Students will get called in to discuss the results and can get in trouble. She said a friend was wrongly accused.

"She was very, very stressed," Sentosa Harjo said. "It was a whole ... big mess."

DPI urges teachers to "use great caution" when using software that purports to discern when assignments were completed using AI and when they were not and has urged that caution since first issuing the guidance in January 2024.

Many of the tools themselves urge schools to use the scores only as guidance for further investigation into whether an assignment was completed dishonestly.

The tools have high frequencies for false positives, particularly for non-native English speakers and creative writers, DPI says. Similarly, they have high frequencies of false negatives for students who are skilled at using AI and know how to trick the detectors.

"If there is suspicion that a student depended on AI too heavily for an assignment, this should be viewed as a teachable moment to reinforce the appropriate partnership with AI tools rather than a ‘gotcha’ moment," the DPI guidance says.

Teachers should collect students' writing samples throughout the school year to get to know students' writing, DPI says. They can talk to students about appropriate AI use and ask students what they contributed to an assignment that was their own and what was done by AI.