Dubai Tech News

Special report: AI in local classrooms

NORTHWEST ARKANSAS (KNWA/KFTA) — Northwest Arkansas students and educators say they have mixed feelings about the rising use of AI chatbots in the classroom. Examples of popular AI chatbots include ChatGPT, Google Bard, and Microsoft Bing AI. A conducted by the site asked 1000 students in late September and early October about whether they’ve used AI tools for assignments and exams.

Around 56% of respondents said they have. This marks a stark rise from the site’s survey just seven months before, where only 22% of students said the same. At the University of Arkansas, the executive director of academic initiatives Chris Bryson says the conversation surrounding AI has escalated quickly.

“At the beginning, let’s say a year ago in the spring, it was just a discussion about what these tools are and how it could impact a specific discipline. Now that’s kind of transitioning into this next phase of what does this mean?” The answer to that question may depend on who you ask. University students KNWA/FOX24 talked to were split, citing both the potential for academic dishonesty, and educational value.

“I think it’s a tool. Like at the end of the day, you can use it for good purposes or bad,” summarizes U of A senior Samuel Thomas. “It can honestly give you sources if you need help finding topics for a paper,” another UA senior, Taylor Shorb, points out.

Lilly Barcroft, a junior at the university, says she’s been avoiding using the tool altogether, though she acknowledged she may have to change that view. “We’re paying for this kind of education, and we have a lot of great resources here,” says Barcroft. “I don’t know why we wouldn’t just do the work ourselves.

” Most University of Arkansas students KNWA/FOX24 spoke with said they felt that it’s okay to use AI for research, but are concerned about their peers using it for assignments. “[Professors] don’t want you using it to write papers or using it to do all of your work,” said sophomore Bryan D. Sylvester.

“But I mean, people still do it. ” Thomas agrees. “The reality is, I have friends of mine that have used it and done it, and [professors] just never seem to catch anything,” says Thomas.

Bryson says the University of Arkansas does not officially use any AI detection software to uncover things like plagiarism at the time of reporting. However, students say individual professors sometimes claim to use these programs, which are discussed later in this article. Because AI chatbots can make it harder to detect cheating, Bryson encourages the university’s faculty to discuss the topic with their students.

“It’s kind of a community effort, is the best way I could describe it,” starts Bryson. “It isn’t just faculty trying to create rules and making sure everybody adheres. It’s students also letting faculty know what kind of culture they want and the expectations they need to really make it a fair playing field for themselves, too.

” Over at the Don Tyson School of Innovation in Springdale, English teacher Natalie Campbell says educators are adapting assignments to embrace AI. “We’re designing the curriculum for specific outcomes from students. So that means [cheating] is a little bit more difficult.

They can’t just ask for a summary. ” Students at DTSI say the use of AI chatbots is widespread. “I’d probably say around 98% to 100% have at least touched it at one point,” guesses 11th grader Daniel Paxton.

But Paxton says he’s confident teachers at DTSI would catch students if they tried to cheat using AI. Campbell seconds this, saying it’s fairly obvious to her when there’s a noticeable difference in a student’s writing ability. She says she often supplements this suspicion with online AI detection tools.

“It’s humbling and maybe a little bit embarrassing when they realize ‘Oh gosh, I was caught. ’ But then they realize that they have the opportunity to do what’s right,” explains Campbell. Campbell admits AI detection software isn’t perfect.

She mentions she has to use a variety of tools to see if AI created a student’s work because the detection tools can be unreliable. Emphasizing this, she forwarded links to a few of them to KNWA/FOX24, including , , and . To test their accuracy, KNWA/FOX24 submitted one of our old college essays written well before AI was an option for students and compared the results to a similar essay we had ChatGPT 4 write for us.

The responses varied significantly. Using Dupli Checker’s tool, both the human and AI-written essays scored as 0% AI-generated. Conversely, in Sapling’s tool, our human-written essay was deemed to be just over 75% AI-generated, while our AI essay was flagged as 100% faked.

GPTZero’s tool seemed to perform the best of the three, giving our real essay an 8% chance of being AI-generated, and our actual AI essay a 90% chance of being created by an AI. It is important to note that these results were taken at the time of writing, and these tools may have since evolved. Additionally, more samples would be necessary for a more conclusive test.

However, the discrepancy in these results highlights the difficulties a teacher may face in discerning whether a student’s work is legitimate. Whether AI-generated work can be detected accurately or not, Paxton says he’s happy AI chatbots are a tool he has access to. He explains that he often uses it to study for a future career in finance.

“When I’m reading a lot of these scholarly journals, a lot of times, the lingo they use in those journals are very difficult for just a high school student to understand,” says Paxton. He says using AI chatbots can more effectively break down this lingo, and continue studying through terminology barriers that may have otherwise stopped him. Similarly, 10th-grade DTSI student Dara Cuenca says she uses it to summarize concepts for her debate team.

She says she notices debate teams are much more prepared at competitions, and believes chatbots are effectively helping students keep up with their peers, like a tutor would. “For a lot of students, school used to be so hard because you couldn’t grasp what you were learning. And now, I feel like students understand what they’re trying to learn and can just interact more in class as it is.

” Campbell says it’s now up to educators to adapt to the technology to ensure they and their students aren’t left behind. “That’s why we want to try to incorporate it into the classroom,” she begins. “So, that we’re teaching them how to use it appropriately.

And if we just say, ‘No! Don’t use it!’ and we’re afraid of it, then they’re going to use it anyways. And then, they’re going to use it in a way that’s not meaningful to them. ”.


From: nwahomepage
URL: https://www.nwahomepage.com/news/featured-stories/special-report-ai-in-local-classrooms/

Exit mobile version