An HRA Upper School student uses her iPad

AI marks the latest chapter in students’ and teachers’ ever-changing relationship with technology in the classroom

In the nearly three years since the debut of ChatGPT, generative artificial intelligence (AI) has transformed how students approach their assignments and unsettled instructors’ long-held assumptions about how teaching can and should look. Schools across the country have had to grapple with this unprecedented and evolving reality, and Hampton Roads Academy has been no exception.

The adjustment has not been without significant challenges, from concerns about academic dishonesty to a rethinking of the viability and purpose of extended writing assignments. Both our faculty and our students, however, have worked creatively to ensure that the educational experience at HRA is no less rigorous and no less effective at shaping ethical, well-rounded citizens and leaders in the age of AI.

As recent graduate Reece David ’25 confidently predicted, “If any school can be innovative enough to solve this problem and adapt, HRA will be that school.”

Navigators as Pioneers

Students have long been at the cutting edge of applying digital tools such as Khan Academy and Quizlet to assist with their learning and studying. Unsurprisingly, they have been quick to adopt AI chatbots like ChatGPT and similar applications for use on their assignments.

Sophomore class president Madison Davis ’28, for example, explained that she used AI programs last year to study for all of her courses, uploading photographs of her class notes and prompting a chatbot to create 100-question practice quizzes with a variety of question formats, such as short answer, multiple choice, and fill-in-the-blank.

It really helped me grasp each concept better,” Davis said. “I think students should learn that AI is a great way to gain ideas, use as a study source, or give background information on a topic.”

While Davis indicated that her teachers had clearly communicated when students would and would not be permitted to make use of AI in their work, she and many of her peers have also raised concerns about classmates using this powerful tool to cheat or to avoid the critical thinking that is essential to learning.

Adapting in the Classroom

HRA students explore the educational benefits of AI without sacrificing rigor, critical thinking, or academic integrity

“A support, not a substitute, for their own skills”: AI literacy is another tool in HRA students’ toolkit, supplementing, rather than supplanting, such crucial skills as verbal communication and critical thinking

For the faculty at HRA, the challenge is to help students navigate a world where AI is fast becoming a fixture of academic, professional, social, and civic life without losing sight of the intellectual and interpersonal skills that will always be crucial, if unwritten, facets of a holistic education. Indeed, as AI’s capabilities constantly evolve, teachers and students are figuring out how best to make use of the technology—or not to—in tandem.

Our teachers’ approaches to AI have been uniformly thoughtful but varied, particularly along disciplinary lines. Students widely note that STEM courses have most readily incorporated AI in the classroom, though even here, the adoption of these tools has not been unproblematic. Upper School science teacher Dr. Maribel Gendreau explained that AI detection tools occasionally produce false positives in analyzing lab reports due to such habits as “excessive copy-and-paste, rearranging words from a source, using too many synonyms to replace words from the source, or running … paragraphs through AI to ‘clean them up.’” Consequently, her approach to these assignments has shifted, with a greater emphasis on paraphrasing and writing in smaller stages to ensure that the final product reflects students’ own authentic thinking.

“AI does have potential for good,” Gendreau said, “such as using it to brainstorm or produce an outline. If students learn to use it as a support, not a substitute, for their own skills, it could be beneficial.”

Other instructors, particularly in the humanities, are less optimistic. Concerned about academic dishonesty, eroding verbal skills, and miseducation by way of AI “hallucinations,” Upper School government teacher and Associate Director of College Counseling Christopher Hailey and many of his colleagues have adopted what he terms a “Back to the Future” approach, returning to classroom practices of 15-20 years ago, before computer technology became ubiquitous in education.

Hailey’s guiding principle is that technology should be applied purposefully and in targeted situations, rather than by default. Reversing the norm in HRA’s classrooms for a number of years now, he instructs his students to keep their iPads off their desks until they are instructed to take them out for a specific task, such as answering questions on a Google Form to aggregate responses on the board and trigger discussion. His in-class reading quizzes call on students to rely on “no other resources than their pencil, their paper, and their brain.” Some larger assessments employ an online format, to prepare students for AP Exams, but Hailey uses the secure lockdown browser in the College Board’s AP Classroom to prevent AI-based cheating. For instruction, he has increasingly employed in-class deep reading and annotation of both foundational government documents and longer articles, followed by Socratic seminars in which students discuss what they have read. Such exercises, he argued, are much more effective than essays that ChatGPT could easily write for students at home.

AI is reshaping the substance of Hailey’s government classes as well as his teaching methods. As students’ awareness of the news that drives voting behavior becomes ever more algorithmically driven, social media literacy has come to occupy a significant place in his curriculum. Students must maintain “a healthy sense of skepticism” about unverifiable AI-generated content on social platforms, he explained, just as they must seek out and evaluate sources to verify claims generated by chatbots as they outline an essay.

Rethinking the “Philosophy of Pedagogy”

The Upper School leadership team has gone so far as to steer teachers away from grading any work done at home, which is very likely to be completed with AI and thus not reflect students’ knowledge and abilities. Moreover, as Assistant Head of School and Director of Upper School Ben Rous explained, students who are adamant about not using chatbots are at a disadvantage, leading to a disparity in productivity and even ultimate performance between AI skeptics and eager adopters.

Working through such challenges, according to Rous, will only strengthen HRA’s academic program. “The teachers are the ones who are going to become better because of the lessons we’re learning,” he said, giving significant credit to students like Reece David, who alerted him to practical and ethical concerns about students’ AI use during his tenure as a member of the HRA Honor Council.

The result, Rous predicted, will be a rethinking of the “philosophy of pedagogy” along the lines of interpersonal connection.

Upper School English teacher and Dean of Student Life Laurie Hager, for one, has embraced this approach by cultivating “a classroom climate of kindness and support for one another,” inviting students to practice the skills of in-the-moment critical thinking, extemporaneous debate, and the formation of arguments through respectful dialogue with their classmates and teacher. Building a “rapport with an adult who is not your parent,” Hager explained, will not only provide an intellectually engaging experience, but also prepare college-bound Navigators to forge meaningful relationships with professors.

The Future of AI at HRA

Few would question the value of robust in-class discussion, but the future of extended writing assignments that cannot be completed in a class period remains uncertain in the age of ChatGPT, which can produce full essays in seconds. Can teachers ever again ask students to write longer papers at home without relying on generative AI?

For Hailey, there is no question that such assignments must be kept alive. “You’re not just writing for an assignment. You’re trying to create meaning from your perspective,” he said. “We have lost a lot if we cannot produce meaning for ourselves.”

Rous heartily agreed. Writing, he said, “gives you the blueprint for verbalizing.” Without this skill, he fears, students cannot develop a command of “the ability to articulate” ideas in any context.

Total prohibition of AI, however, is not a feasible solution, according to David: “In the not-so-distant future, it is my opinion that we will be thinking of generative AI and [large language models] as the new calculators or spell check”—that is, as indispensable everyday tools. “I do not think schools should be AI free, just AI responsible.”

The good news, David insisted, is that his alma mater is a community uniquely equipped to thread the needle of encouraging AI literacy without sacrificing critical thinking, writing skills, or academic integrity.

“HRA can and will rise to the challenge,” he said. “There will be a period of trial and error, but that’s what HRA stands for. We think, explore, and discover in a remarkably safe environment that allows for that discovery.”