In the age of AI: Human interaction key to preserving academic voice

Image 1 of 1
Robothand and a human hand writing
Image: Adobe Stock, the image is Ai generated

How can doctoral students maintain their disciplinary voice when writing with AI tools? A new study from Chalmers University of Technology suggests that human interaction - particularly peer feedback - is essential for developing critical AI literacy.

While much of the current academic debate around generative AI centers on detecting AI-generated text, this new research takes a different approach: How can we empower students to retain ownership of their academic voice while using AI?

"I am really excited about this publication! This research doesn’t just observe AI use, it shapes it, offering practical and reflective strategies to prepare writers for an AI-integrated academic future" says Baraa Khuder, senior lecturer at the Department of Communication and Learning in Science, and the researcher behind the study.

The study highlights that human-to-human interaction, especially in the form of co-author feedback practices, is a powerful way to help students engage more critically with AI tools. By drawing on earlier research on how scholars revise texts collaboratively, Baraa Khuder demonstrates that making these peer interactions visible equips students to regulate their use of AI more effectively.

Feedback-seeking: a core skill for AI-assisted writing

Unlike studies that treat AI as a stand-alone writing assistant, this research identifies feedback-seeking as a core skill in AI-integrated academic writing.

“More than just receiving critique, feedback-seeking involves knowing what to ask, whom to ask, and who ultimately decides when collaborating with AI” says Baraa Khuder.

The study’s teaching intervention was based on a model of how academic co-authors revise texts together. These interactions - ranging from research positioning to argumentation - were made visible to students through a structured revision heuristic. This approach helped students learn to ask better questions, reflect on disciplinary norms, and critically evaluate AI-generated suggestions.

A Framework for the Future of Writing Instruction

This new research offers a practical framework for teaching students how to incorporate AI tools into their writing without compromising academic identity. Writing with AI, Khuder argues, is not about outsourcing judgment, but about learning to navigate it critically, ethically, and intelligently.

“AI tools are only as good as the questions we ask—and the best questions often come from other humans”.

Facts about the study

This study, Enhancing disciplinary voice through feedback-seeking in AI-assisted doctoral writing for publication, investigates how AI was integrated into a writing-for-publication course using a pedagogical framework centered on human interaction and feedback-seeking. Fifty-five doctoral students from diverse linguistic and disciplinary backgrounds participated in structured activities to explore how AI can both support and challenge their academic voice. Textual analysis of students’ reflections and AI interactions shows that while AI can enhance aspects of writing, its effective use demands critical engagement. Feedback-seeking emerged as a key skill, helping writers determine when AI input adds value and when disciplinary expertise must guide the process. The study underscores the dynamic between AI, feedback practices, and disciplinary voice, offering new insights into AI’s pedagogical role in academic writing.

Questions?

Baraa Khuder
  • Senior Lecturer, Language and Communication, Communication and Learning in Science

Author

Jenny Palm