Methodology
5 min read

A comparative study of AI moderation vs. Human moderation using biometrics by Curtin University

AUTHOR
Dr. Billy Sung
February 10, 2026
TABLE OF CONTENT
Read the full paper
SUMMARISE WITH AI

Can AI-moderated interviews match human interviewers in eliciting disclosure and engagement?

As AI-moderated interviews become more common in research, a key question keeps arising: can an AI interviewer truly perform as well as a human moderator in terms of disclosure, comfort, and data quality?

A recent experimental study by Professor Billy Sung and Dr. Patrick Duong at Curtin University provides one of the most thorough answers to date. Instead of relying solely on self-reported opinions, the study combines biometric data, self-reported experiences, and a controlled interview design to directly compare AI-moderated and human-led interviews.

Here’s what the research aimed to examine, its findings, and why it matters to researchers.

Image 1. Snapshot from a real recorded interview using biometrics.

The research question

The aim of the study was not to demonstrate that AI is “better” than humans.

It asked a more precise question: Can AI interviewers elicit disclosure and meaningful responses at a level comparable to human interviewers, without increasing discomfort or stress?

This distinction is important. In research, emotional warmth and rapport matter, but disclosure, trust, and data quality are what ultimately impact the value of insights.

Study design: keeping the comparison fair

The research team carried out a randomized controlled experiment where the only variable was interviewer type.

  • Participants: 60 English-speaking university students and staff
  • Topic: perceptions of fast fashion and how people justify their behaviour
  • Format: one-on-one, face-to-face interviews lasting around 16 minutes

Participants were randomly assigned to one of two conditions:

  • AI interviewer: questions were delivered via text-to-speech. The AI generated dynamic follow-ups based on each answer.
  • Human interviewer: a trained human moderator conducted the interview but followed the exact same AI-generated script and follow-ups.

This setup ensured that:

  • the content, order, and probing logic were identical
  • only the interviewer identity differed

Watch Dr. Patrick Duong explain the study

Tune in to Dr. Patrick Duong as he explains the study, making it easier and more engaging to understand.

What the researchers measured

The study went beyond traditional post-interview surveys.

  1. Self-reported experienceParticipants rated:
    • sense of connection
    • trustworthiness
    • comfort and awkwardness
    • willingness to disclose
    • ability to answer effectively
    • overall evaluation of the interviewer
  2. Emotional and physiological responsesDuring the interview, researchers captured:
    • facial expressions (joy, confusion, fear, stress, etc.)
    • heart rate as a proxy for engagement
    • skin conductance as a measure of stress and arousal
    This combination enabled the team to see how respondents responded in real time, not just what they said.

Key findings

  1. Humans create stronger emotional connection - Participants reported a higher sense of connection with human interviewers, and biometric data confirmed this:
    • more facial expressions associated with joy
    • higher engagement as measured by heart rate
    This confirms what many researchers already know: humans are better at generating emotional warmth and rapport.
  2. AI matches humans on disclosure and trust - Where the results become more interesting is on the metrics that directly affect data quality. The study found no significant differences between AI and human interviewers on:
    • willingness to disclose personal information
    • perceived trustworthiness
    • comfort and awkwardness
    • ability to answer questions effectively
    • overall positive experience
    In other words, participants were just as willing to open up to an AI interviewer as to a human one.
  3. AI does not increase stress or discomfort - Biometric data showed no increase in:
    • stress
    • confusion
    • negative emotional responses
    Even though AI felt less “human,” it did not impose an emotional toll on participants. This is an important practical reassurance for researchers considering AI moderation at scale.

What actually drives disclosure

To understand why disclosure levels were similar, the researchers performed a regression analysis on all measured variables.

Two factors clearly predicted willingness to disclose:

  • trustworthiness
  • positive interview experience

Rapport and emotional connection, while important, were not the primary factors.

Since trust and experience ratings showed no significant difference between AI and human interviewers, disclosure outcomes stayed comparable. This is a key insight: for many research settings, emotional warmth is not necessary for honest, detailed answers.

What this means for researchers

The findings suggest a more nuanced way to think about interviewer choice.

  1. Use AI interviewers when:
    • scale and consistency matter
    • standardisation is important
    • the goal is disclosure, explanation, and reasoning
    • budgets or timelines make large numbers of interviews necessary
    • AI interviewers can deliver comparable disclosure without introducing stress or discomfort.
  2. Use human interviewers when:
    • emotional connection is central to the research goal
    • rapport itself is part of the insight
    • the topic requires deep relational sensitivity
    • Consider hybrid designs
  3. One practical model is:
    • AI-moderated interviews for large-scale data collection
    • human-led interviews for smaller, emotionally intensive deep dives
    This balances efficiency with interpretive richness.

Why this study matters

This Curtin University research is one of the first to combine:

  • controlled interview design
  • biometric measurement
  • psychological and experiential metrics

Its conclusion is not that AI replaces human interviewers.

It demonstrates that AI interviewers can consistently match humans in disclosure and comfort, even if they do not yet replicate human emotional presence. For research teams, this shifts AI-moderated interviews from a risky experiment to a methodologically sound option for many real-world studies.

If you are interested in the full methodology, biometric setup, and statistical analysis, read the full paper from Curtin University.