Online qualitative research is increasingly threatened by imposter participants – individuals who fabricate stories to receive interview incentives – posing serious concerns about data integrity, especially in health-focused studies. This paper describes our experience encountering suspected imposter participants during a qualitative project on maternal-fetal surgery (MFS) counseling and details the strategies our team developed to safeguard authentic data while maintaining ethical, participant-centered research practices. We initially recruited participants through a nonprofit email list, targeting those who received counseling for MFS. Participants completed a demographics survey via Qualtrics and engaged in 90-minute Zoom interviews, receiving $100 in compensation. Despite carefully designed recruitment and screening measures, early interviews raised red flags including discrepancies between survey and interview responses, inconsistent IP addresses, mention of hospitals not known to perform MFS procedures, vague narratives, and reluctance to turn on cameras. Nevertheless, several were also fairly convincing and represented a demographic group from whom we desired narratives. To avoid losing authentic data, we needed to devise a process for assessing the authenticity of previously collected data as well as to ensure authentic participants for the future.
Method and Result
We submitted a modification to our IRB protocol that allowed us to view a visual “artifact” from each participant prior to starting the interview recording. Examples included discharge paperwork, a surgeon’s drawing, or medical equipment used during treatment. We received permission to contact previously interviewed participants to verify authenticity using a brief Zoom session to view their artifact. Authentic participants responded positively, expressing gratitude for our care in ensuring accurate data. Suspected imposters failed to respond to multiple follow-up requests. We revised our recruitment approach to limit outreach to closed, diagnosis-specific Facebook groups and snowball sampling. With these changes in place, we completed data collection without encountering further suspected imposters.
Discussion
Our findings demonstrate that imposter participation can significantly threaten the validity of qualitative research, particularly in health studies. While we acknowledge concerns raised by scholars about labeling participants as “fraudulent” and the epistemological risks of dismissing narratives that diverge from researcher expectations, we argue that data authenticity is paramount in contexts where findings inform clinical practice and policy. Our artifact verification strategy enabled us to balance participant privacy with data integrity without collecting intrusive personal health information. We recommend several strategies for researchers designing qualitative studies in sensitive domains: avoid open social media recruitment, monitor IP addresses across survey touchpoints, require camera use at least during the screening portion of interviews, and design screening protocols carefully to confirm the participant’s engagement with the experience under study before beginning the interview. These steps, when communicated transparently and applied consistently, can uphold ethical standards while enhancing data quality. This work contributes to identifying best practices for navigating fraud risk in online qualitative research. Our experience illustrates how thoughtful, reflexive screening protocols, grounded in both methodological rigor and respect for participants, can support more credible, inclusive, and responsible social work research.
![[ Visit Client Website ]](images/banner.gif)