Applied researchers are faced with competing demands between methodological standards and the realities of working with diverse people with different needs and preferences impacting their engagement. Finding ways to promote equitable access to research participation can enhance the usefulness and generalizability of findings.
Ohio is implementing a kinship and adoption navigation program serving all families in the state with post-adoptive, formal, or informal kinship caregiving arrangements. The program is undergoing a rigorous effectiveness evaluation with a cluster randomized control trial design. A major challenge was to design a recruitment protocol that was both
- culturally responsive and inclusive, reaching diverse families across the state, while offering flexibility in survey completion methods, and
- rigorous and replicable, to prevent systematic differences in response rates, especially between the intervention and control groups.
Method
To design the recruitment protocol, the evaluation team interviewed kinship and adoptive families in Ohio to learn their preferred contact and survey completion methods and their ideas of appropriate incentive types and amounts given the survey length.
The recruitment protocol emphasized flexibility and consistency for all families, regardless of cohort. When piloting the recruitment protocol, families chose from a variety of mechanisms to complete the survey. The evaluation team recruited families using emails and text messages, with embedded survey links for easy access. Families were called by the evaluation team offering to complete the survey via phone, verify contact information, or send survey materials via postal mail.
At the conclusion of piloting the protocol, the evaluation team calculated response rates and used t-test and chi-square tests to compare descriptive characteristics of respondents to non-respondents to assess potential disproportionalities in response rates.
Based on the results from the pilot, the active recruitment window was expanded to 15 days, paper surveys were mailed earlier in the recruitment process, and families received more text messages.
Using data from the first two months of recruitment into the effectiveness trial, the evaluation team reassessed representativeness and baseline equivalence.
Results
The adjusted recruitment protocol for the effectiveness trial yielded a response rate of 75% (n=304), as opposed to 48% (n=292) during pilot testing. In the pilot, non-respondents were more likely than respondents to have lower education levels (X2(5, 591) = 13.3, p =.021). Also in the pilot, older caregivers preferred to mail surveys and these surveys were often returned outside the recruitment period (23% of all surveys received). In contrast, there were no differences in demographic characteristics between respondents and non-respondents in the effectiveness trial and fewer surveys were received outside the recruitment period (4%). There were no systematic differences between the intervention and control group in either the pilot or the effectiveness trial.
Conclusion
Diverse families can engage in applied research using a rigorous recruitment protocol that achieves high response rates from a representative sample. Seeking and listening to participant’s preferences, offering flexibility in survey completion methods, and analyzing pilot data prior to an effectiveness trial were all solutions used to maintain high standards of rigor and support equity in applied research.