Methods: This symposium includes four papers that provide examples of how community-voices shaped measures, followed by exploratory and confirmatory factor analyses to validate the community-created and/or modified measures. Paper 1 provides an example of experiences incorporating birth parent and staff perspectives on measures of court/legal system practices and child welfare casework/agency practices. Paper 2 describes the process of incorporating provider perspectives into developing measures of co-parenting, parental self-efficacy and satisfaction, and relationship conflict for a community-based fatherhood intervention. Paper 3 highlights the process of co-constructing meaningful measures of provider attitudes, beliefs, and practice behaviors relevant to the tested intervention approach that incorporated a domestic violence survivor-centered lens. Paper 4 discusses how early childhood providers served on an expert panel that guided the process of documenting, developing, and testing a tool to assess data-driven decision making among early childhood programs.
Results: While the use of previously validated scales can improve the likelihood of identifying treatment effects (if present), community partners across four distinct studies consistently identified that these tools rarely reflected their reality. The research teams across these studies were faced with similar dilemmas: (a) using highly reliable but potentially invalid scales to measure the unique contexts and everyday reality of the phenomena under study or (b) creating/modifying scales that reflect community partners’ reality but may result in unreliable measures that minimize being able to detect intervention effects. Each study identified strategies to address these concerns through balancing the prioritization of partner expertise in the development of the questions asked with leveraging research expertise through thoughtful application of advanced quantitative methods, such as confirmatory factor analysis, to establish psychometric properties, such as convergent and divergent validity of identified constructs and measurement invariance over groups and time.
Implications: These four studies exemplify how different teams approached working alongside community partners to critique and drive scale construction to better reflect the reality of diverse experiences of children and their families. This decentering of intervention tools comes with unique strengths, unique challenges, and most importantly, lessons learned about how to do better as we move forward with a commitment to decentering researchers’ power in our evaluation processes.