Abstract: An IRT Analysis of a Measure Designed to Assess the Implicit Curriculum (Society for Social Work and Research 22nd Annual Conference - Achieving Equal Opportunity, Equity, and Justice)

527P An IRT Analysis of a Measure Designed to Assess the Implicit Curriculum

Schedule:
Saturday, January 13, 2018
Marquis BR Salon 6 (ML 2) (Marriott Marquis Washington DC)
* noted as presenting author
Antoinette Farmer, PhD, Associate Dean and Associate Professor, Rutgers University, New Brunswick, NJ
Peter Treitler, MSW, Doctoral Student, Rutgers University, New Brunswick, NJ
N. Andrew Peterson, PhD, Professor, Rutgers University, New Brunswick, NJ
Background/ Purpose:  The Council on Social Work Education (2008; 2015) mandates all schools of social work to assess the implicit curriculum. The implicit curriculum consists of the program’s commitment to diversity, student development activities, composition of the faculty and their qualifications, administrative and governance structure, and resources.  Prior to this mandate several schools of social work had already begun to assess the implicit curriculum. Some of the measures used are reliable; however, knowing the reliability of a measure does not provide information on how the items function. Therefore, one does not know how each item discriminates across the dimensions being measured (theta) or what items need to be eliminated or added to a measure. Using item response theory (IRT) analysis, this study examined the psychometric properties of four subscales of a measure used to assess the implicit curriculum.

Methods: Analysis of secondary data from a 2016 study designed to assess the implicit curriculum was conducted. An IRT analysis was conducted, using Stata/MP (version. 14.2), to assess the psychometric properties of four subscales of the implicit curriculum measure: faculty diversity, opportunity role structure, clarity of academic policy, and resources. The sample consisted of 480 MSW students; of which 89% were female, 55% were White, non-Hispanic, and 45% were non-White. Prior to the IRT analysis, an exploratory factor analysis (EFA) using principal axis factoring was done, assessing the unidimensionality of each subscale. Samejima’s graded response model option for the IRT analysis was used because the subscales consisted of ordinal items. The subscales were evaluated based on item difficulty (a) and discrimination (b) parameters, as well as item characteristic curves and information functions.

Results:  The subscales were unidimensional according to the EFA. Based on the IRT analysis, there were several highly discriminating and low discriminating items on each subscale. For example, on the Faculty Diversity Subscale (FDS) the highest discriminating item was “the school addresses the needs of persons from diverse backgrounds,” while “I can meet my social needs by participating in the available extracurricular activities at this school” was the highest discriminating item on the Opportunity Role Structure Subscale (ORSS). Examples of the lowest discriminating items are “the school’s faculty members are diverse” on the FDS, and “I am made aware of how I can be involved in student organizations” on the ORSS. The IRT analysis also showed that each subscale contains items that reflect varying levels of the underlying construct, although the items provided relatively low information at very high and very low levels of the constructs.

Conclusion/Implications:  Findings suggest that items which do not accurately assess the constructs of interest should be removed and new items should be added, especially ones that assess the higher and lower ends of the construct. Adding such items will allow for the assessment of areas of the implicit curriculum that may promote or hinder equal opportunity, equity, and social justice. The need to provide information related to the psychometric properties of a measures that goes beyond reporting the reliability of the measure is warranted.