Date of Graduation
5-2022
Document Type
Dissertation
Degree Name
Doctor of Philosophy in Educational Statistics and Research Methods (PhD)
Degree Level
Graduate
Department
Rehabilitation, Human Resources and Communication Disorders
Advisor/Mentor
Turner, Ronna C.
Committee Member
Lo, Wen-Juo
Second Committee Member
Keiffer, Elizabeth A.
Third Committee Member
Tendeiro, Jorge
Keywords
Aberrant responding; Educational tests & measurements; Ideal-point response process; Insufficient effort responding; IRT; Person-fit statistics; Unfolding
Abstract
Researchers have recognized that respondents may not answer items in a way that accurately reflects their attitude or trait level being measured. The resulting response data that deviates from what would be expected has been shown to have significant effects on the psychometric properties of a scale and analytical results. However, many studies that have investigated the detection of aberrant data and its effects have done so using dominance item response theory (IRT) models. It is unknown whether the impacts of aberrant data and the methodology used to identify aberrant responding when using dominance IRT models apply similarly when scales fit an unfolding IRT model. This dissertation is aimed at contributing to the literature with unfolding IRT models (specifically the generalized graded unfolding model [GGUM]) in three main ways: 1) by providing insight on GGUM model-data fit when various types of aberrant data are systematically entered, 2) by investigating how nonparametric person-fit statistics (H^T, 〖U3〗^P, 〖 G〗_N^P, and 〖 G〗_^P) perform under the unfolding framework of GGUM compared to the dominance framework of the generalized partial credit model (GPCM), and 3) by examining how the performance of parametric person-fit statistics (l_(z(p)) and l_(z(p))^*) is impacted by misspecifying a dominance model (GPCM) to unfolding model data (GGUM) and conversely, GGUM to GPCM data. As unfolding models have many advantages and are becoming more widely used by researchers, the rise of questions regarding the effects of data quality and the performance of person-fit statistics under this context is expected. It is essential to gain a better understanding on how underlying response processes affect data-model fit, and how effectively different types of aberrant data are identified using multiple data model frameworks.
The dissertation is organized into three studies based on a simulation design that investigates the impacts of type of aberrant responding, proportion of aberrant responders in the sample, proportion of aberrant responses within a response vector, test length, model-data generation and application on model-fit and person-fit statistic performance. In the first study, the impact of aberrant data on model fit for GGUM and GPCM data is investigated and found to be severe in some cases. However, the GGUM was found to effectively fit both dominance and unfolding data, even with 10% aberrant data in many cases. It is suggested that researchers carefully examine data quality before making conclusions about model-data fit or misfit. The second study investigates the application of popular nonparametric person-fit statistics used with dominance data to data that fit an unfolding model. Given their poor performance, further research is recommended to identify or develop person-fit statistics effective for different types of aberrant behavior exhibited in ideal point response data. Study 3 compares type I error and power rates for parametric person-fit statistics with GGUM and GPCM data that are correctly and incorrectly specified, compared to nonparametric person-fit performance. No person-fit statistic was robust against model misspecification when GPCM was fit to GGUM data. Conversely, results were comparable for GPCM data, regardless of fitting the GPCM or GGUM to the data.
Citation
Reimers, J. A. (2022). Aberrant Responding with Underlying Dominance and Unfolding Response Processes: Examining Model Fit and Performance of Person-Fit Statistics. Graduate Theses and Dissertations Retrieved from https://scholarworks.uark.edu/etd/4537
Included in
Educational Assessment, Evaluation, and Research Commons, Educational Methods Commons, Statistical Methodology Commons, Statistical Models Commons, Statistical Theory Commons