Simulated Computer Adaptive Testing Administration with Remote
Proctoring for Offsite Assessment in Mathematics
Abstract
This study examines remote proctoring as emerging practice for
ascertaining the validity of offsite test administration regarding test
security. While Computer Adaptive Testing (CAT) has the potentials for
greater precision in determining examinees ability level, its gains can
be jeopardized with off-site testing if the test is not ensured. This
study simulated CAT assessment while focusing on item administration,
varying the option of using pre-test items and how it impacts students’
ability estimation and item exposure. Monte-Carlo simulation was
employed to generate data for answering the research questions raised
for the study. The study’s findings revealed that CAT administration was
more consistent with no pre-test items once tightly controlled at
±2theta level, upon which recommendations were made. This finding is
particularly germane, with more institutions moving their assessments
online and rapidly becoming a new normal as an aftermath of the Covid-19
pandemic.
The data for this study were generated from computer simulations using
SimulCAT, a free software package designed by Dr. K. C. T. Han of
Graduate Management Admission Council.