Abstract
Today’s schools turn to computers for all aspects of learning, including assessment. While advantages to computer testing do exist, the comparability between paper pencil tests (PPT) and computer-based tests (CBT) must be considered. This study examined whether the testing medium impacts student performance in math assessment by addressing three questions. First, does a test mode effect exist, as evidenced by mean score difference between a CBT and a PPT? Second, does question type: multiple choice, constructed response, or extended response, relate to student performance? Third, does either gender or computer experience and familiarity impact CBT and PPT scores? Eighty 6th grade students took math tests with half of the questions on a PPT and half of the questions on a CBT. A computer familiarity survey was completed prior to the unit tests. Significant differences were found for one of the unit tests and for some of the question types.
Keywords: paper-pencil tests, computer-based tests, test mode effect, mathematics assessment, item construct
Files
Metadata
- Institution
Cumming
- Publisher
Online Journal of New Horizons in Education
- Date submitted
19 July 2022
- Additional information
Author Biography:
Tara McClelland is a public school teacher and graduate alumni of the University of North Georgia. Josh Cuevas is a professor and educational psychologist at the University of North Georgia.