Tary L. Wallace
University of Central Florida
Lynn Grinnell
University of South Florida
Lou M. Carey
University of South Florida
Robert F. Dedrick
University of South Florida
John M. Ferron
University of South Florida
Kathleen A. Dailey
University of South Florida
Dorian Vizcain
University of South Florida
James A.White
University of South Florida
Abstract: This research examined the psychometric properties (e.g., factor structure, reliability) of the Florida Board of Regents Student Assessment of Instruction instrument and the relation between various factors (adaptations for distance education, initial expectations, time, non-instructional factors, and response scale format) and students’ course evaluations. Data were collected from 631 students in an undergraduate course in educational assessment and in graduate courses in educational technology, language arts, and library science at various times during the semester. Results for the course evaluations reflected a one-factor model and internal consistency reliabilities greater than .90. No significant differences in students’ course evaluation ratings emerged across time during the semester, students’ first and last day ratings of a course, non-instructional factor,( excluding hours employed), or response scale formats.
Citation: Wallace, T. L., Grinnell, L., Carey, L. M., Dedrick, R. F., Ferron, J. M., Dailey, K. A., Vizcain, D., & White, J. A. (2001). A series of studies examining the Florida Board Of Regents’ course evaluation instrument. Florida Journal of Educational Research, 41(2), 14-42.
Download: Wallace.411.pdf (1260 downloads )