July 1, 2002
Informing the Design of Performance Assessments Using a Content-Process Analysis of Two NAEP Science Tasks
Authors:
Kristin M. Bass, Maria E. Magone, and Robert Glaser
Modern conceptions of knowledge have spurred efforts for assessments of thinking and reasoning and a principled approach to task design. This study uses a content-process framework to examine the cognitive demands of two science performance assessments in order to begin to articulate heuristics for assessment design. Think-aloud protocol techniques were used to examine the thinking and reasoning of fourth and eighth graders engaged in two separate National Assessment of Educational Progress (NAEP) hands-on science tasks about estimating the concentration of an unknown salt solution. The quality of observed performance and the task scores demonstrate that in accordance with the test developers’ intentions, the performance assessments and scoring systems discriminated between students on the basis of their data collection and interpretation skills. However, performance also was influenced by details of item presentation (e.g., wording). These findings continue to make apparent the difficulty of item design and the necessity of creating rules that guide the development of performance assessments.
Bass, K. M., Magone, M. E., & Glaser, R. (2002). Informing the design of performance assessments using a content-process analysis of two NAEP science tasks (CSE Report 564). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).|Bass, K. M., Magone, M. E., & Glaser, R. (2002). Informing the design of performance assessments using a content-process analysis of two NAEP science tasks (CSE Report 564). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).