July 6, 1997
On the Validity of Concept Map-Base Assessment Interpretations: An Experiment Testing the Assumption of Hierarchical Concept Maps in Science
Authors:
Maria Araceli Ruiz-Primo, Richard J. Shavelson, and Susan Elise Schultz
In recent years, concepts maps have been increasingly considered as a supplement to traditional multiple-choice tests for classroom and large-scale assessment use. But little research exists to guide important decisions about concept map use as an assessment. In this study, CRESST/Stanford University researchers investigated the (1) impact of imposing a hierarchical structure on students’ representations of their knowledge, (2) consistency of raters in scoring concept maps, and (3) whether concept maps and multiple-choice tests provide similar information about students’ declarative knowledge. Using random assignment of high school classes to two chemistry topics, the researchers found high levels of rater agreement (above .90) on scoring the concept map tasks. Correlations between multiple-choice tests and concept map scores across types of scores were all positive and moderate (r=.31 on average). Suggesting that both types of assessments measure overlapping but somewhat different aspects of student knowledge. Results were inconclusive on whether or not different concept mapping techniques would produce the same information about student knowledge.
Ruiz-Primo, M. A., Shavelson, R. J., & Schultz, S. E. (1997). On the validity of concept map-base assessment interpretations: An experiment testing the assumption of hierarchical concept maps in science (CSE Report 455). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).|Ruiz-Primo, M. A., Shavelson, R. J., & Schultz, S. E. (1997). On the validity of concept map-base assessment interpretations: An experiment testing the assumption of hierarchical concept maps in science (CSE Report 455). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).