January 4, 1993

Omitted and Not-Reached Items in Mathematics in the 1990 National Assessment of Educational Progress

Authors:
Daniel Koretz, Elizabeth Lewis, Tom Skewes-Cox and Leigh Burstein
Non-response to test items on the National Assessment of Educational Progress has been a concern for some time, particularly in the case of mathematics. Until recently, the primary concern has been “not-reached” items-i.e., items not answered because the student failed to complete the test as opposed to omitted or skipped items. The study examined patterns of non-response in the three age/grade groups (age 9/grade 4, age 13/grade 8, and age 17/grade 12) included in the 1990 assessment of mathematics. The results showed that overall omit rates were modest in grades 4 and 8, and not-reached rates were greatly reduced from 1986 levels. Differences in non-response between white and minority students were less severe than they first appeared when adjusted for apparent proficiency differences. Gender differences in omit rates were infrequent. Nonetheless, the results presented above provide grounds for concern. Omit rates were high for a subset of open-ended items, and the proportion of items with high omit rates in grade 12 was substantial. The omit-rate differentials between white and minority students, especially for open-ended items, are troubling and will likely become more so as the NAEP continues to increase its reliance on such items. Taken together, these results suggest the need for routine but focused monitoring and reporting of non-response patterns.
Koretz, D., Lewis, E., Skewes-Cox, T., & Burstein, L. (1993). Omitted and notreached items in mathematics in the 1990 National Assessment of Educational Progress (CSE Report 357). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).|Koretz, D., Lewis, E., Skewes-Cox, T., & Burstein, L. (1993). Omitted and notreached items in mathematics in the 1990 National Assessment of Educational Progress (CSE Report 357). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).
This is a staging environment