Conferences

CRESST Speaks at NCSA

June 25, 2018

CRESST researchers present at this year’s CCSSO National Conference on Student Assessment (NCSA) on a variety of topics, ranging from innovative, career readiness assessment items, to formative assessment tasks in mathematics, to braille versions of online assessments. Check out the presentations below.

 

Career Readiness and High School Assessments

Presenters
Li Cai, UCLA/CRESST
Gregory K. W. K. Chung, UCLA/CRESST
Eric Zilbert, California State Department of Education
Moderator: Mark Hansen, UCLA/CRESST

This symposium focuses on a collaboration between the California Department of Education and UCLA/CRESST, aimed at supporting California’s K-12 assessment initiatives through the context of improving career-readiness inferences, while producing information, tools, and methodologies to inform assessment work. This symposium consists of three papers: (1) Digital Assessment of Problem Solving; (2) Examining Career Readiness Features in Existing Assessments; and (3) California’s Enhanced Assessment Grant: Increasing Students’ Postsecondary Opportunities for Success. The first paper focuses on the development of an innovative assessment item centered around a common career skill, problem solving. The second paper presents findings from examining career-readiness features in existing English language arts and math assessments. The third paper discusses the importance of the Enhanced Assessment Grant in the context of policy and preparing students for career opportunities. Implications for policy and the future of test design, analysis, and reporting will be discussed.

Thursday, June 28, 2018
11:00 AM – 12:00 PM
Hilton San Diego Bayfront, Aqua 300 (Level 3)

 

Scaffolding Mathematics Teachers and Students through the Formative Assessment Processes of Analyzing Student Work and Providing Feedback

Presenter
Deborah La Torre, UCLA/CRESST

With the adoption of college- and career-ready standards, educators and administrators have been tasked with providing mathematics curricula that goes beyond procedural skills to emphasize higher order reasoning abilities and conceptual understanding in students. One approach advocated for helping to accomplish this involves the increased use of formative assessment. Yet despite this push, our research has found that mathematics teachers struggle to gather and analyze student work, provide feedback to students in relation to success criteria, and relate student solutions and errors to exemplars and success criteria. This session will focus on what we have learned during a multi-year NSF-funded study, describing the design features of the formative assessment performance tasks we developed, and reveal the results of our intervention study. Finally, we will discuss applications of our system for the development of new formative assessment tasks by teachers or districts.

Thursday, June 28, 2018
12:30 PM – 1:15 PM
Hilton San Diego Bayfront, Sapphire A (Level 4)

 

A Data-Informed, Judgment-Based Procedure for Linking Cut Scores on Alternative Assessment Formats

Presenters
Mark Hansen, UCLA/CRESST
Michelle McCoy, UCLA/CRESST
Phoebe Winter, Consultant
Discussant: Stephen G. Sireci, University of Massachusetts, Amherst
Moderator: Alan Lytle, Arkansas Department of Education

Making changes to or sometimes replacing test items is often necessary to address student accessibility needs and to ensure fair testing conditions (e.g., an online test delivered on paper or in braille). With such changes, however, parameters from one assessment format can be of questionable accuracy vis-a-vis the alternative format. Furthermore, the number of students taking an alternative form (e.g., braille) is sometimes too small to conduct a traditional item-based linking study. It is nonetheless critical that performance on alternative assessment forms support the same inferences about student performance. This session describes a data-driven, judgement-based approach developed to link cut scores on alternative versions of an assessment. Discussion includes application of the approach to establishing cut scores on a braille version of an online assessment as well the methodology that can be applied to linking other types of alternative assessment formats. Time for audience discussion will be included.

Wednesday, June 27, 2018
10:00 AM – 11:30 AM
Hilton San Diego Bayfront, Sapphire 411 (Level 4)

 

Exploring the Potential for Scoring State ELP Assessments with Diagnostic Classification Models for Providing Formative Feedback

Presenters
Eric Setoguchi, UCLA/CRESST
Terri Schuster, Nebraska Department of Education

English language proficiency assessments are a major source of information for states as they support their English language learners in making academic progress. While there is interest in these instruments to provide formative feedback that impacts instruction in the classroom, current scoring methods do not provide detailed enough scores to be of much usefulness. Diagnostic classification models (DCMs) may have potential in this regard, as they are a scoring approach designed to provide finer-grain information about students’ mastery of individual abilities. Yet, the measurement conditions and interpretation of DCMs are not the same as those of more common approaches, and questions remain as to whether their application to the state assessment context would be feasible or worthwhile. The discussants, including a state ELL assessment representative and a measurement researcher, will discuss their work investigating the potential uses of DCMs in state assessments and the questions that are faced.

Thursday, June 28, 2018
12:30 PM – 1:15 PM
Hilton San Diego Bayfront, Sapphire A (Level 4)

 

0 likes
CRESST

Author

CRESST

Your email address will not be published.