Topic Area: DEI
Moving to an online format, which can decrease student-teacher interactions, makes formative assessment of and responding to student written explanations difficult. It is especially challenging to assess authentic practices of STEM disciplines, including constructing explanations, as they are best assessed in an open format rather than multiple choice. To assess large numbers of student open responses, our group developed a set of assessment items and a Constructed Response Classifier (CRC) tool capable of rapidly analyzing student text responses. The CRC tool produces reports about student thinking in various STEM disciplines with high agreement to human scores. Our items span the disciplines of chemistry, biology, statistics, and physiology and levels from introductory to upper level courses. CRC automated reports identify common ideas in students' short explanations and provide several representations of class level performance as well as individual classifications. CRC reports also reveal that students often mix misconceptions and expert-like ideas, which is captured by association diagrams. By using the tool as part of formative assessment, instructors can examine student ideas and help guide students toward building connections between concepts as they learn to use expert-like reasoning. Instructors have successfully used this tool to refine teaching practice, develop instructional materials, and improve student learning. This session will present an interactive demonstration of the CRC tool and reports. Attendees will explore reports to examine student thinking, and interact to discuss methods they can use in the classroom to address misconceptions and improve learning.