Formative assessments can provide crucial data to help instructors evaluate pedagogical effectiveness and address students' learning needs. The shift to online instruction and learning in the past year emphasized the need for innovative ways to administer assessments that support student learning and success. Faculty often use multiple-choice (MC) assessments due to ease of use, time and other resource constraints. While grading these assessments can be quick, the closed-ended nature of the questions often does not align with real scientific practices and can limit the instructor's ability to evaluate the heterogeneity of student thinking. Students often have mixed understanding that include scientific and non-scientific ideas. Open-ended or Constructed Response (CR) assessment questions, which allow students to construct scientific explanations in their own words, have the potential to reveal student thinking in a way MC questions do not. The results of such assessments can help instructors make decisions about effective pedagogical content and approaches. We present a case study of how results from administration of a CR question via a free-to-use constructed response classifier (CRC) assessment tool led to changes in classroom instruction. The question was used in an introductory biology course and focuses on genetic information flow. Results from the CRC assessment tool revealed unexpected information about student thinking, including naÃ¯ve ideas. For example, a significant fraction of students initially demonstrated mixed understanding of the process of DNA replication. We will highlight how these results influenced change in pedagogy and content, and as a result improved student understanding.