We found 184 results that contain "formative"
Posted on: Teaching Toolkit Ta...
Exit Card Formative Assessment
Tips
What is Formative Assessment?
Formative assessment allows educators to engage in their students’ learning process in order to assess whether they need to modify teaching strategies in order to ensure student learning and content attainment.
The Notecard
At the completion of some/all classes or content areas, hand out one notecard to each student and ask them to write on one side something they learned and on the backside, one question they still have about the content.
Review each Notecard
Through your review, assess whether students accurately understood the content they needed to learn. If not, plan to reteach areas students are struggling within a different way (a project, handout, guided notes, etc.).
Appropriate Courses to Consider this Activity
This activity may be best used with small to medium size class loads. It may be cumbersome to review all the notecards of an extremely large class. Consider making the notecards anonymous because this is about the educator assessing their teaching of the students (formative), but not assessing the students’ knowledge (summative assessment).
Resources and Tools
You can also use Entry Cards
Not necessary for this activity, but you can also have students complete a card at the beginning of class with a prompt from their readings.
Helpful Links
These educators discuss similar formative uses of exit cards:
https://www.edutopia.org/blog/formative-assessment-exit-slip-rebecca-alber
https://www.nwea.org/blog/2012/classroom-techniques-formative-assessment-idea-number-two/
Additional Ways to Use the Notecard
You are not limited to just having students write what they learned on one side of the card and questions they have on the other. You can also pose a question or a short writing prompt to the students at the end of the class and have the students write their answers on the notecard. Students turn in the card and then are allowed to leave the room. The cards are still used for formative, not summative assessment.
Feedback to Students
If you do have students put their names on the cards, you can write comments on the card or page numbers from the textbook to review in order to ensure they know where to correct inaccuracies or answers to their questions.
What is Formative Assessment?
Formative assessment allows educators to engage in their students’ learning process in order to assess whether they need to modify teaching strategies in order to ensure student learning and content attainment.
The Notecard
At the completion of some/all classes or content areas, hand out one notecard to each student and ask them to write on one side something they learned and on the backside, one question they still have about the content.
Review each Notecard
Through your review, assess whether students accurately understood the content they needed to learn. If not, plan to reteach areas students are struggling within a different way (a project, handout, guided notes, etc.).
Appropriate Courses to Consider this Activity
This activity may be best used with small to medium size class loads. It may be cumbersome to review all the notecards of an extremely large class. Consider making the notecards anonymous because this is about the educator assessing their teaching of the students (formative), but not assessing the students’ knowledge (summative assessment).
Resources and Tools
You can also use Entry Cards
Not necessary for this activity, but you can also have students complete a card at the beginning of class with a prompt from their readings.
Helpful Links
These educators discuss similar formative uses of exit cards:
https://www.edutopia.org/blog/formative-assessment-exit-slip-rebecca-alber
https://www.nwea.org/blog/2012/classroom-techniques-formative-assessment-idea-number-two/
Additional Ways to Use the Notecard
You are not limited to just having students write what they learned on one side of the card and questions they have on the other. You can also pose a question or a short writing prompt to the students at the end of the class and have the students write their answers on the notecard. Students turn in the card and then are allowed to leave the room. The cards are still used for formative, not summative assessment.
Feedback to Students
If you do have students put their names on the cards, you can write comments on the card or page numbers from the textbook to review in order to ensure they know where to correct inaccuracies or answers to their questions.
Authored by: Michelle Malkin
Assessing Learning
Posted on: #iteachmsu
Building anonymous surveys for formative feedback
One of the key aspects of colleting formative feedback is that the respondents are confident their responses are anonymous. Specifically, when it comes to classroom mid-semester feedback, it is imperative that students understand their comments cannot be traced back to their identity (and cannot negatively impact their course grade). Three examples of platforms you can use to build your mid-semester feedback survey include:
Qualtrics
MSU users have access to Qualtrics with their MSU netID and password. You can see the basics for building a Qualtrics survey here.
Google Forms
Similarly to Qualtrics, MSU users can log in to Google Drive with their MSU email and passwords. Please make sure you are logged in to Drive in this manner, not with your personal Gmail, prior to building a form for class. For a step by step for setting up an anonymous Google form, visit this webapge.
D2L
Desire2Learn (D2L) is also accessible to MSU instructors as MSU's Learning Management System. Did you know you can build a survey right in D2L? For step-by-step instructions, check out this article.
Regardless of the platform you use to build your mid-semester feedback survey, it is recommended you include a statement similar to the following at the start: Mid-semester feedback is a way your instructor can collect information about your learning experience and how the course design impacts your experience. This survey is your opportunity to share insights about class so your instructor can make decisions on how to proceed with the rest of this semester. This is an anonymous survey. Your identity will not be shared with anyone and will in no way impact your grade in the course. Your feedback is valued and appreciated.
Qualtrics
MSU users have access to Qualtrics with their MSU netID and password. You can see the basics for building a Qualtrics survey here.
Google Forms
Similarly to Qualtrics, MSU users can log in to Google Drive with their MSU email and passwords. Please make sure you are logged in to Drive in this manner, not with your personal Gmail, prior to building a form for class. For a step by step for setting up an anonymous Google form, visit this webapge.
D2L
Desire2Learn (D2L) is also accessible to MSU instructors as MSU's Learning Management System. Did you know you can build a survey right in D2L? For step-by-step instructions, check out this article.
Regardless of the platform you use to build your mid-semester feedback survey, it is recommended you include a statement similar to the following at the start: Mid-semester feedback is a way your instructor can collect information about your learning experience and how the course design impacts your experience. This survey is your opportunity to share insights about class so your instructor can make decisions on how to proceed with the rest of this semester. This is an anonymous survey. Your identity will not be shared with anyone and will in no way impact your grade in the course. Your feedback is valued and appreciated.
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

What is formative feedback? (and why we should care)
Formative feedback is information on our thinking or our performance that gives us time to reflect and act on that feedback. Feedback is descriptive, evaluative, and suggestive. That is, good feedback shows us what we are doing, provides some sense of how we are doing relative to our goals, and provides some suggestions for how we might improve. Having said this, simple descriptive feedback can be quite powerful.
Processing feedback requires reflection. There is immense value in regular reflective practice regardless of your role or responsibilities. Taking time to critically examine how our experiences align with our expectations creates opportunities for us to identify opportunities for learning. Engaging in reflection as an iterative practice creates a norm of growth and improvement.
Summative evaluations of our teaching at the conclusion of each semester play a role in our institutional accountability. We can certainly learn from end-of-semester feedback and many educators do. However, if this is the only opportunity for students to provide course feedback, it comes at a time when they themselves are past the point of benefiting from it.
Formative, mid-semester feedback, however, creates an opportunity for educators to engage learners in the process of reflective practice. Intentional reflection through mid-semester feedback can help explore the initial assumptions made about a class, gain insights from learners, and develop a more comprehensive awareness of teaching practice. Generally, because the knowledge gained through this process of reflection happens with students who have a stake in the course, this reflective practice strengthens teaching practice. Finally, it is important to note as our colleagues at Vanderbilt’s Center for Teaching have noted, “soliciting mid-semester feedback can improve our end-of-course evaluations, as it will both improve the quality of the course itself and provide students with early opportunities to raise concerns with the course.”
Finally, it is essential to note that mid-semester feedback is provided in confidentiality by students. Survey administrators will tabulate and send data to you. No one else will see or have access to the information collected on your course.
Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
source: Finlay, Linda (2008). Reflecting on ‘Reflective practice’. Practice-based Professional Learning Paper 52, The Open University.
Processing feedback requires reflection. There is immense value in regular reflective practice regardless of your role or responsibilities. Taking time to critically examine how our experiences align with our expectations creates opportunities for us to identify opportunities for learning. Engaging in reflection as an iterative practice creates a norm of growth and improvement.
Summative evaluations of our teaching at the conclusion of each semester play a role in our institutional accountability. We can certainly learn from end-of-semester feedback and many educators do. However, if this is the only opportunity for students to provide course feedback, it comes at a time when they themselves are past the point of benefiting from it.
Formative, mid-semester feedback, however, creates an opportunity for educators to engage learners in the process of reflective practice. Intentional reflection through mid-semester feedback can help explore the initial assumptions made about a class, gain insights from learners, and develop a more comprehensive awareness of teaching practice. Generally, because the knowledge gained through this process of reflection happens with students who have a stake in the course, this reflective practice strengthens teaching practice. Finally, it is important to note as our colleagues at Vanderbilt’s Center for Teaching have noted, “soliciting mid-semester feedback can improve our end-of-course evaluations, as it will both improve the quality of the course itself and provide students with early opportunities to raise concerns with the course.”
Finally, it is essential to note that mid-semester feedback is provided in confidentiality by students. Survey administrators will tabulate and send data to you. No one else will see or have access to the information collected on your course.
Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
source: Finlay, Linda (2008). Reflecting on ‘Reflective practice’. Practice-based Professional Learning Paper 52, The Open University.
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

What's my role?: Lessons from implementation of a new discussion format
Topic Area: Online Teaching & Learning
Presented by: Brandy Ellison, Maya Moss
Abstract:
Translating a large-enrollment, team-based, general education course to an online format led to many predictable challenges. Chief among them was how to encourage meaningful interactions among team members about course topics. Inspired by a post in EDUCAUSE, I implemented a discussion format that included rotating roles and staggered due dates. The roles are:
Initiator: Posts a prompt for teammates to respond toDebater: Posts counterarguments or alternative perspectivesHighlighter: Posts a brief summary of the week's discussion, including at least two quotesCitizen: Responds to a prompt and to teammates
In addition to these requirements, I implemented new grading criteria. Students could earn 5 points for meeting all requirements or 0 points if they missed any requirements, including due dates. No partial credit was offered.
The new system presented numerous benefits and challenges. The presentation will provide more detail on the system, how it functioned, and how it connected to University and course learning goals. In addition, a student from the course will be a co-presenter, sharing their own experiences with the discussion format and summarizing feedback gathered from peers. Simon, E. (November 21, 2018). 10 tips for effective online discussions. EDUCAUSE.
Session Resources:
What's my role?: Lessons from implementation of a new discussion format (PowerPoint)
Presented by: Brandy Ellison, Maya Moss
Abstract:
Translating a large-enrollment, team-based, general education course to an online format led to many predictable challenges. Chief among them was how to encourage meaningful interactions among team members about course topics. Inspired by a post in EDUCAUSE, I implemented a discussion format that included rotating roles and staggered due dates. The roles are:
Initiator: Posts a prompt for teammates to respond toDebater: Posts counterarguments or alternative perspectivesHighlighter: Posts a brief summary of the week's discussion, including at least two quotesCitizen: Responds to a prompt and to teammates
In addition to these requirements, I implemented new grading criteria. Students could earn 5 points for meeting all requirements or 0 points if they missed any requirements, including due dates. No partial credit was offered.
The new system presented numerous benefits and challenges. The presentation will provide more detail on the system, how it functioned, and how it connected to University and course learning goals. In addition, a student from the course will be a co-presenter, sharing their own experiences with the discussion format and summarizing feedback gathered from peers. Simon, E. (November 21, 2018). 10 tips for effective online discussions. EDUCAUSE.
Session Resources:
What's my role?: Lessons from implementation of a new discussion format (PowerPoint)
Authored by: Brandy Ellison, Maya Moss
Pedagogical Design
Posted on: #iteachmsu

Automated analyses of written responses reveal student thinking in STEM
Formative assessments can provide crucial data to help instructors evaluate pedagogical effectiveness and address students' learning needs. The shift to online instruction and learning in the past year emphasized the need for innovative ways to administer assessments that support student learning and success. Faculty often use multiple-choice (MC) assessments due to ease of use, time and other resource constraints. While grading these assessments can be quick, the closed-ended nature of the questions often does not align with real scientific practices and can limit the instructor's ability to evaluate the heterogeneity of student thinking. Students often have mixed understanding that include scientific and non-scientific ideas. Open-ended or Constructed Response (CR) assessment questions, which allow students to construct scientific explanations in their own words, have the potential to reveal student thinking in a way MC questions do not. The results of such assessments can help instructors make decisions about effective pedagogical content and approaches. We present a case study of how results from administration of a CR question via a free-to-use constructed response classifier (CRC) assessment tool led to changes in classroom instruction. The question was used in an introductory biology course and focuses on genetic information flow. Results from the CRC assessment tool revealed unexpected information about student thinking, including naïve ideas. For example, a significant fraction of students initially demonstrated mixed understanding of the process of DNA replication. We will highlight how these results influenced change in pedagogy and content, and as a result improved student understanding.To access a PDF of the "Automated analyses of written responses reveal student thinking in STEM" poster, click here.Description of the Poster
Automated analyses of written responses reveal student thinking in STEM
Jenifer N. Saldanha, Juli D. Uhl, Mark Urban-Lurain, Kevin Haudek
Automated Analysis of Constructed Response (AACR) research group
CREATE for STEM Institute, Michigan State University
Email: jenifers@msu.edu
Website: beyondmultiplechoice.org
QR code (for website):
Key highlights:
Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions.
The Constructed Response Classifier (CRC) Tool (free to use: beyondmultiplechoice.org) can be used to assess student learning gains
In an introductory biology classroom:
Analyses by the CRC tool revealed gaps in student understanding and non-normative ideas.
The instructor incorporated short term pedagogical changes and recorded some positive outcomes on a summative assessment.
Additional pedagogical changes incorporated the next semester led to even more positive outcomes related to student learning (this semester included the pivot to online instruction).
The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.
Constructed Response Questions as Formative Assessments
Formative assessments allow instructors to explore nuances of student thinking and evaluate student performance.
Student understanding often includes scientific and non-scientific ideas [1,2].
Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions [3,4].
Constructed Response Classifier (CRC) tool
A formative assessment tool that automatically predicts ratings of student explanations.
This Constructed Response Classifier (CRC) tool generates a report that includes:
categorization of student ideas from writing related to conceptual understanding.
web diagrams depicting the frequency and co-occurrence rates of the most used ideas and relevant terms.
CRC Questions in the Introductory Biology Classroom :
A Case study
Students were taught about DNA replication and the central dogma of Biology.
Question was administered as online homework, completion credit provided. Responses collected were analyzed by the CRC tool.
CRC question:
The following DNA sequence occurs near the middle of the coding region of a gene. DNA 5' A A T G A A T G G* G A G C C T G A A G G A 3'
There is a G to A base change at the position marked with an asterisk. Consequently, a codon normally encoding an amino acid becomes a stop codon. How will this alteration influence DNA replication?
Part 1 of the CRC question used to detect student confusion between the central dogma processes.
Related to the Vision & Change core concept 3 “Information Flow, Exchange, and Storage" [5], adapted from the Genetics Concept Assessment [6,7].
Insight on Instructional Efficacy from CRC Tool
Table 1: Report score summary revealed that only a small fraction of students provided correct responses post instruction. (N = 48 students).
Student responses
Spring 2019
Incorrect
45%
Incomplete/Irrelevant
32%
Correct
23%
Sample incorrect responses:
Though both incorrect, the first response below demonstrates understanding of a type of mutation and the second one uses the context of gene expression.
“This is a nonsense mutation and will end the DNA replication process prematurely leaving a shorter DNA strand” (spellchecked)
“It will stop the DNA replication… This mutation will cause a gene to not be expressed”
CRC report provided:
Response score summaries
Web diagrams of important terms
Term usage and association maps
The instructor Identified scientific and non-scientific ideas in student thinking
This led to:
Short term pedagogical changes, same semester
During end of semester material review, incorporated:
Small group discussions about the central dogma.
Discussions about differences between DNA replication, and transcription and translation.
Worksheets with questions on transcribing and translating sequences.
Figure one:
The figure depicts an improvement in student performance observed in the final summative assessment.
Percentage of students who scored more than 95% on a related question:
In the unit exam = 71%
Final summative exam = 79%
Pedagogical Changes Incorporated in the Subsequent Semester
CR questions:
Explain the central dogma.
List similarities and differences between the processes involved.
Facilitated small group discussions for students to explain their responses.
Worksheets and homework:
Transcribe and translate DNA sequences, including ones with deletions/additions.
Students encouraged to create their own sequences for practice.
Revisited DNA replication via clicker questions and discussions, while students were learning about transcription and translation.
Table 2: 68% of students in the new cohort provided correct responses to the CRC question post instruction. (N = 47 students).
Student Responses
Spring 2020
Incorrect
19%
Incomplete/Irrelevant
13%
Correct
68%
Conclusions
The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.
Future Directions
Use the analytic rubric feature in the CRC tool to obtain further insight into normative and non-normative student thinking.
Use the clicker-based case study available at CourseSource about the processes in the central dogma [8].
Incorporate additional CRC tool questions in each course unit.
Questions currently available in a variety of disciplines:
Biology, Biochemistry, Chemistry, Physiology, and Statistics
Visit our website beyondmultiplechoice.org and sign up for a free account
References:
Ha, M., Nehm, R. H., Urban-Lurain, M., & Merrill, J. E. (2011). CBE—Life Sciences Education, 10(4), 379-393.
Sripathi, K. N., Moscarella, R. A., et al., (2019). CBE—Life Sciences Education, 18(3), ar37.
Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). CBE—Life Sciences Education, 16(2), ar26.
Birenbaum, M., & Tatsuoka, K. K. (1987). Applied Psychological Measurement, 11(4), 385-395.
"Vision and change in undergraduate biology education: a call to action." American Association for the Advancement of Science, Washington, DC (2011).
Smith, M. K., Wood, W. B., & Knight, J. K. (2008). CBE—Life Sciences Education, 7(4), 422-430.
Prevost, L. B., Smith, M. K., & Knight, J. K. (2016). CBE—Life Sciences Education, 15(4), ar65.
Pelletreau, K. N., Andrews, T., Armstrong, N., et al., (2016). CourseSource.
Acknowledgments.
This material is based upon work supported by the National Science Foundation (DUE grant 1323162). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting agencies.
Automated analyses of written responses reveal student thinking in STEM
Jenifer N. Saldanha, Juli D. Uhl, Mark Urban-Lurain, Kevin Haudek
Automated Analysis of Constructed Response (AACR) research group
CREATE for STEM Institute, Michigan State University
Email: jenifers@msu.edu
Website: beyondmultiplechoice.org
QR code (for website):
Key highlights:
Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions.
The Constructed Response Classifier (CRC) Tool (free to use: beyondmultiplechoice.org) can be used to assess student learning gains
In an introductory biology classroom:
Analyses by the CRC tool revealed gaps in student understanding and non-normative ideas.
The instructor incorporated short term pedagogical changes and recorded some positive outcomes on a summative assessment.
Additional pedagogical changes incorporated the next semester led to even more positive outcomes related to student learning (this semester included the pivot to online instruction).
The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.
Constructed Response Questions as Formative Assessments
Formative assessments allow instructors to explore nuances of student thinking and evaluate student performance.
Student understanding often includes scientific and non-scientific ideas [1,2].
Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions [3,4].
Constructed Response Classifier (CRC) tool
A formative assessment tool that automatically predicts ratings of student explanations.
This Constructed Response Classifier (CRC) tool generates a report that includes:
categorization of student ideas from writing related to conceptual understanding.
web diagrams depicting the frequency and co-occurrence rates of the most used ideas and relevant terms.
CRC Questions in the Introductory Biology Classroom :
A Case study
Students were taught about DNA replication and the central dogma of Biology.
Question was administered as online homework, completion credit provided. Responses collected were analyzed by the CRC tool.
CRC question:
The following DNA sequence occurs near the middle of the coding region of a gene. DNA 5' A A T G A A T G G* G A G C C T G A A G G A 3'
There is a G to A base change at the position marked with an asterisk. Consequently, a codon normally encoding an amino acid becomes a stop codon. How will this alteration influence DNA replication?
Part 1 of the CRC question used to detect student confusion between the central dogma processes.
Related to the Vision & Change core concept 3 “Information Flow, Exchange, and Storage" [5], adapted from the Genetics Concept Assessment [6,7].
Insight on Instructional Efficacy from CRC Tool
Table 1: Report score summary revealed that only a small fraction of students provided correct responses post instruction. (N = 48 students).
Student responses
Spring 2019
Incorrect
45%
Incomplete/Irrelevant
32%
Correct
23%
Sample incorrect responses:
Though both incorrect, the first response below demonstrates understanding of a type of mutation and the second one uses the context of gene expression.
“This is a nonsense mutation and will end the DNA replication process prematurely leaving a shorter DNA strand” (spellchecked)
“It will stop the DNA replication… This mutation will cause a gene to not be expressed”
CRC report provided:
Response score summaries
Web diagrams of important terms
Term usage and association maps
The instructor Identified scientific and non-scientific ideas in student thinking
This led to:
Short term pedagogical changes, same semester
During end of semester material review, incorporated:
Small group discussions about the central dogma.
Discussions about differences between DNA replication, and transcription and translation.
Worksheets with questions on transcribing and translating sequences.
Figure one:
The figure depicts an improvement in student performance observed in the final summative assessment.
Percentage of students who scored more than 95% on a related question:
In the unit exam = 71%
Final summative exam = 79%
Pedagogical Changes Incorporated in the Subsequent Semester
CR questions:
Explain the central dogma.
List similarities and differences between the processes involved.
Facilitated small group discussions for students to explain their responses.
Worksheets and homework:
Transcribe and translate DNA sequences, including ones with deletions/additions.
Students encouraged to create their own sequences for practice.
Revisited DNA replication via clicker questions and discussions, while students were learning about transcription and translation.
Table 2: 68% of students in the new cohort provided correct responses to the CRC question post instruction. (N = 47 students).
Student Responses
Spring 2020
Incorrect
19%
Incomplete/Irrelevant
13%
Correct
68%
Conclusions
The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.
Future Directions
Use the analytic rubric feature in the CRC tool to obtain further insight into normative and non-normative student thinking.
Use the clicker-based case study available at CourseSource about the processes in the central dogma [8].
Incorporate additional CRC tool questions in each course unit.
Questions currently available in a variety of disciplines:
Biology, Biochemistry, Chemistry, Physiology, and Statistics
Visit our website beyondmultiplechoice.org and sign up for a free account
References:
Ha, M., Nehm, R. H., Urban-Lurain, M., & Merrill, J. E. (2011). CBE—Life Sciences Education, 10(4), 379-393.
Sripathi, K. N., Moscarella, R. A., et al., (2019). CBE—Life Sciences Education, 18(3), ar37.
Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). CBE—Life Sciences Education, 16(2), ar26.
Birenbaum, M., & Tatsuoka, K. K. (1987). Applied Psychological Measurement, 11(4), 385-395.
"Vision and change in undergraduate biology education: a call to action." American Association for the Advancement of Science, Washington, DC (2011).
Smith, M. K., Wood, W. B., & Knight, J. K. (2008). CBE—Life Sciences Education, 7(4), 422-430.
Prevost, L. B., Smith, M. K., & Knight, J. K. (2016). CBE—Life Sciences Education, 15(4), ar65.
Pelletreau, K. N., Andrews, T., Armstrong, N., et al., (2016). CourseSource.
Acknowledgments.
This material is based upon work supported by the National Science Foundation (DUE grant 1323162). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting agencies.
Authored by: Jenifer Saldanha, Juli Uhl, Mark Urban-Lurain, Kevin Haudek
Assessing Learning
Posted on: Spring Conference o...

Beyond Zoom: A Beginner’s Guide to Unlocking the Transformative Potential of Virtual Reality in Educ
Title: Beyond Zoom: A Beginner’s Guide to Unlocking the Transformative Potential of Virtual Reality in EducationPresenter: Chris ShaltryFormat: Learning Tech DemoDate: May 11th, 2023Time: 10:15 - 11:15 amClick here to viewDescription:Virtual Reality (VR) is an innovative and game-changing technology that presents exciting possibilities for teaching and learning. With its unique ability to provide immersive and interactive experiences that are often impossible to replicate in traditional classrooms, it has the potential to transform the way we approach education.This learning technology demonstration will explore the potential benefits of VR technology in education using the FrameVR platform. This cutting-edge virtual environment enables educators and learners to easily create and share immersive experiences in unique environments, providing a new level of engagement and interactivity in the learning process.During the session, attendees will have the opportunity to experience FrameVR firsthand. They will interact with virtual objects and participate in whole-group and self-directed breakout room discussions. We will also discuss the pedagogical implications of using VR in courses, for academic poster sessions and virtual meetups.We will also address the challenges and limitations of using VR in education.[no special equipment necessary]
Authored by: Chris Shaltry
Posted on: #iteachmsu

Principles, tools, strategies, and additional resources for high-impact assessment design
Principles, tools and strategies, and additional resources related to formative assessment, discussion forums, alternatives to traditional final exams for summative assessment, and giving feedback efficiently.
View the "Supporting Materials for Assessment Options Beyond the Exam: High-Impact Assessment Design" document.
View the "Supporting Materials for Assessment Options Beyond the Exam: High-Impact Assessment Design" document.
Authored by: Becky Matz
Assessing Learning
Posted on: #iteachmsu

Assessment Workshops
The Hub for Innovation in Learning and Technology is supporting two assessment workshops in March: Assessment Options Beyond the Exam and Exam Design.
1) Assessment Options Beyond the Exam, led by Dr. Andrea Bierema: This workshop is for any MSU educator who is looking for resources and help with formative assessments and alternatives to exams such as projects, infographics, and debates. Examples include ideas for classes with 100 or more students. This workshop ran synchronously on 3/10 via zoom.
2) Exam Design, led by Dr. Casey Henley: This workshop is for any MSU educator who is looking for resources and help with academic integrity on summative quizzes and exams. We will focus on writing multiple-choice and short-answer questions, creating a climate of integrity in the course, the pros and cons of video proctoring and creating exams specifically in D2L. This workshop ran synchronously on 3/9 via zoom.
If you have questions related to the SOIREE workshops, please reach out to Ashley Braman (behanash@msu.edu) for additional support.
1) Assessment Options Beyond the Exam, led by Dr. Andrea Bierema: This workshop is for any MSU educator who is looking for resources and help with formative assessments and alternatives to exams such as projects, infographics, and debates. Examples include ideas for classes with 100 or more students. This workshop ran synchronously on 3/10 via zoom.
2) Exam Design, led by Dr. Casey Henley: This workshop is for any MSU educator who is looking for resources and help with academic integrity on summative quizzes and exams. We will focus on writing multiple-choice and short-answer questions, creating a climate of integrity in the course, the pros and cons of video proctoring and creating exams specifically in D2L. This workshop ran synchronously on 3/9 via zoom.
If you have questions related to the SOIREE workshops, please reach out to Ashley Braman (behanash@msu.edu) for additional support.
Authored by: Breana Yaklin, Andrea Bierema, Casey Henley
Assessing Learning
Posted on: #iteachmsu
Tips for Students: Giving useful feedback
Instructors can include a link to this post or download this resource to include with their mid-semester survey introduction correspondence to students.
Instructors can include a link to this post or download this resource to include with their mid-semester survey introduction correspondence to students.
Posted by: Makena Neal
Assessing Learning
Posted on: Teaching Toolkit Ta...

ASK ME ANYTHING with Justin Wigard - Ludic Pedagogy: Teaching with Video Games in the Online Classroom
As part of my ongoing engagement with game studies, I worked with video games in various contexts (popular culture courses, as pedagogical tools, as a mode of research), and regularly teach video games in different classroom formats/contexts (F2F popular culture courses, asynchronous summer courses, etc). Throughout the day, I will be online talking through approaches to teaching games in the online classroom, including but not limited to different methodological approaches (quantitative and qualitative), how to choose the best game for the class, and even some helpful pedagogical strategies for games and access/accessibility.
As part of my ongoing engagement with game studies, I worked with video games in various contexts (popular culture courses, as pedagogical tools, as a mode of research), and regularly teach video games in different classroom formats/contexts (F2F popular culture courses, asynchronous summer courses, etc). Throughout the day, I will be online talking through approaches to teaching games in the online classroom, including but not limited to different methodological approaches (quantitative and qualitative), how to choose the best game for the class, and even some helpful pedagogical strategies for games and access/accessibility.
Posted by: Justin Wigard
Posted on: GenAI & Education
AI Commons Bulletin 2/5/2025
Human-curated news about generative AI for Teaching and Learning in Higher Education.
📝 Try This: Teach Students How to Direct AI to Write an Entire Paper Well
Zufelt (2025) proposes an A to Z strategy for quality writing, whether done manually or with AI. Students follow stages: Gather & Summarize, Prompt & Draft, Curate, Revise & Edit, Review, and Format, with clear instructions at each step.
Learn More: http://doi.org/10.1177/23294906241309846
🤖 The Education Revolution Through AI
AI holds immense potential in education, offering opportunities for personalized learning, task automation, and adaptive teaching. However, challenges such as bias, ethical concerns, and data privacy must be carefully addressed. Its applications are vast, spanning research, teaching, and course design integration.
Learn More: http://octaedro.com/libro/the-education-revolution-through-artificial-intelligence/
💬 Engage With Your Colleagues to Establish Your Strategy for AI in Teaching and Learning
The BYU theatre education faculty proactively explored AI’s role in their curriculum, adopting a shared perspective of AI as a multiplier to enhance their work. They established and shared a set of values on AI use with students, fostering clarity and alignment.
Learn More: Jensen in ArtsPraxis vol. 11, no. 2, p. 43. http://sites.google.com/nyu.edu/artspraxis/2024/volume-11-issue-2.
🎭 Try This: Make a Discussion of AI Ethics More “Real” For Your Students With Personas
To make ethical AI discussions relatable, create characters representing diverse perspectives on AI’s impact. For each character, detail:
* What they’ve heard or read about AI
* Their direct experiences with AI
* Their opinions and statements about AI
* Actions they’ve taken regarding AI
* Their skill level as an influencer, user, or researcher
Learn More: Prietch, S. S., et al. (2024). http://doi.org/10.47756/aihc.y9i1.142
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Human-curated news about generative AI for Teaching and Learning in Higher Education.
📝 Try This: Teach Students How to Direct AI to Write an Entire Paper Well
Zufelt (2025) proposes an A to Z strategy for quality writing, whether done manually or with AI. Students follow stages: Gather & Summarize, Prompt & Draft, Curate, Revise & Edit, Review, and Format, with clear instructions at each step.
Learn More: http://doi.org/10.1177/23294906241309846
🤖 The Education Revolution Through AI
AI holds immense potential in education, offering opportunities for personalized learning, task automation, and adaptive teaching. However, challenges such as bias, ethical concerns, and data privacy must be carefully addressed. Its applications are vast, spanning research, teaching, and course design integration.
Learn More: http://octaedro.com/libro/the-education-revolution-through-artificial-intelligence/
💬 Engage With Your Colleagues to Establish Your Strategy for AI in Teaching and Learning
The BYU theatre education faculty proactively explored AI’s role in their curriculum, adopting a shared perspective of AI as a multiplier to enhance their work. They established and shared a set of values on AI use with students, fostering clarity and alignment.
Learn More: Jensen in ArtsPraxis vol. 11, no. 2, p. 43. http://sites.google.com/nyu.edu/artspraxis/2024/volume-11-issue-2.
🎭 Try This: Make a Discussion of AI Ethics More “Real” For Your Students With Personas
To make ethical AI discussions relatable, create characters representing diverse perspectives on AI’s impact. For each character, detail:
* What they’ve heard or read about AI
* Their direct experiences with AI
* Their opinions and statements about AI
* Actions they’ve taken regarding AI
* Their skill level as an influencer, user, or researcher
Learn More: Prietch, S. S., et al. (2024). http://doi.org/10.47756/aihc.y9i1.142
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Michele (MJ) Jackson
Posted on: Reading Group for S...
As we discussed on 12/3, I've created a new schedule/signup doc to guide our reading selections moving forward. See the link below, please consider adding yourself to the schedule for a meeting or two in the new year, and let me or Stokes know if you have any questions or concerns about the slight format adjustment!
- G
https://docs.google.com/document/d/1x9eCE12sbIO_BY22wCwrIOj-aSxTci89L6-uUOP6y4A/edit?usp=sharing
- G
https://docs.google.com/document/d/1x9eCE12sbIO_BY22wCwrIOj-aSxTci89L6-uUOP6y4A/edit?usp=sharing
Posted by: Garth J Sabo
Pedagogical Design
Posted on: Teaching Toolkit Ta...

ASK ME ANYTHING with Justin Wigard - Graphic Possibilities: Teaching with Comics in the Online Classroom
As part of my ongoing work with the Graphic Possibilities research workshop here at MSU, I approach comics through two interrelated approaches, critical inquiry and engaged pedagogy, and have taught comics in several different classroom formats (lower-level in-person classes, various online synchronous environments, and most recently, in a fully asynchronous online classroom). Throughout the day, I will be online talking through approaches to teaching comics in the online classroom, including but not limited to different methodological approaches (quantitative and qualitative), how to choose the best comic for the class, and even some helpful comics-making pedagogical strategies (best tips, assessment, resources, etc). Ask Me Anything! :)
As part of my ongoing work with the Graphic Possibilities research workshop here at MSU, I approach comics through two interrelated approaches, critical inquiry and engaged pedagogy, and have taught comics in several different classroom formats (lower-level in-person classes, various online synchronous environments, and most recently, in a fully asynchronous online classroom). Throughout the day, I will be online talking through approaches to teaching comics in the online classroom, including but not limited to different methodological approaches (quantitative and qualitative), how to choose the best comic for the class, and even some helpful comics-making pedagogical strategies (best tips, assessment, resources, etc). Ask Me Anything! :)
Posted by: Justin Wigard
Disciplinary Content
Posted on: GenAI & Education
AI Commons Bulletin 1/13/2025
Human-curated news about generative AI for Teaching and Learning in Higher Education.
😮 Word of the Day: “AI-giarism”
“The unethical practice of using artificial intelligence technologies, particularly generative language models, to generate content that is plagiarized either from original human-authored work or directly from AI-generated content, without appropriate acknowledgement of the original sources or AI’s contribution.” (Chan, 2024)
Learn More: https://doi.org/10.1007/s10639-024-13151-7
💚 H-Net Hosts 2025 AI Symposium: Fear, Faith, and Praxis: Artificial Intelligence in the Humanities and Social Sciences
This year’s theme, “Fear, Faith, and Praxis: Artificial Intelligence, Humanities, and Social Sciences,” focuses on student-centered approaches to the use of AI in pedagogical practice and reassessing previous assumptions about AI. This two-day event will be held on MSU’s campus on Feb 20-21, 2025, and available via live stream on the H-Net Commons.
Learn More: https://networks.h-net.org/2025-ai-symposium
💬 Try This: Use AI to Make Peer Feedback More Effective
Use this prompt: ‘‘I teach a university class where students work on teams for the semester. You are my assistant, who is going to help me provide formative feedback to my students. I collect peer comments periodically throughout out the semester, and I would like you to summarize the comments into a performance feedback review in a way that is constructive and actionable. Additionally, the students assess themselves and I would like you to compare their responses to the peer feedback. The output should be in the form of a letter, and please exclude anything that is inappropriate for the workplace.’’ [If there are less than 2 comments for a student, please provide generic feedback only.]
Learn More: https://www.ijee.ie/1atestissues/Vol40-5/02_ijee4488.pdf
🫥 AI’s That Can Read Your Student’s Emotions
Google wants its AI bots to read emotions. Critics point out the dangers from misclassifying user behaviors. AND recent research suggests the science of “universal emotions” is culturally biased.
Learn More: https://techcrunch.com/2024/12/05/google-says-its-new-open-models-can-identify-emotions-and-that-has-experts-worried/
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Human-curated news about generative AI for Teaching and Learning in Higher Education.
😮 Word of the Day: “AI-giarism”
“The unethical practice of using artificial intelligence technologies, particularly generative language models, to generate content that is plagiarized either from original human-authored work or directly from AI-generated content, without appropriate acknowledgement of the original sources or AI’s contribution.” (Chan, 2024)
Learn More: https://doi.org/10.1007/s10639-024-13151-7
💚 H-Net Hosts 2025 AI Symposium: Fear, Faith, and Praxis: Artificial Intelligence in the Humanities and Social Sciences
This year’s theme, “Fear, Faith, and Praxis: Artificial Intelligence, Humanities, and Social Sciences,” focuses on student-centered approaches to the use of AI in pedagogical practice and reassessing previous assumptions about AI. This two-day event will be held on MSU’s campus on Feb 20-21, 2025, and available via live stream on the H-Net Commons.
Learn More: https://networks.h-net.org/2025-ai-symposium
💬 Try This: Use AI to Make Peer Feedback More Effective
Use this prompt: ‘‘I teach a university class where students work on teams for the semester. You are my assistant, who is going to help me provide formative feedback to my students. I collect peer comments periodically throughout out the semester, and I would like you to summarize the comments into a performance feedback review in a way that is constructive and actionable. Additionally, the students assess themselves and I would like you to compare their responses to the peer feedback. The output should be in the form of a letter, and please exclude anything that is inappropriate for the workplace.’’ [If there are less than 2 comments for a student, please provide generic feedback only.]
Learn More: https://www.ijee.ie/1atestissues/Vol40-5/02_ijee4488.pdf
🫥 AI’s That Can Read Your Student’s Emotions
Google wants its AI bots to read emotions. Critics point out the dangers from misclassifying user behaviors. AND recent research suggests the science of “universal emotions” is culturally biased.
Learn More: https://techcrunch.com/2024/12/05/google-says-its-new-open-models-can-identify-emotions-and-that-has-experts-worried/
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Sarah Freye
Posted on: #iteachmsu
Here is a downloadable file of the Center for Teaching and Learning Innovation (CTLI) mid-semester feedback survey sample questions. You can also access the Google Doc here: https://docs.google.com/document/d/1bvWBucqNfRfc938QekLlealPf4XbIbBCg1Bz5UgwUjY/edit?usp=sharing
Please note that there are colleges and units across MSU's campus that are already offering support to their instructors in collecting formative feedback. This effort is to complement these services and make them accessible to the broader MSU instructor community. Feel free to use these questions verbatim, or tailor to best suit your course(s).
Please note that there are colleges and units across MSU's campus that are already offering support to their instructors in collecting formative feedback. This effort is to complement these services and make them accessible to the broader MSU instructor community. Feel free to use these questions verbatim, or tailor to best suit your course(s).
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu
Greetings! I'm Dustin De Felice and tomorrow (10/22) I will be hosting an AMA on ideas for Hyflex classrooms. With the addition of cameras and microphones in classroom spaces across campus, the ability to link students online and in person has never been greater. While there are still challenges with this type of classroom, I have been exploring various formats, scheduling strategies, and modality choices with an eye toward future semesters. Please share your questions, add a thought, or even a description of how your classroom looks now by commenting on this post and I'll share what I know. Looking forward to having a conversation about Hyflex classrooms!
Posted by: Dustin De Felice
Pedagogical Design
Host: MSU Libraries
Zotero Workshop (Online)
An introduction to the free open source citation management program Zotero. In this workshop, participants will learn how to:
Download references from MSU's article databases and websites
Format citations and bibliographies in a Word document
Create groups and share references with other users
Registration for this event is required.
You will receive a link to join a Zoom meeting before the workshop. Please install the Zotero software and Zotero browser connector on your computer before the session begins. More information is available from https://libguides.lib.msu.edu/zotero/setup.
Questions or need more information? Contact the MSU Libraries Zotero training team at lib.dl.zotero@msu.edu.
To schedule a separate session for your class or research group, please contact the Zotero team at lib.dl.zotero@msu.edu.
Navigating Context
Host: CTLI
CTLI Plan-A-Thon
Join us for the CTLI Plan-a-thon! A day dedicated to preparing for a fall semester of teaching and learning. During the event you will have the opportunity to meet with CTLI Teaching Center and MSU IT consultants, work alone, collaborate on course planning or syllabus writing, and attend optional workshops. Stay for the whole day, a part of the day, or come and go as you're able. Connect with us in the ways that are most meaningful to you over warm beverages and conversation.
An optional hybrid-format breakout session includes:
Designing your Syllabus (hybrid from 10-11am)
Open Office hours will be available all day, focusing on pedagogical support and educational technology.
The in-person location for this session is the Center for Teaching and Learning Innovation. Please join us in the Main Library, Room W207 (Training Room 1). For directions to W207, please visit the Room Locations page.
Navigating Context
EXPIRED
Host: CTLI
Educators as Researchers: The SoTL Approach to Innovative Teaching
Curious about conducting research in your classroom as a means to improve student outcomes? Join us for an informative workshop that introduces the fundamentals of the Scholarship of Teaching and Learning (SoTL), which involves the systematic study of teaching and learning in higher education to improve student success. In this session, you'll discover how SoTL can transform your teaching and contribute to your professional growth. We'll guide you through the key steps of a SoTL inquiry, from formulating research questions to sharing your findings. Plus, you'll explore examples of impactful SoTL projects and learn about resources available to help you get started. Whether you're new to SoTL or looking to refine your approach, this session offers valuable insights into the research-based approach to improving student learning.
Upon completion of this learning experience, participants will be able to:
define SoTL and describe its core principles
explain the importance of SoTL in enhancing student learning and improving teaching practices
identify differences between SOTL and traditional research in higher education
describe how SoTL can contribute to professional development, tenure, and promotion in higher education
outline the key steps involved in a SoTL inquiry, from formulating a question to dissemination
explore examples of SoTL projects in various disciplines
identify institutional and external resources available for faculty interested in SoTL (funding, mentorship, workshops)
describe ethical considerations when conducting SoTL research, including the use of student data, informed consent, IRB, etc.
Navigating Context
EXPIRED