We found 82 results that contain "adapt"
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Given the COVID-19 pandemic, educators from many fields have looked to representations of pandemics to help students study topics the pandemic has accentuated. In the history of science, educators have explored inequalities in medicine, trust in experts, and responses to uncertainty. To help make these issues digestible, some educators have turned to the cooperative board game, Pandemic Legacy. Small groups work together to avert a global health crisis by managing disease. Teams play the game multiple times, but actions in one game have consequences for the next and rules change and develop as the game progresses. The game's development introduces students to new concepts at a manageable pace while giving them new problems to solve. While the game effectively introduced students to topics in the history of science, this study sought to know whether it promoted cognitive and interpersonal skills. It focused on team adaptive performance, which is linked to problem-solving and communication skills. Data was collected using three surveys. Variation in teams' responses was analyzed using the Median test. The Friedman test was used to analyze each team's adaptive performance at each of the three timesteps. All teams were initially quite confident in their ability to creatively deal with unexpected events and reported that they adapted well to new tasks. As they encountered novel situations, some teams reported that their confidence decreased. They were newly aware that they did not have creative solutions to unexpected problems. Teams aware of their limitations performed better than those who maintained their initial confidence.To access a PDF of the "Studying Team Adaptive Performance using the Board Game Pandemic Legacy" poster, click here.Description of the Poster
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Research Goal
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015).
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation.
Research Hypothesis
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability
Research Methods
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.
Questions to evaluate adaptability included:
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled.
Q8: We search and develop new competences to deal with difficult situations.
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938.
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome)
Research Results: Success Measure
The success measure results of number of wins are displayed in a bar chart.
The success measure results of number of outbreaks are displayed in a bar chart.
Research Results: Adaptability Measure
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030).
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8.
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8.
Research Findings
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes.
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time.
Acknowledgments
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School.
References
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60.
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65.
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Research Goal
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015).
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation.
Research Hypothesis
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability
Research Methods
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.
Questions to evaluate adaptability included:
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled.
Q8: We search and develop new competences to deal with difficult situations.
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938.
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome)
Research Results: Success Measure
The success measure results of number of wins are displayed in a bar chart.
The success measure results of number of outbreaks are displayed in a bar chart.
Research Results: Adaptability Measure
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030).
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8.
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8.
Research Findings
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes.
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time.
Acknowledgments
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School.
References
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60.
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65.
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.
Authored by:
Melissa Charenko

Posted on: #iteachmsu

Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Given the COVID-19 pandemic, educators from many fields have looked...
Authored by:
PEDAGOGICAL DESIGN
Monday, May 3, 2021
Posted on: #iteachmsu
ASSESSING LEARNING
What is formative feedback? (and why we should care)
Formative feedback is information on our thinking or our performance that gives us time to reflect and act on that feedback. Feedback is descriptive, evaluative, and suggestive. That is, good feedback shows us what we are doing, provides some sense of how we are doing relative to our goals, and provides some suggestions for how we might improve. Having said this, simple descriptive feedback can be quite powerful.
Processing feedback requires reflection. There is immense value in regular reflective practice regardless of your role or responsibilities. Taking time to critically examine how our experiences align with our expectations creates opportunities for us to identify opportunities for learning. Engaging in reflection as an iterative practice creates a norm of growth and improvement.
Summative evaluations of our teaching at the conclusion of each semester play a role in our institutional accountability. We can certainly learn from end-of-semester feedback and many educators do. However, if this is the only opportunity for students to provide course feedback, it comes at a time when they themselves are past the point of benefiting from it.
Formative, mid-semester feedback, however, creates an opportunity for educators to engage learners in the process of reflective practice. Intentional reflection through mid-semester feedback can help explore the initial assumptions made about a class, gain insights from learners, and develop a more comprehensive awareness of teaching practice. Generally, because the knowledge gained through this process of reflection happens with students who have a stake in the course, this reflective practice strengthens teaching practice. Finally, it is important to note as our colleagues at Vanderbilt’s Center for Teaching have noted, “soliciting mid-semester feedback can improve our end-of-course evaluations, as it will both improve the quality of the course itself and provide students with early opportunities to raise concerns with the course.”
Finally, it is essential to note that mid-semester feedback is provided in confidentiality by students. Survey administrators will tabulate and send data to you. No one else will see or have access to the information collected on your course.
Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
source: Finlay, Linda (2008). Reflecting on ‘Reflective practice’. Practice-based Professional Learning Paper 52, The Open University.
Processing feedback requires reflection. There is immense value in regular reflective practice regardless of your role or responsibilities. Taking time to critically examine how our experiences align with our expectations creates opportunities for us to identify opportunities for learning. Engaging in reflection as an iterative practice creates a norm of growth and improvement.
Summative evaluations of our teaching at the conclusion of each semester play a role in our institutional accountability. We can certainly learn from end-of-semester feedback and many educators do. However, if this is the only opportunity for students to provide course feedback, it comes at a time when they themselves are past the point of benefiting from it.
Formative, mid-semester feedback, however, creates an opportunity for educators to engage learners in the process of reflective practice. Intentional reflection through mid-semester feedback can help explore the initial assumptions made about a class, gain insights from learners, and develop a more comprehensive awareness of teaching practice. Generally, because the knowledge gained through this process of reflection happens with students who have a stake in the course, this reflective practice strengthens teaching practice. Finally, it is important to note as our colleagues at Vanderbilt’s Center for Teaching have noted, “soliciting mid-semester feedback can improve our end-of-course evaluations, as it will both improve the quality of the course itself and provide students with early opportunities to raise concerns with the course.”
Finally, it is essential to note that mid-semester feedback is provided in confidentiality by students. Survey administrators will tabulate and send data to you. No one else will see or have access to the information collected on your course.
Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
source: Finlay, Linda (2008). Reflecting on ‘Reflective practice’. Practice-based Professional Learning Paper 52, The Open University.
Posted by:
Makena Neal

Posted on: #iteachmsu

What is formative feedback? (and why we should care)
Formative feedback is information on our thinking or our performanc...
Posted by:
ASSESSING LEARNING
Wednesday, Mar 3, 2021
Posted on: #iteachmsu
ASSESSING LEARNING
Mid-semester Feedback General Process
Are you interested in engaging in reflective practice around your instruction with mid-semester feedback?
Read through the "process" section of the mid-semester feedback playlist
Use the CTLI Mid-Semester Feedback Form to customize your feedback survey. Remember, you know your course, objectives, activites, and style best. Add/remove questions that make the most sense for your course!
note: you must be logged into Google Drive with your MSU credentials to access the form and form instructions
Distribute your anomyous survey to the learners in your course, providing ample time for them to complete
note: double check the Google Form Instructions to ensure you're maintaining anonymity
After survey is closed, review your data. You can use the "interpret" section of the mid-semester feedback playlist if you need help.
Make a plan of action based on the feedback you recieved, share the plan with your class, and get started! (The "action plan" section of the mid-semester feedback playlist can help if you need ideas.)
Additional help can be provided by the CTLI Student Feedback team, should you need a thought partner as you navigate collecting, interpreting, and adapting! CTLI offers more instruments for formative feedback and checking in with learners. Learn more about the entire student-facing survey "library"
Read through the "process" section of the mid-semester feedback playlist
Use the CTLI Mid-Semester Feedback Form to customize your feedback survey. Remember, you know your course, objectives, activites, and style best. Add/remove questions that make the most sense for your course!
note: you must be logged into Google Drive with your MSU credentials to access the form and form instructions
Distribute your anomyous survey to the learners in your course, providing ample time for them to complete
note: double check the Google Form Instructions to ensure you're maintaining anonymity
After survey is closed, review your data. You can use the "interpret" section of the mid-semester feedback playlist if you need help.
Make a plan of action based on the feedback you recieved, share the plan with your class, and get started! (The "action plan" section of the mid-semester feedback playlist can help if you need ideas.)
Additional help can be provided by the CTLI Student Feedback team, should you need a thought partner as you navigate collecting, interpreting, and adapting! CTLI offers more instruments for formative feedback and checking in with learners. Learn more about the entire student-facing survey "library"
Authored by:
Makena Neal

Posted on: #iteachmsu

Mid-semester Feedback General Process
Are you interested in engaging in reflective practice around your i...
Authored by:
ASSESSING LEARNING
Monday, Oct 2, 2023
Posted on: #iteachmsu
ASSESSING LEARNING
I have mid-semester feedback data. Now what?
From the moment you present a mid-semester feedback opportunity to the learners in your course, it is imperative that you communicate your commitments to acting on the feedback. Have you ever had a peer or employer ask for your input on a project or initiative and then seem to completely ignore it? Maybe your significant other asked for your opinion on ways to tackle a challenge and then pursued an opposite approach? If you can recall a moment like this, how did it make you feel?
When you collect mid-semester feedback, you are asking your students for feedback. You want to make sure they feel valued and heard, that they have a voice in your class space, and that their input isn’t being collected just “for show.” You should clearly indicate which elements of their feedback you will and will not act on (and why). We know that students who feel empowered and who see their voice reflected in class activities feel more engaged and are more likely to show positive learning outcomes.
There is a body of literature that indicates biases are real and problematic in students’ evaluation of teaching. The goal of this mid-semester instrument is not evaluative of the instructor, but instead is focused on feedback surrounding the learning experience. That being said, be aware that a host of factors including (but not limited to) gender, race, and subject matter, stress, and load can lead students to make statements that imprecisely reflect the actual quality of instruction.
We recognize it can be difficult to look past the most impassioned individual feedback and consider all the data holistically, but remember that the “loudest” voice or the longest comments may not reflect the overall feelings of learners. One helpful strategy is to have someone you trust read the comments before you do, then provide you their overall impressions and filter out any inappropriate remarks.
The following is Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
Additional sources: Faculty Innovation Center at University of Texas at Austin
Toshalis, Eric & Nakkula, Michael (2012). Motivation, Engagement, and Student Voice. The Student at the Center Series, Jobs For the Future.
Justin Esarey & Natalie Valdes (2020) Unbiased, reliable, and valid student evaluations can still be unfair, Assessment & Evaluation in Higher Education, DOI: 10.1080/02602938.2020.1724875
When you collect mid-semester feedback, you are asking your students for feedback. You want to make sure they feel valued and heard, that they have a voice in your class space, and that their input isn’t being collected just “for show.” You should clearly indicate which elements of their feedback you will and will not act on (and why). We know that students who feel empowered and who see their voice reflected in class activities feel more engaged and are more likely to show positive learning outcomes.
There is a body of literature that indicates biases are real and problematic in students’ evaluation of teaching. The goal of this mid-semester instrument is not evaluative of the instructor, but instead is focused on feedback surrounding the learning experience. That being said, be aware that a host of factors including (but not limited to) gender, race, and subject matter, stress, and load can lead students to make statements that imprecisely reflect the actual quality of instruction.
We recognize it can be difficult to look past the most impassioned individual feedback and consider all the data holistically, but remember that the “loudest” voice or the longest comments may not reflect the overall feelings of learners. One helpful strategy is to have someone you trust read the comments before you do, then provide you their overall impressions and filter out any inappropriate remarks.
The following is Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
Additional sources: Faculty Innovation Center at University of Texas at Austin
Toshalis, Eric & Nakkula, Michael (2012). Motivation, Engagement, and Student Voice. The Student at the Center Series, Jobs For the Future.
Justin Esarey & Natalie Valdes (2020) Unbiased, reliable, and valid student evaluations can still be unfair, Assessment & Evaluation in Higher Education, DOI: 10.1080/02602938.2020.1724875
Posted by:
Makena Neal

Posted on: #iteachmsu

I have mid-semester feedback data. Now what?
From the moment you present a mid-semester feedback opportunity to ...
Posted by:
ASSESSING LEARNING
Wednesday, Mar 3, 2021
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Team-Teaching Online: Successes and Challenges of the MSU Math Department
Topic Area: Online Teaching & Learning
Presented by: Andrew Krause, Tsveta Sendova
Abstract:
We are excited to share the redesigned departmental teaching structure that we implemented during pandemic-forced online teaching. Our department has realigned our teaching efforts into cohesive course-teams, in lieu of traditional independent (coordinated) teaching roles. No longer are individual instructors responsible for specific sections, but instead instructors have a role on a larger team that shares the instructional load. For example, 24 instructors for MTH 132: Calculus 1 worked together in a variety of roles to deliver a cohesive course to 1400 students.
This configuration has important advantages, the three most important being: flexibility, support, and adaptability.
Flexibility: With diverse roles available, each instructor can contribute with their strength -- leading online webinars, small group tutoring, assessment design, video creation, etc.
Support: The large team can support instructors who experience challenges that disrupt their ability to teach (health, family, etc.). It is easy to substitute one or a few teaching roles, rather than an entire ""teacher"".
Adaptability: Having a cohesive ""backbone"" of the course (D2L, materials for students, etc.) makes it possible to rapidly adjust to changing scenarios, such as changing guidance on in-person meetings. It is easy to plug in additional face-to-face meetings as alternatives or enhancements to the online structure.
Presented by: Andrew Krause, Tsveta Sendova
Abstract:
We are excited to share the redesigned departmental teaching structure that we implemented during pandemic-forced online teaching. Our department has realigned our teaching efforts into cohesive course-teams, in lieu of traditional independent (coordinated) teaching roles. No longer are individual instructors responsible for specific sections, but instead instructors have a role on a larger team that shares the instructional load. For example, 24 instructors for MTH 132: Calculus 1 worked together in a variety of roles to deliver a cohesive course to 1400 students.
This configuration has important advantages, the three most important being: flexibility, support, and adaptability.
Flexibility: With diverse roles available, each instructor can contribute with their strength -- leading online webinars, small group tutoring, assessment design, video creation, etc.
Support: The large team can support instructors who experience challenges that disrupt their ability to teach (health, family, etc.). It is easy to substitute one or a few teaching roles, rather than an entire ""teacher"".
Adaptability: Having a cohesive ""backbone"" of the course (D2L, materials for students, etc.) makes it possible to rapidly adjust to changing scenarios, such as changing guidance on in-person meetings. It is easy to plug in additional face-to-face meetings as alternatives or enhancements to the online structure.
Authored by:
Andrew Krause, Tsveta Sendova

Posted on: #iteachmsu

Team-Teaching Online: Successes and Challenges of the MSU Math Department
Topic Area: Online Teaching & Learning
Presented by: Andre...
Presented by: Andre...
Authored by:
PEDAGOGICAL DESIGN
Wednesday, Apr 28, 2021
Posted on: #iteachmsu
ASSESSING LEARNING
Quick tips on how to interpret mid-semester feedback data.
The general sample questions provided in the "process" section of the mid-semster feedback playlist are centered around three themes. Here you can find quick tips for interpreting the data related to those themes, as well as links to other #iteachmsu articles. Remember the sample questions were written generally and with the audience, students, in mind. If you see (or don't see) jargon that would(n't) be typical in your field or discipline, keep in mind we attempted framing items in ways that would make sense for survey participants.
Thanks to our colleagues from the Enhanced Digital Learning Initiative at MSU who provided the information adapted to this article: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad)!
Theme 1: Perceptions on purpose and alignment
This theme encompasses the sample questions where students indicate if they feel that they are prepared for class and understand expectations. Ideally, answers would trend toward “4. always” If that is true and students voice needs that they have in later answers, then you can explore relationships between, say, students who generally understand what is expected of them but (might be) confused about what assignments are asking of them (this is a curious relationship worth exploring with students).Theme 1 example questions: I am prepared for class. I understand what is expected of me in preparation for class.
If responses raise concerns, consider:
Clearly re-stating your course’s learning outcomes verbally and in writing
Clearly indicating how an activity fits into the broader course structure, prepares students for the working world, or aligns with the outcomes
Ensuring that the content assessed on tests & quizzes is content that’s been previewed in prior course activities
Before any course event (lecture, activity, test, etc) state clearly what course objectives are addressed
As you process the data from your students, be sure to focus on trends across feedback - both celebrations of what’s working and opportunities for change. This information provides you with an opportunity to highlight what is working for your own planning,in addition to providing supportive rationale for using certain teaching strategies (which you should share with your class.
Other resources include...
SOIREE
Introduction to Backward Design
Writing Measurable Outcomes for Students
Theme 2: Perceptions of structure, community, and workload
This theme relates to questions that explore students’ perceptions of the class community, structure, and workload. These are powerful descriptive questions that enable you to explore a number of issues with students (and/or with your colleagues), depending on the nature of student responses. Theme 2 example questions: I have the opportunity to ask questions. The material is interesting and engaging. Feedback is provided in a manner that helps me learn. Instructions are clear.
If responses raise concerns, consider:
Narrowing the toolset students need to use to complete required activities
Using the full suite of native tools in D2L – including the discussion board, the calendar, and the checklist
Providing opportunities for students to interact with you and each other in a no-stress, non-academic setting (perhaps via Zoom before or after class)
Re-visiting assignment and project descriptions to very clearly indicate how students use tools, seek assistance, and can contact you and/or their peers
Building in multiple points of clarification and reminders of due dates and work processes
You can also check out this from SOIREE:
Resources to Build Inclusivity and Community
Theme 3: Perceptions of learning environment
Questions in this theme indicate students' self-perception of their learning and the learning environment. Three of these questions are open-ended, so you want to make sure you’re recognizing the time it takes students to provide this type of feedback. An easy way to find patterns in the open ended responses is to paste all them into a word cloud generator. Consider using this tool: https://worditout.com/word-cloud/create Theme 3 example questions: This course's meetings and activities motivate me to learn. The way new concepts are introduced is aligned with my learning style. Overall, my learning in this course meets my expectations. What elements of class have contributed to or proved most helpful for your learning so far? What could be added or changed to reduce barriers to learning in this class so far?
After you consider the responses to these questions in addition to the items in the themes above, you have information to adapt your plan for the remainder of the semester. Be sure to tell your students what you’re changing and why (based on what feedback). Asking for feedback without following up can suggest to students that their opinions might not matter, and harm your relationship. Instead, address opportunities for what you and they can do to make the most of the semester, share your intended plans for utilizing the feedback, and thank students for their honesty, inviting them to continue working with you to improve the course.
You can also consider checking out these additional resources from SOIREE:
Student to Instructor interactions & engagement
Student to student interactions & engagement
Thanks to our colleagues from the Enhanced Digital Learning Initiative at MSU who provided the information adapted to this article: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad)!
Theme 1: Perceptions on purpose and alignment
This theme encompasses the sample questions where students indicate if they feel that they are prepared for class and understand expectations. Ideally, answers would trend toward “4. always” If that is true and students voice needs that they have in later answers, then you can explore relationships between, say, students who generally understand what is expected of them but (might be) confused about what assignments are asking of them (this is a curious relationship worth exploring with students).Theme 1 example questions: I am prepared for class. I understand what is expected of me in preparation for class.
If responses raise concerns, consider:
Clearly re-stating your course’s learning outcomes verbally and in writing
Clearly indicating how an activity fits into the broader course structure, prepares students for the working world, or aligns with the outcomes
Ensuring that the content assessed on tests & quizzes is content that’s been previewed in prior course activities
Before any course event (lecture, activity, test, etc) state clearly what course objectives are addressed
As you process the data from your students, be sure to focus on trends across feedback - both celebrations of what’s working and opportunities for change. This information provides you with an opportunity to highlight what is working for your own planning,in addition to providing supportive rationale for using certain teaching strategies (which you should share with your class.
Other resources include...
SOIREE
Introduction to Backward Design
Writing Measurable Outcomes for Students
Theme 2: Perceptions of structure, community, and workload
This theme relates to questions that explore students’ perceptions of the class community, structure, and workload. These are powerful descriptive questions that enable you to explore a number of issues with students (and/or with your colleagues), depending on the nature of student responses. Theme 2 example questions: I have the opportunity to ask questions. The material is interesting and engaging. Feedback is provided in a manner that helps me learn. Instructions are clear.
If responses raise concerns, consider:
Narrowing the toolset students need to use to complete required activities
Using the full suite of native tools in D2L – including the discussion board, the calendar, and the checklist
Providing opportunities for students to interact with you and each other in a no-stress, non-academic setting (perhaps via Zoom before or after class)
Re-visiting assignment and project descriptions to very clearly indicate how students use tools, seek assistance, and can contact you and/or their peers
Building in multiple points of clarification and reminders of due dates and work processes
You can also check out this from SOIREE:
Resources to Build Inclusivity and Community
Theme 3: Perceptions of learning environment
Questions in this theme indicate students' self-perception of their learning and the learning environment. Three of these questions are open-ended, so you want to make sure you’re recognizing the time it takes students to provide this type of feedback. An easy way to find patterns in the open ended responses is to paste all them into a word cloud generator. Consider using this tool: https://worditout.com/word-cloud/create Theme 3 example questions: This course's meetings and activities motivate me to learn. The way new concepts are introduced is aligned with my learning style. Overall, my learning in this course meets my expectations. What elements of class have contributed to or proved most helpful for your learning so far? What could be added or changed to reduce barriers to learning in this class so far?
After you consider the responses to these questions in addition to the items in the themes above, you have information to adapt your plan for the remainder of the semester. Be sure to tell your students what you’re changing and why (based on what feedback). Asking for feedback without following up can suggest to students that their opinions might not matter, and harm your relationship. Instead, address opportunities for what you and they can do to make the most of the semester, share your intended plans for utilizing the feedback, and thank students for their honesty, inviting them to continue working with you to improve the course.
You can also consider checking out these additional resources from SOIREE:
Student to Instructor interactions & engagement
Student to student interactions & engagement
Posted by:
Makena Neal

Posted on: #iteachmsu

Quick tips on how to interpret mid-semester feedback data.
The general sample questions provided in the "process" section of t...
Posted by:
ASSESSING LEARNING
Thursday, Oct 14, 2021
Posted on: #iteachmsu
ASSESSING LEARNING
Foundations of the Example Mid-Semester Feedback Questions
Foundations of the Hub's Mid-Semester Feedback Instrument:
Generally, mid-semester feedback is formative and focuses on three basic questions:
1.What would students like to see more of?
2.What would students like to see less of?
3.What would students like to see done differently?
The sample questions provided can be used to build an insturment for students at any moment, although mid-semester is most desired because students will have had enough experience to share feedback and there is still time to make changes to the course, if necessary. There are colleagues across the university who already incorporate mid-semester feedback into their educator practice, or who have support from their unit to do this work. The Center for Teaching and Learning Innovation (formerly Hub for Learning and Technology) is offering resource to compliment the great work that is already happening, and provide mid-semester feedback support broadly.
We encourage you to built an instrument that is short, and includes both scaled and open-ended questions. The intention is to gain insight into the student experience as it relates to the structure of the course, not specifically on the instructor.
Mid-semester feedback instruments tend to be generic, but you have the opportunity to use these sample questions in constructing an insturment that is helpful to you and tailored to your course(s). We have drawn from the work of colleagues at Princeton, Vanderbilt, Brown, Kansas, Yale, North Carolina, and MSU’s Broad College of Business to build this list of sample questions. We thank them.
Generally, mid-semester feedback is formative and focuses on three basic questions:
1.What would students like to see more of?
2.What would students like to see less of?
3.What would students like to see done differently?
The sample questions provided can be used to build an insturment for students at any moment, although mid-semester is most desired because students will have had enough experience to share feedback and there is still time to make changes to the course, if necessary. There are colleagues across the university who already incorporate mid-semester feedback into their educator practice, or who have support from their unit to do this work. The Center for Teaching and Learning Innovation (formerly Hub for Learning and Technology) is offering resource to compliment the great work that is already happening, and provide mid-semester feedback support broadly.
We encourage you to built an instrument that is short, and includes both scaled and open-ended questions. The intention is to gain insight into the student experience as it relates to the structure of the course, not specifically on the instructor.
Mid-semester feedback instruments tend to be generic, but you have the opportunity to use these sample questions in constructing an insturment that is helpful to you and tailored to your course(s). We have drawn from the work of colleagues at Princeton, Vanderbilt, Brown, Kansas, Yale, North Carolina, and MSU’s Broad College of Business to build this list of sample questions. We thank them.
Posted by:
Makena Neal

Posted on: #iteachmsu

Foundations of the Example Mid-Semester Feedback Questions
Foundations of the Hub's Mid-Semester Feedback Instrument:
Generall...
Generall...
Posted by:
ASSESSING LEARNING
Tuesday, Oct 18, 2022
Posted on: #iteachmsu
ASSESSING LEARNING
Preparing students for course mid-semester feedback
So you've built a mid-semester feedback instrument for your course. What's next?
Explain to students why you are collecting anonymous feedback in the middle of the semester.
Provide an overview of the process, including when it will take place, how you plan to use the feedback, and when you will share results with the class.
Share advice on how students can give constructive feedback, such as describe, evaluate, and suggest (the instrument itself enables all three).You can share the survey in the body of a message to students (via e-mail, d2l, or other previously determined mode of course communication).
Here is some sample language you could include in a message (feel free to copy/paste or adapt):
In an effort to make sure our class is providing a valuable learning experience for you and your classmates, I’ll be sending out a “mid-semester feedback” survey. This is your opportunity to anonymously share your thoughts on what is working in class and what could be better. No identifying information is collected as a part of the survey and the results are shared with me as a single dataset. I will not be able to identify individual student identities. Your feedback will help me to design and facilitate this course in a way that is meaningful for you. If there are things I could change to make the course more effective I want to know. I’ll use this feedback to inform the remainder of the semester. Thank you in advance for your participation.
You could also choose to build in 10 minutes of time at the start of one of your synchronous course sessions (if applicable) for students to complete the survey. Tip: build this time in at the start of class to avoid feedback being based solely on that day’s activities.
Always be sure to thank your students for participating in the process of improving the class and remember course feedback should always be anonymous!
Explain to students why you are collecting anonymous feedback in the middle of the semester.
Provide an overview of the process, including when it will take place, how you plan to use the feedback, and when you will share results with the class.
Share advice on how students can give constructive feedback, such as describe, evaluate, and suggest (the instrument itself enables all three).You can share the survey in the body of a message to students (via e-mail, d2l, or other previously determined mode of course communication).
Here is some sample language you could include in a message (feel free to copy/paste or adapt):
In an effort to make sure our class is providing a valuable learning experience for you and your classmates, I’ll be sending out a “mid-semester feedback” survey. This is your opportunity to anonymously share your thoughts on what is working in class and what could be better. No identifying information is collected as a part of the survey and the results are shared with me as a single dataset. I will not be able to identify individual student identities. Your feedback will help me to design and facilitate this course in a way that is meaningful for you. If there are things I could change to make the course more effective I want to know. I’ll use this feedback to inform the remainder of the semester. Thank you in advance for your participation.
You could also choose to build in 10 minutes of time at the start of one of your synchronous course sessions (if applicable) for students to complete the survey. Tip: build this time in at the start of class to avoid feedback being based solely on that day’s activities.
Always be sure to thank your students for participating in the process of improving the class and remember course feedback should always be anonymous!
Posted by:
Makena Neal

Posted on: #iteachmsu

Preparing students for course mid-semester feedback
So you've built a mid-semester feedback instrument for your course....
Posted by:
ASSESSING LEARNING
Tuesday, Oct 18, 2022