We found 101 results that contain "adapt"
Posted on: #iteachmsu

Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Given the COVID-19 pandemic, educators from many fields have looked to representations of pandemics to help students study topics the pandemic has accentuated. In the history of science, educators have explored inequalities in medicine, trust in experts, and responses to uncertainty. To help make these issues digestible, some educators have turned to the cooperative board game, Pandemic Legacy. Small groups work together to avert a global health crisis by managing disease. Teams play the game multiple times, but actions in one game have consequences for the next and rules change and develop as the game progresses. The game's development introduces students to new concepts at a manageable pace while giving them new problems to solve. While the game effectively introduced students to topics in the history of science, this study sought to know whether it promoted cognitive and interpersonal skills. It focused on team adaptive performance, which is linked to problem-solving and communication skills. Data was collected using three surveys. Variation in teams' responses was analyzed using the Median test. The Friedman test was used to analyze each team's adaptive performance at each of the three timesteps. All teams were initially quite confident in their ability to creatively deal with unexpected events and reported that they adapted well to new tasks. As they encountered novel situations, some teams reported that their confidence decreased. They were newly aware that they did not have creative solutions to unexpected problems. Teams aware of their limitations performed better than those who maintained their initial confidence.To access a PDF of the "Studying Team Adaptive Performance using the Board Game Pandemic Legacy" poster, click here.Description of the Poster
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Research Goal
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015).
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation.
Research Hypothesis
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability
Research Methods
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.
Questions to evaluate adaptability included:
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled.
Q8: We search and develop new competences to deal with difficult situations.
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938.
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome)
Research Results: Success Measure
The success measure results of number of wins are displayed in a bar chart.
The success measure results of number of outbreaks are displayed in a bar chart.
Research Results: Adaptability Measure
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030).
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8.
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8.
Research Findings
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes.
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time.
Acknowledgments
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School.
References
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60.
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65.
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Research Goal
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015).
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation.
Research Hypothesis
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability
Research Methods
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.
Questions to evaluate adaptability included:
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled.
Q8: We search and develop new competences to deal with difficult situations.
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938.
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome)
Research Results: Success Measure
The success measure results of number of wins are displayed in a bar chart.
The success measure results of number of outbreaks are displayed in a bar chart.
Research Results: Adaptability Measure
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030).
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8.
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8.
Research Findings
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes.
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time.
Acknowledgments
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School.
References
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60.
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65.
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.
Authored by: Melissa Charenko
Pedagogical Design
Posted on: #iteachmsu

What is formative feedback? (and why we should care)
Formative feedback is information on our thinking or our performance that gives us time to reflect and act on that feedback. Feedback is descriptive, evaluative, and suggestive. That is, good feedback shows us what we are doing, provides some sense of how we are doing relative to our goals, and provides some suggestions for how we might improve. Having said this, simple descriptive feedback can be quite powerful.
Processing feedback requires reflection. There is immense value in regular reflective practice regardless of your role or responsibilities. Taking time to critically examine how our experiences align with our expectations creates opportunities for us to identify opportunities for learning. Engaging in reflection as an iterative practice creates a norm of growth and improvement.
Summative evaluations of our teaching at the conclusion of each semester play a role in our institutional accountability. We can certainly learn from end-of-semester feedback and many educators do. However, if this is the only opportunity for students to provide course feedback, it comes at a time when they themselves are past the point of benefiting from it.
Formative, mid-semester feedback, however, creates an opportunity for educators to engage learners in the process of reflective practice. Intentional reflection through mid-semester feedback can help explore the initial assumptions made about a class, gain insights from learners, and develop a more comprehensive awareness of teaching practice. Generally, because the knowledge gained through this process of reflection happens with students who have a stake in the course, this reflective practice strengthens teaching practice. Finally, it is important to note as our colleagues at Vanderbilt’s Center for Teaching have noted, “soliciting mid-semester feedback can improve our end-of-course evaluations, as it will both improve the quality of the course itself and provide students with early opportunities to raise concerns with the course.”
Finally, it is essential to note that mid-semester feedback is provided in confidentiality by students. Survey administrators will tabulate and send data to you. No one else will see or have access to the information collected on your course.
Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
source: Finlay, Linda (2008). Reflecting on ‘Reflective practice’. Practice-based Professional Learning Paper 52, The Open University.
Processing feedback requires reflection. There is immense value in regular reflective practice regardless of your role or responsibilities. Taking time to critically examine how our experiences align with our expectations creates opportunities for us to identify opportunities for learning. Engaging in reflection as an iterative practice creates a norm of growth and improvement.
Summative evaluations of our teaching at the conclusion of each semester play a role in our institutional accountability. We can certainly learn from end-of-semester feedback and many educators do. However, if this is the only opportunity for students to provide course feedback, it comes at a time when they themselves are past the point of benefiting from it.
Formative, mid-semester feedback, however, creates an opportunity for educators to engage learners in the process of reflective practice. Intentional reflection through mid-semester feedback can help explore the initial assumptions made about a class, gain insights from learners, and develop a more comprehensive awareness of teaching practice. Generally, because the knowledge gained through this process of reflection happens with students who have a stake in the course, this reflective practice strengthens teaching practice. Finally, it is important to note as our colleagues at Vanderbilt’s Center for Teaching have noted, “soliciting mid-semester feedback can improve our end-of-course evaluations, as it will both improve the quality of the course itself and provide students with early opportunities to raise concerns with the course.”
Finally, it is essential to note that mid-semester feedback is provided in confidentiality by students. Survey administrators will tabulate and send data to you. No one else will see or have access to the information collected on your course.
Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
source: Finlay, Linda (2008). Reflecting on ‘Reflective practice’. Practice-based Professional Learning Paper 52, The Open University.
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

Mid-semester Feedback General Process
Are you interested in engaging in reflective practice around your instruction with mid-semester feedback?
Read through the "process" section of the mid-semester feedback playlist
Use the CTLI Mid-Semester Feedback Form to customize your feedback survey. Remember, you know your course, objectives, activites, and style best. Add/remove questions that make the most sense for your course!
note: you must be logged into Google Drive with your MSU credentials to access the form and form instructions
Distribute your anomyous survey to the learners in your course, providing ample time for them to complete
note: double check the Google Form Instructions to ensure you're maintaining anonymity
After survey is closed, review your data. You can use the "interpret" section of the mid-semester feedback playlist if you need help.
Make a plan of action based on the feedback you recieved, share the plan with your class, and get started! (The "action plan" section of the mid-semester feedback playlist can help if you need ideas.)
Additional help can be provided by the CTLI Student Feedback team, should you need a thought partner as you navigate collecting, interpreting, and adapting! CTLI offers more instruments for formative feedback and checking in with learners. Learn more about the entire student-facing survey "library"
Read through the "process" section of the mid-semester feedback playlist
Use the CTLI Mid-Semester Feedback Form to customize your feedback survey. Remember, you know your course, objectives, activites, and style best. Add/remove questions that make the most sense for your course!
note: you must be logged into Google Drive with your MSU credentials to access the form and form instructions
Distribute your anomyous survey to the learners in your course, providing ample time for them to complete
note: double check the Google Form Instructions to ensure you're maintaining anonymity
After survey is closed, review your data. You can use the "interpret" section of the mid-semester feedback playlist if you need help.
Make a plan of action based on the feedback you recieved, share the plan with your class, and get started! (The "action plan" section of the mid-semester feedback playlist can help if you need ideas.)
Additional help can be provided by the CTLI Student Feedback team, should you need a thought partner as you navigate collecting, interpreting, and adapting! CTLI offers more instruments for formative feedback and checking in with learners. Learn more about the entire student-facing survey "library"
Authored by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

I have mid-semester feedback data. Now what?
From the moment you present a mid-semester feedback opportunity to the learners in your course, it is imperative that you communicate your commitments to acting on the feedback. Have you ever had a peer or employer ask for your input on a project or initiative and then seem to completely ignore it? Maybe your significant other asked for your opinion on ways to tackle a challenge and then pursued an opposite approach? If you can recall a moment like this, how did it make you feel?
When you collect mid-semester feedback, you are asking your students for feedback. You want to make sure they feel valued and heard, that they have a voice in your class space, and that their input isn’t being collected just “for show.” You should clearly indicate which elements of their feedback you will and will not act on (and why). We know that students who feel empowered and who see their voice reflected in class activities feel more engaged and are more likely to show positive learning outcomes.
There is a body of literature that indicates biases are real and problematic in students’ evaluation of teaching. The goal of this mid-semester instrument is not evaluative of the instructor, but instead is focused on feedback surrounding the learning experience. That being said, be aware that a host of factors including (but not limited to) gender, race, and subject matter, stress, and load can lead students to make statements that imprecisely reflect the actual quality of instruction.
We recognize it can be difficult to look past the most impassioned individual feedback and consider all the data holistically, but remember that the “loudest” voice or the longest comments may not reflect the overall feelings of learners. One helpful strategy is to have someone you trust read the comments before you do, then provide you their overall impressions and filter out any inappropriate remarks.
The following is Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
Additional sources: Faculty Innovation Center at University of Texas at Austin
Toshalis, Eric & Nakkula, Michael (2012). Motivation, Engagement, and Student Voice. The Student at the Center Series, Jobs For the Future.
Justin Esarey & Natalie Valdes (2020) Unbiased, reliable, and valid student evaluations can still be unfair, Assessment & Evaluation in Higher Education, DOI: 10.1080/02602938.2020.1724875
When you collect mid-semester feedback, you are asking your students for feedback. You want to make sure they feel valued and heard, that they have a voice in your class space, and that their input isn’t being collected just “for show.” You should clearly indicate which elements of their feedback you will and will not act on (and why). We know that students who feel empowered and who see their voice reflected in class activities feel more engaged and are more likely to show positive learning outcomes.
There is a body of literature that indicates biases are real and problematic in students’ evaluation of teaching. The goal of this mid-semester instrument is not evaluative of the instructor, but instead is focused on feedback surrounding the learning experience. That being said, be aware that a host of factors including (but not limited to) gender, race, and subject matter, stress, and load can lead students to make statements that imprecisely reflect the actual quality of instruction.
We recognize it can be difficult to look past the most impassioned individual feedback and consider all the data holistically, but remember that the “loudest” voice or the longest comments may not reflect the overall feelings of learners. One helpful strategy is to have someone you trust read the comments before you do, then provide you their overall impressions and filter out any inappropriate remarks.
The following is Adapted from the Enhanced Digital Learning Initiative at MSU: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad).
Additional sources: Faculty Innovation Center at University of Texas at Austin
Toshalis, Eric & Nakkula, Michael (2012). Motivation, Engagement, and Student Voice. The Student at the Center Series, Jobs For the Future.
Justin Esarey & Natalie Valdes (2020) Unbiased, reliable, and valid student evaluations can still be unfair, Assessment & Evaluation in Higher Education, DOI: 10.1080/02602938.2020.1724875
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

Team-Teaching Online: Successes and Challenges of the MSU Math Department
Topic Area: Online Teaching & Learning
Presented by: Andrew Krause, Tsveta Sendova
Abstract:
We are excited to share the redesigned departmental teaching structure that we implemented during pandemic-forced online teaching. Our department has realigned our teaching efforts into cohesive course-teams, in lieu of traditional independent (coordinated) teaching roles. No longer are individual instructors responsible for specific sections, but instead instructors have a role on a larger team that shares the instructional load. For example, 24 instructors for MTH 132: Calculus 1 worked together in a variety of roles to deliver a cohesive course to 1400 students.
This configuration has important advantages, the three most important being: flexibility, support, and adaptability.
Flexibility: With diverse roles available, each instructor can contribute with their strength -- leading online webinars, small group tutoring, assessment design, video creation, etc.
Support: The large team can support instructors who experience challenges that disrupt their ability to teach (health, family, etc.). It is easy to substitute one or a few teaching roles, rather than an entire ""teacher"".
Adaptability: Having a cohesive ""backbone"" of the course (D2L, materials for students, etc.) makes it possible to rapidly adjust to changing scenarios, such as changing guidance on in-person meetings. It is easy to plug in additional face-to-face meetings as alternatives or enhancements to the online structure.
Presented by: Andrew Krause, Tsveta Sendova
Abstract:
We are excited to share the redesigned departmental teaching structure that we implemented during pandemic-forced online teaching. Our department has realigned our teaching efforts into cohesive course-teams, in lieu of traditional independent (coordinated) teaching roles. No longer are individual instructors responsible for specific sections, but instead instructors have a role on a larger team that shares the instructional load. For example, 24 instructors for MTH 132: Calculus 1 worked together in a variety of roles to deliver a cohesive course to 1400 students.
This configuration has important advantages, the three most important being: flexibility, support, and adaptability.
Flexibility: With diverse roles available, each instructor can contribute with their strength -- leading online webinars, small group tutoring, assessment design, video creation, etc.
Support: The large team can support instructors who experience challenges that disrupt their ability to teach (health, family, etc.). It is easy to substitute one or a few teaching roles, rather than an entire ""teacher"".
Adaptability: Having a cohesive ""backbone"" of the course (D2L, materials for students, etc.) makes it possible to rapidly adjust to changing scenarios, such as changing guidance on in-person meetings. It is easy to plug in additional face-to-face meetings as alternatives or enhancements to the online structure.
Authored by: Andrew Krause, Tsveta Sendova
Pedagogical Design
Posted on: #iteachmsu

Quick tips on how to interpret mid-semester feedback data.
The general sample questions provided in the "process" section of the mid-semster feedback playlist are centered around three themes. Here you can find quick tips for interpreting the data related to those themes, as well as links to other #iteachmsu articles. Remember the sample questions were written generally and with the audience, students, in mind. If you see (or don't see) jargon that would(n't) be typical in your field or discipline, keep in mind we attempted framing items in ways that would make sense for survey participants.
Thanks to our colleagues from the Enhanced Digital Learning Initiative at MSU who provided the information adapted to this article: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad)!
Theme 1: Perceptions on purpose and alignment
This theme encompasses the sample questions where students indicate if they feel that they are prepared for class and understand expectations. Ideally, answers would trend toward “4. always” If that is true and students voice needs that they have in later answers, then you can explore relationships between, say, students who generally understand what is expected of them but (might be) confused about what assignments are asking of them (this is a curious relationship worth exploring with students).Theme 1 example questions: I am prepared for class. I understand what is expected of me in preparation for class.
If responses raise concerns, consider:
Clearly re-stating your course’s learning outcomes verbally and in writing
Clearly indicating how an activity fits into the broader course structure, prepares students for the working world, or aligns with the outcomes
Ensuring that the content assessed on tests & quizzes is content that’s been previewed in prior course activities
Before any course event (lecture, activity, test, etc) state clearly what course objectives are addressed
As you process the data from your students, be sure to focus on trends across feedback - both celebrations of what’s working and opportunities for change. This information provides you with an opportunity to highlight what is working for your own planning,in addition to providing supportive rationale for using certain teaching strategies (which you should share with your class.
Other resources include...
SOIREE
Introduction to Backward Design
Writing Measurable Outcomes for Students
Theme 2: Perceptions of structure, community, and workload
This theme relates to questions that explore students’ perceptions of the class community, structure, and workload. These are powerful descriptive questions that enable you to explore a number of issues with students (and/or with your colleagues), depending on the nature of student responses. Theme 2 example questions: I have the opportunity to ask questions. The material is interesting and engaging. Feedback is provided in a manner that helps me learn. Instructions are clear.
If responses raise concerns, consider:
Narrowing the toolset students need to use to complete required activities
Using the full suite of native tools in D2L – including the discussion board, the calendar, and the checklist
Providing opportunities for students to interact with you and each other in a no-stress, non-academic setting (perhaps via Zoom before or after class)
Re-visiting assignment and project descriptions to very clearly indicate how students use tools, seek assistance, and can contact you and/or their peers
Building in multiple points of clarification and reminders of due dates and work processes
You can also check out this from SOIREE:
Resources to Build Inclusivity and Community
Theme 3: Perceptions of learning environment
Questions in this theme indicate students' self-perception of their learning and the learning environment. Three of these questions are open-ended, so you want to make sure you’re recognizing the time it takes students to provide this type of feedback. An easy way to find patterns in the open ended responses is to paste all them into a word cloud generator. Consider using this tool: https://worditout.com/word-cloud/create Theme 3 example questions: This course's meetings and activities motivate me to learn. The way new concepts are introduced is aligned with my learning style. Overall, my learning in this course meets my expectations. What elements of class have contributed to or proved most helpful for your learning so far? What could be added or changed to reduce barriers to learning in this class so far?
After you consider the responses to these questions in addition to the items in the themes above, you have information to adapt your plan for the remainder of the semester. Be sure to tell your students what you’re changing and why (based on what feedback). Asking for feedback without following up can suggest to students that their opinions might not matter, and harm your relationship. Instead, address opportunities for what you and they can do to make the most of the semester, share your intended plans for utilizing the feedback, and thank students for their honesty, inviting them to continue working with you to improve the course.
You can also consider checking out these additional resources from SOIREE:
Student to Instructor interactions & engagement
Student to student interactions & engagement
Thanks to our colleagues from the Enhanced Digital Learning Initiative at MSU who provided the information adapted to this article: Scott Schopieray (CAL), Stephen Thomas (Nat. Sci.) Sarah Wellman (CAL & Broad), Jeremy Van Hof (Broad)!
Theme 1: Perceptions on purpose and alignment
This theme encompasses the sample questions where students indicate if they feel that they are prepared for class and understand expectations. Ideally, answers would trend toward “4. always” If that is true and students voice needs that they have in later answers, then you can explore relationships between, say, students who generally understand what is expected of them but (might be) confused about what assignments are asking of them (this is a curious relationship worth exploring with students).Theme 1 example questions: I am prepared for class. I understand what is expected of me in preparation for class.
If responses raise concerns, consider:
Clearly re-stating your course’s learning outcomes verbally and in writing
Clearly indicating how an activity fits into the broader course structure, prepares students for the working world, or aligns with the outcomes
Ensuring that the content assessed on tests & quizzes is content that’s been previewed in prior course activities
Before any course event (lecture, activity, test, etc) state clearly what course objectives are addressed
As you process the data from your students, be sure to focus on trends across feedback - both celebrations of what’s working and opportunities for change. This information provides you with an opportunity to highlight what is working for your own planning,in addition to providing supportive rationale for using certain teaching strategies (which you should share with your class.
Other resources include...
SOIREE
Introduction to Backward Design
Writing Measurable Outcomes for Students
Theme 2: Perceptions of structure, community, and workload
This theme relates to questions that explore students’ perceptions of the class community, structure, and workload. These are powerful descriptive questions that enable you to explore a number of issues with students (and/or with your colleagues), depending on the nature of student responses. Theme 2 example questions: I have the opportunity to ask questions. The material is interesting and engaging. Feedback is provided in a manner that helps me learn. Instructions are clear.
If responses raise concerns, consider:
Narrowing the toolset students need to use to complete required activities
Using the full suite of native tools in D2L – including the discussion board, the calendar, and the checklist
Providing opportunities for students to interact with you and each other in a no-stress, non-academic setting (perhaps via Zoom before or after class)
Re-visiting assignment and project descriptions to very clearly indicate how students use tools, seek assistance, and can contact you and/or their peers
Building in multiple points of clarification and reminders of due dates and work processes
You can also check out this from SOIREE:
Resources to Build Inclusivity and Community
Theme 3: Perceptions of learning environment
Questions in this theme indicate students' self-perception of their learning and the learning environment. Three of these questions are open-ended, so you want to make sure you’re recognizing the time it takes students to provide this type of feedback. An easy way to find patterns in the open ended responses is to paste all them into a word cloud generator. Consider using this tool: https://worditout.com/word-cloud/create Theme 3 example questions: This course's meetings and activities motivate me to learn. The way new concepts are introduced is aligned with my learning style. Overall, my learning in this course meets my expectations. What elements of class have contributed to or proved most helpful for your learning so far? What could be added or changed to reduce barriers to learning in this class so far?
After you consider the responses to these questions in addition to the items in the themes above, you have information to adapt your plan for the remainder of the semester. Be sure to tell your students what you’re changing and why (based on what feedback). Asking for feedback without following up can suggest to students that their opinions might not matter, and harm your relationship. Instead, address opportunities for what you and they can do to make the most of the semester, share your intended plans for utilizing the feedback, and thank students for their honesty, inviting them to continue working with you to improve the course.
You can also consider checking out these additional resources from SOIREE:
Student to Instructor interactions & engagement
Student to student interactions & engagement
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

Foundations of the Example Mid-Semester Feedback Questions
Foundations of the Hub's Mid-Semester Feedback Instrument:
Generally, mid-semester feedback is formative and focuses on three basic questions:
1.What would students like to see more of?
2.What would students like to see less of?
3.What would students like to see done differently?
The sample questions provided can be used to build an insturment for students at any moment, although mid-semester is most desired because students will have had enough experience to share feedback and there is still time to make changes to the course, if necessary. There are colleagues across the university who already incorporate mid-semester feedback into their educator practice, or who have support from their unit to do this work. The Center for Teaching and Learning Innovation (formerly Hub for Learning and Technology) is offering resource to compliment the great work that is already happening, and provide mid-semester feedback support broadly.
We encourage you to built an instrument that is short, and includes both scaled and open-ended questions. The intention is to gain insight into the student experience as it relates to the structure of the course, not specifically on the instructor.
Mid-semester feedback instruments tend to be generic, but you have the opportunity to use these sample questions in constructing an insturment that is helpful to you and tailored to your course(s). We have drawn from the work of colleagues at Princeton, Vanderbilt, Brown, Kansas, Yale, North Carolina, and MSU’s Broad College of Business to build this list of sample questions. We thank them.
Generally, mid-semester feedback is formative and focuses on three basic questions:
1.What would students like to see more of?
2.What would students like to see less of?
3.What would students like to see done differently?
The sample questions provided can be used to build an insturment for students at any moment, although mid-semester is most desired because students will have had enough experience to share feedback and there is still time to make changes to the course, if necessary. There are colleagues across the university who already incorporate mid-semester feedback into their educator practice, or who have support from their unit to do this work. The Center for Teaching and Learning Innovation (formerly Hub for Learning and Technology) is offering resource to compliment the great work that is already happening, and provide mid-semester feedback support broadly.
We encourage you to built an instrument that is short, and includes both scaled and open-ended questions. The intention is to gain insight into the student experience as it relates to the structure of the course, not specifically on the instructor.
Mid-semester feedback instruments tend to be generic, but you have the opportunity to use these sample questions in constructing an insturment that is helpful to you and tailored to your course(s). We have drawn from the work of colleagues at Princeton, Vanderbilt, Brown, Kansas, Yale, North Carolina, and MSU’s Broad College of Business to build this list of sample questions. We thank them.
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu

Preparing students for course mid-semester feedback
So you've built a mid-semester feedback instrument for your course. What's next?
Explain to students why you are collecting anonymous feedback in the middle of the semester.
Provide an overview of the process, including when it will take place, how you plan to use the feedback, and when you will share results with the class.
Share advice on how students can give constructive feedback, such as describe, evaluate, and suggest (the instrument itself enables all three).You can share the survey in the body of a message to students (via e-mail, d2l, or other previously determined mode of course communication).
Here is some sample language you could include in a message (feel free to copy/paste or adapt):
In an effort to make sure our class is providing a valuable learning experience for you and your classmates, I’ll be sending out a “mid-semester feedback” survey. This is your opportunity to anonymously share your thoughts on what is working in class and what could be better. No identifying information is collected as a part of the survey and the results are shared with me as a single dataset. I will not be able to identify individual student identities. Your feedback will help me to design and facilitate this course in a way that is meaningful for you. If there are things I could change to make the course more effective I want to know. I’ll use this feedback to inform the remainder of the semester. Thank you in advance for your participation.
You could also choose to build in 10 minutes of time at the start of one of your synchronous course sessions (if applicable) for students to complete the survey. Tip: build this time in at the start of class to avoid feedback being based solely on that day’s activities.
Always be sure to thank your students for participating in the process of improving the class and remember course feedback should always be anonymous!
Explain to students why you are collecting anonymous feedback in the middle of the semester.
Provide an overview of the process, including when it will take place, how you plan to use the feedback, and when you will share results with the class.
Share advice on how students can give constructive feedback, such as describe, evaluate, and suggest (the instrument itself enables all three).You can share the survey in the body of a message to students (via e-mail, d2l, or other previously determined mode of course communication).
Here is some sample language you could include in a message (feel free to copy/paste or adapt):
In an effort to make sure our class is providing a valuable learning experience for you and your classmates, I’ll be sending out a “mid-semester feedback” survey. This is your opportunity to anonymously share your thoughts on what is working in class and what could be better. No identifying information is collected as a part of the survey and the results are shared with me as a single dataset. I will not be able to identify individual student identities. Your feedback will help me to design and facilitate this course in a way that is meaningful for you. If there are things I could change to make the course more effective I want to know. I’ll use this feedback to inform the remainder of the semester. Thank you in advance for your participation.
You could also choose to build in 10 minutes of time at the start of one of your synchronous course sessions (if applicable) for students to complete the survey. Tip: build this time in at the start of class to avoid feedback being based solely on that day’s activities.
Always be sure to thank your students for participating in the process of improving the class and remember course feedback should always be anonymous!
Posted by: Makena Neal
Assessing Learning
Posted on: #iteachmsu
Tips for Students: Giving useful feedback
Instructors can include a link to this post or download this resource to include with their mid-semester survey introduction correspondence to students.
Instructors can include a link to this post or download this resource to include with their mid-semester survey introduction correspondence to students.
Posted by: Makena Neal
Assessing Learning
Posted on: GenAI & Education
AI 101: and Infographic
Check out this simple resource from aiEDU (the AI Education Project) a non-profit that creates equitable learning experiences that build foundational AI literacy. You can learn more, and find adaptable tools and activities for educators, parents, and students at https://www.aiedu.org/
Check out this simple resource from aiEDU (the AI Education Project) a non-profit that creates equitable learning experiences that build foundational AI literacy. You can learn more, and find adaptable tools and activities for educators, parents, and students at https://www.aiedu.org/
Posted by: Makena Neal
Posted on: #iteachmsu
While this resource is from the Office of Student Life Counseling and Consultation at The Ohio State University (Adapted and used by permission of Dr. Joan Whitney, Director of Villanova University Counseling Center), "Dealing with the Aftermath of Tragedy in the Classroom" provides 12 actionable steps for educators to consider when coming back together with their students after a collective tragedy.
(1-page PDF)
(1-page PDF)
Posted by: Makena Neal
Navigating Context
Posted on: #iteachmsu
Pre-Class Survey
It's helpful to survey your students before class begins to learn about their accessibility and/or technology needs. This contributes to students feeling welcome in your course and gives you practical information about both learners' needs and whether to follow-up with specific resources. There is a template accessibility survey (titled "[COURSE#] Accessibility pre-start Survey") within the CTLI's library of surveys that you can copy and adapt to your own course; instructions on how to access and make your own version are here: https://iteach.msu.edu/iteachmsu/groups/iteachmsu/stories/2810
It's helpful to survey your students before class begins to learn about their accessibility and/or technology needs. This contributes to students feeling welcome in your course and gives you practical information about both learners' needs and whether to follow-up with specific resources. There is a template accessibility survey (titled "[COURSE#] Accessibility pre-start Survey") within the CTLI's library of surveys that you can copy and adapt to your own course; instructions on how to access and make your own version are here: https://iteach.msu.edu/iteachmsu/groups/iteachmsu/stories/2810
Posted by: Ellie Louson
Pedagogical Design
Posted on: GenAI & Education
aiEDU (the AI Education Project), a non-profit that creates equitable learning experiences that build foundational AI literacy, presents three AI Activities...
1. Will Robots Take My Job?
2. Quick, Draw!
3. Charge Your Phone... or Else
You can learn more, and find adaptable tools and activities for educators, parents, and students at https://www.aiedu.org/
1. Will Robots Take My Job?
2. Quick, Draw!
3. Charge Your Phone... or Else
You can learn more, and find adaptable tools and activities for educators, parents, and students at https://www.aiedu.org/
Posted by: Makena Neal
Posted on: #iteachmsu
Hello Colleagues,
I’m writing to inform you that the MSU Libraries' Open Educational Resources Award Program call for applications for the academic year 2023-2024 opens today.
Now in its 5th year, the OER Award Program provides financial incentives and support to instructors interested in adopting, adapting, or creating OER as an alternative to traditional learning materials to advance our goals of affordability, access, equity, and student success.
Please visit https://libguides.lib.msu.edu/oer/award or consult the attached Call for Proposals to learn more about the application categories, eligibility, participation requirements, timelines, and criteria for evaluation. Application forms are available at https://libguides.lib.msu.edu/oer/award, and the deadline for submission is February 12, 2024.
The OER Advisory Committee will meet to review applications, and successful awardees will be notified on March 1, 2024.
Please feel free to share this information with interested colleagues.
Sincerely,
Linda
I’m writing to inform you that the MSU Libraries' Open Educational Resources Award Program call for applications for the academic year 2023-2024 opens today.
Now in its 5th year, the OER Award Program provides financial incentives and support to instructors interested in adopting, adapting, or creating OER as an alternative to traditional learning materials to advance our goals of affordability, access, equity, and student success.
Please visit https://libguides.lib.msu.edu/oer/award or consult the attached Call for Proposals to learn more about the application categories, eligibility, participation requirements, timelines, and criteria for evaluation. Application forms are available at https://libguides.lib.msu.edu/oer/award, and the deadline for submission is February 12, 2024.
The OER Advisory Committee will meet to review applications, and successful awardees will be notified on March 1, 2024.
Please feel free to share this information with interested colleagues.
Sincerely,
Linda
Posted by: Linda Miles
Pedagogical Design
Posted on: GenAI & Education
AI Commons Bulletin 1/22/2025
Human-curated news about generative AI for Teaching and Learning in Higher Education.
📷 AI for Photographic Course Materials
Instructors using photos in course materials can explore AI tools that extend images into panoramic or 360-degree views. Currently based on a single photo, these tools may soon evolve to include context, offering more accurate and dynamic results.
Learn More: https://people.engr.tamu.edu/nimak/Papers/PanoDreamer/index.html
👍 Policies at German Universities Generally Positive Toward AI
A content analysis of AI guidelines at 67 universities in Germany can be summed up as: use it if you wish, just be open and transparent.
Learn More: https://doi.org/10.1111/ejed.12891
💬 Word of the Day: Agentic Era
Google sees the future as agentic. To them, this means AI that can “understand more about the world around you, think multiple steps ahead, and take action on your behalf”. In other words, AI that makes decisions and adapts to its surroundings.
Learn More: https://blog.google/technology/google-deepmind/google-gemini-ai-update-december-2024/
🏫 Learning Needs in the Age of AI is Different
The rise of Generative Artificial Intelligence (GenAI) sparks important discussions regarding learner independence and self-direction:
1. How to use AI productively for one’s learning needs
2. How to evaluate AI responses
3. How to maintain one’s own voice
Learn More: https://doi.org/10.3390/educsci14121369
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Human-curated news about generative AI for Teaching and Learning in Higher Education.
📷 AI for Photographic Course Materials
Instructors using photos in course materials can explore AI tools that extend images into panoramic or 360-degree views. Currently based on a single photo, these tools may soon evolve to include context, offering more accurate and dynamic results.
Learn More: https://people.engr.tamu.edu/nimak/Papers/PanoDreamer/index.html
👍 Policies at German Universities Generally Positive Toward AI
A content analysis of AI guidelines at 67 universities in Germany can be summed up as: use it if you wish, just be open and transparent.
Learn More: https://doi.org/10.1111/ejed.12891
💬 Word of the Day: Agentic Era
Google sees the future as agentic. To them, this means AI that can “understand more about the world around you, think multiple steps ahead, and take action on your behalf”. In other words, AI that makes decisions and adapts to its surroundings.
Learn More: https://blog.google/technology/google-deepmind/google-gemini-ai-update-december-2024/
🏫 Learning Needs in the Age of AI is Different
The rise of Generative Artificial Intelligence (GenAI) sparks important discussions regarding learner independence and self-direction:
1. How to use AI productively for one’s learning needs
2. How to evaluate AI responses
3. How to maintain one’s own voice
Learn More: https://doi.org/10.3390/educsci14121369
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Sarah Freye
Posted on: GenAI & Education
AI Commons Bulletin 2/19/2025
🧠 AI Tools Soon to Decide How Much They Need to “Think”
Expect the answers from AI tools to generally improve over the next few months, as more of them incorporate “reasoning” into their process. These are models that can discern when a prompt is more complex and would require a multi-step reasoning process. OpenAI is starting this with ChatGPT soon.
Learn More: https://www.youtube.com/watch?v=KtwK3hBAjDY
📗 Five Generations of Intelligent Textbooks
Sosnovsky & Brusilovsky compile the literature on intelligent textbooks and organize five generations:
Engineered: AI-powered adaptive reading.
Integrated: Linked with external smart content.
Extracted: AI analyzes and structures knowledge.
Datamined: Tracks student engagement for insights.
Generated: AI creates content, questions, & chatbots
Learn More: Sosnovsky, S., Brusilovsky, P. & Lan, A. Intelligent Textbooks. Int J Artif Intell Educ (2025).
🚫 Guidance for Uses of AI Banned by EU’s AI Act
The EU regulates AI much more than the US does. When it adopted the AI Act, it banned “unacceptable risk” uses, but didn’t provide much explanation. A new report lays out examples, including manipulative, deceptive, and exploitative practices.
Learn More: https://ec.europa.eu/newsroom/dae/redirection/document/112367
⏳ Waiting 5-10 Minutes for an AI to Answer?! What?!
Deep Research is a newer function of Google’s AI, Gemini. You can ask it an extended question and it will break it down into parts, research each part (including multiple web searches), and write up a report you can download. It’s available both on the web and on Android. Additional $ required.
Learn More: https://youtu.be/IBKRyI5m_Rk
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
🧠 AI Tools Soon to Decide How Much They Need to “Think”
Expect the answers from AI tools to generally improve over the next few months, as more of them incorporate “reasoning” into their process. These are models that can discern when a prompt is more complex and would require a multi-step reasoning process. OpenAI is starting this with ChatGPT soon.
Learn More: https://www.youtube.com/watch?v=KtwK3hBAjDY
📗 Five Generations of Intelligent Textbooks
Sosnovsky & Brusilovsky compile the literature on intelligent textbooks and organize five generations:
Engineered: AI-powered adaptive reading.
Integrated: Linked with external smart content.
Extracted: AI analyzes and structures knowledge.
Datamined: Tracks student engagement for insights.
Generated: AI creates content, questions, & chatbots
Learn More: Sosnovsky, S., Brusilovsky, P. & Lan, A. Intelligent Textbooks. Int J Artif Intell Educ (2025).
🚫 Guidance for Uses of AI Banned by EU’s AI Act
The EU regulates AI much more than the US does. When it adopted the AI Act, it banned “unacceptable risk” uses, but didn’t provide much explanation. A new report lays out examples, including manipulative, deceptive, and exploitative practices.
Learn More: https://ec.europa.eu/newsroom/dae/redirection/document/112367
⏳ Waiting 5-10 Minutes for an AI to Answer?! What?!
Deep Research is a newer function of Google’s AI, Gemini. You can ask it an extended question and it will break it down into parts, research each part (including multiple web searches), and write up a report you can download. It’s available both on the web and on Android. Additional $ required.
Learn More: https://youtu.be/IBKRyI5m_Rk
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Sarah Freye
Host: CTLI
Setting the Tone from the Start
The way a course begins is crucial for educators to establish an environment that fosters engagement, collaboration, and a sense of belonging. Join us for a one-hour hybrid workshop where Educator Developers with MSU's Center for Teaching and Learning Innovation will share actionable strategies that lay the groundwork for an engaging and inclusive course experience from day one including items related to syllabi, expectation setting and pedagogical transparency, checking in on learner needs throughout the term, and way to build a sense of classroom community.
In this workshop, we'll delve into practical techniques and approaches educators can employ to create a welcoming and motivating atmosphere that resonates with learners. The content in this workshop will be primarily targeted to classroom instructors and settings, but tools and strategies are relevant for adaptation and use by any educator in any context. Whether you're a seasoned educator or just embarking on your teaching journey this academic year, "Setting the Tone from the Start" is designed to equip you with actionable insights that will make a difference in your classroom.
Upon completion of this learning experience, participants will be able to:
learn how to craft an engaging and purposeful course introduction that communicates the course's relevance, objectives, and expectations
discover techniques for fostering an inclusive and supportive learning community, understanding how to encourage peer connections and embrace diverse viewpoints
be equipped with a range of interactive strategies, including icebreakers and technology tools, to effectively engage students and cultivate an active learning environment that persists throughout the course duration.
The in-person location for this session is the Center for Teaching and Learning Innovation. Please join us in the Main Library, Room W207. For directions to W207, please visit the Room Locations page..
Navigating Context
EXPIRED