We found 250 results that contain "collaboration"
Posted on: Educator Stories
PEDAGOGICAL DESIGN
Norman Scheel's Educator Story
This week, we are featuring Norman Scheel, a Research Associate in MSU’s Department of Radiology Cognitive Imaging Research Center. Norman was recognized via iteach.msu.edu's Thank and Educator Initiative! We encourage MSU community members to nominate high-impact Spartan educators (via our Thank an Educator form) regularly!
Read more about Norman’s perspectives below. #iteachmsu's questions are bolded below, followed by their responses!
You were recognized via the Thank an Educator Initiative. In one word, what does being an educator mean to you? Share with me what this word/quality looks like in your practice? (Have your ideas on this changed over time? If so, how?)
The word would be “rewarding”. For me, teaching and learning is a two-way street and no matter in which direction you are driving, it is always an investment in the future and there is always traffic in both ways. Above all, I want to set up my students for success. As a teacher, I see myself as a conductor to help my students achieve their personal goals and as a role model who possibly has a substantial influence on the future of my students. So, seeing my students excel is highly rewarding, but I am also learning so much from my students, every day, which is also immensely rewarding.
Tell me more about your educational “setting.” This can include, but not limited to departmental affiliations, community connections, co-instructors, and students. (AKA, where do you work?)
I am now in the final stages of my postdoc in the Radiology Department of Michigan State University and am currently applying for Assistant Professor positions. Together with Prof. David Zhu I supervise and mentor the graduate students in our lab as well as students that rotate through it. I also mentor and advise students remotely for their bachelor’s and master's theses at my home University of Lübeck, Germany where I did my Ph.D. in Computer Science and Computational Neuroscience. In my research, I work interdisciplinary with many different universities, e. g. Vanderbilt University, University of Texas, John Hopkins University, or the Max Planck Institute Tübingen, Germany, on a variety of research questions. With my collaborators at these institutions, there are always students working on joint projects where it is natural to mutually teach skills important for the project’s success but also in the personal interest of the students.
What is a challenge you experience in your educator role? Any particular “solutions” or “best practices” you’ve found that help you support student success at the university despite/in the face of this?
My German Diploma in Informatics taught me the importance of multidimensional learning, or as Aristotle said, “the whole is greater than the sum of its parts”. Over the last few years, I saw a trend that students are taught highly specific topics, without relating these to a “grand scheme”. Integrating information from multiple perspectives gives cross-references to other related topics and courses. This integration facilitates the ability to abstract learned information and helps to apply it in a more holistic way of connecting “the bigger picture”. For clarity, the content in my lectures is presented in a way that is illustrative rather than abstract, so that students are able to grasp the content and put it into relation to what they have learned before. I always try to highlight cross-references as much as possible, so that students see past the boundaries of final exams.
What are practices you utilize that help you feel successful as an educator?
The most important I think is to find a way to effectively communicate. As my teaching is typically in a small group or individual setting, I am able to tailor my teaching directly to the needs of my students. This helps tremendously in finding ways to communicate expectations between my students and me.
What topics or ideas about teaching and learning would you like to see discussed on the iteach.msu.edu platform? Why do you think this conversation is needed at MSU?
It would be amazing to have a central place on the platform, where educators could advertise potential master’s or bachelor’s theses, or rotation projects, or vice versa, students could advertise that they are on the look-out for these projects, with a few skills that they have, to see if there might be a fit. In my time here at MSU, it has been very difficult to find mid-level academic hands, especially interdisciplinary ones. The lack of or at least problematic communication between different parts of the University makes local collaboration very difficult.
What are you looking forward to (or excited to be a part of) next semester?
I am excited for a few of my students to get the chance to present at scientific conferences. It is always such a rewarding experience and always such a big push for motivation and new ideas.
Don't forget to celebrate individuals you see making a difference in teaching, learning, or student success at MSU with #iteachmsu's Thank an Educator initiative. You might just see them appear in the next feature!
Read more about Norman’s perspectives below. #iteachmsu's questions are bolded below, followed by their responses!
You were recognized via the Thank an Educator Initiative. In one word, what does being an educator mean to you? Share with me what this word/quality looks like in your practice? (Have your ideas on this changed over time? If so, how?)
The word would be “rewarding”. For me, teaching and learning is a two-way street and no matter in which direction you are driving, it is always an investment in the future and there is always traffic in both ways. Above all, I want to set up my students for success. As a teacher, I see myself as a conductor to help my students achieve their personal goals and as a role model who possibly has a substantial influence on the future of my students. So, seeing my students excel is highly rewarding, but I am also learning so much from my students, every day, which is also immensely rewarding.
Tell me more about your educational “setting.” This can include, but not limited to departmental affiliations, community connections, co-instructors, and students. (AKA, where do you work?)
I am now in the final stages of my postdoc in the Radiology Department of Michigan State University and am currently applying for Assistant Professor positions. Together with Prof. David Zhu I supervise and mentor the graduate students in our lab as well as students that rotate through it. I also mentor and advise students remotely for their bachelor’s and master's theses at my home University of Lübeck, Germany where I did my Ph.D. in Computer Science and Computational Neuroscience. In my research, I work interdisciplinary with many different universities, e. g. Vanderbilt University, University of Texas, John Hopkins University, or the Max Planck Institute Tübingen, Germany, on a variety of research questions. With my collaborators at these institutions, there are always students working on joint projects where it is natural to mutually teach skills important for the project’s success but also in the personal interest of the students.
What is a challenge you experience in your educator role? Any particular “solutions” or “best practices” you’ve found that help you support student success at the university despite/in the face of this?
My German Diploma in Informatics taught me the importance of multidimensional learning, or as Aristotle said, “the whole is greater than the sum of its parts”. Over the last few years, I saw a trend that students are taught highly specific topics, without relating these to a “grand scheme”. Integrating information from multiple perspectives gives cross-references to other related topics and courses. This integration facilitates the ability to abstract learned information and helps to apply it in a more holistic way of connecting “the bigger picture”. For clarity, the content in my lectures is presented in a way that is illustrative rather than abstract, so that students are able to grasp the content and put it into relation to what they have learned before. I always try to highlight cross-references as much as possible, so that students see past the boundaries of final exams.
What are practices you utilize that help you feel successful as an educator?
The most important I think is to find a way to effectively communicate. As my teaching is typically in a small group or individual setting, I am able to tailor my teaching directly to the needs of my students. This helps tremendously in finding ways to communicate expectations between my students and me.
What topics or ideas about teaching and learning would you like to see discussed on the iteach.msu.edu platform? Why do you think this conversation is needed at MSU?
It would be amazing to have a central place on the platform, where educators could advertise potential master’s or bachelor’s theses, or rotation projects, or vice versa, students could advertise that they are on the look-out for these projects, with a few skills that they have, to see if there might be a fit. In my time here at MSU, it has been very difficult to find mid-level academic hands, especially interdisciplinary ones. The lack of or at least problematic communication between different parts of the University makes local collaboration very difficult.
What are you looking forward to (or excited to be a part of) next semester?
I am excited for a few of my students to get the chance to present at scientific conferences. It is always such a rewarding experience and always such a big push for motivation and new ideas.
Don't forget to celebrate individuals you see making a difference in teaching, learning, or student success at MSU with #iteachmsu's Thank an Educator initiative. You might just see them appear in the next feature!
Posted by:
Makena Neal

Posted on: Educator Stories

Norman Scheel's Educator Story
This week, we are featuring Norman Scheel, a Research Associate in ...
Posted by:
PEDAGOGICAL DESIGN
Monday, Nov 7, 2022
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Given the COVID-19 pandemic, educators from many fields have looked to representations of pandemics to help students study topics the pandemic has accentuated. In the history of science, educators have explored inequalities in medicine, trust in experts, and responses to uncertainty. To help make these issues digestible, some educators have turned to the cooperative board game, Pandemic Legacy. Small groups work together to avert a global health crisis by managing disease. Teams play the game multiple times, but actions in one game have consequences for the next and rules change and develop as the game progresses. The game's development introduces students to new concepts at a manageable pace while giving them new problems to solve. While the game effectively introduced students to topics in the history of science, this study sought to know whether it promoted cognitive and interpersonal skills. It focused on team adaptive performance, which is linked to problem-solving and communication skills. Data was collected using three surveys. Variation in teams' responses was analyzed using the Median test. The Friedman test was used to analyze each team's adaptive performance at each of the three timesteps. All teams were initially quite confident in their ability to creatively deal with unexpected events and reported that they adapted well to new tasks. As they encountered novel situations, some teams reported that their confidence decreased. They were newly aware that they did not have creative solutions to unexpected problems. Teams aware of their limitations performed better than those who maintained their initial confidence.To access a PDF of the "Studying Team Adaptive Performance using the Board Game Pandemic Legacy" poster, click here.Description of the Poster
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Research Goal
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015).
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation.
Research Hypothesis
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability
Research Methods
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.
Questions to evaluate adaptability included:
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled.
Q8: We search and develop new competences to deal with difficult situations.
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938.
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome)
Research Results: Success Measure
The success measure results of number of wins are displayed in a bar chart.
The success measure results of number of outbreaks are displayed in a bar chart.
Research Results: Adaptability Measure
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030).
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8.
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8.
Research Findings
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes.
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time.
Acknowledgments
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School.
References
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60.
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65.
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Research Goal
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015).
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation.
Research Hypothesis
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability
Research Methods
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.
Questions to evaluate adaptability included:
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled.
Q8: We search and develop new competences to deal with difficult situations.
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938.
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome)
Research Results: Success Measure
The success measure results of number of wins are displayed in a bar chart.
The success measure results of number of outbreaks are displayed in a bar chart.
Research Results: Adaptability Measure
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030).
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8.
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8.
Research Findings
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes.
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time.
Acknowledgments
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School.
References
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60.
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65.
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.
Authored by:
Melissa Charenko

Posted on: #iteachmsu

Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Given the COVID-19 pandemic, educators from many fields have looked...
Authored by:
PEDAGOGICAL DESIGN
Monday, May 3, 2021