We found 395 results that contain "group"

Posted on: #iteachmsu
Wednesday, Sep 15, 2021
#iteachmsu Settings and Notifications
*This feature will be added in our early fall release which is forthcoming.*This brief tutorial will demonstrate how you can update your #iteachmsu notifications and settings. How do I update my #iteachmsu settings?

After you have logged in using your MSU NetID and password, you will see your name in the upper right-hand corner of the screen. Select 'Settings' to manage your #iteachmsu Commons notification and language settings. AccountOn the settings page you will first see the "Account" section. You can select the edit button on the right to change your preferred language and country settings.  NotificationsBelow is the "Notifications" section. At the top right of the section you will can toggle on or off all email notifications. You can also manage the specific types of alerts that you recieve from #iteachmsu. Under "Notifications" you can choose to turn on or off alerts for new connections, groups, or messages. Below you can also manage your notifications for new likes and comments on your #iteachmsu posts, articles, groups, playlists, and assessments. Click here to watch a video tutorial:
Authored by: #iteachmsu
post image
Posted on: #iteachmsu
Wednesday, Sep 2, 2020
How Video Length Affects Student Learning – The Shorter, The Better!
In-Person Lectures vs. Online Instruction
Actively engaging students in the learning process is important for both in-person lectures and for online instruction. The ways in which students engage with the instructor, their peers, and the course materials will vary based on the setting. In-person courses are often confined by the fact that instruction needs to be squeezed into a specific time period, which can result in there being a limited amount of time for students to perform group work or to actively think about the concepts they are learning. Alternatively, with online instruction, there is often more freedom (especially for an asynchronous course) on how you can present materials and structure the learning environment.
Currently, many instructors are faced with the challenge of adapting their in-person courses into an online format. How course materials are adapted into an online format are going to differ from course to course – however, a common practice shared across courses is to create lecture recordings or videos for students to watch. The format and length of these videos play an important role in the learning experience students have within a course. The ways in which students engage with a longer video recording is going to be much different than how students engage with multiple shorter videos. Below are some of the important reasons why shorter videos can enhance student learning when compared to longer videos.
 
More Opportunities for Students to Actively Engage with the Material
Decades of research on how people learn has shown that active learning (in comparison to more passive approaches, such as direct instruction or a traditional lecture) enhances student performance (Freeman et. al., 2014). While “active learning” can often be a nebulous phrase that has different meanings, active learning can be broadly thought of as any activity in which a learner is metacognitively thinking about and applying knowledge to accomplish some goal or task. Providing multiple opportunities for students to engage in these types of activities can help foster a more meaningful and inclusive learning environment for students. This is especially important for online instruction as students may feel isolated or have a difficult time navigating their learning within a virtual environment.
One of the biggest benefits of creating a series of shorter videos compared to creating one long video is that active learning techniques and activities can be more easily utilized and interspersed throughout a lesson. For example, if you were to record a video of a traditional lecture period, your video would be nearly an hour in length, and it would likely cover multiple important topics within that time period. Creating opportunities to actively engage students throughout an hour-long video is difficult and can result in students feeling overwhelmed.
Conversely, one of the affordances of online instruction is that lectures can be broken down into a series of smaller video lessons and activities. By having shorter videos with corresponding activities, students are going to spend more time actively thinking about and applying their understanding of concepts throughout a lesson. This in turn can promote metacognition by getting students to think about their thinking after each short video rather than at the end of a long video that covers multiple topics.
Additionally, concepts often build upon one another, and it is critical that students develop a solid foundation of prior knowledge before moving onto more complex topics. When you create multiple short videos and activities, it can be easier to get a snapshot of how students conceptualize different topics as they are learning it. This information can help both you as an instructor and your students become better aware of when they are having difficulties so that issues can be addressed before moving onto more complex topics. With longer videos, students may be confused on concepts discussed at the beginning of the video, which can then make it difficult for them to understand subsequent concepts.
Overall, chunking a longer video into multiple shorter videos is a simple technique you can use to create more meaningful learning opportunities in a virtual setting. Short videos, coupled with corresponding activities, is a powerful pedagogical approach to enhance student learning.
 
Reducing Cognitive Load
Another major benefit of having multiple shorter videos instead of one longer video is that it can reduce the cognitive load that students experience when engaging with the content. Learning is a process that requires the brain to adapt, develop, and ultimately form new neural connections in response to stimuli (National Academies of Sciences, 2018). If a video is long and packed with content, developing a meaningful understanding of concepts can be quite difficult. Even if the content is explained in detail (which many people think of as “good instruction”), students simply do not have enough time to process and critically think about the content they are learning. When taking in various stimuli and trying to comprehend multiple concepts, this can result in students feeling anxious and overwhelmed. Having time to self-reflect is one of the most important factors to promoting a deeper, more meaningful learning experience. Unfortunately, long video lectures provide few opportunities (even when done well!) for students to engage in these types of thinking and doing.
Additionally, an unintended drawback of long videos is that the listener can be lulled into a false sense of understanding. For example, have you ever watched a live lecture or an educational video where you followed along and felt like you understood the material, but then after when you went to apply this knowledge, you realized that you forgot or did not understand the content as well as you thought? Everyone has experienced this phenomenon in some form or another. As students watch long video lectures, especially lectures that have clear explanations of the content, they may get a false sense of how well they understand the material. This can result in students overestimating their ability and grasp of foundational ideas, which in turn, can make future learning more difficult as subsequent knowledge will be built upon a faulty base.
Long lecture videos are also more prone to having extraneous information or tangential discussions throughout. This additional information may cause students to shift their cognitive resources away from the core course content, resulting in a less meaningful learning experience (Mayer & Moreno, 2003). Breaking a long video into multiple shorter videos can reduce the cognitive load students may experience and it can create more opportunities for them to self-reflect on what they are learning. 
 
More Engaging for Students
Another important factor to think about is how video length affects student engagement. A study by Guo, Kim, and Rubin (2014) looked at how different forms of video production affected student engagement when watching videos. Two of their main findings were that (1) shorter videos improve student engagement, and that (2) recordings of traditional lectures are less engaging compared to digital tablet drawing or PowerPoint slide presentations. These findings show how it is not only important to record shorter videos, but that simply recording a traditional lecture and splicing it into smaller videos will not result in the most engaging experience for students.
When distilling a traditional lecture into a series of shorter videos, it is important to think about the pedagogical techniques you would normally use in the classroom and how these approaches might translate to an online setting. Identifying how these approaches might be adapted into a video recording can help create a more engaging experience for students in your course.
Overall, the length of lecture videos and the ways in which they are structured directly impacts how students learn in a virtual setting. Recording short, interactive videos, as opposed to long lecture videos, is a powerful technique you can use to enhance student learning and engagement.
 
References
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415.
Guo, P. J., Kim, J., & Rubin, R. (2014, March). How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the first ACM conference on Learning@ scale conference (pp. 41-50).
Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational psychologist, 38(1), 43-52.
National Academies of Sciences, Engineering, and Medicine. (2018). How people learn II: Learners, contexts, and cultures. National Academies Press.
Authored by: Christopher J. Minter
post image
Posted on: #iteachmsu
Thursday, May 6, 2021
Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies
How can a first-year writing course help to create 21st century STEM students with foundations for interdisciplinary inquiry? Could such as curriculum engage STEM students in knowledge production in ways that help to acculturate them as collaborative, ethical, and empathetic learners? Bringing together insights from writing pedagogy, work on critical science literacy, and science studies, this round-table is hosted by the collaborative team leading an effort to rethink the first year writing course required of all students at Lyman Briggs College, MSU's residential college for STEM students. A major goal of the curriculum redesign is to develop science studies-inspired writing assignments that foster reflective experiential learning about the nature of science. The purpose of this approach is not only to demonstrate the value of inquiry in science studies (history, philosophy, and sociology of science) to STEM students as they pursue their careers, but to foster diverse inclusion in science by demystifying key aspects of scientific culture and its hidden curriculum for membership. Following the guidance of critical pedagogy (e.g. bell hooks), we aim to use the context of first-year writing instruction as an opportunity for critical reflection and empowerment. The roundtable describes how the instructional team designed the first-year curriculum and adapted it to teaching online during the pandemic, and shares data on lessons learned by both the instructor team and our students. We invite participants to think with us as we continue to iteratively develop and assess the curriculum.To access a PDF version of the "Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies" poster, click here. Description of Poster:
Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies  
Marisa Brandt, HPS Lyman Briggs College & June Oh, English 
Project Overview: Reimagining LB 133 
Lyman Briggs College aims to provide a high quality science education to diverse students by teaching science in social, human, and global contexts. LB 133: Science & Culture fulfills the Tier 1 writing requirement for 80-85% of LBC students. Starting in F19, we implemented a new, collaboratively developed and taught cohort model of the LB 133 curriculum in order to take advantage of opportunity to foster a community of inquiry, inclusion, and curiosity.  
First year college writing and literacy courses aim to give students skills to communicate and evaluate information in their own fields and beyond. While teaching important writing skills, LB 133 focuses on developing students’ science literacy by encouraging them to enact a subject position of a socially engaged science professional in training. LB 133 was designed based on ideas of HPS. 
History, Philosophy, and Sociology (HPS) or “science studies” is an interdisciplinary field that studies science in context, often extended to include medicine, technology, and other sites of knowledge-production. LB 133 centers inquiry into relations of science and culture. One way HPS can help students succeed in STEM is by fostering inclusion. In LB 133, this occurs through demystifying scientific culture and hidden curriculum through authentic, project-based inquiry.  
Like WRAC 110, LB 133 is organized around five writing projects. Each project entails a method of inquiry into science as a social, human practice and teaches them to write first as a form of sense-making about their data. (Column 2) Then, students develop writing projects to communicate what they have learned to non-scientific audiences.  
Research Questions:  


How did their conceptions of science change?[Text Wrapping Break] 2. Did their writing improve?[Text Wrapping Break] 3. What did they see as the most important ideas and skills they would take from the course?[Text Wrapping Break] 4. Did they want more HPS at LBC?  


Data Collection:  
[Text Wrapping Break]1. Analysis of the beginning and end of course Personal Writing assessments. [Text Wrapping Break]2. End of term survey. [Text Wrapping Break]3. Answers to course reflection questions.  
Selected Results: See Column 3. 
Conclusions: The new model seems successful! Students reported finding 133 surprisingly enjoyable and educational, for many reasons. Many felt motivated to write about science specifically, saw communication as valuable scientific skill. Most felt their writing improved and learned more than anticipated. Most learned and valued key HPS concepts and wanted to learn more about diversity in scientific cultures, and wanted to continue HPS education in LBC to do so. 
Column 2 - Course Structure: Science & Culture 




Assessment 


Science Studies Content[Text Wrapping Break]Learning Goals 


Literacy & Writing Skills Learning Goals 




Part 1 - Cultures of Science 




Personal Writing 1: Personal Statement [STEM Ed Op-ed][Text Wrapping Break]Short form writing from scientific subject position.  


Reflect on evolving identity, role, and responsibilities in scientific culture.   


Diagnostic for answering questions, supporting a claim, providing evidence, structure, and clear writing. 




Scientific Sites Portfolio[Text Wrapping Break]Collaborative investigation of how a local lab produces knowledge.   


Understand scientific practice, reasoning, and communication in its diverse social, material, and cultural contexts. Demystify labs and humanize scientists. 


Making observational field notes. Reading scientific papers.  
Peer review. Claim, evidence, reasoning. Writing analytical essays based on observation.   




Part 2 - Science in Culture 




Unpacking a Fact Poster 
Partner project assessing validity of a public scientific claim. 


Understand the mediation of science and how to evaluate scientific claims. Identify popular conceptions of science and contrast these with scientists’ practices. 


Following sources upstream. Comparing sources.  
APA citation style.  
Visual display of info on a poster. 




Perspectives Portfolio[Text Wrapping Break]Collaborative investigation of a debate concerning science in Michigan. 


Identify and analyze how diverse stakeholders are included in and/or excluded from science. Recognize value of diverse perspective. 


Find, use, and correctly cite primary and scholarly secondary sources from different stakeholder perspectives. 
Learn communicating to a broader audience in an online platform. 




Personal Writing 2: Letter + PS Revision[Text Wrapping Break]Sharing a course takeaway with someone. 


Reflect again on evolving identity, role, and responsibilities in scientific culture.   


Final assessment of answering questions, supporting a claim, providing evidence, structure, and clear writing. 




Weekly Formative Assessments 




Discussion Activities Pre-meeting writing about the readings 


Reflect on prompted aspects of science and culture 


Writing as critical inquiry. 
Note-taking. 
Preparation for discussion. 




Curiosity Colloquium responses 
200 words reflecting on weekly speaker series 


Exposure to college, campus, and academic guests—including diverse science professionals— who share their curiosity and career story.  


Writing as reflection on presentations and their personal value. 
Some presenters share research and writing skills. 




Column 3 - Results  
Results from Personal Writing 
Fall 19: There were largely six themes the op-ed assignments discussed. Majority of students chose to talk about the value of science in terms of its ubiquity, problem-solving skills and critical thinking skills, and the way it prompts technological innovation. 
Fall 21: Students largely focused on 1. the nature of science as a product of human labor research embedded with many cultural issues, and 2. science as a communication and how scientists can gain public trust (e.g., transparency, collaboration, sharing failure.)  
F19 & S20 Selected Survey Results 
 108 students responding.The full report here.  


92.5% reported their overall college writing skills improved somewhat or a lot. 


76% reported their writing skills improved somewhat or a lot more than they expected. 


89% reported planning to say in LBC. 


Selected Course Reflection Comments 
The most impactful things students report learning at end of semester. 
Science and Culture: Quotes: “how scientific knowledge is produced” “science is inherently social” “how different perspectives . . . impact science” “writing is integral to the scientific community as a method of sharing and documenting scientific research and discoveries” 
Writing: Quotes: “a thesis must be specific and debatable” “claim, evidence, and reasoning” “it takes a long time to perfect.” Frequently mentioned skills: Thesis, research skill (citation, finding articles and proper sources), argument (evidence), structure and organization skills, writing as a (often long and arduous) process, using a mentor text, confidence. 
What do you want to learn more about after this course? 
“How culture(s) and science coexist, and . . . how different cultures view science” 
“Gender and minority disparities in STEM” “minority groups in science and how their cultures impact how they conduct science” “different cultures in science instead of just the United States” “how to write scientific essays”  
 
Authored by: Marisa Brandt & June Oh
post image
Posted on: #iteachmsu
Monday, Apr 26, 2021
Street Teams: Team Resilience on the Street
“I want to learn. I want to help.” We regularly hear this from students. How do we design environments that empower positive failures and spark innovation? We created Street Teams, student-run collaborations. We partner with nonprofits to solve challenges in media communication. Students have real-world learning experiences while giving back to the community.To access a PDF of the "Street Teams: Team Resilience on the Street" poster, click here.Description of the Poster 
STREET TEAMS: TEAM  RESILIENCE ON THE STREET 
solution-based learning and resilience 
Street Teams are student-run, creative collaborations. They partner with nonprofits and assist them with media projects. Teams learn while giving back to the community. 

COLLABORATION 

We are stronger together  

Teams consist of students from various majors, backgrounds and skillsets. Their collective diversity amplifies the work of the whole team.  


Strategic focus on group culture and dynamics  

First semester = team building 
Second semester = content creation 
In 2020-21, we did this all through Zoom!  


Holistic approach to solving challenges  

Projects are based on a combination of non-profit requests and student-driven assessment Together, they create sustainable solutions  




MULTI-LAYERED MENTORING 

Faculty mentor students 
Alumni give feedback on student work  
Nonprofit partner-related professional development opportunities 
Student leaders (Producers) mentor teammates 
Street Team Coordinator hosts weekly Producer meetings and trainings  
Teammates mentor each other 


 IMPACT 

Throughout our history: 

131 students involved*  
20 majors represented**  
37 nonprofits served  
550+ products delivered  *At least 1/3 of students return for more than one year **Some are dual majors  




QUOTE FROM A PARTICIPANT

"(Street Teams) make me feel like I belong to a place. Thank you ... for the opportunity you give all of us to connect with the community." - Manuel Pérez Salas 
Authored by: Jeana-Dee Allen, Katie Schroeder, Jon Whiting
post image
Posted on: #iteachmsu
post image
Street Teams: Team Resilience on the Street
“I want to learn. I want to help.” We regularly hear this from stud...
Authored by:
Monday, Apr 26, 2021
Posted on: #iteachmsu
Monday, Apr 26, 2021
Automated analyses of written responses reveal student thinking in STEM
Formative assessments can provide crucial data to help instructors evaluate pedagogical effectiveness and address students' learning needs. The shift to online instruction and learning in the past year emphasized the need for innovative ways to administer assessments that support student learning and success. Faculty often use multiple-choice (MC) assessments due to ease of use, time and other resource constraints. While grading these assessments can be quick, the closed-ended nature of the questions often does not align with real scientific practices and can limit the instructor's ability to evaluate the heterogeneity of student thinking. Students often have mixed understanding that include scientific and non-scientific ideas. Open-ended or Constructed Response (CR) assessment questions, which allow students to construct scientific explanations in their own words, have the potential to reveal student thinking in a way MC questions do not. The results of such assessments can help instructors make decisions about effective pedagogical content and approaches. We present a case study of how results from administration of a CR question via a free-to-use constructed response classifier (CRC) assessment tool led to changes in classroom instruction. The question was used in an introductory biology course and focuses on genetic information flow. Results from the CRC assessment tool revealed unexpected information about student thinking, including naïve ideas. For example, a significant fraction of students initially demonstrated mixed understanding of the process of DNA replication. We will highlight how these results influenced change in pedagogy and content, and as a result improved student understanding.To access a PDF of the "Automated analyses of written responses reveal student thinking in STEM" poster, click here.Description of the Poster 
Automated analyses of written responses reveal student thinking in STEM 
Jenifer N. Saldanha, Juli D. Uhl, Mark Urban-Lurain, Kevin Haudek 
Automated Analysis of Constructed Response (AACR) research group 
CREATE for STEM Institute, Michigan State University 
Email: jenifers@msu.edu 
Website: beyondmultiplechoice.org  
QR code (for website):  
 
Key highlights: 

Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions. 


The Constructed Response Classifier (CRC) Tool (free to use: beyondmultiplechoice.org) can be used to assess student learning gains 

In an introductory biology classroom: 

Analyses by the CRC tool revealed gaps in student understanding and non-normative ideas. 
The instructor incorporated short term pedagogical changes and recorded some positive outcomes on a summative assessment. 
Additional pedagogical changes incorporated the next semester led to even more positive outcomes related to student learning (this semester included the pivot to online instruction). 

The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.   
Constructed Response Questions as Formative Assessments 

Formative assessments allow instructors to explore nuances of student thinking and evaluate student performance.  
Student understanding often includes scientific and non-scientific ideas [1,2].  


Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions [3,4]. 

Constructed Response Classifier (CRC) tool 

A formative assessment tool that automatically predicts ratings of student explanations.  
This Constructed Response Classifier (CRC) tool generates a report that includes: 


categorization of student ideas from writing related to conceptual understanding. 
web diagrams depicting the frequency and co-occurrence rates of the most used ideas and relevant terms. 

CRC Questions in the Introductory Biology Classroom :  
A Case study 
Students were taught about DNA replication and the central dogma of Biology. 
Question was administered as online homework, completion credit provided. Responses collected were analyzed by the CRC tool. 
CRC question: 
The following DNA sequence occurs near the middle of the coding region of a gene.  DNA   5'  A A T G A A T G G* G A G C C T G A A G G A  3'     
There is a G to A base change at the position marked with an asterisk. Consequently, a codon normally encoding an amino acid becomes a stop codon.  How will this alteration influence DNA replication? 

Part 1 of the CRC question used to detect student confusion between the central dogma processes.  
Related to the Vision & Change core concept 3 “Information Flow, Exchange, and Storage" [5], adapted from the Genetics Concept Assessment [6,7]. 

Insight on Instructional Efficacy from CRC Tool 
Table 1: Report score summary revealed that only a small fraction of students provided correct responses post instruction. (N = 48 students). 




Student responses 


Spring 2019 




Incorrect 


45% 




Incomplete/Irrelevant 


32% 




Correct 


23% 




 
Sample incorrect responses:  
Though both incorrect, the first response below demonstrates understanding of a type of mutation and the second one uses the context of gene expression. 

“This is a nonsense mutation and will end the DNA replication process prematurely leaving a shorter DNA strand” (spellchecked) 


“It will stop the DNA replication… This mutation will cause a gene to not be expressed” 

CRC report provided: 

Response score summaries 
Web diagrams of important terms 
Term usage and association maps 

The instructor Identified scientific and non-scientific ideas in student thinking  
This led to: 
Short term pedagogical changes, same semester  

During end of semester material review, incorporated: 


Small group discussions about the central dogma.  
Discussions about differences between DNA replication, and transcription and translation. 


Worksheets with questions on transcribing and translating sequences. 

Figure one: 
The figure depicts an improvement in student performance observed in the final summative assessment.  
Percentage of students who scored more than 95% on a related question: 
In the unit exam = 71% 
Final summative exam = 79% 
Pedagogical Changes Incorporated in the Subsequent Semester 
CR questions: 

Explain the central dogma. 


List similarities and differences between the processes involved. 
Facilitated small group discussions for students to explain their responses. 

 
Worksheets and homework:  
Transcribe and translate DNA sequences, including ones with deletions/additions.  
Students encouraged to create their own sequences for practice.  
Revisited DNA replication via clicker questions and discussions, while students were learning about transcription and translation. 
Table 2: 68% of students in the new cohort provided correct responses to the CRC question post instruction. (N = 47 students). 




Student Responses 


Spring 2020 




Incorrect 


19% 




Incomplete/Irrelevant 


13% 




Correct 


68% 




Conclusions 
The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.   
Future Directions 

Use the analytic rubric feature in the CRC tool to obtain further insight into normative and non-normative student thinking. 
Use the clicker-based case study available at CourseSource about the processes in the central dogma [8]. 


Incorporate additional CRC tool questions in each course unit. 

Questions currently available in a variety of disciplines: 
Biology, Biochemistry, Chemistry, Physiology, and Statistics 
Visit our website beyondmultiplechoice.org and sign up for a free account 
References: 

Ha, M., Nehm, R. H., Urban-Lurain, M., & Merrill, J. E. (2011).  CBE—Life Sciences Education, 10(4), 379-393. 


Sripathi, K. N., Moscarella, R. A., et al., (2019). CBE—Life Sciences Education, 18(3), ar37. 


Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). CBE—Life Sciences Education, 16(2), ar26. 


Birenbaum, M., & Tatsuoka, K. K. (1987). Applied Psychological Measurement, 11(4), 385-395. 


 "Vision and change in undergraduate biology education: a call to action." American Association for the Advancement of Science, Washington, DC (2011). 


Smith, M. K., Wood, W. B., & Knight, J. K. (2008). CBE—Life Sciences Education, 7(4), 422-430. 


Prevost, L. B., Smith, M. K., & Knight, J. K. (2016). CBE—Life Sciences Education, 15(4), ar65. 


Pelletreau, K. N., Andrews, T., Armstrong, N., et al., (2016). CourseSource. 

Acknowledgments.  
This material is based upon work supported by the National Science Foundation (DUE grant 1323162). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting agencies. 
Authored by: Jenifer Saldanha, Juli Uhl, Mark Urban-Lurain, Kevin Haudek
post image
Posted on: #iteachmsu
post image
Automated analyses of written responses reveal student thinking in STEM
Formative assessments can provide crucial data to help instructors ...
Authored by:
Monday, Apr 26, 2021
Posted on: Educator Stories
Wednesday, May 4, 2022
Dustin De Felice's Educator Story
This week, we are featuring Dr. Dustin De Felice, Associate Professor and Director of MSU’s English Language Center. Dr. De Felice was recognized via iteach.msu.edu's Thank and Educator Initiative! We encourage MSU community members to nominate high-impact Spartan educators (via our Thank an Educator form) regularly!
Read more about Dustin’s perspectives below. #iteachmsu's questions are bolded below, followed by their responses!

You were recognized via the Thank an Educator Initiative. In one word, what does being an educator mean to you? 
One word = Language. I added my CV and resume into a WordCloud generator (https://www.jasondavies.com/wordcloud/) and both generations put that word in the center. I believe that it reflects my focus in my teaching, learning, and supporting. 



What does this word/quality look like in your practice? Have your ideas on this changed over time? If so, how?
I believe I can best assess student understanding through the use of a variety of classroom tasks and assignments that build from and into each other. I rely very heavily on projects that give students the chance to engage in conversations, observations or interactions with language learning situations and language learners. I also believe in autonomous learning and the benefits students receive from working through material at their own pace. As such, I have been offering my courses in hybrid forms with some interactions in the classroom balanced with other interactions asynchronously. Within my classroom, I shy away from lecturing for more than twenty minutes and within those twenty minute blocks, I incorporate video, audio, or other multimedia files along with practical examples that I will ask the students to complete in small groups. I like to design packets of activities that encourage my students to learn the material while demonstrating their current level of understanding at the same time.
All of this direction comes from my earliest experiences with languages and language teaching. I remember being drawn to the English language from an early age. I was fascinated by dialects and accents, and I was especially taken by comedians, rappers and great orators and their abilities to make the English language entertain, inspire and provoke. However, it didn’t take me too long to realize I wasn’t drawn to the English language per se, but to all languages. I began taking courses in linguistics, education, humanities and sociology to help me better understand the world-at-large. Early in my career, I started teaching English as Second Language in Chicago, and I found the experience exhilarating. There I was midway through my B.A. and I was teaching three hour classes every morning and every night four times a week. I worked in a rundown building with no A/C in the summer and half-working space heaters in the winter. The classes were full of immigrants from all over Latin America and most of my classes had 35-55 students in them. Of course there were no textbooks, no curriculum or even a plan for that matter, but I loved the challenge. I loved every moment of trying to help these motivated adults learn something about English, about the city and about the U.S. I think that the challenge is what keeps me going. I sincerely enjoying working with students on succeeding at whatever tasks they have in front of them, and I especially enjoy doing so when it involves language of all kinds.
Tell me more about your educational “setting.” This can include, but not limited to departmental affiliations, community connections, co-instructors, and students. (AKA, where do you work?)
My educational setting includes the English Language Center as my primary home with multiple affiliations in or with graduate programs, undergraduate courses, service-oriented centers, and student-centered activities. I have a much smaller teaching load than I used to now that I spend most of my time in administration, but I specifically asked to maintain a teaching load because of how much I draw from my teaching. In fact, I don’t know how I would get through each semester without having the opportunity to work alongside students and their learning. 
What is a challenge you experience in your educator role? Any particular “solutions” or “best practices” you’ve found that help you support student success at the university despite/in the face of this?
MSU is a big and sometimes confusing place. I see opportunities in my teaching as a way of making MSU a place where students can succeed. I strive to provide my students with a welcoming environment whereby their learning becomes one of many ways of helping them reach their long-term goals. I try to provide opportunities through my courses and daily interactions to educational experiences that will help shape students’ futures. I also strive to be someone the students are very comfortable approaching with questions and/or advice. This approach includes ensuring I am accessible and open for meetings as needed. Within my courses, I work to conscientiously provide my students with an interesting variety of tasks to help keep them curious, satisfied, and motivated.
What are practices you utilize that help you feel successful as an educator?
I am very interested in student success, so I often utilize a 2-week module schedule, which helps make tasks more manageable and less stressful than a 1-week module format. Many students have told me the additional week gives them enough time to understand readings and complete tasks without rushing, which leads to better quality submissions. I believe student success requires creativity and flexibility, so I design classes that give lots of new ways to integrate ideas into students’ lives. I intentionally design course activities and readings with a focus on practicality. I also strive to be very responsive and available to answer questions/concerns from students. Many students have told me that my timely comments and grading are very helpful to them. In my courses, I seek out extra resources based on student interest and need. That kind of searching often leads to flexibility in applying the course content to best serve the students. Because communication is a key component of the practices that help me feel successful, many students have commented on how they really enjoy the open communication between the students and me.  I hope to let everyone focus on their interests and pull out what will be useful for them in their personal and professional lives. Part of that hope includes taking the time to get to know my students’ interests. Lastly, I always have modules up ahead of time, which really helps students plan their time. 
What topics or ideas about teaching and learning would you like to see discussed on the iteach.msu.edu platform? Why do you think this conversation is needed at MSU? 
It has been a difficult few years with so many national and international events that I would like to hear more about keeping or reinvigorating the joy and passion in our teaching and learning. I often meet with students and faculty 1-to-1, and I have to say there are so many good ideas and perspectives to inspire and share.
What are you looking forward to (or excited to be a part of) next semester?
Now that my role is more administration than teaching, I look forward to learning more about what the faculty around me are doing in their classrooms. Of course, I get the pleasure of supporting their teaching, and I’m constantly amazed at the creativity I see in the faculty around me. I suppose the main reason I so enjoy learning about what the faculty are doing in their courses is because that level of creativity just brings out the best in our students. Watching our students learn, grow, and get closer to any and all their goals is just a rewarding endeavor.  



Don't forget to celebrate individuals you see making a difference in teaching, learning, or student success at MSU with #iteachmsu's Thank an Educator initiative. You might just see them appear in the next feature!
Posted by: Makena Neal
post image
Posted on: #iteachmsu
Friday, Oct 7, 2022
Finally! A Common Teaching and Learning Events Calendar!
How many times have you been on campus at MSU - using a restroom, walking by a bulletin board in a hallway, waiting for an elevator - and saw a flyer or poster for an upcoming event. "Ooo, that sounds super interesting!" You scan the printed sheet of paper for details. "Bummer! I missed it." I have been at MSU in a variety of capacities since 2008 and I cannot count the number of times this has happened to me. If I happened to walk through a building that was outside my usual route and see a program or event of interest, it usually had already passed. Once I began my work in educational development, alongside with my doctoral studies in HALE, this became increasingly frusterating. I saw really cool topics, relevant across disciplines, being offered to limited groups - or even worse, being open to all MSU educators but not being promoted broadly. I was missing out so I knew others were as well. So when I saw the #iteachmsu Commons Educator Events Calendar, I was super excited. There is now a common calendar that, just like all of the #iteachmsu Commons, is for educators by educators. Anyone with MSU credentials can log in to iteach.msu.edu and share an event on the calendar. From unit, college, or organization-sponsored programs like educator trainings and workshops, to individual initatives like communities of practice, coworks, or meet-ups, any scheduled activity with an intended/open audience of folx who support the teaching and learning, student succes, and/or outreach mission of the university can be shared here!
      
From a self-proclaimed lifelong learner, I'm really excited to have a "one stop shop" where I can determine MSU personal growth and professional development activities, but as an educator at the Center for Teaching and Learning Innovation I am also thrilled about some of the ways the new #iteachmsu site functionality supports program facilitators. The "Going" button on an event details page can be linked directly to your event's registration. You can upload supporting materials or pre-activity details. There are easy ways to designate both face-to-face and virtual events. There's even a discussion thread for comments on each event!            If you have events that support MSU educators, start sharing them on the #iteachmsu Events Calendar today!Article cover photo by Windows on Unsplash
Authored by: Makena Neal
post image
Posted on: #iteachmsu
Monday, May 3, 2021
Studying Team Adaptive Performance using the Board Game Pandemic Legacy
Given the COVID-19 pandemic, educators from many fields have looked to representations of pandemics to help students study topics the pandemic has accentuated. In the history of science, educators have explored inequalities in medicine, trust in experts, and responses to uncertainty. To help make these issues digestible, some educators have turned to the cooperative board game, Pandemic Legacy. Small groups work together to avert a global health crisis by managing disease. Teams play the game multiple times, but actions in one game have consequences for the next and rules change and develop as the game progresses. The game's development introduces students to new concepts at a manageable pace while giving them new problems to solve. While the game effectively introduced students to topics in the history of science, this study sought to know whether it promoted cognitive and interpersonal skills. It focused on team adaptive performance, which is linked to problem-solving and communication skills. Data was collected using three surveys. Variation in teams' responses was analyzed using the Median test. The Friedman test was used to analyze each team's adaptive performance at each of the three timesteps. All teams were initially quite confident in their ability to creatively deal with unexpected events and reported that they adapted well to new tasks. As they encountered novel situations, some teams reported that their confidence decreased. They were newly aware that they did not have creative solutions to unexpected problems. Teams aware of their limitations performed better than those who maintained their initial confidence.To access a PDF of the "Studying Team Adaptive Performance using the Board Game Pandemic Legacy" poster, click here.Description of the Poster
Studying Team Adaptive Performance using the Board Game Pandemic Legacy 
Research Goal 
This study examined how team adaptive performance evolves over time. Adaptative performance is understood as a process that more effectively moves a team towards its objectives. The team must recognize deviations from expected action and readjust actions to obtain the best outcome (Salas, Sims, Burke 2005; Priest et al. 2002; Marques-Quinteiro et al. 2015). 
While previous studies have examined team adaptive performance in singular events, this study aimed to measure the evolution of team adaptive performance over time. Using a cooperative boardgame that changes as teams play, the study measured how well teams performed in response to three major deviations in game play that necessitated adaptation. 
Research Hypothesis 
Teams with higher perceived levels of adaptability will have better outcomes (the success measure) over time than teams with lower levels of adaptability  
Research Methods 
A total of 16 participants were divided into four teams. Each team played the cooperative board game, Pandemic Legacy (Figure 1), nine times throughout the study. Each participant completed a team adaptive performance questionnaire three times during the study, once after each major disruption in the board game. The questionnaire was designed to assess perceptions of team performance, based on Marques Quinteiro et al. 2015. It consisted of control questions about participants’ demographics as well as a 10-item Likert scale team performance questions broken down into categories assessing satisfaction, creativity, adjustability, adaptability, and positivity.  
Questions to evaluate adaptability included: 
Q7:We update technical and interpersonal competences as a way to better perform the tasks in which we are enrolled. 
Q8: We search and develop new competences to deal with difficult situations. 
Reliability Analysis showed that Cronbach alpha for Q7 and Q8 is 0.938. 
Team outcomes were assessed by a success measure that evaluated each team’s number of wins (where > wins = better outcome) and number of outbreaks (where < outbreaks = better outcome) 
Research Results: Success Measure 
The success measure results of number of wins are displayed in a bar chart. 
The success measure results of number of outbreaks are displayed in a bar chart. 
Research Results: Adaptability Measure 
Differences in the median score of teams’ responses to each question was calculated using the Median Test. Team 3 responded differently than at least one of the other teams to Q8 after Survey 1. Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, revealing a statistically significant difference between Team 3 and Team 1 (p =.030), and Team 3 and Team 2 (p =.030). 
Using the above method revealed no significant results after Survey 2. After Survey 3, there was a significant difference between Team 4 and Team 2 (p=.049) for Q7 and Team 1 and Team 2 (p=.049) for Q8. 
A Friedman Test was performed to determine if responses to the questions changed over time. There was a statistically significant difference in Team 3’s response to Q8 (X2(2)= 6.500, p= .039). Post-hoc analysis with pairwise comparison tests was conducted with a Bonferroni correction applied, resulting in a significance difference between Team 3’s first and third survey for Q8. 
Research Findings 
The initial analysis suggests that teams, such as Team 3, that develop higher perceptions of their adaptability will have better outcomes once the higher perceptions are achieved. Teams, such as Team 1, that begin with high perceived levels of adaptability but did not alter their approach when the success measures indicate adaptation is needed will have poorer outcomes. Teams, such as Team 2, that report high perceptions of adaptability throughout and that correspond with the success measure, will maintain good outcomes. 
Analysis of the satisfaction, creativity, adjustability, and positivity data is needed to determine if these affect the success measure or adaptability over time. 
Acknowledgments 
Funding provided by the MSU SUTL Fellows program, a collaboration between the Lyman Briggs College and the MSU Graduate School. 
References 
Marques-Quinteiro, P. et al. 2015. “Measuring adaptive performance in individuals and teams.” Team Performance Management 21, 7/8: 339-60. 
Priest, H.A. et al. 2002. “Understanding team adaptability: Initial theoretical and practical considerations.” Proceedings of the Human Factors and Ergonomics Society 46: 561-65. 
Salas, E. D.E. Sims, C.S. Burke. 2005. “Is there a ‘Big Five’ in Teamwork?” Small Group Research 36, 5: 555-99.   
Authored by: Melissa Charenko
post image