We found 520 results that contain "instructional technology"

Posted on: #iteachmsu
Monday, Apr 26, 2021
Learning in the Time of COVID-19
In the wake of the COVID-19 pandemic, Michigan State University, like many universities, closed its on-campus offerings and hastily moved to remote learning in March 2020. In addition to moving all classes online, students were asked to leave on-campus housing if possible. As COVID-19 cases continued to increase through the summer, plans to reopen in the fall were halted and most institutions announced they would continue offering instruction through remote learning. At the start of the spring 2020 semester, we collected data from MSU students enrolled in introductory economics courses about their grade expectations and views of economics as a major. In order to understand how students responded to the disruption generated by the pandemic, we began collecting additional data about the direct effects of the pandemic on their learning environment, including changes to living arrangements, internet access, studying behavior, and general well-being. Survey data were collected at the beginning and end of the spring, summer, and fall terms of 2020. Supplementing this survey data with administrative data on demographic characteristics and actual grade outcomes, we investigate how the pandemic affects students and how students' final grades in their economics course relate to their responses to the pandemic and virtual learning. We find the effects vary with student background characteristics (including race, gender, GPA, and first-generation college status) and final grades are related to internet connectivity, stress, and anxiety. These unique data allow us to provide a descriptive analysis of students' reactions to an unprecedented disruption to their educational environment.

To access a PDF of the "Learning in the Time of COVID-19" poster, click here.Description of the Poster 
Learning in the Time of COVID-19 
Andrea Chambers, Stacy Dickert-Conlin, Steven J. Haider, and Scott A. Imberman 
Introduction 
This study provides a snapshot of how students were experiencing the COVID-19 pandemic in the month following the abrupt shift to online instruction and how students have adapted to the experience of remote learning more long term. It contributes to the concerns that the mental well-being and academic performance of students has been affected by the coronavirus pandemic. 
Research Questions 


What demographic and academic factors are associated with student responses to questions about internet access, ability to focus, feelings of anxiety, and their financial situation? 


How are students’ final grades in their economics course related to their responses to the pandemic and virtual learning? 


Methodology 

Surveyed students enrolled in introductory economics courses from one large, public research university during three semesters (Spring, Summer, and Fall) of 2020. 
Students completed surveys at the beginning and end of the semester. 
Supplemented these data with administrative data on demographic characteristics and actual grade outcomes. 
Conducted multiple regression analyses of student characteristics on student perceptions and final semester grades. 

Survey 
The Two Surveys: 

Initial Survey – General information and grade students expected to earn in the class 
Final Survey – Students’ reactions to the COVID-19 pandemic and remote learning  

Response Rate: 


Of the 6,665 eligible students, 3,445 students (52%) answered at least one of the COVID-related questions. 


COVID-Related Statements: 


My internet connectivity is sufficient to complete my economics coursework. 


My final grade in my economics course will be unaffected. 


My overall semester GPA will be unaffected. 


My time available for studying has increased. 


My ability to focus on my studies has declined. 


My anxiety about my studies has increased. 


My financial situation has worsened.  


Sample Descriptives 


Female: 47.3%, Male: 52.7% 




White: 71.5%, Black: 4.2%, Hispanic/Latinx: 4.7%, Asian: 6.6%, 2 or more Races: 2.7%, Other or Not Reported: 1.5%, International: 8.7% 


1st Year at MSU: 37.5%, 2nd year at MSU: 38.5%, 3rd Year at MSU: 16.5%, 4th Year or Later at MSU: 7.6% 


First-Generation College Student: 18.5% 


Results 
Image: A stacked bar chart detailing the percent of students who strongly agree and agreed with each COVID-related statement on displayed on top of the percent of students who strongly disagreed, disagreed, or neither agreed nor disagreed with each COVID-related statement. 
Title: Figure 1. Responses to COVID-Related Questions for Spring, Summer, and Fall 2020 
Details of image: 

My internet connectivity is sufficient: 83.3% strongly agree/agree and 16.7% strongly disagreed/disagreed/either agreed nor disagreed. 
My econ course final grade will be unaffected: 36.0% strongly agree/agree and 64.1% strongly disagreed/disagreed/either agreed nor disagreed. 
My overall semester GPA will be unaffected: 31.2% strongly agree/agree and 68.8% strongly disagreed/disagreed/either agreed nor disagreed. 
My time available for studying has increased: 46.9% strongly agree/agree and 53.1% strongly disagreed/disagreed/either agreed nor disagreed. 
My ability to focus on my studies has declined: 69.0% strongly agree/agree and 31.0% strongly disagreed/disagreed/either agreed nor disagreed. 
My anxiety about my studies has increased: 74.0% strongly agree/agree and 26.0% strongly disagreed/disagreed/either agreed nor disagreed. 
My financial situation has worsened: 36.3% strongly agree/agree and 63.7% strongly disagreed/disagreed/either agreed nor disagreed. 

Research Question 1: What demographic and academic factors are associated with student responses to questions about internet access, ability to focus, feelings of anxiety, and their financial situation? 
Empirical Strategy: where  is an indicator for whether the student agrees or strongly agrees with the statement. 
Ability to Focus 


April 2020: 83% of students report their ability to focus on their studies has declined.  


December 2020: 61.5% of students state feeling their ability to focus has declined.  


During the initial reaction to the pandemic and remote instruction, we do not see statistically significant differences across student characteristics such as gender, race/ethnicity, or first-generation college status. However, when we look at the continued response in the summer and fall semesters, female students are more likely to state their ability to focus on their studies has declined relative to their male peers by 9 percentage points. 


Anxiety about Studies 


Over 70% of students in the sample report an increase in anxiety about their studies in April 2020 and through Summer and Fall 2020. 




Female students are more likely to report an increase in anxiety relative to their male peers of around 8 percentage points in the Spring 2020 and 16 percentage points during Summer and Fall 2020.  


Financial Situation  


April 2020: 48.6% state that their financial situation has worsened. 


This condition was felt more by first-generation college students, women, and lower performing students compared to their respective peers.  


December 2020: 30% state their financial situation has worsened and first-generation college students during Summer and Fall 2020 are still more likely to experience a worsened condition. 


Research Question 2: How are students’ final grades in their economics course related to their responses to the pandemic and virtual learning? 
Empirical Strategy: 
where is a vector of COVID-related questions and  are the student background characteristics, year in college, GPA, and expected grade at the start of the semester. 


Internet Connectivity: Students who did not have sufficient internet connection earned lower final grades.  


COVID-Related Stress: In April 2020, students who strongly agree their ability to focus has decreased and students across all semesters who strongly agree their anxiety has increased earned lower final grades.  


Financial: Students who state their financial situation has worsened earned lower final grades in the summer and fall semesters. 


Discussion & Conclusions 


As many students in this study report feeling their ability to focus has declined and anxiety has increased, findings suggest women, first-generation college students, and lower performing students may be particularly vulnerable to these feelings and experiences.  


Survey results suggest financial situations worsened for first-generation college students, which could lead to food or housing insecurity for these students, issues which could lead to increased stress and anxiety, lower grades, and possibly prevent students from persisting in higher education.  


Requiring access to instruction via online learning has showcased the need for quality internet access. 




The coronavirus pandemic has raised a lot of questions about the future of online education, it is important to keep in mind the ways in which students are impacted by such a move. 
Authored by: Andrea Chambers
post image
Posted on: #iteachmsu
Monday, Apr 26, 2021
Automated analyses of written responses reveal student thinking in STEM
Formative assessments can provide crucial data to help instructors evaluate pedagogical effectiveness and address students' learning needs. The shift to online instruction and learning in the past year emphasized the need for innovative ways to administer assessments that support student learning and success. Faculty often use multiple-choice (MC) assessments due to ease of use, time and other resource constraints. While grading these assessments can be quick, the closed-ended nature of the questions often does not align with real scientific practices and can limit the instructor's ability to evaluate the heterogeneity of student thinking. Students often have mixed understanding that include scientific and non-scientific ideas. Open-ended or Constructed Response (CR) assessment questions, which allow students to construct scientific explanations in their own words, have the potential to reveal student thinking in a way MC questions do not. The results of such assessments can help instructors make decisions about effective pedagogical content and approaches. We present a case study of how results from administration of a CR question via a free-to-use constructed response classifier (CRC) assessment tool led to changes in classroom instruction. The question was used in an introductory biology course and focuses on genetic information flow. Results from the CRC assessment tool revealed unexpected information about student thinking, including naïve ideas. For example, a significant fraction of students initially demonstrated mixed understanding of the process of DNA replication. We will highlight how these results influenced change in pedagogy and content, and as a result improved student understanding.To access a PDF of the "Automated analyses of written responses reveal student thinking in STEM" poster, click here.Description of the Poster 
Automated analyses of written responses reveal student thinking in STEM 
Jenifer N. Saldanha, Juli D. Uhl, Mark Urban-Lurain, Kevin Haudek 
Automated Analysis of Constructed Response (AACR) research group 
CREATE for STEM Institute, Michigan State University 
Email: jenifers@msu.edu 
Website: beyondmultiplechoice.org  
QR code (for website):  
 
Key highlights: 

Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions. 


The Constructed Response Classifier (CRC) Tool (free to use: beyondmultiplechoice.org) can be used to assess student learning gains 

In an introductory biology classroom: 

Analyses by the CRC tool revealed gaps in student understanding and non-normative ideas. 
The instructor incorporated short term pedagogical changes and recorded some positive outcomes on a summative assessment. 
Additional pedagogical changes incorporated the next semester led to even more positive outcomes related to student learning (this semester included the pivot to online instruction). 

The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.   
Constructed Response Questions as Formative Assessments 

Formative assessments allow instructors to explore nuances of student thinking and evaluate student performance.  
Student understanding often includes scientific and non-scientific ideas [1,2].  


Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions [3,4]. 

Constructed Response Classifier (CRC) tool 

A formative assessment tool that automatically predicts ratings of student explanations.  
This Constructed Response Classifier (CRC) tool generates a report that includes: 


categorization of student ideas from writing related to conceptual understanding. 
web diagrams depicting the frequency and co-occurrence rates of the most used ideas and relevant terms. 

CRC Questions in the Introductory Biology Classroom :  
A Case study 
Students were taught about DNA replication and the central dogma of Biology. 
Question was administered as online homework, completion credit provided. Responses collected were analyzed by the CRC tool. 
CRC question: 
The following DNA sequence occurs near the middle of the coding region of a gene.  DNA   5'  A A T G A A T G G* G A G C C T G A A G G A  3'     
There is a G to A base change at the position marked with an asterisk. Consequently, a codon normally encoding an amino acid becomes a stop codon.  How will this alteration influence DNA replication? 

Part 1 of the CRC question used to detect student confusion between the central dogma processes.  
Related to the Vision & Change core concept 3 “Information Flow, Exchange, and Storage" [5], adapted from the Genetics Concept Assessment [6,7]. 

Insight on Instructional Efficacy from CRC Tool 
Table 1: Report score summary revealed that only a small fraction of students provided correct responses post instruction. (N = 48 students). 




Student responses 


Spring 2019 




Incorrect 


45% 




Incomplete/Irrelevant 


32% 




Correct 


23% 




 
Sample incorrect responses:  
Though both incorrect, the first response below demonstrates understanding of a type of mutation and the second one uses the context of gene expression. 

“This is a nonsense mutation and will end the DNA replication process prematurely leaving a shorter DNA strand” (spellchecked) 


“It will stop the DNA replication… This mutation will cause a gene to not be expressed” 

CRC report provided: 

Response score summaries 
Web diagrams of important terms 
Term usage and association maps 

The instructor Identified scientific and non-scientific ideas in student thinking  
This led to: 
Short term pedagogical changes, same semester  

During end of semester material review, incorporated: 


Small group discussions about the central dogma.  
Discussions about differences between DNA replication, and transcription and translation. 


Worksheets with questions on transcribing and translating sequences. 

Figure one: 
The figure depicts an improvement in student performance observed in the final summative assessment.  
Percentage of students who scored more than 95% on a related question: 
In the unit exam = 71% 
Final summative exam = 79% 
Pedagogical Changes Incorporated in the Subsequent Semester 
CR questions: 

Explain the central dogma. 


List similarities and differences between the processes involved. 
Facilitated small group discussions for students to explain their responses. 

 
Worksheets and homework:  
Transcribe and translate DNA sequences, including ones with deletions/additions.  
Students encouraged to create their own sequences for practice.  
Revisited DNA replication via clicker questions and discussions, while students were learning about transcription and translation. 
Table 2: 68% of students in the new cohort provided correct responses to the CRC question post instruction. (N = 47 students). 




Student Responses 


Spring 2020 




Incorrect 


19% 




Incomplete/Irrelevant 


13% 




Correct 


68% 




Conclusions 
The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.   
Future Directions 

Use the analytic rubric feature in the CRC tool to obtain further insight into normative and non-normative student thinking. 
Use the clicker-based case study available at CourseSource about the processes in the central dogma [8]. 


Incorporate additional CRC tool questions in each course unit. 

Questions currently available in a variety of disciplines: 
Biology, Biochemistry, Chemistry, Physiology, and Statistics 
Visit our website beyondmultiplechoice.org and sign up for a free account 
References: 

Ha, M., Nehm, R. H., Urban-Lurain, M., & Merrill, J. E. (2011).  CBE—Life Sciences Education, 10(4), 379-393. 


Sripathi, K. N., Moscarella, R. A., et al., (2019). CBE—Life Sciences Education, 18(3), ar37. 


Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). CBE—Life Sciences Education, 16(2), ar26. 


Birenbaum, M., & Tatsuoka, K. K. (1987). Applied Psychological Measurement, 11(4), 385-395. 


 "Vision and change in undergraduate biology education: a call to action." American Association for the Advancement of Science, Washington, DC (2011). 


Smith, M. K., Wood, W. B., & Knight, J. K. (2008). CBE—Life Sciences Education, 7(4), 422-430. 


Prevost, L. B., Smith, M. K., & Knight, J. K. (2016). CBE—Life Sciences Education, 15(4), ar65. 


Pelletreau, K. N., Andrews, T., Armstrong, N., et al., (2016). CourseSource. 

Acknowledgments.  
This material is based upon work supported by the National Science Foundation (DUE grant 1323162). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting agencies. 
Authored by: Jenifer Saldanha, Juli Uhl, Mark Urban-Lurain, Kevin Haudek
post image
Posted on: #iteachmsu
post image
Automated analyses of written responses reveal student thinking in STEM
Formative assessments can provide crucial data to help instructors ...
Authored by:
Monday, Apr 26, 2021
Posted on: #iteachmsu Ambassadors
Tuesday, Nov 5, 2019
Storytelling for Learning 1: Creating Meaning from Chaos
Storytelling for Learning 1: Creating Meaning from Chaos
In 1944, experimental psychologists Fritz Heider and Marianne Simmel used the video below in an experiment. They instructed their female undergraduate subjects to write down what happened in the movie.
 
I want you to do the same. Take out some paper, watch the video, and jot down a few sentences about what happened. 
 

 
What did you write down?  What was happening?
 
 
Now watch part of the video below, where some comedians talk about what they saw. You only need watch about a minute of the video to get the idea. (Warning: potentially offensive language...as you can imagine from comedians on YouTube.)
 


 
What is interesting is most people create a story. The characters are a shape. There is a setting of a room, or perhaps a house. Many people see a bullying event, or another form of conflict. 
Yet this is simply a video of shapes moving around a screen, isn't it?
 
Humans are wired to create meaning from input. That is why my aunt sees Jesus in her toast. That is why, when the photo below was taken by NASA in 1975 (yes, that is a real and unedited photo), the public FREAKED out. A face! A human face!  There is life there! They are communicating with us!
 
 

 
 
It is also why we love conspiracy theories. When random things happen, especially bad things, we want a logical explanation. Random bad luck is not an explanation that satisfies us. Thus, Elvis didn't die young. Nope. He faked his death to live in peace, away from the nuisance of fame. Now he is in hiding-- living out his years in a lovely coastal fishing village in Honduras.
 
p>Daydreaming is, for the most part, storytelling. It is us thinking about a possible scenario, planning something in the future and creating the "story" that surrounds it, or just fantasizing about something other than where we are at the moment. 
 
 



How many waking hours do you spend each day daydreaming?


2-3 hours
3-5 hours
5-7 hours
over 9 hours



 
 
 
 
 
 


Created with QuizMaker

 

 
So what is the correct answer to the above?  Scroll down. 

 
And down!
 
 
 
 
 
We spend 7.7 hours each day telling ourselves stories. That’s about half of our waking hours. And then we sleep. And tell stories in our dreams. 
Authored by: Anne Baker
post image
Posted on: MSU Online & Remote Teaching
Monday, Oct 18, 2021
Remote Communication with Students Quick Guide
Click the image above to access a PDF of the Quick Guide. Remote Communication With Your Students
This quick guide provides an introduction to communicating with your students as you move to remote teaching. It outlines key steps to Plan, Modify, and Implement when making this move to optimize student learning. As with any steps you take in moving to remote teaching, it’s important to anchor your decisions in course learning objectives and to be transparent, flexible, and generous with students.
Plan
Michigan State University has shifted to remote teaching, which means your course will be moving to a digital environment. Remote teaching is a way to continue instruction when face-to-face meetings are disrupted and you are not able to meet in person. When planning for remote teaching, it’s important to develop a communication plan for helping students transition to a remote environment.
Modify
It is important that you develop a communication plan for maintaining ongoing contact with your students about the course. Consider the following:

Clarify your modified expectations and course elements:

When your class will meet. Schedule any virtual sessions during the time your course already meets. This guarantees that students have the availability.
How you will deliver content (e.g. Zoom, recorded lectures, etc.).
How students will engage with one another.
How students will be assessed moving forward.
Changes to assignments.


Tell students how they can contact you and how soon they can expect a reply from you.
Consider using the D2L announcements and discussion board tools to push out course-level communications.

 
Even if you have not yet finalized all the changes to your course, it is important to send a message to your students so they know how to reach you. To get started, here is a sample email you might send:
 
Dear [insert course name here] students,
 
I’m writing to let you know that the University is implementing a remote teaching strategy in response to the novel coronavirus. What this means for you is that we will not be meeting at our normal class location. Instead, we will meet online at the same time our class normally meets. However, I will be hosting the class through Zoom. We will also be using our D2L course site to deliver and collect materials for the class. To access the course, go to https://d2l.msu.edu/. Once you log in with your NetID and password, you should see our course listed under “My Courses”.
 
Over the next few days, I will keep you informed about how our course experience will change. Know for now that we are planning to move forward with the course, and please be patient while we get things shifted for this new mode. I will be back in touch soon with more details.
 
Best,
[Insert your name]
Implement
As your initial form of communication with students, it is important to inform your class often about course changes and expectations. To send emails, you have several options:

D2L email classlist function
The Instructor Systems email tool from the Registrar’s website
Spartan Mail for individual and small group communications

Additional Help
For additional help and support, please check out the other Remote Teaching articles here, or contact the MSU IT Service Desk at local (517) 432-6200 or toll free (844) 678-6200.
 
Attribution 4.0 International (CC BY 4.0)
Authored by: 4.0 International (CC by 4.0)
post image
Posted on: #iteachmsu
Thursday, May 6, 2021
Benefits of Teaching a Large Course Using a Flipped Zoom Classroom
In Fall 2020, we conducted CSE 260 (Discrete Mathematics) as a flipped class, where students were expected to watch videos before class so that they could use class time to work together to solve problems. This class covers foundational mathematics for computer science and computer engineering students. Students need a lot of practice to master the methods and concepts. Unfortunately, these problems do not provide an instant feedback mechanism similar to programming projects. A flipped class where students work together in a group, along with regular assistance by the instructional team, provides such a mechanism. We surveyed students to gather their impressions on the course. Most students liked the flipped class structure and generally preferred it to a traditional lecture format. Furthermore, students reported it helped them develop friendships, something difficult to achieve in the Covid-era.
To access a PDF of the "Benefits of Teaching a Large Course Using a Flipped Zoom Classroom" poster, click here.Description of the Poster 
CSE 260 Flipped Class (Lessons Learned) 
Sandeep Kulkarni and Eric Torng 
 CSE 260: Discrete Mathematics

Topics Covered: 


Propositional and predicate logic 
Set Theory 
Elementary Number theory and its applications to cryptography 


Mathematical Induction 
Counting and probability 
Relations 


Role in Curriculum 


Foundational mathematics for computer science  

Analog to calculus (continuous mathematics) for engineering and natural sciences 
Why Flipped Class 

Students need lots of practice to master the methods and concepts 
Discrete math problems do not provide instant feedback to students if they do something wrong (unlike some programming errors such as a program failing to compile), so doing problems in class in groups helps students get quick feedback on any mistakes 
For Fall 2020, student groups not only improved learning, they also created a sense of community for students who participated regularly. 


80% of students responding to an end of semester survey reported they developed friendships through the homework groups 

 Flipped Class Design 

Class enrollment roughly 200 (10-20% were outside the US, several in Asia) 
Instructional Team 


2 faculty, 6 TAs/ULAs 


Online videos covered the core concepts 


Each video had an associated homework assignment that would be worked on in class by student groups 
Each video had an associated online quiz that every student was required to complete before working on the associated homework in class in groups 


Homework group composition 


20 groups, approximately 10 students per group 
Group creation started about a month before the first class 
Each student was asked to fill out a survey that asked two main things 


Do you request specific group partners? 


15% of students made such requests 


What is your self-perceived math background and ability to lead a group discussion? 
60% of students filled out the survey 


Groups were created based on these responses (group partner requests and balancing self-perceived ability) 
Groups did not change 


Homework group technical support 


Groups had a shared Google drive space for working on assignments 
Groups had predefined Zoom breakout rooms  


Some issues due to Zoom max of 200 participants for predefined breakout rooms 

First Week Activities 

The first week was focused on group work logistics and the daily structure 


We discussed group roles and group dynamics 
We had students practice their group collaboration on ungraded simple math exercises  


We had several technical issues the first week including having to move roughly 80 students rather than the anticipated 20 students to their predefined Zoom breakout rooms 

Daily Structure 

At the end of every class, each group submitted a survey to identify (1) difficulties encountered, (2) their current status in solving the homework problems, (3) and their assessment of the group collaboration. 


Before the next class, we prepared a few slides summarizing the responses in all three dimensions along with 2-3 quoted comments that best captured the current student sentiment. 
At the start of the next class, we spent roughly 20 minutes covering those slides. 
Afterwards, groups began their collaborative work in their assigned breakout rooms 
The instructional team moved through the groups to help as needed for both content and to enforce good group dynamics. 
The work done in class was submitted as (lightly graded) homework to ensure that it was completed 

Common Difficulties 

Internet issues 
Some students not watching the videos before class 
Freeloaders: some students not participating on a regular basis but getting the same homework grade leads to resentment from those that do participate. 
Groups were not perfectly synchronized; leading groups might be 2-3 assignments ahead of trailing groups. 

Lessons Learned 

# instructional staff needs to be about 1/3 # of groups 


This implies we can have at most ~20 groups with current instructional staff size 


Need better mechanisms to address freeloaders 


Perhaps more frequent individual assessments to ensure all students are participating and learning 


Each class/week must have specific deliverables to ensure group synchronization 
Stricter enforcement of requirements to watch videos before class 

 Survey 

Administered by Qualtrics 
Roughly 1/3 of students (65) responded 

Selected Comments 

I think the flipped model is much more effective when it has to be online and potentially I think it could work when in person classes are able to be taught again. I think some students learn a bit differently than others so I think having the option of flipped classes (maybe every other semester) could be beneficial to some and hindering to others.  


I feel like there would be more participation if the flipped class happened in person rather than zoom. People would likely hold themselves more accountable.  
I think the reason group work helped me learn was because it was over zoom. This way everyone is able to see a screen and hear each other. If it had been an in-person flipped class it would have been more difficult to communicate with such a large group, so groups would have to be smaller. The people sitting furthest away from wherever the work is being done would not participate. I think I learned the most when I was doing problems as a group.  
Flipped classroom in person is very nice.  For example CMSE 201, 202 and STT 180 all do very nice jobs of balancing the in class work and the pout of class lecture.  Also, having TA's walking around to help is very nice.  

 Information from Graphs 
Most students preferred flipped class 
There was a preference towards flipped in-person class 
Most students reported that they learnt a great deal from their peers 
49% students preferred flipped class, 5% preferred any option, Remaining students were ok with either. 
Authored by: Eric Torng, Sandeep Kulkarni
post image
Posted on: The MSU Graduate Leadership Institute
Monday, Oct 11, 2021
MSU SciComm Conveyance Conference
Who did you work with and what was their role in your project? I led the executive board and committee chairs as we collaborated with our expert speakers to put together our two-day conference. In total, we offered 22 sessions including workshops, lectures, networking opportunities, and social events. How did you manage relationships with key stakeholders in your college to achieve your project goals? I reached out to the Deans and department chairs to schedule meetings with them if they wanted more information. The initial email was very detailed about what we were trying to accomplish. I have met with these stakeholders in the past before too so that was helpful.  What is the impact of your project? Who was your target audience and what difference did your project make for them?  Our conference helped to bring together individuals who are passionate about science communication. Our attendees included a mix of students who were interested in entering science communication fields and experts looking to connect to peers and provide advice to the next generation. Our formal sessions included presentations and workshops that were targeted to specific interest areas, and we also offered space for networking and other informal conversations, all of which was well-received by our attendees.  If someone were to continue your work in the future, what advice would you have for them?  It was very important to us to highlight the diversity of science communicators as we worked to put together this conference, and we would encourage anyone else looking to plan a similar event to do the same. For anyone looking to host a virtual conference, make sure that your technology is accessible, and have members of your team ready to assist with any technical difficulties.  How did this work contribute to your personal leadership development? Through the process of planning and executing a conference of this scale, our leadership team was able to gain valuable skills related to event planning, public relations, marketing, grant-writing, and innovative technology use. I was able to guide our team in conducting all of this which helped me feel prepared for future event planning. I learned a lot about what it takes to run a successful large-scale event, and I look forward to the opportunity to use these skills when planning future MSU SciComm programs.  MSU SciComm Conveyance Website
Authored by: Chelsie Boodoo
post image
Posted on: #iteachmsu
Thursday, Jun 12, 2025
D2L: Customize Your NavBar
The NavBar in D2L is the panel at the top of your course homepage that provides links to important tools and pages. When you open a new course, the NavBar includes a default set of links and drop-down menus to various D2L features. It usually looks something like the image below.

Why customize your NavBar?

You may not use all the tools included in the default NavBar; removing unused items can simplify navigation for students.
A streamlined, relevant NavBar helps students find what they need more efficiently.
You can personalize it to fit your teaching style, whether that’s clean and text-based or visual with icons.

How to customize your NavBar

On your course homepage, locate the NavBar at the top.
Click the three-dot menu icon on the right side of the NavBar.
From the dropdown, select “Customize this NavBar.”

Note: When you customize the NavBar, you're creating a new version of the MSU NavBar for your course.

Edit NavBar Links

Under the “Name” textbox, you’ll see a “Links” section listing all current NavBar buttons.
Hover over any link to delete it or drag to reorder.
Click “Add Links” to include new tools, even ones that normally appear in dropdowns, like “Class Progress,” without adding the entire “Assessments” menu.


Enable icon-based navigation (optional):
Prefer a more visual layout?Check the box labeled “Enable Icon-Based NavBar”, located just below the “Add Links” button. This will display icons instead of (or alongside) text for each link.
Preview and Save

Click “Save and Close” to preview your updated NavBar.
You can continue editing it at any time until it feels just right.
If at any time you want to see what the NavBar looks like, click "save and close." You can edit it as much as needed.

Tips:

Students don’t see all the same tools that you do (e.g., “Course Admin” and “Intelligent Agents”). Use the View as Student feature to check how the NavBar appears from their perspective.
Avoid changing the NavBar after students have access, as it may confuse them.

Example
Here’s what my instructor NavBar looks like:It includes only the tools I use, arranged in the order students need them. I’ve removed dropdown menus since I don’t use all the tools they contain. Students see a clean, focused navigation bar that matches how the course is structured.
Authored by: Andrea Bierema
post image
Posted on: MSU Online & Remote Teaching
Monday, May 4, 2020
Managing Chat Permissions in a ZOOM meeting
 
As the host, you can control who meeting or webinar participants are allowed to chat with. You can also disable the chat for all participants or disable private chat, so participants cannot send private messages.
 
Check out this article for additional instructions on:

Controlling Chat Access
Disabling In-Meeting Chat

https://support.zoom.us/hc/en-us/articles/115004809306-Controlling-and-Disabling-In-Meeting-Chat#h_d9a04597-0138-4fb9-86cd-81cc4c68b21f
Posted by: Makena Neal
post image