We found 320 results that contain "classroom observation"
Posted on: #iteachmsu
NAVIGATING CONTEXT
Mandatory Reporting & Sample Syllabus Statement
Mandatory reporting
If you are an employee and a student or colleague discloses that she or he was a victim of sexual assault or relationship violence, your response and support can make a big difference.
MSU recognizes the complexities associated with fulfilling your mandatory reporting obligations as an employee while offering support and maintaining the relationship you have built with the student or employee. To that end, MSU has created the University Reporting Protocols in order to provide employees with information about the mandatory reporting process, including what happens when a report is made, as well as tips for responding and supporting students and employees.
Unless identified as a confidential source, all university employees are obligated to promptly report incidents of sexual harassment, sexual violence, sexual misconduct, stalking, and relationship violence that:
Are observed or learned about in their professional capacity
Involve a member of the university community or
Occurred at a university-sponsored event or on university property
Employees are only required to report relationship violence and sexual misconduct of which they become aware in their capacity as a university employee, not in a personal capacity.
For more information about employee mandatory reporting roles and responsibilities, download the University Reporting Protocols.
I am a faculty member or instructor. Is there recommended language I can put in my syllabus to notify students that I am a mandatory reporter?
(EXAMPLES OF RECOMMENDED LANGUAGE)
Michigan State University is committed to fostering a culture of caring and respect that is free of relationship violence and sexual misconduct, and to ensuring that all affected individuals have access to services. For information on reporting options, confidential advocacy and support resources, university policies and procedures, or how to make a difference on campus, visit the Title IX website at civilrights.msu.edu.
Limits to confidentiality. Essays, journals, and other materials submitted for this class are generally considered confidential pursuant to the University's student record policies. However, students should be aware that University employees, including instructors, may not be able to maintain confidentiality when it conflicts with their responsibility to report certain issues to protect the health and safety of MSU community members and others. As the instructor, I must report the following information to other University offices (including the Department of Police and Public Safety) if you share it with me:
Suspected child abuse/neglect, even if this maltreatment happened when you were a child;
Allegations of sexual assault, relationship violence, stalking, or sexual harassment; and
Credible threats of harm to oneself or to others.
These reports may trigger contact from a campus official who will want to talk with you about the incident that you have shared. In almost all cases, it will be your decision whether you wish to speak with that individual. If you would like to talk about these events in a more confidential setting, you are encouraged to make an appointment with the MSU Counseling and Psychiatric Services.
If you are an employee and a student or colleague discloses that she or he was a victim of sexual assault or relationship violence, your response and support can make a big difference.
MSU recognizes the complexities associated with fulfilling your mandatory reporting obligations as an employee while offering support and maintaining the relationship you have built with the student or employee. To that end, MSU has created the University Reporting Protocols in order to provide employees with information about the mandatory reporting process, including what happens when a report is made, as well as tips for responding and supporting students and employees.
Unless identified as a confidential source, all university employees are obligated to promptly report incidents of sexual harassment, sexual violence, sexual misconduct, stalking, and relationship violence that:
Are observed or learned about in their professional capacity
Involve a member of the university community or
Occurred at a university-sponsored event or on university property
Employees are only required to report relationship violence and sexual misconduct of which they become aware in their capacity as a university employee, not in a personal capacity.
For more information about employee mandatory reporting roles and responsibilities, download the University Reporting Protocols.
I am a faculty member or instructor. Is there recommended language I can put in my syllabus to notify students that I am a mandatory reporter?
(EXAMPLES OF RECOMMENDED LANGUAGE)
Michigan State University is committed to fostering a culture of caring and respect that is free of relationship violence and sexual misconduct, and to ensuring that all affected individuals have access to services. For information on reporting options, confidential advocacy and support resources, university policies and procedures, or how to make a difference on campus, visit the Title IX website at civilrights.msu.edu.
Limits to confidentiality. Essays, journals, and other materials submitted for this class are generally considered confidential pursuant to the University's student record policies. However, students should be aware that University employees, including instructors, may not be able to maintain confidentiality when it conflicts with their responsibility to report certain issues to protect the health and safety of MSU community members and others. As the instructor, I must report the following information to other University offices (including the Department of Police and Public Safety) if you share it with me:
Suspected child abuse/neglect, even if this maltreatment happened when you were a child;
Allegations of sexual assault, relationship violence, stalking, or sexual harassment; and
Credible threats of harm to oneself or to others.
These reports may trigger contact from a campus official who will want to talk with you about the incident that you have shared. In almost all cases, it will be your decision whether you wish to speak with that individual. If you would like to talk about these events in a more confidential setting, you are encouraged to make an appointment with the MSU Counseling and Psychiatric Services.
Authored by:
Office for Civil Rights and Title IX Education and Compli...

Posted on: #iteachmsu

Mandatory Reporting & Sample Syllabus Statement
Mandatory reporting
If you are an employee and a student or colleag...
If you are an employee and a student or colleag...
Authored by:
NAVIGATING CONTEXT
Wednesday, Aug 23, 2023
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
SpartanQM - Online/Blended Course Peer-Review Process
Introduction
Quality Matters (QM) is a nationally recognized, faculty-centered, peer review process designed to certify the quality of online courses and online components. MSU purchased a campus subscription to the QM Rubric to assist faculty and instructors in creating quality courses that will improve online education and student learning. The initial pilot of using the rubric to inform course design started as an MSU partnership between the Center for Integrative Studies in General Science, College of Arts & Letters, and MSU Information Technology. Currently, MSU maintains its full subscription status on a yearly basis which provides access to the fully annotated QM Rubric and the QM Course Review Management System (CRMS). Additionally, MSU IT Academic Technology consults with faculty and instructors on applying QM standards to their courses and developing new approaches in online and blended learning.
The MSU QM Course Review Process is a faculty-driven, peer review process that emphasizes continuous quality improvement. The QM reviewers experience and review a course from a student perspective and provide feedback based on the Quality Matters Standards. See IT Instructional Technology & Development for information about course development and see IT’s Academic Technology Service Catalog to learn more about QM at MSU.
Our course review process consists of three parts:
a self-review done by you to get familiar with the course review process on the MyQM system.
an internal review by a peer-reviewer to provide initial feedback on the course design.
after any necessary changes are made and the course has run, a copy of the course can undergo an official review conducted by a team of three QM Reviewers (Master Reviewer, Subject Matter Expert and one additional Reviewer) resulting in Quality Matters Certification [cost $1,000].
Whole programs can also be QM certified whose courses have been peer-reviewed. Information on QM program certification can be found on QM’s website.
Getting Started
Anyone at MSU can create an account through the Quality Matters website by using their msu.edu email address.
Quality Matters provides a fully annotated course standards rubric, different types of course reviews including a self-review, and discounted QM professional development through its website and MSU’s subscription.
Some of the Quality Matters resources involve added costs and official course reviews require MSU consultation first.
Course Rubric
The QM Rubric is a research-based peer review process that is widely adopted in higher education as a measure of online course quality. It offers weighted best practices in online instruction to improve course quality.
Visit the QM Higher Education Rubric, Sixth Edition to download the rubric.
The rubric is helpful as a tool to consider what elements may be missing from an online or blended course or to generate suggestions for new features.
Self-Review First
Faculty and staff can use the fully annotated, self review materials, within the MyQM CRMS (Course Review Management System). Annotations explaining each standard in greater detail can be accessed within the Self Review tool after logging in to the QM site.
This unofficial self review is a way to become more familiar with QM standards or assess a course prior to an internal or official review. You can also do pre- and post- assessments of your courses to keep a record of improvements, and a private report can be emailed once completed.
What to expect in a peer-review?
The internal and official review are almost identical. Both generally consist of the following steps:
Pre-Review Discussion
Team chair (Lead Reviewer in an internal review) contacts review members and faculty member to set up a conference call or face-to-face meeting at the beginning of the review. The purpose of the conference call/meeting is to discuss the instructor worksheet, ensure that all members have access to the course, establish the team review timeline, and answer any questions from team members before the review begins.
Review Phase
The review begins. Each team member logs into the QM Rubric website and uses the online rubric tool to record their observations about the course. Remember that you are reviewing the course from the student’s perspective. If you have questions during the review, don’t hesitate to contact your team chair.
Post-Review Discussion
Upon completion of the review, the team chair will call for the final conference. This conference will be among the review team members to discuss any discrepancies in the review and to ensure that recommendations are helpful and effective. All individual reviews will be submitted after this meeting to compile the final report.
Post Review – Revise Course (as needed)
The team chair will submit the final review to the Campus QM Coordinator through the online QM tool. The review findings will be shared with the course instructor who then has an opportunity to respond to the review (using the course Amendment Form in the QM site). If the course does not yet meet standards, the faculty course developer/instructor works to bring the course to standards (with the assistance of an instructional designer, if desired). The review team chair then reviews the changes and determines whether or not the changes move the course to QM standards. In an internal review, revisions are made before submitting for an official review.
Steps for Internal Review
It is good practice to complete a self-review of your course before submitting for internal or official review. This is an optional step and only you see the self-review responses. For a self-review, log into the CRMS (Course Review Management System) on the QM website and use the Self Review tool to conduct a review of your own course.
When you are ready to submit a course for internal review:
Sign up for a SpartanQM Online/Blended Course Peer-Review and wait for an email response.
Make a copy of your course to be reviewed.
Log in to MyQM at http://www.qmprogram.org/MyQM (Your login name is your email address on file with QM. If you do not have your login info choose "Forgot Username" or "Forgot Password")
Log in to the Course Review Management System (CRMS) and select “Start a Review Application” on the main screen.
Select Michigan State University.
Select David Goodrich as the QM Coordinator.
Select yourself as the Course Representative.
Select Internal Review as the review type.
Scroll down and enter course information. Select Submit Application. You will receive an email that will prompt you to complete the worksheet once it is approved.
Log in to the Course Review Management System (CRMS) to complete the Course Worksheet.
Select My Course Reviews: Open Course Reviews
Here you will choose the "View" next to the applicable course number.
The Actions section allows you to view, edit and then submit the Course Worksheet. Select edit to input your course information.
When finished, click “Submit Complete Worksheet.”
Your course will automatically be assigned to a Lead Reviewer who will contact you regarding the course review.
After your review, you may make any necessary changes to your QM Review course as a result of the internal review.
This review is an unofficial course review that provides feedback on meeting the QM Standards before submitting for QM recognition.
Steps for Official Review
When the course is ready for the official review:
Sign up for a SpartanQM Online/Blended Course Peer-Review and wait for an email response.
Faculty will use the updated copy of the course that was used in the internal review.
Log in to MyQM at http://www.qmprogram.org/MyQM (Your login name is your email address on file with QM. If you do not have your login info choose "Forgot Username" or "Forgot Password")
Log in to the Course Review Management System (CRMS) and select “Start a Review Application” on the main screen.
Select Michigan State University.
Select David Goodrich as the QM Coordinator.
Select yourself as the Course Representative.
Select QM-Managed Review as the review type.
Scroll down and enter course information. Select Submit Application. You will receive an email that will prompt you to complete the Course Worksheet once it is approved.
Log in to the Course Review Management System (CRMS) to complete the Course Worksheet.
Select My Course Reviews, Open Course Reviews.
Here you will choose the "View" next to the applicable course number.
The Actions section allows you to view, edit and then submit the Course Worksheet. Select edit to input your course information.
If you completed an internal review inside the CRMS, you can copy your internal review worksheet.
MSU staff will add the QM review team to the QM Review Course. This can take up to two weeks.
The Course Representative (faculty course developer/instructor) meets virtually or by phone with the QM review team for a pre-review meeting.
A QM Review is scheduled for a 4-6 week review period, which includes approximately 3 weeks of actual review time in addition to pre- and post-review conference calls.
The QM Team Chair will submit the final report which will be sent to the Course Representative.
Once the standards are met, Quality Matters recognition is provided to the Course Representative and the course is listed in the QM Recognized Courses registry.
Recertification Review
Certified courses are reviewed and re-certified after five years.
Resource Links
QM Higher Education Rubric, Sixth Edition
QM at MSU Community: Faculty and staff at MSU can join this D2L Community site to learn more about the QM Rubric, discounted professional development, and course examples for meeting standards.
Quality Matters website: Create an account using your msu.edu email and access the self-review tools on the MyQM site.
Quality Matters (QM) is a nationally recognized, faculty-centered, peer review process designed to certify the quality of online courses and online components. MSU purchased a campus subscription to the QM Rubric to assist faculty and instructors in creating quality courses that will improve online education and student learning. The initial pilot of using the rubric to inform course design started as an MSU partnership between the Center for Integrative Studies in General Science, College of Arts & Letters, and MSU Information Technology. Currently, MSU maintains its full subscription status on a yearly basis which provides access to the fully annotated QM Rubric and the QM Course Review Management System (CRMS). Additionally, MSU IT Academic Technology consults with faculty and instructors on applying QM standards to their courses and developing new approaches in online and blended learning.
The MSU QM Course Review Process is a faculty-driven, peer review process that emphasizes continuous quality improvement. The QM reviewers experience and review a course from a student perspective and provide feedback based on the Quality Matters Standards. See IT Instructional Technology & Development for information about course development and see IT’s Academic Technology Service Catalog to learn more about QM at MSU.
Our course review process consists of three parts:
a self-review done by you to get familiar with the course review process on the MyQM system.
an internal review by a peer-reviewer to provide initial feedback on the course design.
after any necessary changes are made and the course has run, a copy of the course can undergo an official review conducted by a team of three QM Reviewers (Master Reviewer, Subject Matter Expert and one additional Reviewer) resulting in Quality Matters Certification [cost $1,000].
Whole programs can also be QM certified whose courses have been peer-reviewed. Information on QM program certification can be found on QM’s website.
Getting Started
Anyone at MSU can create an account through the Quality Matters website by using their msu.edu email address.
Quality Matters provides a fully annotated course standards rubric, different types of course reviews including a self-review, and discounted QM professional development through its website and MSU’s subscription.
Some of the Quality Matters resources involve added costs and official course reviews require MSU consultation first.
Course Rubric
The QM Rubric is a research-based peer review process that is widely adopted in higher education as a measure of online course quality. It offers weighted best practices in online instruction to improve course quality.
Visit the QM Higher Education Rubric, Sixth Edition to download the rubric.
The rubric is helpful as a tool to consider what elements may be missing from an online or blended course or to generate suggestions for new features.
Self-Review First
Faculty and staff can use the fully annotated, self review materials, within the MyQM CRMS (Course Review Management System). Annotations explaining each standard in greater detail can be accessed within the Self Review tool after logging in to the QM site.
This unofficial self review is a way to become more familiar with QM standards or assess a course prior to an internal or official review. You can also do pre- and post- assessments of your courses to keep a record of improvements, and a private report can be emailed once completed.
What to expect in a peer-review?
The internal and official review are almost identical. Both generally consist of the following steps:
Pre-Review Discussion
Team chair (Lead Reviewer in an internal review) contacts review members and faculty member to set up a conference call or face-to-face meeting at the beginning of the review. The purpose of the conference call/meeting is to discuss the instructor worksheet, ensure that all members have access to the course, establish the team review timeline, and answer any questions from team members before the review begins.
Review Phase
The review begins. Each team member logs into the QM Rubric website and uses the online rubric tool to record their observations about the course. Remember that you are reviewing the course from the student’s perspective. If you have questions during the review, don’t hesitate to contact your team chair.
Post-Review Discussion
Upon completion of the review, the team chair will call for the final conference. This conference will be among the review team members to discuss any discrepancies in the review and to ensure that recommendations are helpful and effective. All individual reviews will be submitted after this meeting to compile the final report.
Post Review – Revise Course (as needed)
The team chair will submit the final review to the Campus QM Coordinator through the online QM tool. The review findings will be shared with the course instructor who then has an opportunity to respond to the review (using the course Amendment Form in the QM site). If the course does not yet meet standards, the faculty course developer/instructor works to bring the course to standards (with the assistance of an instructional designer, if desired). The review team chair then reviews the changes and determines whether or not the changes move the course to QM standards. In an internal review, revisions are made before submitting for an official review.
Steps for Internal Review
It is good practice to complete a self-review of your course before submitting for internal or official review. This is an optional step and only you see the self-review responses. For a self-review, log into the CRMS (Course Review Management System) on the QM website and use the Self Review tool to conduct a review of your own course.
When you are ready to submit a course for internal review:
Sign up for a SpartanQM Online/Blended Course Peer-Review and wait for an email response.
Make a copy of your course to be reviewed.
Log in to MyQM at http://www.qmprogram.org/MyQM (Your login name is your email address on file with QM. If you do not have your login info choose "Forgot Username" or "Forgot Password")
Log in to the Course Review Management System (CRMS) and select “Start a Review Application” on the main screen.
Select Michigan State University.
Select David Goodrich as the QM Coordinator.
Select yourself as the Course Representative.
Select Internal Review as the review type.
Scroll down and enter course information. Select Submit Application. You will receive an email that will prompt you to complete the worksheet once it is approved.
Log in to the Course Review Management System (CRMS) to complete the Course Worksheet.
Select My Course Reviews: Open Course Reviews
Here you will choose the "View" next to the applicable course number.
The Actions section allows you to view, edit and then submit the Course Worksheet. Select edit to input your course information.
When finished, click “Submit Complete Worksheet.”
Your course will automatically be assigned to a Lead Reviewer who will contact you regarding the course review.
After your review, you may make any necessary changes to your QM Review course as a result of the internal review.
This review is an unofficial course review that provides feedback on meeting the QM Standards before submitting for QM recognition.
Steps for Official Review
When the course is ready for the official review:
Sign up for a SpartanQM Online/Blended Course Peer-Review and wait for an email response.
Faculty will use the updated copy of the course that was used in the internal review.
Log in to MyQM at http://www.qmprogram.org/MyQM (Your login name is your email address on file with QM. If you do not have your login info choose "Forgot Username" or "Forgot Password")
Log in to the Course Review Management System (CRMS) and select “Start a Review Application” on the main screen.
Select Michigan State University.
Select David Goodrich as the QM Coordinator.
Select yourself as the Course Representative.
Select QM-Managed Review as the review type.
Scroll down and enter course information. Select Submit Application. You will receive an email that will prompt you to complete the Course Worksheet once it is approved.
Log in to the Course Review Management System (CRMS) to complete the Course Worksheet.
Select My Course Reviews, Open Course Reviews.
Here you will choose the "View" next to the applicable course number.
The Actions section allows you to view, edit and then submit the Course Worksheet. Select edit to input your course information.
If you completed an internal review inside the CRMS, you can copy your internal review worksheet.
MSU staff will add the QM review team to the QM Review Course. This can take up to two weeks.
The Course Representative (faculty course developer/instructor) meets virtually or by phone with the QM review team for a pre-review meeting.
A QM Review is scheduled for a 4-6 week review period, which includes approximately 3 weeks of actual review time in addition to pre- and post-review conference calls.
The QM Team Chair will submit the final report which will be sent to the Course Representative.
Once the standards are met, Quality Matters recognition is provided to the Course Representative and the course is listed in the QM Recognized Courses registry.
Recertification Review
Certified courses are reviewed and re-certified after five years.
Resource Links
QM Higher Education Rubric, Sixth Edition
QM at MSU Community: Faculty and staff at MSU can join this D2L Community site to learn more about the QM Rubric, discounted professional development, and course examples for meeting standards.
Quality Matters website: Create an account using your msu.edu email and access the self-review tools on the MyQM site.
Authored by:
Dave Goodrich

Posted on: #iteachmsu

SpartanQM - Online/Blended Course Peer-Review Process
Introduction
Quality Matters (QM) is a nationally recognized, facul...
Quality Matters (QM) is a nationally recognized, facul...
Authored by:
PEDAGOGICAL DESIGN
Tuesday, Feb 9, 2021
Posted on: #iteachmsu Educator Awards
PEDAGOGICAL DESIGN
#iteachmsu Educator Awards
What are the #iteachmsu Awards?
Gratitude is so important especially for the wide educator community (including but not limited to faculty, GTAs, ULAs, instructional designers, academic advisors, librarians, coaches, etc.) who help support learning across MSU. At #iteachmsu, we believe in elevating, recognizing, and celebrating those contributions is vital. The #iteachmsu Educator Awards are dedicated to honoring individuals who have been recognized through the Thank an Educator initiative. This is a simple but important act of saying thank you and recognizing the great work of educator colleagues across campus. To learn more about Thank an Educator more broadly check out this #iteachmsu article and this MSU Today article!
Why do the Awards exist?
While the collaborating units and the #iteachmsu project team are excited about the aforementioned “wide educator community”, we have found through informational interviews and observations (as well as conversations with our diverse advisory group and content contributors) that individuals across roles that contribute to the teaching and learning mission of the university may not personally identify as educators. We established the Thank an Educator initiative and are recognizing those individuals with the #iteachmsu Educator Awards to:
help demonstrate the diversity of educators across roles on campus
help individuals associate their name/work with “educator” and embrace their educator identity
celebrate the amazing individuals we have shaping the learning experiences and success of students on our campus.
How are #iteachmsu Educator Award recipients recognized?
In the inaugural year of the #iteachmsu Educator Awards (2019) a brief ceremony and casual reception were held as a conclusion to the Spring Conference on Teaching, Learning, and Student Success. Awardees were designated with a flag on their name tags and picked up their #iteachmsu Educator Award certificates (along with their nomination stories) at reception with food and drink. Dr. Jeff Grabill, Associate Provost at the time, gave a brief welcome and introduction to some of the foundations of #iteachmsu. Then former Provost Youatt concluded the formal portion of the ceremony with congratulations and thoughts on the importance of educator work.
The global pandemic and resulting remote work (2020-21) forced us to think differently about how to hold public events, and while the shift was challenging and uncomfortable at times we have emerged with a way to uplift #iteachmsu Educator Award recipients in a more public way. Instead of a small reception, recognized individuals are being recognized publically via articles here on iteach.msu.edu. They each receive the same Educator Award materials- which are distributed digitally.
How can you submit an educator for an #iteachmsu Educator Award?
Anyone can recognize a fellow Spartan for their contributions to MSU's teaching and learning mission or for how they made a lasting impression on your experience. All you have to do is click "Thank an Educator" in the left panel of iteach.msu.edu. From there you'll see a short form where you can enter the name, netID, and a short story of the educator you'd like to recognize.
updated 06/23/2021
Gratitude is so important especially for the wide educator community (including but not limited to faculty, GTAs, ULAs, instructional designers, academic advisors, librarians, coaches, etc.) who help support learning across MSU. At #iteachmsu, we believe in elevating, recognizing, and celebrating those contributions is vital. The #iteachmsu Educator Awards are dedicated to honoring individuals who have been recognized through the Thank an Educator initiative. This is a simple but important act of saying thank you and recognizing the great work of educator colleagues across campus. To learn more about Thank an Educator more broadly check out this #iteachmsu article and this MSU Today article!
Why do the Awards exist?
While the collaborating units and the #iteachmsu project team are excited about the aforementioned “wide educator community”, we have found through informational interviews and observations (as well as conversations with our diverse advisory group and content contributors) that individuals across roles that contribute to the teaching and learning mission of the university may not personally identify as educators. We established the Thank an Educator initiative and are recognizing those individuals with the #iteachmsu Educator Awards to:
help demonstrate the diversity of educators across roles on campus
help individuals associate their name/work with “educator” and embrace their educator identity
celebrate the amazing individuals we have shaping the learning experiences and success of students on our campus.
How are #iteachmsu Educator Award recipients recognized?
In the inaugural year of the #iteachmsu Educator Awards (2019) a brief ceremony and casual reception were held as a conclusion to the Spring Conference on Teaching, Learning, and Student Success. Awardees were designated with a flag on their name tags and picked up their #iteachmsu Educator Award certificates (along with their nomination stories) at reception with food and drink. Dr. Jeff Grabill, Associate Provost at the time, gave a brief welcome and introduction to some of the foundations of #iteachmsu. Then former Provost Youatt concluded the formal portion of the ceremony with congratulations and thoughts on the importance of educator work.
The global pandemic and resulting remote work (2020-21) forced us to think differently about how to hold public events, and while the shift was challenging and uncomfortable at times we have emerged with a way to uplift #iteachmsu Educator Award recipients in a more public way. Instead of a small reception, recognized individuals are being recognized publically via articles here on iteach.msu.edu. They each receive the same Educator Award materials- which are distributed digitally.
How can you submit an educator for an #iteachmsu Educator Award?
Anyone can recognize a fellow Spartan for their contributions to MSU's teaching and learning mission or for how they made a lasting impression on your experience. All you have to do is click "Thank an Educator" in the left panel of iteach.msu.edu. From there you'll see a short form where you can enter the name, netID, and a short story of the educator you'd like to recognize.
updated 06/23/2021
Authored by:
Makena Neal

Posted on: #iteachmsu Educator Awards

#iteachmsu Educator Awards
What are the #iteachmsu Awards?
Gratitude is so important especiall...
Gratitude is so important especiall...
Authored by:
PEDAGOGICAL DESIGN
Tuesday, Jul 20, 2021
Posted on: #iteachmsu
NAVIGATING CONTEXT
Making an investment in people, taking time off work
While the Center for Economic and Policy Research has gone so far as to call the U.S. the “No Vacation Nation" stating from a study of 22 of the richest countries that "The United States continues to be the only advanced economy that does not guarantee its workers paid vacation and holidays." Former President Samuel Stanley made efforts to ensure that at MSU this wasn't ture. On his last day of service, President Stanley declared that, "MSU’s biggest investment — and greatest strength — is you, the exceptionally talented support staff, faculty and academic staff who bring our educational mission to life. You do so much to teach, inspire and support our students’ success in all they do. An important part of my job, and that of my administration, is recognizing your efforts and supporting your success." And the way in which Stanley recognized those efforts, was to award a new, annual winter break, which for this academic year will run from Dec. 23 through Jan. 2.In my experience, this is a time that many employees would use vacation days or accrued time off, but the act of intentionally gifting MSU staff this time means that the days they would have otherwise allocated to ringing in the new year can be distributed to other times in their work cycle. But will they? According to a study done by Glassdoor, the reasons U.S. workers don’t use their vacation time includes:
Their workload is too great and no one else at their company can do the work in their absence without fear they will fall behind.
They worry they will miss out on participating in an important project, decision or meeting.
They feel guilty about leaving the office too long because they think their team might feel lost or overwhelmed.
Some worry their desire to take vacation time will make them appear less motivated or dedicated.
Additionally, the Glassdoor study found that of those who did use vacation time, only 54% were able to fully "check out" while 27% were expected to stay aware of work issues and jump in if need be. This data-- combined with that from a study from the World Health Organization and the International Labour Organization that found working 55 hours or more a week was associated with a 35% higher risk of stroke and a 17% higher risk of dying from heart disease, compared with a working week of 35 to 40 hours-- has a few important take aways for us...
Over working ourselves has negative health implications.
Taking the time off that we're provided by our organization is important for both employees and employer.
Workplace culture, division of labor, and human capital/capacity all impact workers' ability to let go while taking time away (or taking time at all).
"While taking a vacation may make employees temporarily feel behind, they should realize that stepping away from work and fully disconnecting carries a ripple effect of benefits. It allows employees to return to work feeling more productive, creative, recharged and reenergized. In turn, employers should consider what a vacation really means – to actually vacate work – and how they can support employees to find true rest and relaxation to avoid burnout and turnover within their organizations," said Carmel Galvin, Glassdoor chief human resources officer. Additionalluy, in a report on the impacts of a reduction to 32 hour/4-day work-week in 27 companies, scholars at Boston College, the University College Dublin and Cambridge University found improvement in many well-being metrics. "Stress, burnout, fatigue, work-family conflict all declined, while physical and mental health, positive affect, workfamily and work-life balance, and satisfaction across multiple domains of life increased." Download a copy of the report for all the details.I love the way this Forbes article by Caroline Castrill puts it, "Don’t be a vacation slacker. Time off is linked to a slew of benefits, including better sleep and improved mental health. So, what are you waiting for? Put the guilt aside and plan your next holiday. Your body and mind will thank you." The article also links to multiple studies that support the assertions that vacation time:
increases mindfulness
improves heart health
reduces stress
boosts brainpower
improves sleep
The bottom line is taking vacation time is essential to employee survival. We (the royal "we") still have a long way to go when it comes to employee health and workplace wellbeing, but taking full-advantage of employer provided breaks is one place to start. So as the year comes to an end... leave your computer at work, set your away message, turn off notifications, and respect your own PTO boundaries. Photo by Bethany Legg on Unsplash
Their workload is too great and no one else at their company can do the work in their absence without fear they will fall behind.
They worry they will miss out on participating in an important project, decision or meeting.
They feel guilty about leaving the office too long because they think their team might feel lost or overwhelmed.
Some worry their desire to take vacation time will make them appear less motivated or dedicated.
Additionally, the Glassdoor study found that of those who did use vacation time, only 54% were able to fully "check out" while 27% were expected to stay aware of work issues and jump in if need be. This data-- combined with that from a study from the World Health Organization and the International Labour Organization that found working 55 hours or more a week was associated with a 35% higher risk of stroke and a 17% higher risk of dying from heart disease, compared with a working week of 35 to 40 hours-- has a few important take aways for us...
Over working ourselves has negative health implications.
Taking the time off that we're provided by our organization is important for both employees and employer.
Workplace culture, division of labor, and human capital/capacity all impact workers' ability to let go while taking time away (or taking time at all).
"While taking a vacation may make employees temporarily feel behind, they should realize that stepping away from work and fully disconnecting carries a ripple effect of benefits. It allows employees to return to work feeling more productive, creative, recharged and reenergized. In turn, employers should consider what a vacation really means – to actually vacate work – and how they can support employees to find true rest and relaxation to avoid burnout and turnover within their organizations," said Carmel Galvin, Glassdoor chief human resources officer. Additionalluy, in a report on the impacts of a reduction to 32 hour/4-day work-week in 27 companies, scholars at Boston College, the University College Dublin and Cambridge University found improvement in many well-being metrics. "Stress, burnout, fatigue, work-family conflict all declined, while physical and mental health, positive affect, workfamily and work-life balance, and satisfaction across multiple domains of life increased." Download a copy of the report for all the details.I love the way this Forbes article by Caroline Castrill puts it, "Don’t be a vacation slacker. Time off is linked to a slew of benefits, including better sleep and improved mental health. So, what are you waiting for? Put the guilt aside and plan your next holiday. Your body and mind will thank you." The article also links to multiple studies that support the assertions that vacation time:
increases mindfulness
improves heart health
reduces stress
boosts brainpower
improves sleep
The bottom line is taking vacation time is essential to employee survival. We (the royal "we") still have a long way to go when it comes to employee health and workplace wellbeing, but taking full-advantage of employer provided breaks is one place to start. So as the year comes to an end... leave your computer at work, set your away message, turn off notifications, and respect your own PTO boundaries. Photo by Bethany Legg on Unsplash
Authored by:
Makena Neal

Posted on: #iteachmsu

Making an investment in people, taking time off work
While the Center for Economic and Policy Research has gone so far a...
Authored by:
NAVIGATING CONTEXT
Friday, Dec 2, 2022
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Planning for Cooperative Learning
Picture a classroom full of voices, chairs facing not the front but one another, heads leaned close, and pens moving furiously. This image is very different from the traditional university classroom in which a gallery of students listen and watch as a professor recites information. However, an increasing amount of university instructors favor the former example for their classrooms (Smith et al. 2005). Why would undergraduate instructors turn away from tradition and toward this more cooperative learning environment?
Many studies have found there is a fundamental difference in the way students engage with material in cooperative classes. In my personal experience with cooperative learning, I have witnessed students constructing new knowledge based on previous experience, gaining a richer understanding of a concept by explaining it to a peer, and even voicing their insecurities with the material. In this post, I will discuss the benefits of cooperative learning and explore some cooperative learning approaches. I hope to persuade you that cooperative learning is an effective and feasible approach that can be incorporated into your classroom this semester and beyond.
Active Learning vs. Lecturing
Anecdotal evidence aside, the data speak for themselves. In a meta-analysis of over 200 studies, Freedman et al. (2014) found dramatic differences between lecture-based and active instructional strategies (including cooperative learning) in science, technology, engineering, and math (STEM) classrooms. Students in active learning classrooms are 1.5 times less likely to fail than students in lecture-based classrooms and outperform their counterparts on exams by an average of 6%. These results point to increased retention and higher GPAs of students within the discipline when active learning strategies are implemented.
What drives these increased learning gains? In the transition between lecturing and active learning, the instructor shifts the learning environment from being teacher-centered to student-centered. This shift in focus promotes greater accountability, ownership of ideas, a sense of belonging amongst students, and a more cooperative classroom.
The Cooperative Classroom
Cooperative learning is one active learning approach documented as effective in achieving student learning goals. With a cooperative learning approach, students work together in small groups to accomplish tasks that promote positive interdependence. In other words, learning activities are structured so that achievement is both beneficial to individual students and also to the group as a whole. These activities can last anywhere from five minutes to an entire semester. Successful cooperative learning strategies promote student engagement with the material, individual accountability, and teamwork-building skills. Cooperative learning also promotes regular, formative assessment of student learning, higher order thinking, and builds classroom community (see Smith et al. 2005),
Cooperative Learning in Action
The key to successfully implementing cooperative learning is aligning it with learning objectives. Cooperative learning activities aren’t extras, but essential steps toward optimal learning. Some topics could include concepts that will be emphasized on the exam, big ideas for the day, and items that are difficult for students to master. The better integrated these activities are, the easier it will be to select approaches that meet your overall course objectives.
It may seem like an intimidating task to implement cooperative learning in a lecture-based course. Completely redesigning a course involves significant time and effort, and graduate student assistants often don’t have the freedom to dictate the classroom structure. The good news is that cooperative learning can be incorporated into courses in small, low-stakes ways. The following are three strategies that can be integrated into your curriculum next semester and accomplished within 5-15 minutes. I would suggest starting here:
Think-pair-share
Instructors pose a question or discussion topic (e.g., “Based on what you know about global wind and ocean currents, describe why the wave height in the Southern Ocean is an average of two meters higher than in the Equatorial Pacific”). Instructors then give students individual reflection time to process the question and to think about their answer. Following this silent period, students are then asked to pair up with another student to discuss their answer and to resolve any differences (if there is a correct answer to the question). The class can then come together as a large group once again, and the instructor can call on individual groups to share their discussions. This approach encourages students to explore and demonstrate their understanding of key concepts prior to a high-stakes exam in a way that is not possible in a lecture format.
Bonus: The pair step is a great opportunity for the instructor to walk throughout the room to monitor the discussion groups and connect with students on a more individual basis. The share step can be used to assess the distribution of ideas among students and identify sticky points that may require additional attention. This approach also allows students to speak up in class after vetting their thoughts with another student, which helps to decrease public speaking anxiety.
Minute Paper
Similarly to the think-pair-share activity, instructors pose a question or discussion topic. Instructors then provide time (typically under three minutes) for students to write down their ideas . This could be specified as anything from a “brain dump” (e.g., “Discuss the factors that dictate the growth of algae in the Arctic Ocean”) to a more structured form (“e.g., How would you design an experiment to measure the effect of temperature and light on algal growth in the Arctic Ocean?”). Students can then team up into small groups to discuss their answers and come to a consensus or perspective on the major ideas from the question. Following small group time, a few groups can be asked to report out to the whole class about their discussion. The minute paper approach allows instructors and students to move beyond memorization and into higher order thinking skills such as analysis and evaluation.
Bonus: Positive interdependence can be achieved by assigning group members specific roles (e.g., recorder, checker, task manager, and spokesperson). These roles can be rotated each time the activity is used to allow students to practice each communication skill.
Jigsaw
This learning strategy works well for course concepts that can be split up into separate yet interconnected parts. Each part thus represents a piece of the puzzle, and the complete puzzle requires each individual piece to be complete. The jigsaw approach is split into two steps: the expert group meeting and the jigsaw group meeting. In the expert group meeting, instructors split students into small groups that are each assigned one part of the relevant content. Expert groups are assigned to discuss their “puzzle piece” and to achieve a consensus or mastery of their component. Expert groups are then dissolved and new jigsaw groups are formed, made up of one person from each expert group. In the jigsaw group meeting, each “expert ambassador” has a chance to report to the group about his or her piece of the puzzle. Jigsaw groups are then assigned the task of connecting each component to form a complete picture of the concept. The jigsaw approach encourages students to take ownership of their component of the concept and improve their communication skills when meeting with the jigsaw group.
Bonus: Keep in mind that this method, while rich in discussion opportunities, requires the most logistical planning and organizational support of the three strategies outlined. For further reading, see https://www.jigsaw.org.
What are your favorite cooperative learning activities that you use in your own classroom? Do you have a successful strategy to encourage students to embrace cooperative learning? Please share your thoughts in the comments below or use the hashtag #ITeachMSU to further engage in the conversation on Twitter or Facebook.
Additional Reading
Angelo, T., K.P. Cross. 1993. Classroom Assessment Techniques: A Handbook for College Teachers. Jossey-Bass. ISBN: 1555425003. http://www.amazon.com/Classroom-Assessment-Techniques-Handbook-Teachers/dp/155425003/ref=sr_1_1?ie=UTF8&qid=1450279809&sr=8-1&keywords=classroom+assesment+techniques
Freedman, S., S.L. Eddy, M. McDonough, M.K. Smith, N. Okoroafor, H. Jordt, and M.P. Wenderoth. 2014. Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America 111(23): 8410-8415. http://www.pnas.org/content/111/23/8410.full.pdf
Johnson D.W., R.T. Johnson, and K.A. Smith. 2006. Active Learning: Cooperation in the College Classroom. Interaction Book Co. ISBN: 978-0939603145. http://www.amazon.com/Active-Learning-Cooperation-College-Classroom/dp/093960314
Smith, K.A., S.D. Sheppard, D.W. Johnson, and R.T. Johnson. 2005. Pedagogies of engagement: Classroom-based practices (cooperative learning and problem-based learning). Journal of Engineering Education 94: 87-101.http://personal.cege.umn.edu/~smith/docs/Smith-Pedagogies_of_Engagement.pdf
Originally posted at “Inside Teaching MSU” (site no longer live): Salk, K. Planning for Cooperative Learning. inside teaching.grad.msu.edu
Many studies have found there is a fundamental difference in the way students engage with material in cooperative classes. In my personal experience with cooperative learning, I have witnessed students constructing new knowledge based on previous experience, gaining a richer understanding of a concept by explaining it to a peer, and even voicing their insecurities with the material. In this post, I will discuss the benefits of cooperative learning and explore some cooperative learning approaches. I hope to persuade you that cooperative learning is an effective and feasible approach that can be incorporated into your classroom this semester and beyond.
Active Learning vs. Lecturing
Anecdotal evidence aside, the data speak for themselves. In a meta-analysis of over 200 studies, Freedman et al. (2014) found dramatic differences between lecture-based and active instructional strategies (including cooperative learning) in science, technology, engineering, and math (STEM) classrooms. Students in active learning classrooms are 1.5 times less likely to fail than students in lecture-based classrooms and outperform their counterparts on exams by an average of 6%. These results point to increased retention and higher GPAs of students within the discipline when active learning strategies are implemented.
What drives these increased learning gains? In the transition between lecturing and active learning, the instructor shifts the learning environment from being teacher-centered to student-centered. This shift in focus promotes greater accountability, ownership of ideas, a sense of belonging amongst students, and a more cooperative classroom.
The Cooperative Classroom
Cooperative learning is one active learning approach documented as effective in achieving student learning goals. With a cooperative learning approach, students work together in small groups to accomplish tasks that promote positive interdependence. In other words, learning activities are structured so that achievement is both beneficial to individual students and also to the group as a whole. These activities can last anywhere from five minutes to an entire semester. Successful cooperative learning strategies promote student engagement with the material, individual accountability, and teamwork-building skills. Cooperative learning also promotes regular, formative assessment of student learning, higher order thinking, and builds classroom community (see Smith et al. 2005),
Cooperative Learning in Action
The key to successfully implementing cooperative learning is aligning it with learning objectives. Cooperative learning activities aren’t extras, but essential steps toward optimal learning. Some topics could include concepts that will be emphasized on the exam, big ideas for the day, and items that are difficult for students to master. The better integrated these activities are, the easier it will be to select approaches that meet your overall course objectives.
It may seem like an intimidating task to implement cooperative learning in a lecture-based course. Completely redesigning a course involves significant time and effort, and graduate student assistants often don’t have the freedom to dictate the classroom structure. The good news is that cooperative learning can be incorporated into courses in small, low-stakes ways. The following are three strategies that can be integrated into your curriculum next semester and accomplished within 5-15 minutes. I would suggest starting here:
Think-pair-share
Instructors pose a question or discussion topic (e.g., “Based on what you know about global wind and ocean currents, describe why the wave height in the Southern Ocean is an average of two meters higher than in the Equatorial Pacific”). Instructors then give students individual reflection time to process the question and to think about their answer. Following this silent period, students are then asked to pair up with another student to discuss their answer and to resolve any differences (if there is a correct answer to the question). The class can then come together as a large group once again, and the instructor can call on individual groups to share their discussions. This approach encourages students to explore and demonstrate their understanding of key concepts prior to a high-stakes exam in a way that is not possible in a lecture format.
Bonus: The pair step is a great opportunity for the instructor to walk throughout the room to monitor the discussion groups and connect with students on a more individual basis. The share step can be used to assess the distribution of ideas among students and identify sticky points that may require additional attention. This approach also allows students to speak up in class after vetting their thoughts with another student, which helps to decrease public speaking anxiety.
Minute Paper
Similarly to the think-pair-share activity, instructors pose a question or discussion topic. Instructors then provide time (typically under three minutes) for students to write down their ideas . This could be specified as anything from a “brain dump” (e.g., “Discuss the factors that dictate the growth of algae in the Arctic Ocean”) to a more structured form (“e.g., How would you design an experiment to measure the effect of temperature and light on algal growth in the Arctic Ocean?”). Students can then team up into small groups to discuss their answers and come to a consensus or perspective on the major ideas from the question. Following small group time, a few groups can be asked to report out to the whole class about their discussion. The minute paper approach allows instructors and students to move beyond memorization and into higher order thinking skills such as analysis and evaluation.
Bonus: Positive interdependence can be achieved by assigning group members specific roles (e.g., recorder, checker, task manager, and spokesperson). These roles can be rotated each time the activity is used to allow students to practice each communication skill.
Jigsaw
This learning strategy works well for course concepts that can be split up into separate yet interconnected parts. Each part thus represents a piece of the puzzle, and the complete puzzle requires each individual piece to be complete. The jigsaw approach is split into two steps: the expert group meeting and the jigsaw group meeting. In the expert group meeting, instructors split students into small groups that are each assigned one part of the relevant content. Expert groups are assigned to discuss their “puzzle piece” and to achieve a consensus or mastery of their component. Expert groups are then dissolved and new jigsaw groups are formed, made up of one person from each expert group. In the jigsaw group meeting, each “expert ambassador” has a chance to report to the group about his or her piece of the puzzle. Jigsaw groups are then assigned the task of connecting each component to form a complete picture of the concept. The jigsaw approach encourages students to take ownership of their component of the concept and improve their communication skills when meeting with the jigsaw group.
Bonus: Keep in mind that this method, while rich in discussion opportunities, requires the most logistical planning and organizational support of the three strategies outlined. For further reading, see https://www.jigsaw.org.
What are your favorite cooperative learning activities that you use in your own classroom? Do you have a successful strategy to encourage students to embrace cooperative learning? Please share your thoughts in the comments below or use the hashtag #ITeachMSU to further engage in the conversation on Twitter or Facebook.
Additional Reading
Angelo, T., K.P. Cross. 1993. Classroom Assessment Techniques: A Handbook for College Teachers. Jossey-Bass. ISBN: 1555425003. http://www.amazon.com/Classroom-Assessment-Techniques-Handbook-Teachers/dp/155425003/ref=sr_1_1?ie=UTF8&qid=1450279809&sr=8-1&keywords=classroom+assesment+techniques
Freedman, S., S.L. Eddy, M. McDonough, M.K. Smith, N. Okoroafor, H. Jordt, and M.P. Wenderoth. 2014. Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America 111(23): 8410-8415. http://www.pnas.org/content/111/23/8410.full.pdf
Johnson D.W., R.T. Johnson, and K.A. Smith. 2006. Active Learning: Cooperation in the College Classroom. Interaction Book Co. ISBN: 978-0939603145. http://www.amazon.com/Active-Learning-Cooperation-College-Classroom/dp/093960314
Smith, K.A., S.D. Sheppard, D.W. Johnson, and R.T. Johnson. 2005. Pedagogies of engagement: Classroom-based practices (cooperative learning and problem-based learning). Journal of Engineering Education 94: 87-101.http://personal.cege.umn.edu/~smith/docs/Smith-Pedagogies_of_Engagement.pdf
Originally posted at “Inside Teaching MSU” (site no longer live): Salk, K. Planning for Cooperative Learning. inside teaching.grad.msu.edu
Posted by:
Maddie Shellgren
Posted on: #iteachmsu
Planning for Cooperative Learning
Picture a classroom full of voices, chairs facing not the front but...
Posted by:
PEDAGOGICAL DESIGN
Friday, Nov 2, 2018
Posted on: MSU Online & Remote Teaching
ASSESSING LEARNING
Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs. From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the remote environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. In thinking about online exams, and the current situation for remote teaching, we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams. When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Authored by:
Jessica Knott, Stephen Thomas, Becky Matz, Kate Sonka, Sa...

Posted on: MSU Online & Remote Teaching

Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, gen...
Authored by:
ASSESSING LEARNING
Tuesday, Jul 7, 2020
Posted on: Spring Conference on Teaching & Learning
PEDAGOGICAL DESIGN
Keynote I: Drawing to Teach: Visualizing our Curriculum for Reflection and Community
Stephen Thomas
Title: Drawing to Teach: Visualizing our Curriculum for Reflection and CommunityLocation: Room 2130College courses and programs of study are comprised of a complex arrangement of structures and processes that can make them difficult to conceptualize or communicate to others. When describing a course to others, we often fall back on simplistic narratives of the topic without referencing the pedagogy, assessment, learning environment, resources, student engagement, or a myriad of other impactful features. In this presentation we will look at what it might mean to use visual tools and formats to more formatively represent our curriculum to allow reflection on your teaching, receive feedback from colleagues, and foster community around our teaching efforts.
Dr. Stephen Thomas is the Assistant Dean for STEM Education Teaching and Learning, the Associate Director for the Center for Integrative Studies in General Science at and the Digital Curriculum Coordinator for the College of Natural Science at MSU. For his bachelor’s degree from Denison University, Stephen majored in Biology and minored in Art. This interest in the science/art intersection continued into graduate school as he freelanced as a biological illustrator while earning his masters and Ph.D. at the University of Massachusetts at Amherst in Organismal and Evolutionary Biology and Entomology. Since coming to MSU, Stephen’s focus has shifted from virulence of fungal pathogens of Lymantria dispar to visual communication of science in formal and informal settings and the use of technology in teaching. Stephen has worked on projects such as the use of comics to reduce subject anxiety in non-major science courses, the development of a Massive Open Online Course (MOOC) to teach general science, and augmented reality and kiosk games to engage visitors in science museums. In more recent projects, Stephen has worked on curriculum for Drawing to Learn Biology where students explore science practices of observation and visual model-based reasoning through nature journaling. In his professional development work, Stephen collaborates with Dr. Julie Libarkin on building communities of practice in STEM teaching, STEM education research, and interdisciplinary experiences in art, science, and culture. You can learn more about this work at the STEMed@State website.
Title: Drawing to Teach: Visualizing our Curriculum for Reflection and CommunityLocation: Room 2130College courses and programs of study are comprised of a complex arrangement of structures and processes that can make them difficult to conceptualize or communicate to others. When describing a course to others, we often fall back on simplistic narratives of the topic without referencing the pedagogy, assessment, learning environment, resources, student engagement, or a myriad of other impactful features. In this presentation we will look at what it might mean to use visual tools and formats to more formatively represent our curriculum to allow reflection on your teaching, receive feedback from colleagues, and foster community around our teaching efforts.
Dr. Stephen Thomas is the Assistant Dean for STEM Education Teaching and Learning, the Associate Director for the Center for Integrative Studies in General Science at and the Digital Curriculum Coordinator for the College of Natural Science at MSU. For his bachelor’s degree from Denison University, Stephen majored in Biology and minored in Art. This interest in the science/art intersection continued into graduate school as he freelanced as a biological illustrator while earning his masters and Ph.D. at the University of Massachusetts at Amherst in Organismal and Evolutionary Biology and Entomology. Since coming to MSU, Stephen’s focus has shifted from virulence of fungal pathogens of Lymantria dispar to visual communication of science in formal and informal settings and the use of technology in teaching. Stephen has worked on projects such as the use of comics to reduce subject anxiety in non-major science courses, the development of a Massive Open Online Course (MOOC) to teach general science, and augmented reality and kiosk games to engage visitors in science museums. In more recent projects, Stephen has worked on curriculum for Drawing to Learn Biology where students explore science practices of observation and visual model-based reasoning through nature journaling. In his professional development work, Stephen collaborates with Dr. Julie Libarkin on building communities of practice in STEM teaching, STEM education research, and interdisciplinary experiences in art, science, and culture. You can learn more about this work at the STEMed@State website.
Authored by:
Stephen Thomas, Associate Director, CISGS; Assistant Dean...

Posted on: Spring Conference on Teaching & Learning

Keynote I: Drawing to Teach: Visualizing our Curriculum for Reflection and Community
Stephen Thomas
Title: Drawing to Teach: Visualizing our Curriculum ...
Title: Drawing to Teach: Visualizing our Curriculum ...
Authored by:
PEDAGOGICAL DESIGN
Monday, May 1, 2023
Posted on: Teaching Toolkit Tailgate
NAVIGATING CONTEXT
Image from insidehighered.com
How do MSU faculty view their strengths and weaknesses as educators?
What resources do they need to continue to grow?
In 2018, our Learning Community of Adams Academy graduates surveyed 215 faculty to find out.
Here are some of our results:
Strengths: We see ourselves as having more strengths than challenges, especially:
Teaching with enthusiasm
Fostering active learning
Female respondents: mentoring, teaching teachers, facilitating connections and creating community.
Challenges:
Student assessment was the most commonly cited challenge
Fostering active learning (again!)
Fostering dialogue
Familiarity with evidence-based teaching practices: much variation!
Broad Business College and the College of Music, no respondents familiar with the concept (or at least the term).
James Madison, the College of Law, the College of Veterinary Med. and the College of Osteopathic Med.: all respondents familiar with it.
Labor categories: a plurality of “no” responses only from tenure-track and “other”: tenured, fixed-term and academic specialists had plurality of “yes” answers.
Barriers to developing teaching practice:
“More time” is no. 1 response.
Most frequently used resources for developing teaching practice:
Brown Bag or Learn at Lunch presentations
Departmental workshops
Academic Advancement Network
MSU Learning Communities
Following our survey, in 2019 we developed a peer-observation protocol.
If you’re interested in trying it out, either in your own department or with one of our group, please contact Mike or Cheryl.
Dr. Cheryl Caesar, caesarc@msu.edu
Dr. Michael Ristich, ristich@msu.edu
MSU Faculty Attitudes towards Teaching: Reports from the Field
Image from insidehighered.com
How do MSU faculty view their strengths and weaknesses as educators?
What resources do they need to continue to grow?
In 2018, our Learning Community of Adams Academy graduates surveyed 215 faculty to find out.
Here are some of our results:
Strengths: We see ourselves as having more strengths than challenges, especially:
Teaching with enthusiasm
Fostering active learning
Female respondents: mentoring, teaching teachers, facilitating connections and creating community.
Challenges:
Student assessment was the most commonly cited challenge
Fostering active learning (again!)
Fostering dialogue
Familiarity with evidence-based teaching practices: much variation!
Broad Business College and the College of Music, no respondents familiar with the concept (or at least the term).
James Madison, the College of Law, the College of Veterinary Med. and the College of Osteopathic Med.: all respondents familiar with it.
Labor categories: a plurality of “no” responses only from tenure-track and “other”: tenured, fixed-term and academic specialists had plurality of “yes” answers.
Barriers to developing teaching practice:
“More time” is no. 1 response.
Most frequently used resources for developing teaching practice:
Brown Bag or Learn at Lunch presentations
Departmental workshops
Academic Advancement Network
MSU Learning Communities
Following our survey, in 2019 we developed a peer-observation protocol.
If you’re interested in trying it out, either in your own department or with one of our group, please contact Mike or Cheryl.
Dr. Cheryl Caesar, caesarc@msu.edu
Dr. Michael Ristich, ristich@msu.edu
Authored by:
Cheryl Caesar and Mike Ristich
Posted on: Teaching Toolkit Tailgate
Image from insidehighered.com
How do MSU facu...
MSU Faculty Attitudes towards Teaching: Reports from the Field
Image from insidehighered.com
How do MSU facu...
Authored by:
NAVIGATING CONTEXT
Monday, Jul 27, 2020