We found 18 results that contain "ungrading"
Posted on: Ungrading (a CoP)

Beyond Buzzwords: The Practice of Ungrading
What is ungrading?“Ungrading is a practice which eliminates or greatly minimizes the use of assigned points or letter grades in a course, focusing instead on providing frequent and detailed feedback to students on their work, in relation to the course learning goals…the primary purpose of the assessment is to help students learn and improve their knowledge and skills, rather than to create a summative score that students use to compare themselves against an external credential.” (Kenyon, 2022)MAET Principles:Providing support from application to beyond graduationValuing diversity of resources, perspectives, and communitiesPromoting growth as curious learners and transformational leadersWhat is ungrading in MAET?
Why Ungrading?
Act of social justice
We are biased
Grades are problematic
Better learning
MAET Ungrading Overview
All learners begin with a 4.0
Weekly unit schedule with due dates
Learners submit creations (assignments)
Instructors provide timely, balanced, qualitative feedback
Learners iterate creations
Instructors review iterations
Learners reflect on progress
Submit final grades to MSU
How is this different from what we used to do?
Program wide ungrading (all courses)
No points for assignments
No participation points
Gradebook has only 1 item (final grade)
Instructor communicates if student work does not meet expectations/falls below 4.0
Students reflect on learning/grade twice
Use single-point rubric for feedback
Lessons Learned and MAET Recommends
unveil and define beliefs
How do you communicate expectations?
What is sufficient? Exemplary?
What is a 4.0? 3.5?
Can all students get a 4.0? Should they?
Who has the responsibility in student experience and student learning?
responding to students and instructors
Consistency (and iteration)
Realistic timelines (1+ year)
Regular check ins/meetings
iterate and refine
Still some anxiety over potential email that their grade dropped*
About Me
For more information, access the full slide deck (and source of this article).ABOUT THE AUTHORS:
Liz Owens Boltz - MAET Director & Instructor
Brittany Dillman - Director of Graduate Certificate Programs, GC Advisor & Instructor
Candace Robertson - Asst Director of Student Experience & Outreach, MA Advisor & Instructor
Heather Williamson - Academic Program Coordinator & Admissions
Why Ungrading?
Act of social justice
We are biased
Grades are problematic
Better learning
MAET Ungrading Overview
All learners begin with a 4.0
Weekly unit schedule with due dates
Learners submit creations (assignments)
Instructors provide timely, balanced, qualitative feedback
Learners iterate creations
Instructors review iterations
Learners reflect on progress
Submit final grades to MSU
How is this different from what we used to do?
Program wide ungrading (all courses)
No points for assignments
No participation points
Gradebook has only 1 item (final grade)
Instructor communicates if student work does not meet expectations/falls below 4.0
Students reflect on learning/grade twice
Use single-point rubric for feedback
Lessons Learned and MAET Recommends
unveil and define beliefs
How do you communicate expectations?
What is sufficient? Exemplary?
What is a 4.0? 3.5?
Can all students get a 4.0? Should they?
Who has the responsibility in student experience and student learning?
responding to students and instructors
Consistency (and iteration)
Realistic timelines (1+ year)
Regular check ins/meetings
iterate and refine
Still some anxiety over potential email that their grade dropped*
About Me
For more information, access the full slide deck (and source of this article).ABOUT THE AUTHORS:
Liz Owens Boltz - MAET Director & Instructor
Brittany Dillman - Director of Graduate Certificate Programs, GC Advisor & Instructor
Candace Robertson - Asst Director of Student Experience & Outreach, MA Advisor & Instructor
Heather Williamson - Academic Program Coordinator & Admissions
Authored by: Brittany Dillman, Liz Owens Boltz, Candace Robertson, Heather Williamson
Pedagogical Design
Posted on: #iteachmsu

Focusing on iteration and growth: Making the shift to ungrading
Topic Area: Student Success
Presented By: Candace Robertson, Brittany Dillman, Liz Boltz
Abstract:
How can we support student success by removing the barrier of grading? What impact would this have on feedback and iteration? Members from Team MAET (Master of Arts in Educational Technology) will share how they worked through these questions and others to move the majority of program courses to an ungrading philosophy as an act of social justice. In this session, you will learn from the triumphs, challenges, and solutions from their journey.Session resources:Google Slidedeck (website)
Presented By: Candace Robertson, Brittany Dillman, Liz Boltz
Abstract:
How can we support student success by removing the barrier of grading? What impact would this have on feedback and iteration? Members from Team MAET (Master of Arts in Educational Technology) will share how they worked through these questions and others to move the majority of program courses to an ungrading philosophy as an act of social justice. In this session, you will learn from the triumphs, challenges, and solutions from their journey.Session resources:Google Slidedeck (website)
Authored by: Candace Robertson, Brittany Dillman, Liz Boltz
Assessing Learning
Posted on: #iteachmsu

5 Innovative Grading Strategies: A Quick Guide
Introduction:
As educators we seek to enhance student engagement and learning outcomes, exploring innovative grading strategies can offer fresh perspectives and effective solutions. Here’s a concise overview of five innovative grading practices:
1. Transparent Grading:
What is it? Transparent grading involves clearly defining and communicating grading criteria, processes, and feedback to students.
Key Elements: Detailed rubrics, open communication, student involvement.
Benefits: Enhanced understanding, improved performance, increased trust.
2. Self-Grading:
What is it? Self-grading allows students to assess their own work, promoting reflection and autonomy.
Key Elements: Self-assessment, reflection, feedback loops.
Benefits: Empowers students, promotes deeper learning, supports self-regulation.
3. Peer Grading (Peer Review):
What is it? Peer grading involves students assessing each other’s work, enhancing collaboration and responsibility.
Key Elements: Peer evaluation, feedback exchange, critical thinking.
Benefits: Deepens understanding, builds skills, fosters collaboration.
4. Gameful or Gamified Grading:
What is it? Gameful grading integrates game design elements, such as points, badges, and leaderboards, into the grading process.
Key Elements: Gamification, student choice, immediate feedback.
Benefits: Increases engagement, enhances mastery, supports skill development.
5. Ungrading:
What is it?: Ungrading minimizes or eliminates traditional grades in favor of detailed feedback and alternative assessments.
Key Elements: Detailed feedback, self-assessment, focus on growth.
Benefits: Promotes deep learning, reduces stress, supports equity.
Explore these strategies to boost student engagement and learning outcomes!
As educators we seek to enhance student engagement and learning outcomes, exploring innovative grading strategies can offer fresh perspectives and effective solutions. Here’s a concise overview of five innovative grading practices:
1. Transparent Grading:
What is it? Transparent grading involves clearly defining and communicating grading criteria, processes, and feedback to students.
Key Elements: Detailed rubrics, open communication, student involvement.
Benefits: Enhanced understanding, improved performance, increased trust.
2. Self-Grading:
What is it? Self-grading allows students to assess their own work, promoting reflection and autonomy.
Key Elements: Self-assessment, reflection, feedback loops.
Benefits: Empowers students, promotes deeper learning, supports self-regulation.
3. Peer Grading (Peer Review):
What is it? Peer grading involves students assessing each other’s work, enhancing collaboration and responsibility.
Key Elements: Peer evaluation, feedback exchange, critical thinking.
Benefits: Deepens understanding, builds skills, fosters collaboration.
4. Gameful or Gamified Grading:
What is it? Gameful grading integrates game design elements, such as points, badges, and leaderboards, into the grading process.
Key Elements: Gamification, student choice, immediate feedback.
Benefits: Increases engagement, enhances mastery, supports skill development.
5. Ungrading:
What is it?: Ungrading minimizes or eliminates traditional grades in favor of detailed feedback and alternative assessments.
Key Elements: Detailed feedback, self-assessment, focus on growth.
Benefits: Promotes deep learning, reduces stress, supports equity.
Explore these strategies to boost student engagement and learning outcomes!
Authored by: Monica L. Mills
Assessing Learning
Posted on: MSU Online & Remote...
Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs. From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the remote environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. In thinking about online exams, and the current situation for remote teaching, we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams. When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Authored by: Jessica Knott, Stephen Thomas, Becky Matz, Kate Sonka, Sarah Wellman, Daniel Trego, Casey Henley, Jeremy Van Hof, David Howe
Assessing Learning
Posted on: Educator Stories

Brittany Dillman's Educator Story
This week, we are featuring Brittany Dillman, MAET Graduate Certificate Program Coordinator, within the Department of Counseling, Educational Psychology and Special Education at MSU. Brittany was recognized via iteach.msu.edu's Thank and Educator Initiative! We encourage MSU community members to nominate high-impact Spartan educators (via our Thank an Educator initiative) regularly!
Read more about Brittany’s perspectives below. #iteachmsu's questions are bolded below, followed by their responses!
You were recognized via the Thank an Educator Initiative. In one word, what does being an educator mean to you?
Love.
Share with me what this word/quality looks like in your practice?
Being an educator is an act of love. I have always known this, but I don’t think I have always been able to (brave enough or self-secure enough) enact this in ways that I do now.
Have your ideas on this changed over time? if so how?
In previous versions of my educator self, I put content first. Now, I put learners first. This includes checking in with them kindly, sharing myself and my humanity (flaws, quirks, and challenges), and giving them lots of chances. I have learned so much from the work of Lisa Laughman and the MSU Health4U program about emotional wellness to help me make the shift from content first to learners first.
Tell me more about your educational “setting.” This can include, but not limited to departmental affiliations, community connections, co-instructors, and students. (Aka, where do you work?)
I am the Graduate Certificate (GC) Programs Director for the Master of Arts in Educational Technology (MAET) program in the College of Education at MSU. This is my favorite job in my life (so far). In my position, I work very closely with my team to create and maintain curriculum, advise GC students, teach online and hybrid master’s-level courses, support a group of phenomenal adjunct instructors, promote our program and the amazing work of our instructors and learners via social media, recruit students, and anything else that comes up.
What is a challenge you experience in your educator role?
The biggest challenge we face in our program is the increasing prices of MSU tuition and the barriers that cause for potential learners, particularly those learners of color or disability. Our program is phenomenal and students are consistently, incredibly pleased with their experience with us, but the cost is prohibitive for too many learners. In addition to our “assigned duties,” my colleagues and I are looking for grants and other ways we can support our students financially. We haven’t had the success that we want with this, but are continuing to explore big and small ways we can support our learners.
Any particular “solutions” or “best practices” you’ve found that help you support student success at the university despite/in the face of this? What are practices you utilize that help you feel successful as an educator?
I work with an amazing team (program staff and adjunct faculty) and we intentionally focus time and energy on how we work as a team, building our team’s strength, and ways we can improve as a team. This provides the foundation for all of our other work. You’ll notice that most of my answers center around how my team functions because that is what supports me as an educator.
We have a shared mission and vision that we all believe in and buy into. We have it on our website, in our presentations, and on our meeting agendas. We use that to guide us in our decisions. I feel like that sounds a bit cheesy, but it’s true and it really helps!
We believe in and use backward design for course design, and also for program design and decisions to move us forward. So, we make decisions that lead us toward our mission and vision.
Along the way, we make mistakes and so we iterate. One of our instructors emailed me yesterday and wrote, “You are masters of iteration!” We aren’t perfect, but we try to get better.
We rely on each other and our strengths. I bring organization (and spreadsheets!). Other colleagues bring creativity, writing, and networking/connections. We don’t pigeonhole ourselves into these archetypes, but we build off of the best of what we can each bring.
We push each other, question each other, and engage in critical questioning with our ideas. We do this in safe and kind ways, but it helps us all get better when one person asks a question like “have we considered this other way?”
We treat each other, our learners, and instructors, as human beings who are amazing and flawed. We respect each others’ humanity and help when we can. It’s not perfect, but we do our best.
What topics or ideas about teaching and learning would you like to see discussed on the iteach.msu.edu platform? Why do you think this conversation is needed at MSU?
Because my program is educational technology, there is often an assumption that we know everything - all the tools, apps, programs, and all the tricks and shortcuts. We don’t. We ground our program in pedagogy and thoughtful design based on the TPACK Framework (Mishra & Koehler, 2006). I wish all Spartan educators would approach curriculum, design, teaching, instruction, and assessment from a thoughtful, human-centered perspective. How do you design your course to best fit your content, your context, your students, your available technologies, and pedagogies? Then, how do we teach in alignment with this? Then, how do we assess students? Then, how do we improve the whole cycle for the next round? Will specific technologies be a part of this process? Of course! But my wish is that we can keep students at the center of all our work. Two of my favorite MSU events that do this are the Accessible Learning Conference (held in the fall) and the Spring Conference on Teaching, Learning, and Student Success (held in May annually). So, if you are seeking fellow Spartans with this perspective, I recommend starting in those places.
What are you looking forward to (or excited to be a part of) next semester?
So much!
My colleagues and I have been taking some Quality Matters courses to learn more about their research, best practices of online education. So, I am excited to use some of my new knowledge this fall with students and experience the impact of some of the design decisions we have made based on our new and improved knowledge.
I haven’t taught, yet, in 2021 (based on my work schedule and some course buyouts) so I am excited to teach this fall. Our program shifted to a program-wide ungrading philosophy and practice in Fall 2020 and I am excited to get “back” into that now that we’ve had a chance to iterate and improve it.
I am looking forward to my children starting school (they just turned 5) and to experience their continued growth and learning...and being a parent of kids who are in school (a new experience for me)
Finally (and maybe most of all) I’m looking forward to fall weather. I know we need to appreciate all of Michigan’s amazing seasons, but fall is my favorite! I look forward to crisp days, colorful leaves, apple cider, donuts, and pumpkin spice flavored everything for the few short weeks it’s with us. I’m so sorry that pumpkin spice has gotten such a bad reputation in the last few years (though pumpkin spice flavored goat cheese does take the trend a smidge too far for even me). So, if there are fellow fall and pumpkin spice lovers out there who want to connect (or talk about pedagogy and teaching), please email me: dillmanb@msu.edu
Don't forget to celebrate individuals you see making a difference in teaching, learning, or student success at MSU with #iteachmsu's Thank an Educator initiative. You might just see them appear in the next feature! Follow the MSU Hub Twitter account to see other great content from the #iteachmsu Commons as well as educators featured every week during #ThankfulThursdays.
Read more about Brittany’s perspectives below. #iteachmsu's questions are bolded below, followed by their responses!
You were recognized via the Thank an Educator Initiative. In one word, what does being an educator mean to you?
Love.
Share with me what this word/quality looks like in your practice?
Being an educator is an act of love. I have always known this, but I don’t think I have always been able to (brave enough or self-secure enough) enact this in ways that I do now.
Have your ideas on this changed over time? if so how?
In previous versions of my educator self, I put content first. Now, I put learners first. This includes checking in with them kindly, sharing myself and my humanity (flaws, quirks, and challenges), and giving them lots of chances. I have learned so much from the work of Lisa Laughman and the MSU Health4U program about emotional wellness to help me make the shift from content first to learners first.
Tell me more about your educational “setting.” This can include, but not limited to departmental affiliations, community connections, co-instructors, and students. (Aka, where do you work?)
I am the Graduate Certificate (GC) Programs Director for the Master of Arts in Educational Technology (MAET) program in the College of Education at MSU. This is my favorite job in my life (so far). In my position, I work very closely with my team to create and maintain curriculum, advise GC students, teach online and hybrid master’s-level courses, support a group of phenomenal adjunct instructors, promote our program and the amazing work of our instructors and learners via social media, recruit students, and anything else that comes up.
What is a challenge you experience in your educator role?
The biggest challenge we face in our program is the increasing prices of MSU tuition and the barriers that cause for potential learners, particularly those learners of color or disability. Our program is phenomenal and students are consistently, incredibly pleased with their experience with us, but the cost is prohibitive for too many learners. In addition to our “assigned duties,” my colleagues and I are looking for grants and other ways we can support our students financially. We haven’t had the success that we want with this, but are continuing to explore big and small ways we can support our learners.
Any particular “solutions” or “best practices” you’ve found that help you support student success at the university despite/in the face of this? What are practices you utilize that help you feel successful as an educator?
I work with an amazing team (program staff and adjunct faculty) and we intentionally focus time and energy on how we work as a team, building our team’s strength, and ways we can improve as a team. This provides the foundation for all of our other work. You’ll notice that most of my answers center around how my team functions because that is what supports me as an educator.
We have a shared mission and vision that we all believe in and buy into. We have it on our website, in our presentations, and on our meeting agendas. We use that to guide us in our decisions. I feel like that sounds a bit cheesy, but it’s true and it really helps!
We believe in and use backward design for course design, and also for program design and decisions to move us forward. So, we make decisions that lead us toward our mission and vision.
Along the way, we make mistakes and so we iterate. One of our instructors emailed me yesterday and wrote, “You are masters of iteration!” We aren’t perfect, but we try to get better.
We rely on each other and our strengths. I bring organization (and spreadsheets!). Other colleagues bring creativity, writing, and networking/connections. We don’t pigeonhole ourselves into these archetypes, but we build off of the best of what we can each bring.
We push each other, question each other, and engage in critical questioning with our ideas. We do this in safe and kind ways, but it helps us all get better when one person asks a question like “have we considered this other way?”
We treat each other, our learners, and instructors, as human beings who are amazing and flawed. We respect each others’ humanity and help when we can. It’s not perfect, but we do our best.
What topics or ideas about teaching and learning would you like to see discussed on the iteach.msu.edu platform? Why do you think this conversation is needed at MSU?
Because my program is educational technology, there is often an assumption that we know everything - all the tools, apps, programs, and all the tricks and shortcuts. We don’t. We ground our program in pedagogy and thoughtful design based on the TPACK Framework (Mishra & Koehler, 2006). I wish all Spartan educators would approach curriculum, design, teaching, instruction, and assessment from a thoughtful, human-centered perspective. How do you design your course to best fit your content, your context, your students, your available technologies, and pedagogies? Then, how do we teach in alignment with this? Then, how do we assess students? Then, how do we improve the whole cycle for the next round? Will specific technologies be a part of this process? Of course! But my wish is that we can keep students at the center of all our work. Two of my favorite MSU events that do this are the Accessible Learning Conference (held in the fall) and the Spring Conference on Teaching, Learning, and Student Success (held in May annually). So, if you are seeking fellow Spartans with this perspective, I recommend starting in those places.
What are you looking forward to (or excited to be a part of) next semester?
So much!
My colleagues and I have been taking some Quality Matters courses to learn more about their research, best practices of online education. So, I am excited to use some of my new knowledge this fall with students and experience the impact of some of the design decisions we have made based on our new and improved knowledge.
I haven’t taught, yet, in 2021 (based on my work schedule and some course buyouts) so I am excited to teach this fall. Our program shifted to a program-wide ungrading philosophy and practice in Fall 2020 and I am excited to get “back” into that now that we’ve had a chance to iterate and improve it.
I am looking forward to my children starting school (they just turned 5) and to experience their continued growth and learning...and being a parent of kids who are in school (a new experience for me)
Finally (and maybe most of all) I’m looking forward to fall weather. I know we need to appreciate all of Michigan’s amazing seasons, but fall is my favorite! I look forward to crisp days, colorful leaves, apple cider, donuts, and pumpkin spice flavored everything for the few short weeks it’s with us. I’m so sorry that pumpkin spice has gotten such a bad reputation in the last few years (though pumpkin spice flavored goat cheese does take the trend a smidge too far for even me). So, if there are fellow fall and pumpkin spice lovers out there who want to connect (or talk about pedagogy and teaching), please email me: dillmanb@msu.edu
Don't forget to celebrate individuals you see making a difference in teaching, learning, or student success at MSU with #iteachmsu's Thank an Educator initiative. You might just see them appear in the next feature! Follow the MSU Hub Twitter account to see other great content from the #iteachmsu Commons as well as educators featured every week during #ThankfulThursdays.
Posted by: Makena Neal
Pedagogical Design
Posted on: #iteachmsu Educator...

College of Education 2021 #iteachmsu Educator Award Recipients
The following is a list of the educators receiving the #iteachmsu Educator Award from the College of Education. For more information on these awards, check out the article entitled "#iteachmsu Educator Awards".
Kris Surla: Kristen’s impact on me has been so great that you’re just going to have to wait to read about it—in my book.
That’s right. That’s how deep it is and how deep it goes.
Kris, from infinity to infinity. There are no words in this language we share to describe the depth, the vastness, the expansive nature of my love and gratitude for you and all that you have done for me, with me, and alongside me.
You are the greatest of all time — a supernova in my galaxy! Shine bright, fight on. You’re the Yuri to my Malcolm.
Candace Robertson: Candace, has been major help as my advisor and mentor. She continues to see the great in me and wants my education at Michigan State to be the best. Setting me up for success in a way my undergraduate never did. Always, checking in to make sure I have what I need and that I'm okay! Candace is what Michigan State is all about. She also has provided me with the resources to begin un-grading in my classroom becoming the first teacher in AACPS to issue that grading policy. I'm so happy to have Candace as apart of my Michigan State experience and I hope she continues to stay proud of me! Thank you for all you do!
Brittany Dillman: Brittany demonstrates the epitome of student support, critical analysis, and curriculum development. For me, she was *hands down* the most supportive educator at MSU during a very difficult Fall 2020 semester due to socioeconomic and health issues caused by the pandemic. Brittany was caring and persistent in her communication with me to make sure I developed a plan to successfully manage my coursework. For many students undergoing a crisis, this kind of "hands-on" advising is needed. Brittany understood that and never once made me feel like I was a "burdensome" or "irresponsible" student-- because that's not who students are when they are going through traumatic experiences. I am not working again with Brittany in Spring 2021 and have received so much practical and thoughtful feedback on my course content. All of the advice and suggestions that Brittany gives me feel so tangible because she takes the time to provide detailed feedback. I have implemented many of the ideas and content I've gained through the MAET program in my other roles at MSU and have received nothing but positive feedback. Brittany is leading the hell out of this program and I have learned so much from her. Thank you for everything you do Brittany, because I see you and am deeply grateful that you are in this role.
Spencer Morgan: Spencer served as the graduate coordinator for the MSU Community Engagement Scholars Program for the 2020-2021 academic year. He is a valuable member of the team in the Center for Community Engaged Learning. Spencer developed positive and sincere relationships with the undergraduate scholars, community partners, and colleagues in our center. His thoughtful planning and willingness to guide the program in a totally virtual format are impressive and appreciated. Spencer is creative, professional, kind, and an outstanding mentor. He facilitated and led professional development sessions for students and partners and has assisted in further developing our program assessment strategies. I am so thankful for his patience and perseverance through a challenging time in the lives of so many. He contributed greatly to our ability to offer this important program during a global pandemic. Thank you Spencer! Your contributions are appreciated and your impact on the lives of the scholars and partners will be lasting. I am confident that your future as a Student Affairs Professional is going to be amazing!
Anyone can recognize a fellow Spartan for their contributions to MSU's teaching and learning mission or for how they made a lasting impression on your experience. All you have to do is click "Thank an Educator" in the left panel of iteach.msu.edu. From there you'll see a short form where you can enter the name, netID, and a short story of the educator you'd like to recognize.
Kris Surla: Kristen’s impact on me has been so great that you’re just going to have to wait to read about it—in my book.
That’s right. That’s how deep it is and how deep it goes.
Kris, from infinity to infinity. There are no words in this language we share to describe the depth, the vastness, the expansive nature of my love and gratitude for you and all that you have done for me, with me, and alongside me.
You are the greatest of all time — a supernova in my galaxy! Shine bright, fight on. You’re the Yuri to my Malcolm.
Candace Robertson: Candace, has been major help as my advisor and mentor. She continues to see the great in me and wants my education at Michigan State to be the best. Setting me up for success in a way my undergraduate never did. Always, checking in to make sure I have what I need and that I'm okay! Candace is what Michigan State is all about. She also has provided me with the resources to begin un-grading in my classroom becoming the first teacher in AACPS to issue that grading policy. I'm so happy to have Candace as apart of my Michigan State experience and I hope she continues to stay proud of me! Thank you for all you do!
Brittany Dillman: Brittany demonstrates the epitome of student support, critical analysis, and curriculum development. For me, she was *hands down* the most supportive educator at MSU during a very difficult Fall 2020 semester due to socioeconomic and health issues caused by the pandemic. Brittany was caring and persistent in her communication with me to make sure I developed a plan to successfully manage my coursework. For many students undergoing a crisis, this kind of "hands-on" advising is needed. Brittany understood that and never once made me feel like I was a "burdensome" or "irresponsible" student-- because that's not who students are when they are going through traumatic experiences. I am not working again with Brittany in Spring 2021 and have received so much practical and thoughtful feedback on my course content. All of the advice and suggestions that Brittany gives me feel so tangible because she takes the time to provide detailed feedback. I have implemented many of the ideas and content I've gained through the MAET program in my other roles at MSU and have received nothing but positive feedback. Brittany is leading the hell out of this program and I have learned so much from her. Thank you for everything you do Brittany, because I see you and am deeply grateful that you are in this role.
Spencer Morgan: Spencer served as the graduate coordinator for the MSU Community Engagement Scholars Program for the 2020-2021 academic year. He is a valuable member of the team in the Center for Community Engaged Learning. Spencer developed positive and sincere relationships with the undergraduate scholars, community partners, and colleagues in our center. His thoughtful planning and willingness to guide the program in a totally virtual format are impressive and appreciated. Spencer is creative, professional, kind, and an outstanding mentor. He facilitated and led professional development sessions for students and partners and has assisted in further developing our program assessment strategies. I am so thankful for his patience and perseverance through a challenging time in the lives of so many. He contributed greatly to our ability to offer this important program during a global pandemic. Thank you Spencer! Your contributions are appreciated and your impact on the lives of the scholars and partners will be lasting. I am confident that your future as a Student Affairs Professional is going to be amazing!
Anyone can recognize a fellow Spartan for their contributions to MSU's teaching and learning mission or for how they made a lasting impression on your experience. All you have to do is click "Thank an Educator" in the left panel of iteach.msu.edu. From there you'll see a short form where you can enter the name, netID, and a short story of the educator you'd like to recognize.
Posted by: Makena Neal
Pedagogical Design
Posted on: #iteachmsu

Welcome to My Classroom with Dr. Valerie Hedges
The "Welcome to My Classroom" series functions like a pedagogy and practice show and tell where educators from throughout MSU's ecosystem share something from their teaching and learning practice. Valerie shared the ways she has integrated practices in her courses to enhance and center equitable opportunities for learning!
Here are some key take-aways from Dr. Hedges:
When it comes to syllabus language, be transparent about your choices and don't be afraid to cite sources for your rationale. We ask students to cite their sources, we should too. If you need help surfacing and/or naming your pedagogical practices, contact the Center for Teaching and Learning Innovation!
Fostering a sense of belonging is important to student success. Instructors can design interactions at three levels to help promote a students sense of belonging: learner-learner interations, learner-instructor interactions, and learner-content interactions. Check out the recording (below) for more on each!
Being flexible can make a big impact. Where and when do students in your course have a sense of choice or agency in their learning? Are you sharing content in ways that allow people multiple modes of engagement? What barriers to accessing your learning experience exist? What are your current late work policies (and why do they exist - see takeaway bullet one)?
Not all the things "we've always done" are the best way of ding things. When it comes to grading, one simple way to make your practices more equitable is to remove participation and attendance based grades. If you want to consider bigger shifts, you might think about giving students multiple attempts at quizzes. Valerie incorporates feedback and learner reflection into this practice, and has ultimately moved away from a point-based grading system to what she calls "ungrading-lite"
A more student-center course with a focus on equitable practices has ultimately contributed to a more accommodating and empathetic environment for all!
Resources for Continued Growth:
To support your ongoing professional development please consider these resources:
Slide Deck: Access Valerie’s Welcome to My Classroom slide deck which outlines why equitable pedagogy is important, shares examples of how Valerie fosters a sense of belonging through a welcoming course structure, and highlights key considerations of equity in assessments and grading.
Syllabus Example: In the Q&A following Valerie's formal presentation she shared an example of one of her course syllabi to demonstrate the language she uses to set the tone for her learning environment, describe her approach to grading, and more.
Online Discussion: Do you have excamples of equitable, inclusive educator practices that you'd be willing to share broadly? Consider adding an article describing your practice, outlining an activity, or even reflecting on an experience! You can also share how Valerie's talk sparked ideas and questions about equitable pedagogy in the comments below. Both can be done by logging in to the #iteachmsu commons (you're already here!) with your MSU netID (click "log in" in the upper right corner)!
Recording: In case you missed the session or would like to revisit it, you can view the full recording on MediaSpace (also embedded below).
The cover photo for this article was sourced from "EquityTool".
Here are some key take-aways from Dr. Hedges:
When it comes to syllabus language, be transparent about your choices and don't be afraid to cite sources for your rationale. We ask students to cite their sources, we should too. If you need help surfacing and/or naming your pedagogical practices, contact the Center for Teaching and Learning Innovation!
Fostering a sense of belonging is important to student success. Instructors can design interactions at three levels to help promote a students sense of belonging: learner-learner interations, learner-instructor interactions, and learner-content interactions. Check out the recording (below) for more on each!
Being flexible can make a big impact. Where and when do students in your course have a sense of choice or agency in their learning? Are you sharing content in ways that allow people multiple modes of engagement? What barriers to accessing your learning experience exist? What are your current late work policies (and why do they exist - see takeaway bullet one)?
Not all the things "we've always done" are the best way of ding things. When it comes to grading, one simple way to make your practices more equitable is to remove participation and attendance based grades. If you want to consider bigger shifts, you might think about giving students multiple attempts at quizzes. Valerie incorporates feedback and learner reflection into this practice, and has ultimately moved away from a point-based grading system to what she calls "ungrading-lite"
A more student-center course with a focus on equitable practices has ultimately contributed to a more accommodating and empathetic environment for all!
Resources for Continued Growth:
To support your ongoing professional development please consider these resources:
Slide Deck: Access Valerie’s Welcome to My Classroom slide deck which outlines why equitable pedagogy is important, shares examples of how Valerie fosters a sense of belonging through a welcoming course structure, and highlights key considerations of equity in assessments and grading.
Syllabus Example: In the Q&A following Valerie's formal presentation she shared an example of one of her course syllabi to demonstrate the language she uses to set the tone for her learning environment, describe her approach to grading, and more.
Online Discussion: Do you have excamples of equitable, inclusive educator practices that you'd be willing to share broadly? Consider adding an article describing your practice, outlining an activity, or even reflecting on an experience! You can also share how Valerie's talk sparked ideas and questions about equitable pedagogy in the comments below. Both can be done by logging in to the #iteachmsu commons (you're already here!) with your MSU netID (click "log in" in the upper right corner)!
Recording: In case you missed the session or would like to revisit it, you can view the full recording on MediaSpace (also embedded below).
The cover photo for this article was sourced from "EquityTool".
Posted by: Makena Neal
Pedagogical Design
Posted on: #iteachmsu

Exam Strategy for Online and Distance Teaching
Authors: Jeremy Van Hof, Stephen Thomas, Becky Matz, Kate Sonka, Sarah Wellman, Daniel Trego, Casey Henley, Jessica Knott, David Howe With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs. From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the online environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. In thinking about online exams we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams. When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands a remote (distinct from online) environment places on students.
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for online teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands a remote (distinct from online) environment places on students.
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for online teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Authored by: Jeremy Van Hof, Stephen Thomas, Becky Matz, Kate Sonka, Sarah Wellman, Daniel Trego, Casey Henley, David Howe, Jessica Knott
Assessing Learning
Posted on: Ungrading (a CoP)
Multiple stories and sentiments were generously shared by 4/4 Beyond Buzzwords: Ungrading workshop participants (thank you for your vulnerability and candor) about the varied ways in which students react to, and make assumption / inferences about their instructors, after the employment of ungrading and ungrading-inspired practices.
This article (linked below) "Academe Has a Lot to Learn About How Inclusive Teaching Affects Instructors" By Chavella Pittman and Thomas J. Tobin in The Chronicle of Higher Education on FEBRUARY 7, 2022 will likely be of interest to you. Starting out by recognizing / acknowledging the power held by some identities (core, chosen, and given) but not by others, complicates the idea that all educators have the same "power and authority" to give up/share to increase learners' sense of ownership and agency in the classroom. ""What if you have neither the institutional authority (a full-time or tenure-track job) nor the dominant-culture identity (by virtue of your race, gender, and/or ability) that usually go hand in hand with being treated as a respected, powerful presence in the college classroom?... In urging faculty members to adopt inclusive teaching practices, we need to start asking if they actually can — and at what cost, " say Pittman and Tobin.
Take-aways shared in this piece include:
1. Understand that your classroom choices may unintentionally affect or undercut a colleague
2. Discuss in your department the issue of bias in students' rating of teaching
3. Respect the variability among your colleagues, as well as among your students
4. Find trained help
"Share your stories, experiences, and thought processes as you negotiate your instructor role in the classroom..." iteach.msu.edu is one space where we can continue to help "normalize the conversation about instructor identity and status as a necessary element in the adoption of inclusive design and teaching practices".
https://www.chronicle.com/article/academe-has-a-lot-to-learn-about-how-inclusive-teaching-affects-instructors
This article (linked below) "Academe Has a Lot to Learn About How Inclusive Teaching Affects Instructors" By Chavella Pittman and Thomas J. Tobin in The Chronicle of Higher Education on FEBRUARY 7, 2022 will likely be of interest to you. Starting out by recognizing / acknowledging the power held by some identities (core, chosen, and given) but not by others, complicates the idea that all educators have the same "power and authority" to give up/share to increase learners' sense of ownership and agency in the classroom. ""What if you have neither the institutional authority (a full-time or tenure-track job) nor the dominant-culture identity (by virtue of your race, gender, and/or ability) that usually go hand in hand with being treated as a respected, powerful presence in the college classroom?... In urging faculty members to adopt inclusive teaching practices, we need to start asking if they actually can — and at what cost, " say Pittman and Tobin.
Take-aways shared in this piece include:
1. Understand that your classroom choices may unintentionally affect or undercut a colleague
2. Discuss in your department the issue of bias in students' rating of teaching
3. Respect the variability among your colleagues, as well as among your students
4. Find trained help
"Share your stories, experiences, and thought processes as you negotiate your instructor role in the classroom..." iteach.msu.edu is one space where we can continue to help "normalize the conversation about instructor identity and status as a necessary element in the adoption of inclusive design and teaching practices".
https://www.chronicle.com/article/academe-has-a-lot-to-learn-about-how-inclusive-teaching-affects-instructors
Posted by: Makena Neal
Pedagogical Design
Posted on: Ungrading (a CoP)
the Center for Integrative Studies in the Arts and Humanities invites you to attend a workshop on Alternate Grading April 21st, from 10 to 11:30 am via Zoom.
We are honored to welcome Prof. Nicole Coleman of Wayne State University to run the workshop. If you are interested in learning ways to prioritize learning over grading and to make assessments more meaningful for students, you may want to consider a new grading system. Coleman will lead an interactive program on her experiences with teaching courses in both the Specs Grading and Ungrading structures. She will provide some information on how each system works and the theory behind them. She will then guide educators in adjusting an assignment or a syllabus to work with these methods. Please bring a rubric and/or a syllabus to the session to be able to participate fully in this workshop.
We are honored to welcome Prof. Nicole Coleman of Wayne State University to run the workshop. If you are interested in learning ways to prioritize learning over grading and to make assessments more meaningful for students, you may want to consider a new grading system. Coleman will lead an interactive program on her experiences with teaching courses in both the Specs Grading and Ungrading structures. She will provide some information on how each system works and the theory behind them. She will then guide educators in adjusting an assignment or a syllabus to work with these methods. Please bring a rubric and/or a syllabus to the session to be able to participate fully in this workshop.
Posted by: Makena Neal
Pedagogical Design
Posted on: Ungrading (a CoP)
Hi Ungraders,
Thanks to all who were able to attend our 4/4 session and greetings to those who weren't able to. I wanted to post the questions that were indicated as "follow up needed" as folks left the space last week so we can continue the conversation, sharing, and support. In no particular they are:
1. How to encourage/increase undergrad/student readiness?
2. How do we ensure ungrading doesn't reproduce grade inequities?
3. How do you set students up to manage self-direction?
4. How to negotiate internal and external ecosystems - jobs?
5. How can we keep the atmosphere competitive even with ungrading?
I hope folks can simmer on these and share reflections, ideas, and resources as they come up. Also, I hope folks can add more questions, ideas, and resources.
~Brittany
Thanks to all who were able to attend our 4/4 session and greetings to those who weren't able to. I wanted to post the questions that were indicated as "follow up needed" as folks left the space last week so we can continue the conversation, sharing, and support. In no particular they are:
1. How to encourage/increase undergrad/student readiness?
2. How do we ensure ungrading doesn't reproduce grade inequities?
3. How do you set students up to manage self-direction?
4. How to negotiate internal and external ecosystems - jobs?
5. How can we keep the atmosphere competitive even with ungrading?
I hope folks can simmer on these and share reflections, ideas, and resources as they come up. Also, I hope folks can add more questions, ideas, and resources.
~Brittany
Posted by: Dillman, Brittany
Navigating Context
Posted on: Ungrading (a CoP)
Jesse Stommel shared this really concise Ungrading FAQ via twitter https://www.jessestommel.com/what-is-ungrading/
Posted by: Makena Neal
Pedagogical Design
Posted on: Ungrading (a CoP)
I first learned about ungrading from my the APA's Society for Teaching of Psychology facebook group, and they had so many conversations around this that they started a slack channel. I haven't kept up with it but they use specs grading, which is perhaps a better fit for larger courses. I tried a version of this during remote learning in 2020. Here's the slack channel if you're interested in joining those conversations: https://join.slack.com/t/specsgradingi-pm17499/shared_invite/zt-1wqybjovp-RXBIkLwF0vxmERppKBMGWw
Posted by: Katie Clements
Assessing Learning