Back

Exam Strategy for Remote Teaching

Article image
ASSESSING LEARNING
Exam Strategy for Remote Teaching

JH Contact profile image
Author :
Jessica Knott, Stephen Thomas, Becky Matz, Kate Sonka, Sarah Wellman, Daniel Trego, Casey Henley, Jeremy Van Hof, David Howe
Exam Strategy for Remote Teaching

JH Contact profile image
Author :
Jessica Knott, Stephen Thomas, Becky Matz, Kate Sonka, Sarah Wellman, Daniel Trego, Casey Henley, Jeremy Van Hof, David Howe

With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs.  From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the remote environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. 
 
In thinking about online exams, and the current situation for remote teaching, we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams.  
When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.  

 

Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. 
Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.

There are three primary options*: 

  • Multiple lower-stakes assessments (most preferred)  
  • Open note exams  (preferred)  
  • Online proctored exams (if absolutely necessary)
    • *Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document

Multiple lower-stakes assessments

Description: 
The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online.  For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so).  
 
For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.

Benefits as noted from the literature: 

  • No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
  • More effective method for incentivizing participation and reading; 
  • Encourages knowledge retention as each subsequent assessment builds on the last

Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 
 
Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. 
https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 
 
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. 
https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160 

 

Open note exams 

Description: 
Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts.  For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. 
 
One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension. 

Benefits as noted from the literature: 

    • Open-book exams and collaborative exams promote development of critical thinking skills. 
    • Open-book exams are more engaging and require higher-order thinking skills. 
    • Application of open-book exams simulates the working environment. 
    • Students prefer open-book exams and report decreased anxiety levels. 
    • Collaborative exams stimulate brain cell growth and intricate cognitive complexes.  

Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486

 

Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 

 

Implementation for multiple lower-stakes and open note assessment strategies: 

  • Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior. 
    • NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
  • If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
  • Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester. 
  • Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior 
    • For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L. 
  • Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear. 
  • Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
  • Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code.  You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity 
  • Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students. 
  • Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise. 

Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency.  Try to give students as much of an opportunity to demonstrate their knowledge as possible.  

  • Consider allowing multiple attempts on an assessment. 
  • When conditions allow, consider allowing multiple means of expression. 
  • Can students choose to demonstrate their knowledge from a menu of options
    • M/C test
    • Written response
    • Video presentation 
    • Oral Exam (via Zoom) 
  • Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments) 

Proctored assessments 

Description: 
Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured.  
 
High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material.  Please do not use an online proctored approach unless your assessment needs require its use.   

 

Benefits: 

Increases the barrier to academic dishonesty. 
Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool). 

Implementation:

  • Any online proctored exam must be created and administered using D2L’s Quizzes tool. 
  • Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool. 
  • Clear communication with students about system and hardware requirements and timing considerations is required. 
  • MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686 
  • Respondus Lockdown Browser requires that students download a web browser.
  • When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer. 
  • Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
  • Students are monitored via the webcam while they complete the exam in Lockdown Browser. 

Additional Resources: 

  • Remote Assessment Quick Guide 
  • Remote Assessment Video Conversation 
  • D2L Quizzes Tool Guide
  • Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course) 

 
References: 
Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) 
 
Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752 

Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. 
https://doi.org/10.1093/biosci/biy037 
 
Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). 
 
Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005. 

Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. 
https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 
 
Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 
Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. 
https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 
 
Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001. 

Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901 

VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. 
https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160 

Attachments


  • Remote-....pdf

  • profile-img
    Posted by:
    Makena Neal MSU Online & Remote Teaching