We found 54 results that contain "stem"
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Information on Backward Design from SOIREE
What is Understanding by Design?
One approach to designing learning experiences is the Understanding by Design (UbD) framework (Wiggins & McTighe, 2005). This approach says that we need to know what we want as the end result of a lesson or modules prior to planning for it. That is, we should know what mastery looks like and design learning experiences that enable students to achieve mastery. We can compare the process of UbD to curriculum mapping. When preparing a new course or revamping an existing one, you might begin with the standards, goals, or objectives that you want students to achieve. You then utilize that to design learning experiences that enable students to demonstrate the level of mastery expected. To reiterate, you begin with the goal or results in mind rather than the content itself.
What is Backward Design?
Backward design is a three-stage process that stems from the UbD framework. One key concept of backward design is alignment. Stage 1's content must be what's addressed in Stage 2 and 3. The three stages of the backward design process are:
Identify desired results
Determine assessment evidence
Plan learning experiences and instruction
The video below provides an overview of the backward design experience for course development.
To further develop your understanding of the three stages, please explore the "Three Stages of Backward Design" section of Vanderbilt University's Understanding by Design webpage.
What does this mean for your teaching and online course development?
As you begin to think about moving your content from a face-to-face or hybrid experience to a fully online experience, we recommend looking back at the curriculum you've previously taught. And, by curriculum, we mean the large curricular goals...not the focused, lesson-by-lesson content. If you don't have an existing curriculum map for your course, do you have an outline of topics and course objectives listed in your syllabus? Now, look at it through the eyes of backward design. Are you still able to achieve all of the goals and objectives that you intended on students performing at a mastery level? If not, how do the goals and objectives need to be reworked for this new context? That would just be the start of things in Stage 1 of the process.
To support you as you think through the stages, please make a copy of this backward design template in Google Sheets. Take a few minutes to try and work through Stage 1 of the template through the lens of your entire course. Don't worry, we'll continue to build on your learning in the next mini-lesson!
Dig Deeper
If you would like to dig deeper with the UbD framework and backward design, there are numerous articles, books, and videos published to support your development. MSU Libraries provides electronic access to Wiggins and Mctighe (2005) Understanding by design. If you prefer to explore via video, you can access Moving forward with understanding by design through MSU Libraries as well.
SOIREE:
Design Lead: Sarah Wellman
Content Leads: Kate Sonka, Stephen Thomas, and Jeremy Van Hof
Content Authors: Jason Archer, Kevin Henley, David Howe, Summer Issawi, Leslie Johnson, Rashad Muhammad, Nick Noel, Candace Robertson, Scott Schopieray, Jessica Sender, Daniel Trego, Valeta Wensloff, and Sue Halick
One approach to designing learning experiences is the Understanding by Design (UbD) framework (Wiggins & McTighe, 2005). This approach says that we need to know what we want as the end result of a lesson or modules prior to planning for it. That is, we should know what mastery looks like and design learning experiences that enable students to achieve mastery. We can compare the process of UbD to curriculum mapping. When preparing a new course or revamping an existing one, you might begin with the standards, goals, or objectives that you want students to achieve. You then utilize that to design learning experiences that enable students to demonstrate the level of mastery expected. To reiterate, you begin with the goal or results in mind rather than the content itself.
What is Backward Design?
Backward design is a three-stage process that stems from the UbD framework. One key concept of backward design is alignment. Stage 1's content must be what's addressed in Stage 2 and 3. The three stages of the backward design process are:
Identify desired results
Determine assessment evidence
Plan learning experiences and instruction
The video below provides an overview of the backward design experience for course development.
To further develop your understanding of the three stages, please explore the "Three Stages of Backward Design" section of Vanderbilt University's Understanding by Design webpage.
What does this mean for your teaching and online course development?
As you begin to think about moving your content from a face-to-face or hybrid experience to a fully online experience, we recommend looking back at the curriculum you've previously taught. And, by curriculum, we mean the large curricular goals...not the focused, lesson-by-lesson content. If you don't have an existing curriculum map for your course, do you have an outline of topics and course objectives listed in your syllabus? Now, look at it through the eyes of backward design. Are you still able to achieve all of the goals and objectives that you intended on students performing at a mastery level? If not, how do the goals and objectives need to be reworked for this new context? That would just be the start of things in Stage 1 of the process.
To support you as you think through the stages, please make a copy of this backward design template in Google Sheets. Take a few minutes to try and work through Stage 1 of the template through the lens of your entire course. Don't worry, we'll continue to build on your learning in the next mini-lesson!
Dig Deeper
If you would like to dig deeper with the UbD framework and backward design, there are numerous articles, books, and videos published to support your development. MSU Libraries provides electronic access to Wiggins and Mctighe (2005) Understanding by design. If you prefer to explore via video, you can access Moving forward with understanding by design through MSU Libraries as well.
SOIREE:
Design Lead: Sarah Wellman
Content Leads: Kate Sonka, Stephen Thomas, and Jeremy Van Hof
Content Authors: Jason Archer, Kevin Henley, David Howe, Summer Issawi, Leslie Johnson, Rashad Muhammad, Nick Noel, Candace Robertson, Scott Schopieray, Jessica Sender, Daniel Trego, Valeta Wensloff, and Sue Halick
Authored by:
SOIREE Team

Posted on: #iteachmsu

Information on Backward Design from SOIREE
What is Understanding by Design?
One approach to designing learning...
One approach to designing learning...
Authored by:
PEDAGOGICAL DESIGN
Tuesday, Feb 2, 2021
Posted on: Catalyst Innovation Program
CIP: QR Code-Based Instruction in Engineering and Biology Laboratories
Project Title: Opening New WindowsProject Leads: Sunil Kishore Chakrapani and Jenifer Saldanha
College/ Unit:
Department of ECE, College of Engineering and Biological Science Program, College of Natural Sciences
Elevator Pitch:Quick response or QR codes are machine readable (typically by smart phone/tablet cameras) systems similar to bar-codes that can contain information within the image. They are used to provide an easy, fast and concise way to connect to information via the internet. QR codes are used in stores as payment options, and to display information, especially becoming commonplace during the COVID-19 pandemic. This project explores the use of QR codes in improving the delivery of instructional content in engineering and biology laboratories. The specific objective of this project is to implement QR code-based instruction strategies for laboratories, which will help to make “lab-manuals” more accessible and engaging. Standard laboratory manuals can be overwhelming due to clutter and overloading of information. Students can find it unengaging, and accessing specific information can be challenging. This project will utilize sequentially programmed QR codes placed at different locations in the lab to “walk” students through experimental procedures. The QR codes will be linked to interactive web pages which will display the course content in an engaging manner. When students want to find information regarding a specific instrument or experimental protocol, they can simply scan the associated QR code which will lead them to the information along with a video of how it works. This project also employs this QR code concept to design interactive assessments. Students will answer questions by identifying and scanning the correct QR codes. Team Bios: Dr. Sunil Kishore Chakrapani is an Assistant Professor in the Department of Electrical and Computer Engineering and Mechanical Engineering in the College of Engineering. He teaches undergraduate and graduate courses in both departments in the topics of computer aided manufacturing, and mechanics. His research interests include use of ultrasonics for nondestructive evaluation. Dr. Jenifer Saldanha is an Assistant Professor in the Biological Sciences program in the College of Natural Science. She is the Curriculum Coordinator for introductory molecular and cellular biology labs. Her broad research interests include student success and retention in STEM disciplines, life science education research, and stress biology.What are some of the successes?QR coded links for videos and initial course content were embedded in HTML pages and these work well. The use of sequential QR codes for activities was tested and we found that these work well and smoothly. QR coded assignments were developed for the lab and they work as expected. What are some of the challenges that you have experienced on this project?The QR coded assignments work for the engineering lab, but will require more time and effort to adapt for the biology lab. Web design requires a lot of effort. Using the tools from this project requires a smooth web interface. In the future, it’ll be great if the project resources can be used for web design and development in the form of undergrad hours. Supply chain issues delayed our ability to order tablets for the courses.Image attribution:QR Code for Object Taggingby preetamrai is licensed under CC-BY-2.0
College/ Unit:
Department of ECE, College of Engineering and Biological Science Program, College of Natural Sciences
Elevator Pitch:Quick response or QR codes are machine readable (typically by smart phone/tablet cameras) systems similar to bar-codes that can contain information within the image. They are used to provide an easy, fast and concise way to connect to information via the internet. QR codes are used in stores as payment options, and to display information, especially becoming commonplace during the COVID-19 pandemic. This project explores the use of QR codes in improving the delivery of instructional content in engineering and biology laboratories. The specific objective of this project is to implement QR code-based instruction strategies for laboratories, which will help to make “lab-manuals” more accessible and engaging. Standard laboratory manuals can be overwhelming due to clutter and overloading of information. Students can find it unengaging, and accessing specific information can be challenging. This project will utilize sequentially programmed QR codes placed at different locations in the lab to “walk” students through experimental procedures. The QR codes will be linked to interactive web pages which will display the course content in an engaging manner. When students want to find information regarding a specific instrument or experimental protocol, they can simply scan the associated QR code which will lead them to the information along with a video of how it works. This project also employs this QR code concept to design interactive assessments. Students will answer questions by identifying and scanning the correct QR codes. Team Bios: Dr. Sunil Kishore Chakrapani is an Assistant Professor in the Department of Electrical and Computer Engineering and Mechanical Engineering in the College of Engineering. He teaches undergraduate and graduate courses in both departments in the topics of computer aided manufacturing, and mechanics. His research interests include use of ultrasonics for nondestructive evaluation. Dr. Jenifer Saldanha is an Assistant Professor in the Biological Sciences program in the College of Natural Science. She is the Curriculum Coordinator for introductory molecular and cellular biology labs. Her broad research interests include student success and retention in STEM disciplines, life science education research, and stress biology.What are some of the successes?QR coded links for videos and initial course content were embedded in HTML pages and these work well. The use of sequential QR codes for activities was tested and we found that these work well and smoothly. QR coded assignments were developed for the lab and they work as expected. What are some of the challenges that you have experienced on this project?The QR coded assignments work for the engineering lab, but will require more time and effort to adapt for the biology lab. Web design requires a lot of effort. Using the tools from this project requires a smooth web interface. In the future, it’ll be great if the project resources can be used for web design and development in the form of undergrad hours. Supply chain issues delayed our ability to order tablets for the courses.Image attribution:QR Code for Object Taggingby preetamrai is licensed under CC-BY-2.0
Authored by:
Sunil Kishore Chakrapani and Jenifer Saldanha

Posted on: Center for Teaching and Learning Innovation
Citing Generative AI (e.g., ChatGPT) in Higher Education Scholarship, Teaching, and Professional Writing
As generative AI tools like ChatGPT are increasingly used in academic settings—for teaching support, scholarly writing, and even faculty development—it's important to adopt citation practices that are centerend on ethics and that ensure clarity, transparency, and academic integrity. Below are structured guidelines across major citation styles (APA, MLA, Chicago), tailored to the needs of university instructors, researchers, and students. A final section also offers examples of less formal disclosures appropriate for drafts, instructional materials, and academic development work.
Note that as large language models continue to develop, it will become increasingly important to cite the specific model or agent that was used to generate or modify content. It will also be important to regularly revisit citation guidelines, as these, too, are rapidly evolving to meet the demands of the ever-changing AI landscape.
APA (7th ed.) Style
Official Guidance:APA Style Blog: How to Cite ChatGPT
Reference Entry Template:Author. (Year). Title of AI model (Version date) [Description]. Source URL
Example Reference:OpenAI. (2023). ChatGPT (May 24 version) [Large language model]. https://chat.openai.com/
In-text citation:(OpenAI, 2023)
Higher Education Example:When asked to summarize Bandura’s concept of self-efficacy for use in an introductory education course, ChatGPT stated that “self-efficacy refers to an individual’s belief in their ability to execute behaviors necessary to produce specific performance attainments” (OpenAI, 2023).
MLA (9th ed.) Style
Official Guidance:MLA Style Center: Citing Generative AI
Works Cited Template:“[Prompt text]” prompt. ChatGPT, Version Date, OpenAI, Access Date, chat.openai.com.
Example Entry:“Summarize Bandura’s concept of self-efficacy” prompt. ChatGPT, 24 May version, OpenAI, 26 May 2023, chat.openai.com.
In-text citation:("Summarize Bandura’s concept")
Chicago Manual of Style (17th ed.)
Official Guidance:Chicago recommends citing AI-generated text via footnote only, not in the bibliography.
Footnote Example:
Text generated by ChatGPT, May 24, 2023, OpenAI, https://chat.openai.com.
Higher Education Example:
Used in a teaching statement to describe inclusive pedagogy practices. ChatGPT, response to “Give an example of inclusive teaching in STEM,” May 24, 2023, https://chat.openai.com.
Less Formal Disclosures for Transparency
In many instructional or professional academic contexts—such as teaching statements, reflective memos, informal reports, or early-stage drafts—it may be more appropriate to disclose use of generative AI tools in a narrative or parenthetical style rather than a formal citation format. Below are examples of how this can be done responsibly and transparently:
Examples of Less Formal Attribution:
“This draft was developed with the assistance of ChatGPT, which helped generate an outline based on course goals I provided. All final content was authored and reviewed by me.”
“In preparing this teaching philosophy, I used ChatGPT to help articulate distinctions between formative and summative assessment. The generated content was edited and integrated with my personal teaching experiences.”
“Some of the examples included in this workshop description were drafted with the help of ChatGPT (May 2023 version). I adapted the AI-generated responses to better align with our institutional context.”
“This syllabus language on academic integrity was initially drafted using a prompt in ChatGPT. The AI output was revised significantly to reflect course-specific values and policies.”
(Used in slide footnotes or speaking notes): “Initial ideas for this section were generated using ChatGPT and reviewed for accuracy and alignment with our campus policy.”
When to Use Informal Attribution:
Internal memos or reports
Course or assignment drafts
Teaching statements or portfolios
Slide decks or workshop materials
Informal educational publications (e.g., blog posts, teaching commons)
Best Practices for Academic Use in Higher Education
Transparency is key. Whether using a formal citation style or a narrative disclosure, always clearly communicate how AI tools were used.
Human review is essential. AI-generated content should always be edited for accuracy, nuance, inclusivity, and disciplinary alignment.
Tailor to context. Use formal citation when required (e.g., published research); use informal attribution for pedagogical artifacts or collaborative drafts.
Citing Generative AI Content
Citing Generative AI (e.g., ChatGPT) in Higher Education Scholarship, Teaching, and Professional Writing
As generative AI tools like ChatGPT are increasingly used in academic settings—for teaching support, scholarly writing, and even faculty development—it's important to adopt citation practices that are centerend on ethics and that ensure clarity, transparency, and academic integrity. Below are structured guidelines across major citation styles (APA, MLA, Chicago), tailored to the needs of university instructors, researchers, and students. A final section also offers examples of less formal disclosures appropriate for drafts, instructional materials, and academic development work.
Note that as large language models continue to develop, it will become increasingly important to cite the specific model or agent that was used to generate or modify content. It will also be important to regularly revisit citation guidelines, as these, too, are rapidly evolving to meet the demands of the ever-changing AI landscape.
APA (7th ed.) Style
Official Guidance:APA Style Blog: How to Cite ChatGPT
Reference Entry Template:Author. (Year). Title of AI model (Version date) [Description]. Source URL
Example Reference:OpenAI. (2023). ChatGPT (May 24 version) [Large language model]. https://chat.openai.com/
In-text citation:(OpenAI, 2023)
Higher Education Example:When asked to summarize Bandura’s concept of self-efficacy for use in an introductory education course, ChatGPT stated that “self-efficacy refers to an individual’s belief in their ability to execute behaviors necessary to produce specific performance attainments” (OpenAI, 2023).
MLA (9th ed.) Style
Official Guidance:MLA Style Center: Citing Generative AI
Works Cited Template:“[Prompt text]” prompt. ChatGPT, Version Date, OpenAI, Access Date, chat.openai.com.
Example Entry:“Summarize Bandura’s concept of self-efficacy” prompt. ChatGPT, 24 May version, OpenAI, 26 May 2023, chat.openai.com.
In-text citation:("Summarize Bandura’s concept")
Chicago Manual of Style (17th ed.)
Official Guidance:Chicago recommends citing AI-generated text via footnote only, not in the bibliography.
Footnote Example:
Text generated by ChatGPT, May 24, 2023, OpenAI, https://chat.openai.com.
Higher Education Example:
Used in a teaching statement to describe inclusive pedagogy practices. ChatGPT, response to “Give an example of inclusive teaching in STEM,” May 24, 2023, https://chat.openai.com.
Less Formal Disclosures for Transparency
In many instructional or professional academic contexts—such as teaching statements, reflective memos, informal reports, or early-stage drafts—it may be more appropriate to disclose use of generative AI tools in a narrative or parenthetical style rather than a formal citation format. Below are examples of how this can be done responsibly and transparently:
Examples of Less Formal Attribution:
“This draft was developed with the assistance of ChatGPT, which helped generate an outline based on course goals I provided. All final content was authored and reviewed by me.”
“In preparing this teaching philosophy, I used ChatGPT to help articulate distinctions between formative and summative assessment. The generated content was edited and integrated with my personal teaching experiences.”
“Some of the examples included in this workshop description were drafted with the help of ChatGPT (May 2023 version). I adapted the AI-generated responses to better align with our institutional context.”
“This syllabus language on academic integrity was initially drafted using a prompt in ChatGPT. The AI output was revised significantly to reflect course-specific values and policies.”
(Used in slide footnotes or speaking notes): “Initial ideas for this section were generated using ChatGPT and reviewed for accuracy and alignment with our campus policy.”
When to Use Informal Attribution:
Internal memos or reports
Course or assignment drafts
Teaching statements or portfolios
Slide decks or workshop materials
Informal educational publications (e.g., blog posts, teaching commons)
Best Practices for Academic Use in Higher Education
Transparency is key. Whether using a formal citation style or a narrative disclosure, always clearly communicate how AI tools were used.
Human review is essential. AI-generated content should always be edited for accuracy, nuance, inclusivity, and disciplinary alignment.
Tailor to context. Use formal citation when required (e.g., published research); use informal attribution for pedagogical artifacts or collaborative drafts.
Authored by:
Jeremy Van Hof
Posted on: The MSU Graduate Leadership Institute
NAVIGATING CONTEXT
College of Engineering Leadership Fellows
Leadership Fellows
2018-2019: David Hernandez Escobar & Olivia Chesniak
2019-2020: Hamid Karimi
2020-2022: Chelsie Boodoo
David Hernandez Escobar (2018-2019)As one of the first College of Engineering Leadership Fellows, David worked with Assistant Dean for Graduate Student Services, Dr. Katy Colbry, to develop a needs assessment survey to identify the concerns of graduate students in the College of Engineering. The assessment collected over 100 responses, included open-answer personal reflections from graduate students and ideas on effective actions that could be taken to strengthen the graduate student community within the College of Engineering. Jacob also focused on his own leadership development by collaborating with other Fellows as a strong, cross-disciplinary team who attended professional development sessions together and discussed program communication, building buy-in, and a variety of other topics.
Olivia Chesniak (2018-2019)Olivia’s Fellowship focused on bringing together graduate student organizations focused on women in STEM with the goal of sharing resources, networking, and providing peer mentorship. Olivia’s relationship-building efforts reinforced a cosponsored event among her connections in the College of Natural Science, the College of Engineering, and the College of Agriculture and Natural Resources. During the Spring semester, Olivia worked with Lydia Weiss to develop, advertise, and facilitate discussion sessions for graduate students following the Academic Womens Forum, known as the gradAWF. The Academic Womens Forum has been a valuable and unique space for women in the university to connect with fellow faculty, staff, and administrators. However, the lack of space for graduate students was reflected in unsteady attendance. Olivia was able to work across the university to create a space for graduate student women and ensure its promotion within her College.
Hamid Karimi (2019-2020)Hamid worked with Assistant Dean Dr. Katy Colbry and engaged with stakeholders across the College to identify the need for professional development sessions regarding graduate students' knowledge of and preparedness for the job market following the completion of their degrees. Hamid also explored how to build DEI awareness within the lab setting and promote the benefits of diverse teams in STEM.
Chelsie Boodoo (2020-2022)Chelsie organized the MSU SciComm Conveyance Conference, a virtual science communications conference that brought experts and students from various disciplines together to discuss scicomm practices and the role of science in today’s society. The conference offered an incredible twenty-two sessions including workshops, lectures, networking opportunities, and social events. Sessions included Science, Equity, and Advocacy in the Nuclear Weapons Field, Science vs. Journalistic Writing, Podcast Kickstarter, Creating Effective Data Visualizations, and Building Trust in Scientists, among others. Through her program, students were able to gather valuable information on the science communication field and experts were given the opportunity to connect with peers and provide advice to the next generation. Chelsie led a team through the difficulties of hosting a virtual conference and the process of learning skills related to event planning, public relations, marketing, grant-writing, and innovative-technology use. In her second year as a Fellow, Chelsie worked on developing a Science Art Tool Kit to help graduate students in the science field to communicate about their research using the arts. Her goal was to equip scientists to utilize physical and digital art to more effectively convey data and information. Her tool kit has a wide array of practical examples and resources.
2018-2019: David Hernandez Escobar & Olivia Chesniak
2019-2020: Hamid Karimi
2020-2022: Chelsie Boodoo
David Hernandez Escobar (2018-2019)As one of the first College of Engineering Leadership Fellows, David worked with Assistant Dean for Graduate Student Services, Dr. Katy Colbry, to develop a needs assessment survey to identify the concerns of graduate students in the College of Engineering. The assessment collected over 100 responses, included open-answer personal reflections from graduate students and ideas on effective actions that could be taken to strengthen the graduate student community within the College of Engineering. Jacob also focused on his own leadership development by collaborating with other Fellows as a strong, cross-disciplinary team who attended professional development sessions together and discussed program communication, building buy-in, and a variety of other topics.
Olivia Chesniak (2018-2019)Olivia’s Fellowship focused on bringing together graduate student organizations focused on women in STEM with the goal of sharing resources, networking, and providing peer mentorship. Olivia’s relationship-building efforts reinforced a cosponsored event among her connections in the College of Natural Science, the College of Engineering, and the College of Agriculture and Natural Resources. During the Spring semester, Olivia worked with Lydia Weiss to develop, advertise, and facilitate discussion sessions for graduate students following the Academic Womens Forum, known as the gradAWF. The Academic Womens Forum has been a valuable and unique space for women in the university to connect with fellow faculty, staff, and administrators. However, the lack of space for graduate students was reflected in unsteady attendance. Olivia was able to work across the university to create a space for graduate student women and ensure its promotion within her College.
Hamid Karimi (2019-2020)Hamid worked with Assistant Dean Dr. Katy Colbry and engaged with stakeholders across the College to identify the need for professional development sessions regarding graduate students' knowledge of and preparedness for the job market following the completion of their degrees. Hamid also explored how to build DEI awareness within the lab setting and promote the benefits of diverse teams in STEM.
Chelsie Boodoo (2020-2022)Chelsie organized the MSU SciComm Conveyance Conference, a virtual science communications conference that brought experts and students from various disciplines together to discuss scicomm practices and the role of science in today’s society. The conference offered an incredible twenty-two sessions including workshops, lectures, networking opportunities, and social events. Sessions included Science, Equity, and Advocacy in the Nuclear Weapons Field, Science vs. Journalistic Writing, Podcast Kickstarter, Creating Effective Data Visualizations, and Building Trust in Scientists, among others. Through her program, students were able to gather valuable information on the science communication field and experts were given the opportunity to connect with peers and provide advice to the next generation. Chelsie led a team through the difficulties of hosting a virtual conference and the process of learning skills related to event planning, public relations, marketing, grant-writing, and innovative-technology use. In her second year as a Fellow, Chelsie worked on developing a Science Art Tool Kit to help graduate students in the science field to communicate about their research using the arts. Her goal was to equip scientists to utilize physical and digital art to more effectively convey data and information. Her tool kit has a wide array of practical examples and resources.
Posted by:
Megumi Moore

Posted on: The MSU Graduate Leadership Institute

College of Engineering Leadership Fellows
Leadership Fellows
2018-2019: David Hernandez Escobar & Olivia...
2018-2019: David Hernandez Escobar & Olivia...
Posted by:
NAVIGATING CONTEXT
Thursday, Sep 29, 2022
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Spartan Studios Playkit: Appendix
AppendixThis is the ninth and final article in our iTeach.MSU playlist for the Spartan Studios Playkit.This appendix includes categories related to different elements of interdisciplinary, experiential teaching and course design, and includes what we hope are useful annotations.
Research from the Spartan Studios project
Heinrich, W. F., Louson, E., Blommel, C., & Green, A. R. (2021). Who Coaches the Coaches? The Development of a Coaching Model for Experiential Learning. Innov High Educ 46, 357–375. https://doi.org/10.1007/s10755-020-09537-3
This paper is an overview of the Spartan Studios project and our results for students and faculty who ran prototype courses. It outlines the GORP model as well as the benefits and challenges of this approach to teaching and course planning.
Heinrich, W. F., Lauren, B., & Logan, S. (2020). Interdisciplinary teaching, learning and power in an experiential classroom. Submitted to Experiential Learning & Teaching in Higher Education.
This paper [under review] describes the first iteration of what became the Studios pattern at MSU and introduces the GORP framework.
Research from the James Madison University X-Labs, our colleagues in Virginia working in a similar course model
McCarthy, S., Barnes, A., Briggs, F., Giovanetti, K., Ludwig, P., Robinson, K., & Swayne, N. (Fall 2016). Undergraduate Social Entrepreneurship Education and Communication Design. SIGDOC 2015 Conference Proceedings. https://doi.org/10.1145/2987592.2987625
This report describes some communication strategies within the X-Labs’ drones course, how students documented and presented their works and how faculty plan to iterate the course.
Ludwig, P. M., Lewis, E. J., Nagel, J. K. (2017). Student learning outcomes from a pilot medical innovations course with nursing, engineering and biology undergraduate students. International Journal of STEM Education, 4(33) https://doi.org/10.1186/s40594-017-0095-y
Describes an X-Labs multidisciplinary course on medical innovations and its assessment using qualitative content analysis about students’ attitudes and perceptions of different occupations.
McCarthy, S., Barnes, A., Holland, S. K., Lewis, E., Ludwig, P., & Swayne, N. (2018). Making It: Institutionalizing Collaborative Innovation in Public Higher Education. Proceedings of the 4th International Conference on Higher Education Advances (HEAd’18) 1,549–1,557. http://dx.doi.org/10.4995/HEAD18.2018.8560
A descriptive case study of the academic maker space in the JMU X-Labs, both describing specific courses and how X-Labs is administered. Offers this model as applicable elsewhere in higher ed.
Kishbaugh, A. (2018). An Exploratory Case Study of Cross-Disciplinary Project-Based (i.e. Maker) Curricula as a Catalyst for Entrepreneurship. International Symposium on Academic Makerspaces. https://jmuxlabs.org/app/uploads/2018/10/ISAM_2018_akish_v6.pdf
Describes cross-disciplinary courses as promoting entrepreneurship and innovation, by looking at startups coming from these courses. Offers a framework based on multidisciplinary problem-solving, Design Thinking approaches, and a lean startup methodology.
Selznick, B. S., Mayhew, M. J., & Swayne, N. (2018, November 20). Stop Blaming Innovation. (Correspondence from Chronicle readers). The Chronicle of Higher Education. https://www.chronicle.com/blogs/letters/stop-blaming-innovation/
A rebuttal to an argument that higher ed’s emphasis on innovation is misguided. Argues that innovation has positive student outcomes, is different from entrepreneurship, and that their interventions are effective.
Swayne, N., McCarthy, S., Selznick, B. S., & Fisher, K. A. (2019). Breaking up I/E: Consciously Uncoupling Innovation and Entrepreneurship to Improve Undergraduate Learning. Innovation and Entrepreneurship Theory and Practice. https://doi.org/10.24251/HICSS.2019.651
Describes the X-Labs as evidence for uncoupling entrepreneurship and innovation, and argues that conceptually they are separate; teaching innovation needs to precede teaching entrepreneurship
Lewis, E. J., Ludwig, P. M., Nagel, J., & Ames, A. (2019). Student ethical reasoning confidence pre/post an innovative makerspace course: A survey of ethical reasoning. Nurse Education Today, 75, 75-79. https://doi.org/10.1016/j.nedt.2019.01.011
Describes gains to ethical reasoning after the Medical Innovations X-Labs course.
El-Tawab, S., Sprague, N. & Stewart, M. (2020). Teaching Innovation in Higher Education: A Multidisciplinary Class. In D. Schmidt-Crawford (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 8-13). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/215725/.
Describes a case of the X-Labs autonomous vehicles course, its support of students’ technical and soft skills, and its reproducibility.
McMurtrie, B. (2019) No Textbooks, No Lectures, and No Right Answers. Is This What Higher Education Needs? Chronicle of Higher Education 10 Feb. https://www.chronicle.com/article/no-textbooks-no-lectures-and-no-right-answers-is-this-what-higher-education-needs/
Chronicle of Higher Education story about the JMU X-Labs course model.
Interdisciplinarity
Harden, R. M. (2000) The integration ladder: A tool for curriculum planning and evaluation. Medical Education, 34(7), 551–557. https://doi.org/10.1046/j.1365-2923.2000.00697.x
Offers a framework for thinking about different disciplinary connections, from disciplines being isolated/siloed from each other through transdisciplinarity.
Carmicheal, T. & LaPierre, Y. (2014). Interdisciplinary Learning Works: The Results of a Comprehensive Assessment of Students and Student Learning Outcomes in an Integrative Learning Community. Issues in Interdisciplinary Studies, 32(3), 53–78. http://hdl.handle.net/10323/6647
Evidence-based assessment of student learning outcomes and academic growth metrics as a result of participation in a first-year integrative learning community. The author outlines the interdisciplinary learning goals and processes of the program, and shows that students that participated in the program consistently outperformed students outside of the program in both short term and long term learning and academic growth benchmarks.
Ivanitskaya, L., Clark, D., Montgomery, G., & Primeau, R. (2002). Interdisciplinary Learning: Process and Outcomes. Innovative Higher Education, 27, 95–111. https://doi.org/10.1023/A:1021105309984
A review of expected benefits, learning outcomes, and processes (and potential roadblocks) of interdisciplinary education. Review applied to an interdisciplinary discussion based course. The authors claim that interdisciplinary learning can significantly contribute to intellectual maturity and cognitive development of students, and provide a framework of milestones that students may hit in the process of cognitive development through interdisciplinary ed.
Kezar, A. & Elrod, S. (2012). Facilitating Interdisciplinary Learning: Lessons from Project Kaleidoscope. Change: The Magazine of Higher Learning, 44(1), 16–25, https://doi.org/10.1080/00091383.2012.635999
This magazine article argues for the benefits of interdisciplinary education for both students and institutions, and provides ways to encourage interdisciplinary education on a systemic level. The authors give key strategies and tips for facilitating interdisciplinary learning and creating student experiences. The barriers to interdisciplinary learning/education are recognized (specifically institutional) and potential solutions are given as well.
Stentoft D. (2017) From saying to doing interdisciplinary learning: Is problem-based learning the answer? Active Learning in Higher Education, 18(1). 51–61. https://doi.org/10.1177/1469787417693510
Author argues that PBL is an effective strategy to facilitate interdisciplinary learning and vice versa. The author also acknowledges three barriers to effective interdisciplinary education: curriculum organization, student competencies to navigate interdisciplinary problems, and instructor competency - and proposes how to address these barriers.
Imafuku, R., Kataoka, R., Mayahara, M., Suzuki, H., & Saiki, T. (2014). Students’ Experiences in Interdisciplinary Problem-based Learning: A Discourse Analysis of Group Interaction. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1388
Kruck, S. E. and Teer, Faye P. (2009). Interdisciplinary Student Teams Projects: A Case Study. Journal of Information Systems Education, 20(3), 325–330. https://aisel.aisnet.org/jise/vol20/iss3/7
Problem-Based Learning/Project-Based Learning
Ertmer, P. A., & Simons, K. D. (2006). Jumping the PBL Implementation Hurdle: Supporting the Efforts of K–12 Teachers. Interdisciplinary Journal of Problem-Based Learning, 1(1). https://doi.org/10.7771/1541-5015.1005
While focused on problem based learning at the K-12 level, this paper covers topics relevant to higher education instruction, including implementation challenges, creating collaborative classroom culture, teachers adjusting to changing roles, scaffolding student learning, initiating student inquiry, maintaining student engagement, aiding conceptual integration, and promoting reflective thinking
Fukuzawa, S., Boyd, C., & Cahn, J. (2017). Student motivation in response to problem-based learning. Collected Essays on Learning and Teaching, 10, 175-188. https://doi.org/10.22329/celt.v10i0.4748
Study of student perceptions of problem-based learning in an anthropology course found that students with more subject matter experience didn’t necessarily have greater intrinsic motivation about the course. Also includes strategies for transitioning students to PBL when they are used to traditional lectures.
Guo, P., Saab, N., Post, L. S., & Admiraal, W. (2020). A review of project-based learning in higher education: Student outcomes and measures. International Journal of Educational Research, 102, 101586. https://doi.org/10.1016/j.ijer.2020.101586
A review of literature around project based learning that includes 76 papers. Topics covered in the review include cognitive outcomes of PjBL including knowledge and cognitive strategies, affective outcomes including perceptions of the benefits of PjBL and perceptions of the experience of PBL, and behavior outcomes including skills and engagement
Lee, J. S., Blackwell, S., Drake, J., & Moran, K. A. (2014). Taking a leap of faith: redefining teaching and learning in higher education through project-based learning. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1426
Study of instructors who implemented PjBL that focused around their challenges and successes with community partnerships, student engagement, and assessment
Moro, C., & McLean, M. (2017). Supporting students’ transition to university and problem-based learning. Medical Science Educator, 27(2), 353-361. https://doi.org/10.1007/s40670-017-0384-6
15 strategies for scaffolding learning and supporting students in PBL programs includes using a phased approach to PBL, getting student feedback in the first few weeks of the program, and develop learner’s reflective skills before self-assessment
Pepper C. (2010). ‘There’s a lot of learning going on but NOT much teaching!’: Student perceptions of problem‐based learning in science. Higher Education Research & Development, 29(6), 693-707. https://doi.org/10.1080/07294360.2010.501073
Overview of student responses to problem based learning at an Australian university. Developed a continuum of how students react to problem based learning that includes missing the point, working in groups, splitting the workload, completing the task, assessing the task, learning new information, sharing ideas, and being self directed learners
Perrault, E. K., & Albert, C. A. (2018). Utilizing project-based learning to increase sustainability attitudes among students. Applied Environmental Education & Communication, 17(2), 96-105. https://doi.org/10.1080/1533015X.2017.1366882
While PjBL is often concerned with knowledge gain, this study suggests that PBL can also shift student attitudes around the topic. For this study, students designed a communications campaign for an office of sustainability. The students themselves were found to have more favorable views around sustainability by the end of the course
Boston University Center for Teaching & Learning. (n.d.). Project-based learning: teaching guide. http://www.bu.edu/ctl/guides/project-based-learning/
Brief overview of what project based learning is and four key steps to implementing it (defining the problem, generating ideas, prototyping solutions, and testing)
Strobel, J., & van Barneveld, A. (2009). When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. Interdisciplinary Journal of Problem-Based Learning, 3(1). https://doi.org/10.7771/1541-5015.1046
Combines the results of many meta-analyses around PBL over the last few decades to compare PBL to traditional classroom learning. The study finds that PBL results in more satisfaction among students and faculty, leads to better long term retention of knowledge (traditional was better for short-term), and better skill development
Vogler, J. S., Thompson, P., Davis, D. W., Mayfield, B. E., Finley, P. M., & Yasseri, D. (2018). The hard work of soft skills: augmenting the project-based learning experience with interdisciplinary teamwork. Instructional Science, 46(3), 457-488. https://doi.org/10.1007/s11251-017-9438-9
Two-year study of an interdisciplinary problem based learning task and student outcomes. Study used student feedback during each year to understand how students were feeling about the course. The instructors learned that students felt the instructors had inconsistent and unclear expectations and hence, experienced anxiety about grades. The instructors took this to mean that they needed to do a better job of articulating the learning outcomes and end of course goal. The instructors also learned that students often do not know how to collaborate interdisciplinary and decided to add scaffolding to the course
Learning Objectives and Bloom’s Taxonomy
Armstrong, P. (2010). Bloom’s taxonomy. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/
Overview of the original 6 levels of Bloom’s Taxonomy and the 6 levels of the Revised Taxonomy: remember, understand, apply, analyze, evaluate, and create. Includes the four types of knowledge: factual, conceptual, procedural, and metacognitive.
Carnegie Mellon University Eberly Center. (n.d.). Design & Teach a Course. https://www.cmu.edu/teaching/designteach/design/learningobjectives.html
Strategies and tips for articulating and writing learning objectives including that learning objectives should be student-centered, break down the task and focus on specific cognitive processes, use action verbs, and be measurable.
Ferguson, C. (2002). Using the revised taxonomy to plan and deliver team-taught, integrated, thematic units. Theory Into Practice, 41(4), 238-243. https://doi.org/10.1207/s15430421tip4104_6
Example of an interdisciplinary high school course (English & social studies) where the two instructors used a taxonomy table to map their learning objectives onto the 6 levels of the Revised Taxonomy and 4 types of knowledge. Such a table may be useful for thinking about the learning objectives in your course
Kidwell, L. A., Fisher, D. G., Braun, R. L., & Swanson, D. L. (2013). Developing learning objectives for accounting ethics using Bloom's taxonomy. Accounting Education, 22(1), 44-65. https://doi.org/10.1080/09639284.2012.698478
An example of using Bloom’s Taxonomy in accounting ethics to create learning objectives. For each larger course theme, the authors list examples how learning objectives could be created from each level of the Taxonomy.
Mayer, R. E. (2002). Rote versus meaningful learning. Theory Into Practice, 41(4), 226-232. https://doi.org/10.1207/s15430421tip4104_4
Includes 19 processes/action verbs, how they map to the 6 levels of the Revised Taxonomy, and simple examples of what a task for students to do might look like. Examples of included verbs are “compare,” “implement,” “organize,” “critique,” and “generate”
Tyran, C. K. (2010). Designing the spreadsheet-based decision support systems course: an application of Bloom's taxonomy. Journal of Business Research, 63(2), 207-216. https://doi.org/10.1016/j.jbusres.2009.03.009
An example of using Bloom’s taxonomy to map course activities to ensure students have the prerequisite knowledge to complete the assignments
Reflection; Reflection as Assessment
Ash, S. L., & Clayton, P. H. (2009). Learning through critical reflection: A tutorial for service-learning students. Ash, Clayton & Moses.
Introduces characteristics of critical reflection and the DEAL model.
Eyler, J., Eyler, J., Giles, D. E., & Schmeide, A. (1996). A practitioner's guide to reflection in service-learning: Student voices & reflections. Vanderbilt University.
Argues that successful reflection is continuous, challenging, connected, and contextualized.
Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximize student learning (2nd edition). Corwin Press.
Especially chapter 10, Using Assessment for Reflection and Self-Regulation
Ash, S. L., Clayton, P. H., & Atkinson, M. P. (2005). Integrating reflection and assessment to capture and improve student learning. Michigan Journal of Community Service Learning, 11(2), 49-60. http://hdl.handle.net/2027/spo.3239521.0011.204
Sees coupled reflection and assessment as mutually informing and reinforcing for students in service learning. Describes tools to guide reflective writing processes. Focus on both individual student learning and reflection as part of program-wide approaches to reflection.
Assessment of Experiential Education & Interdisciplinary Learning
Conrad, D., & Hedin, D. (1981). National assessment of experiential education: Summary and implications. Journal of Experiential Education, 4(2), 6–20. https://doi.org/10.1177/105382598100400202
A summary of the research of the Evaluation of Experiential Learning project which sought to (1) assess the impact of experiential learning on secondary school students and (2) use that data to identify the elements of the EE programs that contributed the most to such student development.
Field, M., Lee, R., & Field, M. L. (1994). Assessing interdisciplinary learning. New Directions for Teaching and Learning, 1994(58), 69–84. https://doi.org/10.1002/tl.37219945806
In-depth discussion of assessment techniques for interdisciplinary study in higher education
Heinrich, W. F., Habron, G. B., Johnson, H. L., & Goralnik, L. (2015). Critical thinking assessment across four sustainability-related experiential learning settings. Journal of Experiential Education, 38(4), 373–393. https://doi.org/10.1177/1053825915592890
Implications of critical thinking coupled with engaged citizenry within experiential education courses.
Mansilla, V. B., & Duraising, E. D. (2007). Target assessment of students’ interdisciplinary work: An empirically grounded framework proposed. The Journal of Higher Education, 78(2), 215-237. https://doi.org/10.1080/00221546.2007.11780874
Introduction of a framework for targeted assessment of interdisciplinary student work. Also a good review of relevant literature of assessment and interdisciplinary learning in higher education.
Yates, T., Wilson, J., & Purton, K. (2015). Surveying assessment in experiential learning: A single campus study. The Canadian Journal for the Scholarship of Teaching and Learning, 6(3). https://doi.org/10.5206/cjsotl-rcacea.2015.3.4
Exploration of experiential assessment within a Canadian University. Exploration intended for the use in identifying common methods and facilitating development of best assessment practices for higher education, specifically experiential higher education.
You, H. S., Marshall, J. A., & Delgado, C. (2019). Toward interdisciplinary learning: Development and validation of an assessment for interdisciplinary understanding of global carbon cycling. Research in Science Education. https://doi.org/10.1007/s11165-019-9836-x
Development and validation of an assessment which measured the understanding of the carbon cycle for high school and undergraduate students.
Building and Managing Student Teams & Team Dynamics
Burke, A. (2011) Group Work: How to Use Groups Effectively. Journal of Effective Teaching, 11(2), 87-95. https://uncw.edu/jet/articles/vol11_2/burke.pdf
Cano, J. L., Lidon, I., Rebollar, R., Roman, P., & Saenz, M. J. (2006). Student groups solving real-life projects. A case study of experiential learning. International Journal of Engineering Education, 22(6), 1252-1260. https://www.ijee.ie/articles/Vol22-6/16_IJEE1811.pdf
Fearon, C., McLaughlin, H., & Yoke Eng, T. (2012). Using student group work in higher education to emulate professional communities of practice. Education + Training, 54(2/3), 114–125. https://doi.org/10.1108/00400911211210233
Fellenz, M. R. (2006). Toward fairness in assessing student groupwork: A protocol for peer evaluation of individual contributions. Journal of Management Education, 30(4), 570–591. https://doi.org/10.1177/1052562906286713
Furman, R., Bender, K., & Rowan, D. (2014). An experiential approach to group work. Oxford University Press.
Smith, G. G., Sorensen, C., Gump, A., Heindel, A. J., Caris, M., & Martinez, C. D. (2011). Overcoming student resistance to group work: Online versus face-to-face. The Internet and Higher Education, 14(2), 121–128. https://doi.org/10.1016/j.iheduc.2010.09.005
Hassanien, A. (2006). Student Experience of Group Work and Group Assessment in Higher Education. Journal of Teaching in Travel & Tourism, 6(1), 17–39. https://doi.org/10.1300/j172v06n01_02
Kayes, A. B., Kayes, D. C., & Kolb, D. A. (2005). Experiential learning in teams. Simulation & Gaming, 36(3), 330–354. https://doi.org/10.1177/1046878105279012
Napier, N. P. & Johnson, R. D. (2007). Technical Projects: Understanding Teamwork Satisfaction In an Introductory IS Course. Journal of Information Systems Education. 18(1), 39-48. http://www.jise.org/volume18/n1/JISEv18n1p39.html
Winsett, C., Foster, C., Dearing, J., & Burch, G. (2016). The impact of group experiential learning on student engagement. Academy of Business Research Journal. 3, 7-17.
Online Experiential Education and Innovative Online Teaching & Course Structures
Bolan, C. M. (2003). Incorporating the experiential learning theory into the instructional design of online courses. Nurse Educator, 28(1), 10–14. https://doi.org/10.1097/00006223-200301000-00006
Provides insights on how to implement an experiential learning framework into an already developed online course.
Christian, D. D., McCarty, D. L., & Brown, C. L. (2020). Experiential education during the COVID-19 pandemic: A reflective process. Journal of Constructivist Psychology, 1–14. https://doi.org/10.1080/10720537.2020.1813666
Provides insight on how experiential learning can occur in an online format which acknowledges the new normal due to the COVID-19 pandemic. This includes case studies.
Sharoff, L. (2019). Creative and innovative online teaching strategies: Facilitation for active participation. The Journal of Educators Online, 16. https://doi.org/10.9743/jeo.2019.16.2.9
Piece on how to keep students thoughtfully engaged with online courses.
Diversity, Equity, and Inclusion
Bricklemyer, J. (2019, April 29). DEI online course supplemental checklist. https://codl.ku.edu/sites/codl.ku.edu/files/docs/DEI%20Online%20Course%20Supplemental%20Checklist%2029Apr19.pdf
A set of five principles around designing a course for inclusion geared specifically toward online courses. Also includes links to other resources for more in-depth resources
Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2). https://doi.org/10.1126/sciadv.aau4734
Students in classes where the instructor believed that student potential was fixed earned lower grades than in courses where the instructor believed student potential changed over time. In addition, the difference in grades between students from underrepresented racial groups and white/Asian students was larger in the classes with instructors who thought mindset was fixed.
CAST (2018). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org
A set of broad guidelines for ensuring that all learners can engage in learning, regardless of culture, language, or disability status. Each guideline includes practical examples of how it could be implemented in a course and the research supporting the guideline.
Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE—Life Sciences Education, 18(2). https://doi.org/10.1187/cbe.19-01-0021
Guide that covers why instructors need to develop self-awareness and empathy for students and consider classroom climate before pedagogical choices for inclusivity. Also includes an interactive webpage about inclusive teaching with literature citations and a checklist for instructors.
MyPronouns.org Resources on Personal Pronouns. (n.d.). https://www.mypronouns.org/
A guide about personal pronouns and best practices for using them: include your pronouns when introducing yourself, avoid using “preferred” in front of pronouns, and using “go by” instead of “uses” when introducing pronouns. E.g. My name is Sparty and I go by him/his pronouns.
University of Michigan Center for Research on Learning and Teaching. Inclusive Strategies Reflection. https://docs.google.com/document/d/1UK3HFQv-3qMDNjvt0fFPbts38ApOL7ghpPE0iSYJ1Z8/edit?usp=sharing
A self-reflection tool for instructors about their teaching practices measured along five dimensions: critical engagement of difference, academic belonging, transparency, structured interactions, and flexibility. Each dimension includes ideas for instructors to add to their own courses
Poorvu Center for Teaching and Learning.(n.d.) Inclusive Teaching Strategies. https://poorvucenter.yale.edu/InclusiveTeachingStrategies
Includes 9 recommendations instructors can take to create a more inclusive classroom including incorporating diversity into the curriculum, examining implicit biases, adding a diversity statement to the syllabus, and soliciting student feedback
Guide for Inclusive Teaching at Columbia https://ctl.columbia.edu/resources-and-technology/resources/inclusive-teaching-guide/
Photo from LubosHouska from Pixabay
Research from the Spartan Studios project
Heinrich, W. F., Louson, E., Blommel, C., & Green, A. R. (2021). Who Coaches the Coaches? The Development of a Coaching Model for Experiential Learning. Innov High Educ 46, 357–375. https://doi.org/10.1007/s10755-020-09537-3
This paper is an overview of the Spartan Studios project and our results for students and faculty who ran prototype courses. It outlines the GORP model as well as the benefits and challenges of this approach to teaching and course planning.
Heinrich, W. F., Lauren, B., & Logan, S. (2020). Interdisciplinary teaching, learning and power in an experiential classroom. Submitted to Experiential Learning & Teaching in Higher Education.
This paper [under review] describes the first iteration of what became the Studios pattern at MSU and introduces the GORP framework.
Research from the James Madison University X-Labs, our colleagues in Virginia working in a similar course model
McCarthy, S., Barnes, A., Briggs, F., Giovanetti, K., Ludwig, P., Robinson, K., & Swayne, N. (Fall 2016). Undergraduate Social Entrepreneurship Education and Communication Design. SIGDOC 2015 Conference Proceedings. https://doi.org/10.1145/2987592.2987625
This report describes some communication strategies within the X-Labs’ drones course, how students documented and presented their works and how faculty plan to iterate the course.
Ludwig, P. M., Lewis, E. J., Nagel, J. K. (2017). Student learning outcomes from a pilot medical innovations course with nursing, engineering and biology undergraduate students. International Journal of STEM Education, 4(33) https://doi.org/10.1186/s40594-017-0095-y
Describes an X-Labs multidisciplinary course on medical innovations and its assessment using qualitative content analysis about students’ attitudes and perceptions of different occupations.
McCarthy, S., Barnes, A., Holland, S. K., Lewis, E., Ludwig, P., & Swayne, N. (2018). Making It: Institutionalizing Collaborative Innovation in Public Higher Education. Proceedings of the 4th International Conference on Higher Education Advances (HEAd’18) 1,549–1,557. http://dx.doi.org/10.4995/HEAD18.2018.8560
A descriptive case study of the academic maker space in the JMU X-Labs, both describing specific courses and how X-Labs is administered. Offers this model as applicable elsewhere in higher ed.
Kishbaugh, A. (2018). An Exploratory Case Study of Cross-Disciplinary Project-Based (i.e. Maker) Curricula as a Catalyst for Entrepreneurship. International Symposium on Academic Makerspaces. https://jmuxlabs.org/app/uploads/2018/10/ISAM_2018_akish_v6.pdf
Describes cross-disciplinary courses as promoting entrepreneurship and innovation, by looking at startups coming from these courses. Offers a framework based on multidisciplinary problem-solving, Design Thinking approaches, and a lean startup methodology.
Selznick, B. S., Mayhew, M. J., & Swayne, N. (2018, November 20). Stop Blaming Innovation. (Correspondence from Chronicle readers). The Chronicle of Higher Education. https://www.chronicle.com/blogs/letters/stop-blaming-innovation/
A rebuttal to an argument that higher ed’s emphasis on innovation is misguided. Argues that innovation has positive student outcomes, is different from entrepreneurship, and that their interventions are effective.
Swayne, N., McCarthy, S., Selznick, B. S., & Fisher, K. A. (2019). Breaking up I/E: Consciously Uncoupling Innovation and Entrepreneurship to Improve Undergraduate Learning. Innovation and Entrepreneurship Theory and Practice. https://doi.org/10.24251/HICSS.2019.651
Describes the X-Labs as evidence for uncoupling entrepreneurship and innovation, and argues that conceptually they are separate; teaching innovation needs to precede teaching entrepreneurship
Lewis, E. J., Ludwig, P. M., Nagel, J., & Ames, A. (2019). Student ethical reasoning confidence pre/post an innovative makerspace course: A survey of ethical reasoning. Nurse Education Today, 75, 75-79. https://doi.org/10.1016/j.nedt.2019.01.011
Describes gains to ethical reasoning after the Medical Innovations X-Labs course.
El-Tawab, S., Sprague, N. & Stewart, M. (2020). Teaching Innovation in Higher Education: A Multidisciplinary Class. In D. Schmidt-Crawford (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 8-13). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/215725/.
Describes a case of the X-Labs autonomous vehicles course, its support of students’ technical and soft skills, and its reproducibility.
McMurtrie, B. (2019) No Textbooks, No Lectures, and No Right Answers. Is This What Higher Education Needs? Chronicle of Higher Education 10 Feb. https://www.chronicle.com/article/no-textbooks-no-lectures-and-no-right-answers-is-this-what-higher-education-needs/
Chronicle of Higher Education story about the JMU X-Labs course model.
Interdisciplinarity
Harden, R. M. (2000) The integration ladder: A tool for curriculum planning and evaluation. Medical Education, 34(7), 551–557. https://doi.org/10.1046/j.1365-2923.2000.00697.x
Offers a framework for thinking about different disciplinary connections, from disciplines being isolated/siloed from each other through transdisciplinarity.
Carmicheal, T. & LaPierre, Y. (2014). Interdisciplinary Learning Works: The Results of a Comprehensive Assessment of Students and Student Learning Outcomes in an Integrative Learning Community. Issues in Interdisciplinary Studies, 32(3), 53–78. http://hdl.handle.net/10323/6647
Evidence-based assessment of student learning outcomes and academic growth metrics as a result of participation in a first-year integrative learning community. The author outlines the interdisciplinary learning goals and processes of the program, and shows that students that participated in the program consistently outperformed students outside of the program in both short term and long term learning and academic growth benchmarks.
Ivanitskaya, L., Clark, D., Montgomery, G., & Primeau, R. (2002). Interdisciplinary Learning: Process and Outcomes. Innovative Higher Education, 27, 95–111. https://doi.org/10.1023/A:1021105309984
A review of expected benefits, learning outcomes, and processes (and potential roadblocks) of interdisciplinary education. Review applied to an interdisciplinary discussion based course. The authors claim that interdisciplinary learning can significantly contribute to intellectual maturity and cognitive development of students, and provide a framework of milestones that students may hit in the process of cognitive development through interdisciplinary ed.
Kezar, A. & Elrod, S. (2012). Facilitating Interdisciplinary Learning: Lessons from Project Kaleidoscope. Change: The Magazine of Higher Learning, 44(1), 16–25, https://doi.org/10.1080/00091383.2012.635999
This magazine article argues for the benefits of interdisciplinary education for both students and institutions, and provides ways to encourage interdisciplinary education on a systemic level. The authors give key strategies and tips for facilitating interdisciplinary learning and creating student experiences. The barriers to interdisciplinary learning/education are recognized (specifically institutional) and potential solutions are given as well.
Stentoft D. (2017) From saying to doing interdisciplinary learning: Is problem-based learning the answer? Active Learning in Higher Education, 18(1). 51–61. https://doi.org/10.1177/1469787417693510
Author argues that PBL is an effective strategy to facilitate interdisciplinary learning and vice versa. The author also acknowledges three barriers to effective interdisciplinary education: curriculum organization, student competencies to navigate interdisciplinary problems, and instructor competency - and proposes how to address these barriers.
Imafuku, R., Kataoka, R., Mayahara, M., Suzuki, H., & Saiki, T. (2014). Students’ Experiences in Interdisciplinary Problem-based Learning: A Discourse Analysis of Group Interaction. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1388
Kruck, S. E. and Teer, Faye P. (2009). Interdisciplinary Student Teams Projects: A Case Study. Journal of Information Systems Education, 20(3), 325–330. https://aisel.aisnet.org/jise/vol20/iss3/7
Problem-Based Learning/Project-Based Learning
Ertmer, P. A., & Simons, K. D. (2006). Jumping the PBL Implementation Hurdle: Supporting the Efforts of K–12 Teachers. Interdisciplinary Journal of Problem-Based Learning, 1(1). https://doi.org/10.7771/1541-5015.1005
While focused on problem based learning at the K-12 level, this paper covers topics relevant to higher education instruction, including implementation challenges, creating collaborative classroom culture, teachers adjusting to changing roles, scaffolding student learning, initiating student inquiry, maintaining student engagement, aiding conceptual integration, and promoting reflective thinking
Fukuzawa, S., Boyd, C., & Cahn, J. (2017). Student motivation in response to problem-based learning. Collected Essays on Learning and Teaching, 10, 175-188. https://doi.org/10.22329/celt.v10i0.4748
Study of student perceptions of problem-based learning in an anthropology course found that students with more subject matter experience didn’t necessarily have greater intrinsic motivation about the course. Also includes strategies for transitioning students to PBL when they are used to traditional lectures.
Guo, P., Saab, N., Post, L. S., & Admiraal, W. (2020). A review of project-based learning in higher education: Student outcomes and measures. International Journal of Educational Research, 102, 101586. https://doi.org/10.1016/j.ijer.2020.101586
A review of literature around project based learning that includes 76 papers. Topics covered in the review include cognitive outcomes of PjBL including knowledge and cognitive strategies, affective outcomes including perceptions of the benefits of PjBL and perceptions of the experience of PBL, and behavior outcomes including skills and engagement
Lee, J. S., Blackwell, S., Drake, J., & Moran, K. A. (2014). Taking a leap of faith: redefining teaching and learning in higher education through project-based learning. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1426
Study of instructors who implemented PjBL that focused around their challenges and successes with community partnerships, student engagement, and assessment
Moro, C., & McLean, M. (2017). Supporting students’ transition to university and problem-based learning. Medical Science Educator, 27(2), 353-361. https://doi.org/10.1007/s40670-017-0384-6
15 strategies for scaffolding learning and supporting students in PBL programs includes using a phased approach to PBL, getting student feedback in the first few weeks of the program, and develop learner’s reflective skills before self-assessment
Pepper C. (2010). ‘There’s a lot of learning going on but NOT much teaching!’: Student perceptions of problem‐based learning in science. Higher Education Research & Development, 29(6), 693-707. https://doi.org/10.1080/07294360.2010.501073
Overview of student responses to problem based learning at an Australian university. Developed a continuum of how students react to problem based learning that includes missing the point, working in groups, splitting the workload, completing the task, assessing the task, learning new information, sharing ideas, and being self directed learners
Perrault, E. K., & Albert, C. A. (2018). Utilizing project-based learning to increase sustainability attitudes among students. Applied Environmental Education & Communication, 17(2), 96-105. https://doi.org/10.1080/1533015X.2017.1366882
While PjBL is often concerned with knowledge gain, this study suggests that PBL can also shift student attitudes around the topic. For this study, students designed a communications campaign for an office of sustainability. The students themselves were found to have more favorable views around sustainability by the end of the course
Boston University Center for Teaching & Learning. (n.d.). Project-based learning: teaching guide. http://www.bu.edu/ctl/guides/project-based-learning/
Brief overview of what project based learning is and four key steps to implementing it (defining the problem, generating ideas, prototyping solutions, and testing)
Strobel, J., & van Barneveld, A. (2009). When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. Interdisciplinary Journal of Problem-Based Learning, 3(1). https://doi.org/10.7771/1541-5015.1046
Combines the results of many meta-analyses around PBL over the last few decades to compare PBL to traditional classroom learning. The study finds that PBL results in more satisfaction among students and faculty, leads to better long term retention of knowledge (traditional was better for short-term), and better skill development
Vogler, J. S., Thompson, P., Davis, D. W., Mayfield, B. E., Finley, P. M., & Yasseri, D. (2018). The hard work of soft skills: augmenting the project-based learning experience with interdisciplinary teamwork. Instructional Science, 46(3), 457-488. https://doi.org/10.1007/s11251-017-9438-9
Two-year study of an interdisciplinary problem based learning task and student outcomes. Study used student feedback during each year to understand how students were feeling about the course. The instructors learned that students felt the instructors had inconsistent and unclear expectations and hence, experienced anxiety about grades. The instructors took this to mean that they needed to do a better job of articulating the learning outcomes and end of course goal. The instructors also learned that students often do not know how to collaborate interdisciplinary and decided to add scaffolding to the course
Learning Objectives and Bloom’s Taxonomy
Armstrong, P. (2010). Bloom’s taxonomy. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/
Overview of the original 6 levels of Bloom’s Taxonomy and the 6 levels of the Revised Taxonomy: remember, understand, apply, analyze, evaluate, and create. Includes the four types of knowledge: factual, conceptual, procedural, and metacognitive.
Carnegie Mellon University Eberly Center. (n.d.). Design & Teach a Course. https://www.cmu.edu/teaching/designteach/design/learningobjectives.html
Strategies and tips for articulating and writing learning objectives including that learning objectives should be student-centered, break down the task and focus on specific cognitive processes, use action verbs, and be measurable.
Ferguson, C. (2002). Using the revised taxonomy to plan and deliver team-taught, integrated, thematic units. Theory Into Practice, 41(4), 238-243. https://doi.org/10.1207/s15430421tip4104_6
Example of an interdisciplinary high school course (English & social studies) where the two instructors used a taxonomy table to map their learning objectives onto the 6 levels of the Revised Taxonomy and 4 types of knowledge. Such a table may be useful for thinking about the learning objectives in your course
Kidwell, L. A., Fisher, D. G., Braun, R. L., & Swanson, D. L. (2013). Developing learning objectives for accounting ethics using Bloom's taxonomy. Accounting Education, 22(1), 44-65. https://doi.org/10.1080/09639284.2012.698478
An example of using Bloom’s Taxonomy in accounting ethics to create learning objectives. For each larger course theme, the authors list examples how learning objectives could be created from each level of the Taxonomy.
Mayer, R. E. (2002). Rote versus meaningful learning. Theory Into Practice, 41(4), 226-232. https://doi.org/10.1207/s15430421tip4104_4
Includes 19 processes/action verbs, how they map to the 6 levels of the Revised Taxonomy, and simple examples of what a task for students to do might look like. Examples of included verbs are “compare,” “implement,” “organize,” “critique,” and “generate”
Tyran, C. K. (2010). Designing the spreadsheet-based decision support systems course: an application of Bloom's taxonomy. Journal of Business Research, 63(2), 207-216. https://doi.org/10.1016/j.jbusres.2009.03.009
An example of using Bloom’s taxonomy to map course activities to ensure students have the prerequisite knowledge to complete the assignments
Reflection; Reflection as Assessment
Ash, S. L., & Clayton, P. H. (2009). Learning through critical reflection: A tutorial for service-learning students. Ash, Clayton & Moses.
Introduces characteristics of critical reflection and the DEAL model.
Eyler, J., Eyler, J., Giles, D. E., & Schmeide, A. (1996). A practitioner's guide to reflection in service-learning: Student voices & reflections. Vanderbilt University.
Argues that successful reflection is continuous, challenging, connected, and contextualized.
Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximize student learning (2nd edition). Corwin Press.
Especially chapter 10, Using Assessment for Reflection and Self-Regulation
Ash, S. L., Clayton, P. H., & Atkinson, M. P. (2005). Integrating reflection and assessment to capture and improve student learning. Michigan Journal of Community Service Learning, 11(2), 49-60. http://hdl.handle.net/2027/spo.3239521.0011.204
Sees coupled reflection and assessment as mutually informing and reinforcing for students in service learning. Describes tools to guide reflective writing processes. Focus on both individual student learning and reflection as part of program-wide approaches to reflection.
Assessment of Experiential Education & Interdisciplinary Learning
Conrad, D., & Hedin, D. (1981). National assessment of experiential education: Summary and implications. Journal of Experiential Education, 4(2), 6–20. https://doi.org/10.1177/105382598100400202
A summary of the research of the Evaluation of Experiential Learning project which sought to (1) assess the impact of experiential learning on secondary school students and (2) use that data to identify the elements of the EE programs that contributed the most to such student development.
Field, M., Lee, R., & Field, M. L. (1994). Assessing interdisciplinary learning. New Directions for Teaching and Learning, 1994(58), 69–84. https://doi.org/10.1002/tl.37219945806
In-depth discussion of assessment techniques for interdisciplinary study in higher education
Heinrich, W. F., Habron, G. B., Johnson, H. L., & Goralnik, L. (2015). Critical thinking assessment across four sustainability-related experiential learning settings. Journal of Experiential Education, 38(4), 373–393. https://doi.org/10.1177/1053825915592890
Implications of critical thinking coupled with engaged citizenry within experiential education courses.
Mansilla, V. B., & Duraising, E. D. (2007). Target assessment of students’ interdisciplinary work: An empirically grounded framework proposed. The Journal of Higher Education, 78(2), 215-237. https://doi.org/10.1080/00221546.2007.11780874
Introduction of a framework for targeted assessment of interdisciplinary student work. Also a good review of relevant literature of assessment and interdisciplinary learning in higher education.
Yates, T., Wilson, J., & Purton, K. (2015). Surveying assessment in experiential learning: A single campus study. The Canadian Journal for the Scholarship of Teaching and Learning, 6(3). https://doi.org/10.5206/cjsotl-rcacea.2015.3.4
Exploration of experiential assessment within a Canadian University. Exploration intended for the use in identifying common methods and facilitating development of best assessment practices for higher education, specifically experiential higher education.
You, H. S., Marshall, J. A., & Delgado, C. (2019). Toward interdisciplinary learning: Development and validation of an assessment for interdisciplinary understanding of global carbon cycling. Research in Science Education. https://doi.org/10.1007/s11165-019-9836-x
Development and validation of an assessment which measured the understanding of the carbon cycle for high school and undergraduate students.
Building and Managing Student Teams & Team Dynamics
Burke, A. (2011) Group Work: How to Use Groups Effectively. Journal of Effective Teaching, 11(2), 87-95. https://uncw.edu/jet/articles/vol11_2/burke.pdf
Cano, J. L., Lidon, I., Rebollar, R., Roman, P., & Saenz, M. J. (2006). Student groups solving real-life projects. A case study of experiential learning. International Journal of Engineering Education, 22(6), 1252-1260. https://www.ijee.ie/articles/Vol22-6/16_IJEE1811.pdf
Fearon, C., McLaughlin, H., & Yoke Eng, T. (2012). Using student group work in higher education to emulate professional communities of practice. Education + Training, 54(2/3), 114–125. https://doi.org/10.1108/00400911211210233
Fellenz, M. R. (2006). Toward fairness in assessing student groupwork: A protocol for peer evaluation of individual contributions. Journal of Management Education, 30(4), 570–591. https://doi.org/10.1177/1052562906286713
Furman, R., Bender, K., & Rowan, D. (2014). An experiential approach to group work. Oxford University Press.
Smith, G. G., Sorensen, C., Gump, A., Heindel, A. J., Caris, M., & Martinez, C. D. (2011). Overcoming student resistance to group work: Online versus face-to-face. The Internet and Higher Education, 14(2), 121–128. https://doi.org/10.1016/j.iheduc.2010.09.005
Hassanien, A. (2006). Student Experience of Group Work and Group Assessment in Higher Education. Journal of Teaching in Travel & Tourism, 6(1), 17–39. https://doi.org/10.1300/j172v06n01_02
Kayes, A. B., Kayes, D. C., & Kolb, D. A. (2005). Experiential learning in teams. Simulation & Gaming, 36(3), 330–354. https://doi.org/10.1177/1046878105279012
Napier, N. P. & Johnson, R. D. (2007). Technical Projects: Understanding Teamwork Satisfaction In an Introductory IS Course. Journal of Information Systems Education. 18(1), 39-48. http://www.jise.org/volume18/n1/JISEv18n1p39.html
Winsett, C., Foster, C., Dearing, J., & Burch, G. (2016). The impact of group experiential learning on student engagement. Academy of Business Research Journal. 3, 7-17.
Online Experiential Education and Innovative Online Teaching & Course Structures
Bolan, C. M. (2003). Incorporating the experiential learning theory into the instructional design of online courses. Nurse Educator, 28(1), 10–14. https://doi.org/10.1097/00006223-200301000-00006
Provides insights on how to implement an experiential learning framework into an already developed online course.
Christian, D. D., McCarty, D. L., & Brown, C. L. (2020). Experiential education during the COVID-19 pandemic: A reflective process. Journal of Constructivist Psychology, 1–14. https://doi.org/10.1080/10720537.2020.1813666
Provides insight on how experiential learning can occur in an online format which acknowledges the new normal due to the COVID-19 pandemic. This includes case studies.
Sharoff, L. (2019). Creative and innovative online teaching strategies: Facilitation for active participation. The Journal of Educators Online, 16. https://doi.org/10.9743/jeo.2019.16.2.9
Piece on how to keep students thoughtfully engaged with online courses.
Diversity, Equity, and Inclusion
Bricklemyer, J. (2019, April 29). DEI online course supplemental checklist. https://codl.ku.edu/sites/codl.ku.edu/files/docs/DEI%20Online%20Course%20Supplemental%20Checklist%2029Apr19.pdf
A set of five principles around designing a course for inclusion geared specifically toward online courses. Also includes links to other resources for more in-depth resources
Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2). https://doi.org/10.1126/sciadv.aau4734
Students in classes where the instructor believed that student potential was fixed earned lower grades than in courses where the instructor believed student potential changed over time. In addition, the difference in grades between students from underrepresented racial groups and white/Asian students was larger in the classes with instructors who thought mindset was fixed.
CAST (2018). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org
A set of broad guidelines for ensuring that all learners can engage in learning, regardless of culture, language, or disability status. Each guideline includes practical examples of how it could be implemented in a course and the research supporting the guideline.
Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE—Life Sciences Education, 18(2). https://doi.org/10.1187/cbe.19-01-0021
Guide that covers why instructors need to develop self-awareness and empathy for students and consider classroom climate before pedagogical choices for inclusivity. Also includes an interactive webpage about inclusive teaching with literature citations and a checklist for instructors.
MyPronouns.org Resources on Personal Pronouns. (n.d.). https://www.mypronouns.org/
A guide about personal pronouns and best practices for using them: include your pronouns when introducing yourself, avoid using “preferred” in front of pronouns, and using “go by” instead of “uses” when introducing pronouns. E.g. My name is Sparty and I go by him/his pronouns.
University of Michigan Center for Research on Learning and Teaching. Inclusive Strategies Reflection. https://docs.google.com/document/d/1UK3HFQv-3qMDNjvt0fFPbts38ApOL7ghpPE0iSYJ1Z8/edit?usp=sharing
A self-reflection tool for instructors about their teaching practices measured along five dimensions: critical engagement of difference, academic belonging, transparency, structured interactions, and flexibility. Each dimension includes ideas for instructors to add to their own courses
Poorvu Center for Teaching and Learning.(n.d.) Inclusive Teaching Strategies. https://poorvucenter.yale.edu/InclusiveTeachingStrategies
Includes 9 recommendations instructors can take to create a more inclusive classroom including incorporating diversity into the curriculum, examining implicit biases, adding a diversity statement to the syllabus, and soliciting student feedback
Guide for Inclusive Teaching at Columbia https://ctl.columbia.edu/resources-and-technology/resources/inclusive-teaching-guide/
Photo from LubosHouska from Pixabay
Authored by:
Ellie Louson

Posted on: #iteachmsu

Spartan Studios Playkit: Appendix
AppendixThis is the ninth and final article in our iTeach.MSU ...
Authored by:
PEDAGOGICAL DESIGN
Tuesday, Jun 22, 2021
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Course Content: What makes the cut
There are a variety of considerations when it comes to course content. Now, if you’re close to the start of the semester, it is likely that you have already chosen (and submitted to the Registrar Office) your textbook and/or required materials for student purchase. Please consider the following when selecting your supplemental course content (additional materials, case studies, scenarios, etc.)... and for your primary texts next term.
Diversifying voice - who is represented?
“Does your syllabus demonstrate to students that everyone has a place in your field of study? … Pedagogically, we might find it challenging to create a sense of belonging in a course when some students cannot imagine themselves as part of the community of scholarship and practice” (Marcella Addy et al., 2021, p. 52). Wow, that statement is really powerful, especially considering some recent scholarship. Schucan Bird and Pitman (2020) found, after an analysis of reading lists, that the reading lists did not represent the diverse local student body but came closer to representing the demographic profile of academic staff (dominated by white, male, and Eurocentric authors). Despite challenges across disciplines and settings, educators should make every effort to center students in their course design and make course materials a descriptive representation of the student body itself (Schucan Bird & Pitman, 2020). This shift can include showcasing the contributions of marginalized groups (Blackburn, 2017) with greater representation of perspectives, histories and approaches of scholars (Le Grange, 2016), along with adopting efforts to decolonialize teaching and learning (Phillips & Archer-Lean 2018).
Looking for ways to get started? Colleagues at Tufts University Libraries (according to this Inside Higher Ed article) have noted that diversifying your course materials to include content about and by marginalized scholars (groups whose characteristics result in the systematic denial of equal rights and opportunities within a community or society including but not limited to race, socioeconomic status, gender identity, sexual orientation) helps to “foster an environment that includes knowledge that has been systematically excluded from academia.” You might…
Considering diverse authorship of readings (ethnicity, gender, geographic location)
Inviting guest speakers who bring different perspectives
Using diverse audio and visual materials, such as films, interviews and TED talks
Incorporating readings that challenge standard approaches
Using primary research with authorship that reflects local collaborators
Offering multiple perspectives in assigned readings and letting students choose what to read or discuss at times.
Faculty members “can identify resources that highlight historically underrepresented researchers and activists in our fields,” she suggests. “We can include statements and topics in syllabi to decode our courses, structures and expectations. We can work to decolonize the power dynamics of our classrooms so what students already know and experience is also seen as a valuable contribution to the learning environment,” said Bridget Trogden (presently serving as Dean of Undergraduate Education at American University). Improving diversity and inclusion of voices in educational materials isn’t necessarily difficult, educators just need to be intentional. Fuentes et al. (2021) go beyond centering authors of mariginalized backgrounds, and recommend educators transparently acknowledge their intentional material selections. The example they provide in their article Rethinking the Course Syllabus: Considerations for Promoting Equity, Diversity, and Inclusion is, "The following text/articles for the course have been chosen in efforts to highlight the important work of historically underrepresented and marginalized scholars in the field" (Fuentes et al., 2021, p.75).
“The proof is in the data: children are more likely to have a more productive learning experience and thrive in the classroom, throughout the school and in their communities when they see themselves represented in curriculum and library materials,” said Lessa Kanani'opua Pelayo-Lozada, President of the American Library Association. If data supports diversifying reading may boost a student’s development and well-being, WHILE ALSO increasing a sense of belonging and breaking down barriers to collegiate success… what reasons do we have not to reimagine our course materials?
Accessibility of digital content
The experts at MSU IT who manage the Digital Accessibility page recommend that educators ask the following questions before adopting digital content (adapted with permission from UC-Boulder’s Digital Accessibility Program):
Ask for Publisher Information: Contact the publisher to ask them for details about the accessibility of your particular textbook and/or digital content. This should include all known accessibility issues, any workarounds that the student can use, a named point of contact, and any guidance on how to ensure any content you create within the platform is accessible.
Review your Assessments: If you use digital online quizzes, ask the publisher for a list of quiz question types that are accessible. Review your own quiz content to ensure that none of your questions rely on drag-and-drop actions, images without alt text, or other inaccessible mechanisms.
Consult with Digital Accessibility Specialists: Contact your local Accessibility Policy Liaison for support and reach out to the MSU IT Digital Experience (DigitalX) team for help evaluating your digital content at webaccess@msu.edu or call the IT Help Desk at 517-432-6200.
Notify your Students: If the digital content (including texts, assignments, tests, or online homework systems) used in your course are not fully accessible, please notify your students in your syllabus with the following statement: “This course requires the use of [name of software or service], which is currently not accessible. Michigan State University is committed to providing equal opportunity for participation in all programs, services and activities. Accommodations for persons with disabilities, with documentation from the MSU Resource Center for Persons with Disabilities, may be requested by contacting [insert Professor name or "me"] at the start of the term and/or two weeks prior to the accommodation date (test, project, etc). Requests received after this date will be honored whenever possible. For questions, contact the Resource Center for Persons with Disabilities at 517.884.7273”
For more information on Digital Accessibility check out the “Course Accessibility: Commitments, Support, and Resources” article, visit the Accessibility Evaluation Questions for Digital Content page, or contact your college/department’s Web Accessibility Policy Liaison.
Cost as a barrier to access
Buying school materials can cost a lot, creating a barrier for students and impacting their collegiate success. Taking measures to curtail expenses on mandatory learning resources is not only a stride towards rendering college more cost-effective and attainable but also promotes equity. Embedding no-cost course materials into a syllabus provides the avenue to diminish financial burdens on students, foster more inclusive access to education, and enables the repurposing, blending, and creation of course content specifically tailored to each class. According to MSU Libraries Open Educational Resources (OER) Program, OER are “teaching, learning, and research resources that are copyright-free (public domain) or have been released under an open license that permits others to reuse, revise, remix, retain, and redistribute them. Examples of OER include open textbooks, videos, images, course modules, lectures, homework assignments, quizzes, lab and classroom activities, games, simulations, and other resources contained in digital media collections from around the world.”
Diversifying Course Materials: A How-To Guide on Inside Higher Ed (previously linked) shared four additional considerations for instructors when considering their course materials.
Accessibility, affordability and adaptation
Relatability and reflection
Clarity and intentionality
Alternative perspectives
Read more about each of these four considerations at the link above and check out the resources below for more in depth from authors cited throughout this article.
Resources
Marcella Addy, Dube, Mitchell & SoRelle (2021) What Inclusive Instructors Do. Stylus Publishing. https://doi.org/10.4324/9781003448655
Schucan Bird, K. & Pitman, L. (2020) How diverse is your reading list? Exploring issues of representation and decolonisation in the UK. Higher Education, 79, 903–920. https://doi.org/10.1007/s10734-019-00446-9.
Le Grange, L. (2016). Decolonising the university curriculum. South African Journal of Higher Education. https://doi.org/10.20853/30-2-709.
Blackburn, H. (2017). The status of women in STEM in higher education: a review of the literature 2007–2017. Science & Technology Libraries. https://doi.org/10.1080/0194262X.2017.1371658.
Phillips, S. R., & Archer-Lean, C. (2018). Decolonising the reading of Aboriginal and Torres Strait Islander writing: reflection as transformative practice. Higher Education Research & Development, 38(1), 24–37. https://doi.org/10.1080/07294360.2018.1539956.
Fuentes, M. A., Zelaya, D. G., & Madsen, J. W. (2021). Rethinking the Course Syllabus: Considerations for Promoting Equity, Diversity, and Inclusion. Teaching of Psychology, 48(1), 69-79. https://doi.org/10.1177/0098628320959979
Photo by Paul Hanaoka on Unsplash
Diversifying voice - who is represented?
“Does your syllabus demonstrate to students that everyone has a place in your field of study? … Pedagogically, we might find it challenging to create a sense of belonging in a course when some students cannot imagine themselves as part of the community of scholarship and practice” (Marcella Addy et al., 2021, p. 52). Wow, that statement is really powerful, especially considering some recent scholarship. Schucan Bird and Pitman (2020) found, after an analysis of reading lists, that the reading lists did not represent the diverse local student body but came closer to representing the demographic profile of academic staff (dominated by white, male, and Eurocentric authors). Despite challenges across disciplines and settings, educators should make every effort to center students in their course design and make course materials a descriptive representation of the student body itself (Schucan Bird & Pitman, 2020). This shift can include showcasing the contributions of marginalized groups (Blackburn, 2017) with greater representation of perspectives, histories and approaches of scholars (Le Grange, 2016), along with adopting efforts to decolonialize teaching and learning (Phillips & Archer-Lean 2018).
Looking for ways to get started? Colleagues at Tufts University Libraries (according to this Inside Higher Ed article) have noted that diversifying your course materials to include content about and by marginalized scholars (groups whose characteristics result in the systematic denial of equal rights and opportunities within a community or society including but not limited to race, socioeconomic status, gender identity, sexual orientation) helps to “foster an environment that includes knowledge that has been systematically excluded from academia.” You might…
Considering diverse authorship of readings (ethnicity, gender, geographic location)
Inviting guest speakers who bring different perspectives
Using diverse audio and visual materials, such as films, interviews and TED talks
Incorporating readings that challenge standard approaches
Using primary research with authorship that reflects local collaborators
Offering multiple perspectives in assigned readings and letting students choose what to read or discuss at times.
Faculty members “can identify resources that highlight historically underrepresented researchers and activists in our fields,” she suggests. “We can include statements and topics in syllabi to decode our courses, structures and expectations. We can work to decolonize the power dynamics of our classrooms so what students already know and experience is also seen as a valuable contribution to the learning environment,” said Bridget Trogden (presently serving as Dean of Undergraduate Education at American University). Improving diversity and inclusion of voices in educational materials isn’t necessarily difficult, educators just need to be intentional. Fuentes et al. (2021) go beyond centering authors of mariginalized backgrounds, and recommend educators transparently acknowledge their intentional material selections. The example they provide in their article Rethinking the Course Syllabus: Considerations for Promoting Equity, Diversity, and Inclusion is, "The following text/articles for the course have been chosen in efforts to highlight the important work of historically underrepresented and marginalized scholars in the field" (Fuentes et al., 2021, p.75).
“The proof is in the data: children are more likely to have a more productive learning experience and thrive in the classroom, throughout the school and in their communities when they see themselves represented in curriculum and library materials,” said Lessa Kanani'opua Pelayo-Lozada, President of the American Library Association. If data supports diversifying reading may boost a student’s development and well-being, WHILE ALSO increasing a sense of belonging and breaking down barriers to collegiate success… what reasons do we have not to reimagine our course materials?
Accessibility of digital content
The experts at MSU IT who manage the Digital Accessibility page recommend that educators ask the following questions before adopting digital content (adapted with permission from UC-Boulder’s Digital Accessibility Program):
Ask for Publisher Information: Contact the publisher to ask them for details about the accessibility of your particular textbook and/or digital content. This should include all known accessibility issues, any workarounds that the student can use, a named point of contact, and any guidance on how to ensure any content you create within the platform is accessible.
Review your Assessments: If you use digital online quizzes, ask the publisher for a list of quiz question types that are accessible. Review your own quiz content to ensure that none of your questions rely on drag-and-drop actions, images without alt text, or other inaccessible mechanisms.
Consult with Digital Accessibility Specialists: Contact your local Accessibility Policy Liaison for support and reach out to the MSU IT Digital Experience (DigitalX) team for help evaluating your digital content at webaccess@msu.edu or call the IT Help Desk at 517-432-6200.
Notify your Students: If the digital content (including texts, assignments, tests, or online homework systems) used in your course are not fully accessible, please notify your students in your syllabus with the following statement: “This course requires the use of [name of software or service], which is currently not accessible. Michigan State University is committed to providing equal opportunity for participation in all programs, services and activities. Accommodations for persons with disabilities, with documentation from the MSU Resource Center for Persons with Disabilities, may be requested by contacting [insert Professor name or "me"] at the start of the term and/or two weeks prior to the accommodation date (test, project, etc). Requests received after this date will be honored whenever possible. For questions, contact the Resource Center for Persons with Disabilities at 517.884.7273”
For more information on Digital Accessibility check out the “Course Accessibility: Commitments, Support, and Resources” article, visit the Accessibility Evaluation Questions for Digital Content page, or contact your college/department’s Web Accessibility Policy Liaison.
Cost as a barrier to access
Buying school materials can cost a lot, creating a barrier for students and impacting their collegiate success. Taking measures to curtail expenses on mandatory learning resources is not only a stride towards rendering college more cost-effective and attainable but also promotes equity. Embedding no-cost course materials into a syllabus provides the avenue to diminish financial burdens on students, foster more inclusive access to education, and enables the repurposing, blending, and creation of course content specifically tailored to each class. According to MSU Libraries Open Educational Resources (OER) Program, OER are “teaching, learning, and research resources that are copyright-free (public domain) or have been released under an open license that permits others to reuse, revise, remix, retain, and redistribute them. Examples of OER include open textbooks, videos, images, course modules, lectures, homework assignments, quizzes, lab and classroom activities, games, simulations, and other resources contained in digital media collections from around the world.”
Diversifying Course Materials: A How-To Guide on Inside Higher Ed (previously linked) shared four additional considerations for instructors when considering their course materials.
Accessibility, affordability and adaptation
Relatability and reflection
Clarity and intentionality
Alternative perspectives
Read more about each of these four considerations at the link above and check out the resources below for more in depth from authors cited throughout this article.
Resources
Marcella Addy, Dube, Mitchell & SoRelle (2021) What Inclusive Instructors Do. Stylus Publishing. https://doi.org/10.4324/9781003448655
Schucan Bird, K. & Pitman, L. (2020) How diverse is your reading list? Exploring issues of representation and decolonisation in the UK. Higher Education, 79, 903–920. https://doi.org/10.1007/s10734-019-00446-9.
Le Grange, L. (2016). Decolonising the university curriculum. South African Journal of Higher Education. https://doi.org/10.20853/30-2-709.
Blackburn, H. (2017). The status of women in STEM in higher education: a review of the literature 2007–2017. Science & Technology Libraries. https://doi.org/10.1080/0194262X.2017.1371658.
Phillips, S. R., & Archer-Lean, C. (2018). Decolonising the reading of Aboriginal and Torres Strait Islander writing: reflection as transformative practice. Higher Education Research & Development, 38(1), 24–37. https://doi.org/10.1080/07294360.2018.1539956.
Fuentes, M. A., Zelaya, D. G., & Madsen, J. W. (2021). Rethinking the Course Syllabus: Considerations for Promoting Equity, Diversity, and Inclusion. Teaching of Psychology, 48(1), 69-79. https://doi.org/10.1177/0098628320959979
Photo by Paul Hanaoka on Unsplash
Authored by:
Makena Neal

Posted on: #iteachmsu

Course Content: What makes the cut
There are a variety of considerations when it comes to course conte...
Authored by:
PEDAGOGICAL DESIGN
Friday, Feb 2, 2024
Posted on: #iteachmsu
ASSESSING LEARNING
Exam Strategy for Online and Distance Teaching
Authors: Jeremy Van Hof, Stephen Thomas, Becky Matz, Kate Sonka, Sarah Wellman, Daniel Trego, Casey Henley, Jessica Knott, David Howe With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs. From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the online environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. In thinking about online exams we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams. When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands a remote (distinct from online) environment places on students.
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for online teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands a remote (distinct from online) environment places on students.
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for online teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Authored by:
Jeremy Van Hof, Stephen Thomas, Becky Matz, Kate Sonka, S...

Posted on: #iteachmsu

Exam Strategy for Online and Distance Teaching
Authors: Jeremy Van Hof, Stephen Thomas, Becky Matz, Kate Sonka, Sa...
Authored by:
ASSESSING LEARNING
Wednesday, Sep 2, 2020
Posted on: MSU Online & Remote Teaching
ASSESSING LEARNING
Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs. From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the remote environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. In thinking about online exams, and the current situation for remote teaching, we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams. When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Authored by:
Jessica Knott, Stephen Thomas, Becky Matz, Kate Sonka, Sa...

Posted on: MSU Online & Remote Teaching

Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, gen...
Authored by:
ASSESSING LEARNING
Tuesday, Jul 7, 2020