We found 61 results that contain "observation"
Posted on: MSU Online & Remote Teaching
ASSESSING LEARNING
Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, generosity, and transparency, we know that there is no one solution for assessment that will meet all faculty and student needs. From this perspective, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the exam to allow for the differences of the remote environment. This document, written for any instructor who typically administers an end-of-semester high-stakes final exam, addresses how best to make those modifications. In thinking about online exams, and the current situation for remote teaching, we recommend the following approaches (in priority order) for adjusting exams: multiple lower-stakes assessments, open-note exams, and online proctored exams. When changes to the learning environment occur, creating an inclusive and accessible learning experience for students with disabilities should remain a top priority. This includes providing accessible content and implementing student disability accommodations, as well as considering the ways assessment methods might be affected.
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Faculty and students should be prepared to discuss accommodation needs that may arise. The team at MSU Resource Center for Persons with Disabilities (RCPD) will be available to answer questions about implementing accommodations. Contact information for Team RCPD is found at https://www.rcpd.msu.edu/teamrcpd. Below you will find a description of each of the recommendations, tips for their implementation, the benefits of each, and references to pertinent research on each.
There are three primary options*:
Multiple lower-stakes assessments (most preferred)
Open note exams (preferred)
Online proctored exams (if absolutely necessary)
*Performance-based assessments such as laboratory, presentation, music, or art experiences that show proficiency will be discussed in another document
Multiple lower-stakes assessments
Description: The unique circumstances of this semester make it necessary to carefully consider your priorities when assessing students. Rather than being cumulative, a multiple assessment approach makes assessment an incremental process. Students demonstrate their understanding frequently, and accrue points over time, rather than all at once on one test. Dividing the assessment into smaller pieces can reduce anxiety and give students more practice in taking their exams online. For instance, you might have a quiz at the end of each week that students have to complete. Each subsequent quiz can (and should) build on the previous one, allowing students to build toward more complex and rigorous applications of the content. Using this approach minimizes your need to change the types of questions that you have been asking to date, which can affect student performance (e.g. if you normally ask multiple-choice questions, you can continue to do so). For the remainder of the semester, use the D2L quizzes tool to build multiple smaller assessments. Spread out the totality of your typical final exam over the month of April. This can be as simple as dividing a 100 question final exam into eight 12-question “synthesis activities” that students complete bi-weekly.
Benefits as noted from the literature:
No significant differences were observed in terms of keystroke information, rapid guessing, or aggregated scores between proctoring conditions;
More effective method for incentivizing participation and reading;
Encourages knowledge retention as each subsequent assessment builds on the last
Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes internet test administration: Is there differential test-taking behavior and performance?. American Journal of Distance Education, 31(4), 226-241. https://www.tandfonline.com/doi/abs/10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Open note exams
Description: Open note assessments allow students to refer to the Internet and other materials while completing their assessments. By design, this disincentives academic dishonesty. Often instructors put time parameters around open note exams. These types of exams also lend themselves to collaborative work in which multiple students work together to complete the assessment. With an open note strategy, you can keep your general exam schedule and point structure, but you may need to revise questions so they are less about factual recall and more about the application of concepts. For instance you might give students a scenario or case study that they have to apply class concepts to as opposed to asking for specific values or definitions. If you plan to make such changes, communicate your intent and rationale to you students prior to the exam. One effective open note testing technique is to use multiple-true/false questions as a means to measure understanding. These questions (called “multiple selection” questions in D2L) pose a scenario and prompt students to check all the boxes that apply. For example, students may be prompted to read a short case or lab report, then check all statements that are true about that reading. In this way a single question stem can assess multiple levels of complexity and/or comprehension.
Benefits as noted from the literature:
Open-book exams and collaborative exams promote development of critical thinking skills.
Open-book exams are more engaging and require higher-order thinking skills.
Application of open-book exams simulates the working environment.
Students prefer open-book exams and report decreased anxiety levels.
Collaborative exams stimulate brain cell growth and intricate cognitive complexes.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
Implementation for multiple lower-stakes and open note assessment strategies:
Timed vs. untimed: On the whole, performance on timed and untimed assessments yields similar scores. Students express greater anxiety over timed assessments, while they view untimed assessments as more amenable to dishonest behavior.
NOTE: If you typically have a time limit on your face-to-face assessments, increase it by 20% to allow for the added demands the remote environment places on students. </li >
If the exam is meant to be taken synchronously, remember to stay within your class period. Adjust the length of the exam accordingly.
Reduced scope: Decreasing content covered in the exam may be necessary to create an exam of appropriate length and complexity, given the unique circumstances this semester.
Question pools: Create a pool of questions, and let D2L randomly populate each student’s quiz. This helps reduce dishonest behavior
For example, a 10 question quiz might have 18 total questions in the pool, 10 of which are randomly distributed to each student by D2L.
Randomize answer order: In questions in which it makes sense, have D2L randomize the order in which the answer options appear.
Individual question per page: This can reduce instances of students taking the assessment together. It is even more effective when question order is randomized and a question pool is used. <//li>
Honor code attestation: Give students an opportunity to affirm their intent to be honest by making question one of every assessment a 0-point question asking students to agree to an honor code. You can access the MSU Honor Code: https://www.deanofstudents.msu.edu/academic-integrity
Live Zoom availability: In D2L Quizzes, set a time window during which the assessment will be available to students.
Hold a live open office hours session in Zoom at some point during that window, so that students who want to can take the assessment while they have direct access to you - this way they can ask questions if any arise.
Ultimately, our guiding principles for remote teaching are flexibility, generosity, and transparency. Try to give students as much of an opportunity to demonstrate their knowledge as possible.
Consider allowing multiple attempts on an assessment.
When conditions allow, consider allowing multiple means of expression.
Can students choose to demonstrate their knowledge from a menu of options
M/C test
Written response
Video presentation
Oral Exam (via Zoom)
Consider giving students choices. Perhaps they can opt out of answering a question or two. Perhaps they can choose which of a series of prompts to respond to. Perhaps students can waive one test score (to help accomodate for their rapidly changing environments)
Proctored assessments
Description: Respondus Lockdown Browser and Respondus Monitor are tools for remote proctoring in D2L. More information is available at https://help.d2l.msu.edu/node/4686. Please consider whether your assessments can be designed without the need for Respondus. While Respondus may be helpful in limited circumstances (e.g., when assessments must be proctored for accreditation purposes), introducing a new technology may cause additional stress for both students and instructors, and academic integrity is still not assured. High-stakes exams (those that are a large percentage of a student’s grade) that use new technologies and approaches can decrease student performance and may not reflect students’ understanding of the material. Please do not use an online proctored approach unless your assessment needs require its use.
Benefits:
Increases the barrier to academic dishonesty. Allows for use of existing exams (assuming they are translated in D2L’s Quizzes tool).
Implementation:
Any online proctored exam must be created and administered using D2L’s Quizzes tool.
Prior to offering a graded proctored exam, we strongly recommend that you administer an ungraded (or very low-stakes) practice test using the proctoring tool.
Clear communication with students about system and hardware requirements and timing considerations is required.
MSU has gained temporary no-cost access to a pair of online proctoring tools provided by Respondus: https://help.d2l.msu.edu/node/4686
Respondus Lockdown Browser requires that students download a web browser.
When they click into your exam, the Lockdown Browser opens, and prevents users from accessing anything else on their computer.
Respondus Monitor requires use of Respondus Lockdown Browser and a webcam.
Students are monitored via the webcam while they complete the exam in Lockdown Browser.
Additional Resources:
Remote Assessment Quick Guide
Remote Assessment Video Conversation
D2L Quizzes Tool Guide
Self-training on D2L Quizzes (login to MSU’s D2L is required; self-enroll into the training course)
References: Alessio, H.M.; Malay, N.; Mauere, K.; Bailer, A.J.; & Rubin, B.(2017) Examining the effect of proctoring on online test scores, Online Learning 21 (1) Altınay, Z. (2017) Evaluating peer learning and assessment in online collaborative learning environments, Behaviour & Information Technology, 36:3, 312-320, DOI: 10.1080/0144929X.2016.1232752
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037 Cramp, J.; Medlin, J. F.; Lake, P.; & Sharp, C. (2019) Lessons learned from implementing remotely invigilated online exams, Journal of University Teaching & Learning Practice, 16(1). Guerrero-Roldán, A., & Noguera, I.(2018) A Model for Aligning Assessment with Competences and Learning Activities in Online Courses, The Internet and Higher Education, vol. 38, pp. 36–46., doi:10.1016/j.iheduc.2018.04.005.
Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse education in practice, 27, 89-94. https://www.sciencedirect.com/science/article/abs/pii/S1471595317305486 Joseph A. Rios, J.A. & Lydia Liu, O.L. (2017) Online Proctored Versus Unproctored Low-Stakes Internet Test Administration: Is There Differential Test-Taking Behavior and Performance?, American Journal of Distance Education, 31:4, 226-241, DOI: 10.1080/08923647.2017.1258628 Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology, 44(2), 118-127. https://journals.sagepub.com/doi/abs/10.1177/0092055X15624745 Soffer, Tal, et al. “(2017) Assessment of Online Academic Courses via Students' Activities and Perceptions, Studies in Educational Evaluation, vol. 54, pp. 83–93., doi:10.1016/j.stueduc.2016.10.001.
Tan, C.(2020) Beyond high-stakes exam: A neo-Confucian educational programme and its contemporary implications, Educational Philosophy and Theory, 52:2, 137-148, DOI: 10.1080/00131857.2019.1605901
VanPatten, B., Trego, D., & Hopkins, W. P. (2015). In‐Class vs. Online Testing in University‐Level Language Courses: A Research Report. Foreign Language Annals, 48(4), 659-668. https://onlinelibrary.wiley.com/doi/abs/10.1111/flan.12160
Authored by:
Jessica Knott, Stephen Thomas, Becky Matz, Kate Sonka, Sa...

Posted on: MSU Online & Remote Teaching

Exam Strategy for Remote Teaching
With our guiding principles for remote teaching as flexibility, gen...
Authored by:
ASSESSING LEARNING
Tuesday, Jul 7, 2020
Posted on: Spring Conference on Teaching & Learning
PEDAGOGICAL DESIGN
Keynote I: Drawing to Teach: Visualizing our Curriculum for Reflection and Community
Stephen Thomas
Title: Drawing to Teach: Visualizing our Curriculum for Reflection and CommunityLocation: Room 2130College courses and programs of study are comprised of a complex arrangement of structures and processes that can make them difficult to conceptualize or communicate to others. When describing a course to others, we often fall back on simplistic narratives of the topic without referencing the pedagogy, assessment, learning environment, resources, student engagement, or a myriad of other impactful features. In this presentation we will look at what it might mean to use visual tools and formats to more formatively represent our curriculum to allow reflection on your teaching, receive feedback from colleagues, and foster community around our teaching efforts.
Dr. Stephen Thomas is the Assistant Dean for STEM Education Teaching and Learning, the Associate Director for the Center for Integrative Studies in General Science at and the Digital Curriculum Coordinator for the College of Natural Science at MSU. For his bachelor’s degree from Denison University, Stephen majored in Biology and minored in Art. This interest in the science/art intersection continued into graduate school as he freelanced as a biological illustrator while earning his masters and Ph.D. at the University of Massachusetts at Amherst in Organismal and Evolutionary Biology and Entomology. Since coming to MSU, Stephen’s focus has shifted from virulence of fungal pathogens of Lymantria dispar to visual communication of science in formal and informal settings and the use of technology in teaching. Stephen has worked on projects such as the use of comics to reduce subject anxiety in non-major science courses, the development of a Massive Open Online Course (MOOC) to teach general science, and augmented reality and kiosk games to engage visitors in science museums. In more recent projects, Stephen has worked on curriculum for Drawing to Learn Biology where students explore science practices of observation and visual model-based reasoning through nature journaling. In his professional development work, Stephen collaborates with Dr. Julie Libarkin on building communities of practice in STEM teaching, STEM education research, and interdisciplinary experiences in art, science, and culture. You can learn more about this work at the STEMed@State website.
Title: Drawing to Teach: Visualizing our Curriculum for Reflection and CommunityLocation: Room 2130College courses and programs of study are comprised of a complex arrangement of structures and processes that can make them difficult to conceptualize or communicate to others. When describing a course to others, we often fall back on simplistic narratives of the topic without referencing the pedagogy, assessment, learning environment, resources, student engagement, or a myriad of other impactful features. In this presentation we will look at what it might mean to use visual tools and formats to more formatively represent our curriculum to allow reflection on your teaching, receive feedback from colleagues, and foster community around our teaching efforts.
Dr. Stephen Thomas is the Assistant Dean for STEM Education Teaching and Learning, the Associate Director for the Center for Integrative Studies in General Science at and the Digital Curriculum Coordinator for the College of Natural Science at MSU. For his bachelor’s degree from Denison University, Stephen majored in Biology and minored in Art. This interest in the science/art intersection continued into graduate school as he freelanced as a biological illustrator while earning his masters and Ph.D. at the University of Massachusetts at Amherst in Organismal and Evolutionary Biology and Entomology. Since coming to MSU, Stephen’s focus has shifted from virulence of fungal pathogens of Lymantria dispar to visual communication of science in formal and informal settings and the use of technology in teaching. Stephen has worked on projects such as the use of comics to reduce subject anxiety in non-major science courses, the development of a Massive Open Online Course (MOOC) to teach general science, and augmented reality and kiosk games to engage visitors in science museums. In more recent projects, Stephen has worked on curriculum for Drawing to Learn Biology where students explore science practices of observation and visual model-based reasoning through nature journaling. In his professional development work, Stephen collaborates with Dr. Julie Libarkin on building communities of practice in STEM teaching, STEM education research, and interdisciplinary experiences in art, science, and culture. You can learn more about this work at the STEMed@State website.
Authored by:
Stephen Thomas, Associate Director, CISGS; Assistant Dean...

Posted on: Spring Conference on Teaching & Learning

Keynote I: Drawing to Teach: Visualizing our Curriculum for Reflection and Community
Stephen Thomas
Title: Drawing to Teach: Visualizing our Curriculum ...
Title: Drawing to Teach: Visualizing our Curriculum ...
Authored by:
PEDAGOGICAL DESIGN
Monday, May 1, 2023
Posted on: Teaching Toolkit Tailgate
NAVIGATING CONTEXT
Image from insidehighered.com
How do MSU faculty view their strengths and weaknesses as educators?
What resources do they need to continue to grow?
In 2018, our Learning Community of Adams Academy graduates surveyed 215 faculty to find out.
Here are some of our results:
Strengths: We see ourselves as having more strengths than challenges, especially:
Teaching with enthusiasm
Fostering active learning
Female respondents: mentoring, teaching teachers, facilitating connections and creating community.
Challenges:
Student assessment was the most commonly cited challenge
Fostering active learning (again!)
Fostering dialogue
Familiarity with evidence-based teaching practices: much variation!
Broad Business College and the College of Music, no respondents familiar with the concept (or at least the term).
James Madison, the College of Law, the College of Veterinary Med. and the College of Osteopathic Med.: all respondents familiar with it.
Labor categories: a plurality of “no” responses only from tenure-track and “other”: tenured, fixed-term and academic specialists had plurality of “yes” answers.
Barriers to developing teaching practice:
“More time” is no. 1 response.
Most frequently used resources for developing teaching practice:
Brown Bag or Learn at Lunch presentations
Departmental workshops
Academic Advancement Network
MSU Learning Communities
Following our survey, in 2019 we developed a peer-observation protocol.
If you’re interested in trying it out, either in your own department or with one of our group, please contact Mike or Cheryl.
Dr. Cheryl Caesar, caesarc@msu.edu
Dr. Michael Ristich, ristich@msu.edu
MSU Faculty Attitudes towards Teaching: Reports from the Field
Image from insidehighered.com
How do MSU faculty view their strengths and weaknesses as educators?
What resources do they need to continue to grow?
In 2018, our Learning Community of Adams Academy graduates surveyed 215 faculty to find out.
Here are some of our results:
Strengths: We see ourselves as having more strengths than challenges, especially:
Teaching with enthusiasm
Fostering active learning
Female respondents: mentoring, teaching teachers, facilitating connections and creating community.
Challenges:
Student assessment was the most commonly cited challenge
Fostering active learning (again!)
Fostering dialogue
Familiarity with evidence-based teaching practices: much variation!
Broad Business College and the College of Music, no respondents familiar with the concept (or at least the term).
James Madison, the College of Law, the College of Veterinary Med. and the College of Osteopathic Med.: all respondents familiar with it.
Labor categories: a plurality of “no” responses only from tenure-track and “other”: tenured, fixed-term and academic specialists had plurality of “yes” answers.
Barriers to developing teaching practice:
“More time” is no. 1 response.
Most frequently used resources for developing teaching practice:
Brown Bag or Learn at Lunch presentations
Departmental workshops
Academic Advancement Network
MSU Learning Communities
Following our survey, in 2019 we developed a peer-observation protocol.
If you’re interested in trying it out, either in your own department or with one of our group, please contact Mike or Cheryl.
Dr. Cheryl Caesar, caesarc@msu.edu
Dr. Michael Ristich, ristich@msu.edu
Authored by:
Cheryl Caesar and Mike Ristich
Posted on: Teaching Toolkit Tailgate
Image from insidehighered.com
How do MSU facu...
MSU Faculty Attitudes towards Teaching: Reports from the Field
Image from insidehighered.com
How do MSU facu...
Authored by:
NAVIGATING CONTEXT
Monday, Jul 27, 2020
Posted on: #iteachmsu
NAVIGATING CONTEXT
Making Something Out of Nothing: Experiential Learning, Digital Publishing, and Budget Cuts
The Cube (publishing - process - praxis) is a publishing nexus housed in Michigan State University's Department of Writing, Rhetoric, and American Cultures (WRAC). The Cube supports, promotes, and produces open-access works created by diverse members of the mid-Michigan and Michigan State communities. Our publishing focuses on messages of social justice, accessibility, diversity, and inclusion. We provide a space for diverse voices to publish and advocate for their work and engage with audiences they would otherwise have difficulty reaching. This Poster, featuring The Cube's director, its graduate assistant, and its lead undergraduate web developer, will provide an overview of the work the Cube does, from brainstorming to final product, and show how we faced adversity and thought creatively in the wake of massive budget cuts to the humanities.
To access a PDF of the "We Are The Cube" poster, click here.
Description of the Poster
This poster is made using something similar to a mind map, with bubbles named “high-impact experiential learning,” “people,” “mentorship and community,” “projects,” “process,” and “skills.” Surrounding those bubbles are smaller bubbles with descriptions (described below).
We are The Cube.
Publishing - Process - Praxis
We are a publishing nexus that supports, promotes, and produces open-access work created by diverse members of the mid-Michigan community, focusing on messages of social justice, accessibility, diversity, and inclusion through high-impact experiential learning. We provide a space for diverse ranges of persons, places, and communities to publish and advocate for their work and to engage with audiences they would otherwise be unable to reach.
High-Impact Experiential Learning Circle:
Mentorship is key. Project proposals come to The Cube via our website; from there, we review projects and hire paid undergraduate and graduate interns to complete the work. At any given time, The Cube has between twelve and twenty interns, and our entire budget is dedicated to labor.
Throughout our processes, students are mentored by faculty members, encouraged to take risks and make mistakes, praised for their good work, and given credit for that work. For a full list of our mentors and interns, see our website: https://thecubemsu.com/.
Experiential learning programs allow students to take risks, make mistakes, and learn from those mistakes in a safe and supportive environment.
There are two goals. One is to learn the specifics of a particular subject, and the other is to learn about one’s own learning process.
Experiential learning works in four stages:
concrete learning,
reflective observation,
abstract conceptualization, and
active experimentation.
All of these are key for developing both hard and soft skills, which students will need to be ethical pioneers in their fields and in their communities.
Representative People Circle:
Catherine Davis, User Experience and Design Intern
Shelby Smith, Writing and Editing Intern
Grace Houdek, Graphic Design Intern
Jaclyn Krizanic, Social Media Intern
Jeanetta Mohlke-Hill, Editorial Assistant
Emily Lin, Lead UX Designer
Mitch Carr, Graduate Assistant and Project Coordinator
Kara Headly, Former Social Media Intern
Community & Mentorship Circle:
Dr. Kate Birdsall, Director
Dr. Alexandra Hidalgo, Editor-in-Chief
Dr. Marohang Lumbu, Editor-in-Chief
The Writing Center at MSU
Writing, Rhetoric, and American Cultures (WRAC) at MSU
Projects Circle:
The Current, digital and print magazine
JOGLTEP, academic journal
Constellations, academic journal
Agnes Films, feminist film collective
The Red Cedar review, literary journal
REO Town Reading Series Anthology, digital book
Superheroes Die in the Summer, digital book
Process Circle:
Brainstorming
Collaboration
Client Relations
Consistent Voice and Branding
UX Design and Engineering
Skills Circle:
Confidence
Editing and Writing Style Guides
Professional Development
Risk Analysis
Develop Professional Portfolio
Human Centered Design
Developmental and Copy Editing
Poster by: Dr. Kate Birdsall, Mitch Carr, and Emily Lin (Writing, Rhetoric, and American Cultures (WRAC) Department))
To access a PDF of the "We Are The Cube" poster, click here.
Description of the Poster
This poster is made using something similar to a mind map, with bubbles named “high-impact experiential learning,” “people,” “mentorship and community,” “projects,” “process,” and “skills.” Surrounding those bubbles are smaller bubbles with descriptions (described below).
We are The Cube.
Publishing - Process - Praxis
We are a publishing nexus that supports, promotes, and produces open-access work created by diverse members of the mid-Michigan community, focusing on messages of social justice, accessibility, diversity, and inclusion through high-impact experiential learning. We provide a space for diverse ranges of persons, places, and communities to publish and advocate for their work and to engage with audiences they would otherwise be unable to reach.
High-Impact Experiential Learning Circle:
Mentorship is key. Project proposals come to The Cube via our website; from there, we review projects and hire paid undergraduate and graduate interns to complete the work. At any given time, The Cube has between twelve and twenty interns, and our entire budget is dedicated to labor.
Throughout our processes, students are mentored by faculty members, encouraged to take risks and make mistakes, praised for their good work, and given credit for that work. For a full list of our mentors and interns, see our website: https://thecubemsu.com/.
Experiential learning programs allow students to take risks, make mistakes, and learn from those mistakes in a safe and supportive environment.
There are two goals. One is to learn the specifics of a particular subject, and the other is to learn about one’s own learning process.
Experiential learning works in four stages:
concrete learning,
reflective observation,
abstract conceptualization, and
active experimentation.
All of these are key for developing both hard and soft skills, which students will need to be ethical pioneers in their fields and in their communities.
Representative People Circle:
Catherine Davis, User Experience and Design Intern
Shelby Smith, Writing and Editing Intern
Grace Houdek, Graphic Design Intern
Jaclyn Krizanic, Social Media Intern
Jeanetta Mohlke-Hill, Editorial Assistant
Emily Lin, Lead UX Designer
Mitch Carr, Graduate Assistant and Project Coordinator
Kara Headly, Former Social Media Intern
Community & Mentorship Circle:
Dr. Kate Birdsall, Director
Dr. Alexandra Hidalgo, Editor-in-Chief
Dr. Marohang Lumbu, Editor-in-Chief
The Writing Center at MSU
Writing, Rhetoric, and American Cultures (WRAC) at MSU
Projects Circle:
The Current, digital and print magazine
JOGLTEP, academic journal
Constellations, academic journal
Agnes Films, feminist film collective
The Red Cedar review, literary journal
REO Town Reading Series Anthology, digital book
Superheroes Die in the Summer, digital book
Process Circle:
Brainstorming
Collaboration
Client Relations
Consistent Voice and Branding
UX Design and Engineering
Skills Circle:
Confidence
Editing and Writing Style Guides
Professional Development
Risk Analysis
Develop Professional Portfolio
Human Centered Design
Developmental and Copy Editing
Poster by: Dr. Kate Birdsall, Mitch Carr, and Emily Lin (Writing, Rhetoric, and American Cultures (WRAC) Department))
Authored by:
Kate Birdsall, Mitch Carr, Emily Lin

Posted on: #iteachmsu

Making Something Out of Nothing: Experiential Learning, Digital Publishing, and Budget Cuts
The Cube (publishing - process - praxis) is a publishing nexus hous...
Authored by:
NAVIGATING CONTEXT
Monday, Apr 26, 2021
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
MSU's own Ryan Sweeder joins 3DL4US Podcast
3DL4US Podcast
Join us as we explore the experiences and insights of the people working to improve the outcomes of their students as they make the transition to college, using three-dimensional learning for undergraduate science (3DL4US) as a language and framework to transform individual assessment items, the broader culture of STEM higher ed, and everything in between.
Michigan State's own Ryan Sweeder joins the fray, to discuss metaphorical mountain climbing, literal mushroom foraging, and hypothetical Jeopardy! appearances. Paul sheepishly asks the hard questions that have gone unanswered so far about 3DL's role in improving student experiences and outcomes. Ryan takes it all in stride and pretends not to notice when Becky starts speaking in "blah blah blah".
According to his bio with Lyman Briggs College, "Ryan Sweeder is a chemist in Lyman Briggs College at Michigan State University specializing in chemistry education research. He studies methods for increasing the learning in undergraduate general chemistry classes using out-of-class activities. He also runs the SPRING Scholars program, a program that helps students explore science career options and develop a professional network. Within his general chemistry classes, he brings his research to bear by providing students with lots of opportunities to engage with course content, apply it to real world scenarios, and gain frequent feedback on their level of understanding. Through these processes he shares his passion about understanding how molecular level interactions can be used to explain our everyday observations of materials, their properties, and how they interact."
Join us as we explore the experiences and insights of the people working to improve the outcomes of their students as they make the transition to college, using three-dimensional learning for undergraduate science (3DL4US) as a language and framework to transform individual assessment items, the broader culture of STEM higher ed, and everything in between.
Michigan State's own Ryan Sweeder joins the fray, to discuss metaphorical mountain climbing, literal mushroom foraging, and hypothetical Jeopardy! appearances. Paul sheepishly asks the hard questions that have gone unanswered so far about 3DL's role in improving student experiences and outcomes. Ryan takes it all in stride and pretends not to notice when Becky starts speaking in "blah blah blah".
According to his bio with Lyman Briggs College, "Ryan Sweeder is a chemist in Lyman Briggs College at Michigan State University specializing in chemistry education research. He studies methods for increasing the learning in undergraduate general chemistry classes using out-of-class activities. He also runs the SPRING Scholars program, a program that helps students explore science career options and develop a professional network. Within his general chemistry classes, he brings his research to bear by providing students with lots of opportunities to engage with course content, apply it to real world scenarios, and gain frequent feedback on their level of understanding. Through these processes he shares his passion about understanding how molecular level interactions can be used to explain our everyday observations of materials, their properties, and how they interact."
Authored by:
Ryan Sweeder
Posted on: #iteachmsu
MSU's own Ryan Sweeder joins 3DL4US Podcast
3DL4US Podcast
Join us as we explore the experiences and insights ...
Join us as we explore the experiences and insights ...
Authored by:
PEDAGOGICAL DESIGN
Friday, Apr 16, 2021
Posted on: #iteachmsu
Comparative Analysis of Crowdmark and Gradescope
Executive Summary
This analysis presents a review and comparison of two instructional technologies for administering and digitally grading online and in-person assessments: Crowdmark and Gradescope. We tested both instructor and student workflows for creating, submitting, and grading assessments using Crowdmark and Gradescope integrated with a test course in D2L. Our evaluation criteria included ease of use, features available, accessibility, and flexibility. We found some key similarities:
Remote and in person assessments are supported, with multiple question types.
Grading is done by question rather than by student for more consistency.
Multiple graders can grade assignments, such as co-instructors and teaching assistants.
Grades are synced automatically with the gradebook in D2L Brightspace.
The primary differences between these two are:
Crowdmark can assign assessments according to sections and a drag and drop functionality is available for rubric comments.
Crowdmark emails students when assessments become available and can accept more file types as well as rotate files more easily.
Gradescope allows for time extensions at the course level as well as for each assessment and allows for grading the assessments before the due date.
Based on these findings, we recommend continuing with Crowdmark, the more established and familiar tool. Although Gradescope includes some extra functionalities over Crowdmark, such as programming assessments, these functions are already handled by other tools or have not been used often or at all by faculty (e.g., CSE 231 Introduction to Programming uses Mimir for programming assignments). Crowdmark also offers fast grade sync with the D2L gradebook and the scanning and matching capabilities are more robust for in person assessments.
"The second-best way to grade exams" by ilmungo is licensed under CC BY-NC-SA 2.0
Methods
We tested both instructor and student workflows for creating and submitting assessments using Crowdmark and Gradescope integrated with a test course in D2L. Sample assignments were created for the remote assessments that included all of the available question types (i.e., upload file, enter text, multiple choice, etc.). Using separate accounts, we assigned the assessments as an instructor, submitted the assessments as a student, then returned to the instructor account to grade the assessments and sync the grades to our D2L test course.
Findings
Key Similarities:
Both Crowdmark and Gradescope offer keyboard shortcuts for faster grading; allow late submissions, group submissions, and enforced time limits; and allow for grading by question instead of by student as well as multiple graders such as teaching assistants. Assignment submissions can include pdf or image upload, free response/short answer in a text box, or multiple choice/multi select type questions (with bubble sheets) for online assessments. For both tools, students can upload one PDF and then drag and drop each page to match each question for remote assessments, while instructors can scan and upload student submissions in batches for in person assessments. Both tools will also attempt to split a batch PDF into individual student submissions.
Key Differences:
Accessing Tools
Students have to login to Crowdmark through the Crowdmark website. This link can be added to D2L Brightspace and opened in a new, external web page. The Crowdmark sign-in prompts students to select their institution and then uses students’ Brightspace login. Gradescope can be added to D2L Brightspace as an External Tool in a D2L content module. This allows students to access Gradescope within D2L as an embedded website within the D2L page, instead of as an external page, and does not require any additional login.
Creating Assessments
When creating assessments in Crowdmark, instructors choose between administered (in person) assessments that instructors will upload or assigned (remote) assessments that students will upload (Figure 1). Administered assessments can include bubble sheets for multiple choice questions. Assigned remote assessments can include file upload, text entry responses, or multiple-choice questions (which are automatically graded).When creating an assignment in Gradescope, the assignment type must be chosen first. Then, for the first three assignment types, the submission type is designated as either the instructor or the students (Figure 2). Although Exam/Quiz and Homework/Problem Set are offered as two different choices, they actually have the same options and essential functions. There are no further options if the instructor will be uploading the assessments, but other options are available if students will be uploading. Submissions can be variable length, where students submit any number of pages and indicate the pages where their question responses are, or fixed length where students submit work where answers are in fixed locations (like worksheets). Instructors can also allow students to view and download the assessment template if desired. Multiple choice assignments can be created with printable bubble sheets that either instructors or students can upload. Programming assignments are available, which Crowdmark does not support, and they can be automatically or manually graded.
Figure 1: Assessment types available in Crowdmark.
Figure 2: Assessment types available in Gradescope.
Both tools have the ability for students to take online quizzes. Both have multiple choice and multi select that are auto-graded, and both have free response and file upload that are NOT auto-graded. Gradescope supports short answer questions which are auto-graded, but Crowdmark only has free response questions.For assignments that students will upload, instructors must input text or upload a document for each individual question in Crowdmark. It is possible for an instructor to upload one document in the instructions field which contains all of the assignment questions and then simply enter numbers in the text boxes for each question, rather than the text of each question. Gradescope only requires one document to be uploaded. Each question is then identified by dragging a box around each question area on the page and a question title must be entered.
Assigning & Distributing Assessments
For courses with several sections, Crowdmark allows assessments to be assigned to specific sections rather than the entire course. To approximate this feature in Gradescope, an instructor would have to create separate Gradescope courses or duplicate assignments and direct students to the appropriate version for their section.Both tools allow instructors to set individual accommodations for each assignment to customize due date, lateness penalty, or time to complete. However, Gradescope also allows course-wide extensions for students, where extensions can be added for all assignments to customize time limits (multiply time by x or add x minutes) and due dates. Crowdmark requires accommodations to be made in the submission area for each assignment. It does not support course-wide accommodations.When an assessment is assigned and released to students, Crowdmark sends a notification email to students, where Gradescope only sends an in-platform notification. Gradescope does send a confirmation email when students successfully submit an assignment. Both tools give instructors the option to send a notification email when returning student work.
Submitting Assessments
For in-person assessments, Crowdmark can include a QR code on assignments to ensure that every page of student work is correctly matched to the appropriate student for grading. The QR code can be manually scanned and matched to each student using an app as the assignment is turned in, or instructors can use automated matching (beta) to include a form field where students write their name and ID number for automated character recognition to identify the student and match them to that assignment’s QR code. Gradescope is developing a feature to create a unique label for each copy of an assignment and add that label to each page, but this is not currently available.Submitted file types are more flexible in Crowdmark, which can support PDF, JPEG, PNG, and iPhone photos, any of which can be rotated after submission. Gradescope accepts only PDFs or JPEGs and only PDF pages can be rotated. This means that Crowdmark offers much more flexibility in scanning software and orientation. Gradescope does have a built-in PDF scanner for iOS devices to circumvent format issues and allow seamless upload. Both tools assume that image submissions are of work associated with a single question. All work can be scanned into a single PDF for upload and each page then manually associated with each question in the assignment. In both tools, the student selects which question(s) are associated with each page(s), where multiple questions may be on a single page or multiple pages may be associated with a single question.Crowdmark allows for group submissions when either the instructor or the students scan and upload the assessments. This ability to match multiple students to one assessment allows for two-stage exams, collaborative lab reports, or other group assignments. Gradescope only allows group submissions when students scan and upload assessments, although online assignments also allow group submissions.
Grading Assessments
Assignments can be graded immediately after students have submitted them in Gradescope. Crowdmark does not allow grading to be done until the due date has passed.In Crowdmark, all feedback comments created for each question are stored in a comment library which can be reordered easily by dragging a comment to the desired location. There is no limit on the number of comments that can be dragged and dropped onto each student’s submission. Crowdmark comments can have positive or negative points attached to them, but specifying points is not required. Gradescope does not allow for dragging and dropping multiple comments; however, text annotations are saved for each question and several can be applied to each submission. The separate rubric comments must be associated with positive or negative points for each question. The rubric type can be either negative scoring, where the points are subtracted from 1.0, or positive scoring, where the points are added to 0. Score bounds can also be set, with a maximum of 1.0 and a minimum of 0. While it is possible to select more than one rubric comment, only one comment can be added as part of a “submission specific adjustment” which can include an additional point adjustment.Crowdmark sends grades to D2L and automatically creates the grade item in the gradebook. Gradescope requires that the grade item be created first, then associated with an assignment, before sending grades is possible.
Table 1: Feature Comparison between Crowdmark and Gradescope.
Topic
Crowdmark
Advantage
Gradescope
Accessing Tools
Must access through separate website; sign in to Crowdmark via Brightspace
Can add External Tool to D2L module and it can be accessed within D2L (embedded website into page)
Creating Assessments
Upload PDF and designate where questions are for administered assessments that instructors upload (drag question number to location on page)
Upload PDF and designate where questions are by dragging boxes on the page for fixed length exam/homework that students upload or an administered exam/homework that instructors upload
Must input or upload individual questions manually when creating remote assessments that students upload (but instructor can upload PDF in directions area and just enter Q1, Q2, etc. in text boxes)
Must input question titles separately for variable length submissions that students upload, but questions are designated by dragging box over location on page (no need to enter text of question in Gradescope)
Assigning & Distributing Assessments
Can assign assessments to a section rather than entire course
Cannot assign assessments to a section; must create separate course or duplicate assignments and instruct students which one to submit
Add time for accommodations for each assessment only (customize due date, lateness penalty, or time to complete)
Add extensions at course level and/or for each assessment (multiply time by x or add x minutes)
Students always receive email when new assignments are ready to be completed
Students are not notified when new assignments are ready; but students do receive email when they have submitted an assignment, and instructor has option to send email once the assignment is graded
Submitting Assessments
QR codes on printed work for in person administered assessments (can also use app to match assessments to students when scanning)
Create printouts (beta) for in person assessments; give each student a copy of the assignment with a unique label on each page (this tool is NOT yet available)
iPhone photos supported; can accept PDF, JPG, or PNG (and can rotate any file) for remote assignments submitted by students
iPhone photos not supported; accepts PDF or JPG only (can only rotate PDFs) for remote assignments submitted by students; multiple files and any file type accepted for online assignments
Allows for group submissions whether students or instructors are uploading assessments (i.e. match multiple students to one assessment)
Allows for group submissions only if students are uploading assessments, but also available for online assignments
Grading Assignments
Must wait until due date to begin grading remote assessments
Online assignments can be graded immediately
Drag and drop any number of comments from comment library for each question
Can apply one previously used comment for each submission separate from rubric; cannot select or drag and drop multiple comments, but can add multiple previously used text annotations for each question
Comments can have positive or negative points attached to them, but specifying points is not required
Comments must have associated points (positive, negative, or 0) for each question; can change rubric type from negative scoring (points subtracted from 1.0) to positive scoring (points added to 0) as well as enable/disable score bounds (max of 1.0 and min of 0)
Grades sent to D2L automatically with no need to create grade item first
Grades sent to D2L automatically but must create grade item first
MSU Usage Data
We explored the usage of each tool at MSU to determine if there was a perceptible trend towards one tool over the other. The total number of courses created in each tool is fairly similar (Table 2). Interestingly, the total number of students enrolled in those courses is much higher in Crowdmark, while the number of assessments administered is higher in Gradescope.
Table 2. Tool usage in courses with at least one student and at least one assessment.
Crowdmark
Gradescope
Courses
322
292
Students
25,322
14,398
Assessments
3,308
4,494
Crowdmark has been used by MSU instructors since 2016. Gradescope has been used since 2018. More courses were created in Crowdmark until the 2020 calendar year (Figure 3). Usage of both tools spiked in 2020, presumably due to the COVID-19 induced shift to remote teaching, and was fairly equivalent that year. For the Spring 2021 semester, more courses have been created in Gradescope. It will be interesting to observe whether this trend towards Gradescope usage continues as 2021 progresses or if Crowdmark usage picks back up.Given the disparity between number of students vs. number of classes & assessments, we explored the frequency of class sizes between the two tools (Figure 4). Both tools have been used for classes of all sizes, though the median class size is 37 for Gradescope and 63 for Crowdmark. We also explored the frequency of assessment numbers between the tools (Figure 5). We found that all but one course had 1-60 assessments created, with both tools most frequently having 2-20 assessments. Gradescope showed an interesting secondary peak of courses having 35-45 assessments. We do not have detailed information for either tool on what kinds of assessments were created or whether all of those assessments were actually used, not just created in the course for practice, or duplicates (e.g., available later, more accessible, or different versions for different class sections in Gradescope).
Figure 3. Number of courses created in each tool that had at least one student and at least one assessment for each calendar year since 2016.
Figure 4. Number of courses having a given class size and at least one assessment.
Figure 5. Number of classes having a given number of assessments and at least one student.
Discussion:
Our analysis showed significant functional overlap between Crowdmark and Gradescope, where either tool could be chosen with little to no impact on instructor capability. However, there are a few advantages to the way that Crowdmark handles assignment tracking, submission, and grade syncing to D2L. In particular, Crowdmark already offers a fast QR-code method for matching every page of in-person assessments to the appropriate student enrolled in the course when scanning the assessments in batches. We expect this feature will become a strong asset in the Fall 2021 semester as more classes will be on campus. If we were to choose between Crowdmark and Gradescope for continued support, we would recommend Crowdmark. Gradescope is a competitive technology, but it is still developing and refining capabilities that are already available through Crowdmark or D2L. If an instructor were to need to switch from Gradescope to Crowdmark, they should refer to the D2L self-enroll course “MSU Tools and Technologies” for detailed information and resources on using Crowdmark at MSU and closely review Table 1 to understand the key differences they may encounter. The Assessment Services team and/or Instructional Technology & Development team in the IT department are also available for one-on-one consultation on using either technology (request a consultation via the MSU Help Desk).
This analysis presents a review and comparison of two instructional technologies for administering and digitally grading online and in-person assessments: Crowdmark and Gradescope. We tested both instructor and student workflows for creating, submitting, and grading assessments using Crowdmark and Gradescope integrated with a test course in D2L. Our evaluation criteria included ease of use, features available, accessibility, and flexibility. We found some key similarities:
Remote and in person assessments are supported, with multiple question types.
Grading is done by question rather than by student for more consistency.
Multiple graders can grade assignments, such as co-instructors and teaching assistants.
Grades are synced automatically with the gradebook in D2L Brightspace.
The primary differences between these two are:
Crowdmark can assign assessments according to sections and a drag and drop functionality is available for rubric comments.
Crowdmark emails students when assessments become available and can accept more file types as well as rotate files more easily.
Gradescope allows for time extensions at the course level as well as for each assessment and allows for grading the assessments before the due date.
Based on these findings, we recommend continuing with Crowdmark, the more established and familiar tool. Although Gradescope includes some extra functionalities over Crowdmark, such as programming assessments, these functions are already handled by other tools or have not been used often or at all by faculty (e.g., CSE 231 Introduction to Programming uses Mimir for programming assignments). Crowdmark also offers fast grade sync with the D2L gradebook and the scanning and matching capabilities are more robust for in person assessments.
"The second-best way to grade exams" by ilmungo is licensed under CC BY-NC-SA 2.0
Methods
We tested both instructor and student workflows for creating and submitting assessments using Crowdmark and Gradescope integrated with a test course in D2L. Sample assignments were created for the remote assessments that included all of the available question types (i.e., upload file, enter text, multiple choice, etc.). Using separate accounts, we assigned the assessments as an instructor, submitted the assessments as a student, then returned to the instructor account to grade the assessments and sync the grades to our D2L test course.
Findings
Key Similarities:
Both Crowdmark and Gradescope offer keyboard shortcuts for faster grading; allow late submissions, group submissions, and enforced time limits; and allow for grading by question instead of by student as well as multiple graders such as teaching assistants. Assignment submissions can include pdf or image upload, free response/short answer in a text box, or multiple choice/multi select type questions (with bubble sheets) for online assessments. For both tools, students can upload one PDF and then drag and drop each page to match each question for remote assessments, while instructors can scan and upload student submissions in batches for in person assessments. Both tools will also attempt to split a batch PDF into individual student submissions.
Key Differences:
Accessing Tools
Students have to login to Crowdmark through the Crowdmark website. This link can be added to D2L Brightspace and opened in a new, external web page. The Crowdmark sign-in prompts students to select their institution and then uses students’ Brightspace login. Gradescope can be added to D2L Brightspace as an External Tool in a D2L content module. This allows students to access Gradescope within D2L as an embedded website within the D2L page, instead of as an external page, and does not require any additional login.
Creating Assessments
When creating assessments in Crowdmark, instructors choose between administered (in person) assessments that instructors will upload or assigned (remote) assessments that students will upload (Figure 1). Administered assessments can include bubble sheets for multiple choice questions. Assigned remote assessments can include file upload, text entry responses, or multiple-choice questions (which are automatically graded).When creating an assignment in Gradescope, the assignment type must be chosen first. Then, for the first three assignment types, the submission type is designated as either the instructor or the students (Figure 2). Although Exam/Quiz and Homework/Problem Set are offered as two different choices, they actually have the same options and essential functions. There are no further options if the instructor will be uploading the assessments, but other options are available if students will be uploading. Submissions can be variable length, where students submit any number of pages and indicate the pages where their question responses are, or fixed length where students submit work where answers are in fixed locations (like worksheets). Instructors can also allow students to view and download the assessment template if desired. Multiple choice assignments can be created with printable bubble sheets that either instructors or students can upload. Programming assignments are available, which Crowdmark does not support, and they can be automatically or manually graded.
Figure 1: Assessment types available in Crowdmark.
Figure 2: Assessment types available in Gradescope.
Both tools have the ability for students to take online quizzes. Both have multiple choice and multi select that are auto-graded, and both have free response and file upload that are NOT auto-graded. Gradescope supports short answer questions which are auto-graded, but Crowdmark only has free response questions.For assignments that students will upload, instructors must input text or upload a document for each individual question in Crowdmark. It is possible for an instructor to upload one document in the instructions field which contains all of the assignment questions and then simply enter numbers in the text boxes for each question, rather than the text of each question. Gradescope only requires one document to be uploaded. Each question is then identified by dragging a box around each question area on the page and a question title must be entered.
Assigning & Distributing Assessments
For courses with several sections, Crowdmark allows assessments to be assigned to specific sections rather than the entire course. To approximate this feature in Gradescope, an instructor would have to create separate Gradescope courses or duplicate assignments and direct students to the appropriate version for their section.Both tools allow instructors to set individual accommodations for each assignment to customize due date, lateness penalty, or time to complete. However, Gradescope also allows course-wide extensions for students, where extensions can be added for all assignments to customize time limits (multiply time by x or add x minutes) and due dates. Crowdmark requires accommodations to be made in the submission area for each assignment. It does not support course-wide accommodations.When an assessment is assigned and released to students, Crowdmark sends a notification email to students, where Gradescope only sends an in-platform notification. Gradescope does send a confirmation email when students successfully submit an assignment. Both tools give instructors the option to send a notification email when returning student work.
Submitting Assessments
For in-person assessments, Crowdmark can include a QR code on assignments to ensure that every page of student work is correctly matched to the appropriate student for grading. The QR code can be manually scanned and matched to each student using an app as the assignment is turned in, or instructors can use automated matching (beta) to include a form field where students write their name and ID number for automated character recognition to identify the student and match them to that assignment’s QR code. Gradescope is developing a feature to create a unique label for each copy of an assignment and add that label to each page, but this is not currently available.Submitted file types are more flexible in Crowdmark, which can support PDF, JPEG, PNG, and iPhone photos, any of which can be rotated after submission. Gradescope accepts only PDFs or JPEGs and only PDF pages can be rotated. This means that Crowdmark offers much more flexibility in scanning software and orientation. Gradescope does have a built-in PDF scanner for iOS devices to circumvent format issues and allow seamless upload. Both tools assume that image submissions are of work associated with a single question. All work can be scanned into a single PDF for upload and each page then manually associated with each question in the assignment. In both tools, the student selects which question(s) are associated with each page(s), where multiple questions may be on a single page or multiple pages may be associated with a single question.Crowdmark allows for group submissions when either the instructor or the students scan and upload the assessments. This ability to match multiple students to one assessment allows for two-stage exams, collaborative lab reports, or other group assignments. Gradescope only allows group submissions when students scan and upload assessments, although online assignments also allow group submissions.
Grading Assessments
Assignments can be graded immediately after students have submitted them in Gradescope. Crowdmark does not allow grading to be done until the due date has passed.In Crowdmark, all feedback comments created for each question are stored in a comment library which can be reordered easily by dragging a comment to the desired location. There is no limit on the number of comments that can be dragged and dropped onto each student’s submission. Crowdmark comments can have positive or negative points attached to them, but specifying points is not required. Gradescope does not allow for dragging and dropping multiple comments; however, text annotations are saved for each question and several can be applied to each submission. The separate rubric comments must be associated with positive or negative points for each question. The rubric type can be either negative scoring, where the points are subtracted from 1.0, or positive scoring, where the points are added to 0. Score bounds can also be set, with a maximum of 1.0 and a minimum of 0. While it is possible to select more than one rubric comment, only one comment can be added as part of a “submission specific adjustment” which can include an additional point adjustment.Crowdmark sends grades to D2L and automatically creates the grade item in the gradebook. Gradescope requires that the grade item be created first, then associated with an assignment, before sending grades is possible.
Table 1: Feature Comparison between Crowdmark and Gradescope.
Topic
Crowdmark
Advantage
Gradescope
Accessing Tools
Must access through separate website; sign in to Crowdmark via Brightspace
Can add External Tool to D2L module and it can be accessed within D2L (embedded website into page)
Creating Assessments
Upload PDF and designate where questions are for administered assessments that instructors upload (drag question number to location on page)
Upload PDF and designate where questions are by dragging boxes on the page for fixed length exam/homework that students upload or an administered exam/homework that instructors upload
Must input or upload individual questions manually when creating remote assessments that students upload (but instructor can upload PDF in directions area and just enter Q1, Q2, etc. in text boxes)
Must input question titles separately for variable length submissions that students upload, but questions are designated by dragging box over location on page (no need to enter text of question in Gradescope)
Assigning & Distributing Assessments
Can assign assessments to a section rather than entire course
Cannot assign assessments to a section; must create separate course or duplicate assignments and instruct students which one to submit
Add time for accommodations for each assessment only (customize due date, lateness penalty, or time to complete)
Add extensions at course level and/or for each assessment (multiply time by x or add x minutes)
Students always receive email when new assignments are ready to be completed
Students are not notified when new assignments are ready; but students do receive email when they have submitted an assignment, and instructor has option to send email once the assignment is graded
Submitting Assessments
QR codes on printed work for in person administered assessments (can also use app to match assessments to students when scanning)
Create printouts (beta) for in person assessments; give each student a copy of the assignment with a unique label on each page (this tool is NOT yet available)
iPhone photos supported; can accept PDF, JPG, or PNG (and can rotate any file) for remote assignments submitted by students
iPhone photos not supported; accepts PDF or JPG only (can only rotate PDFs) for remote assignments submitted by students; multiple files and any file type accepted for online assignments
Allows for group submissions whether students or instructors are uploading assessments (i.e. match multiple students to one assessment)
Allows for group submissions only if students are uploading assessments, but also available for online assignments
Grading Assignments
Must wait until due date to begin grading remote assessments
Online assignments can be graded immediately
Drag and drop any number of comments from comment library for each question
Can apply one previously used comment for each submission separate from rubric; cannot select or drag and drop multiple comments, but can add multiple previously used text annotations for each question
Comments can have positive or negative points attached to them, but specifying points is not required
Comments must have associated points (positive, negative, or 0) for each question; can change rubric type from negative scoring (points subtracted from 1.0) to positive scoring (points added to 0) as well as enable/disable score bounds (max of 1.0 and min of 0)
Grades sent to D2L automatically with no need to create grade item first
Grades sent to D2L automatically but must create grade item first
MSU Usage Data
We explored the usage of each tool at MSU to determine if there was a perceptible trend towards one tool over the other. The total number of courses created in each tool is fairly similar (Table 2). Interestingly, the total number of students enrolled in those courses is much higher in Crowdmark, while the number of assessments administered is higher in Gradescope.
Table 2. Tool usage in courses with at least one student and at least one assessment.
Crowdmark
Gradescope
Courses
322
292
Students
25,322
14,398
Assessments
3,308
4,494
Crowdmark has been used by MSU instructors since 2016. Gradescope has been used since 2018. More courses were created in Crowdmark until the 2020 calendar year (Figure 3). Usage of both tools spiked in 2020, presumably due to the COVID-19 induced shift to remote teaching, and was fairly equivalent that year. For the Spring 2021 semester, more courses have been created in Gradescope. It will be interesting to observe whether this trend towards Gradescope usage continues as 2021 progresses or if Crowdmark usage picks back up.Given the disparity between number of students vs. number of classes & assessments, we explored the frequency of class sizes between the two tools (Figure 4). Both tools have been used for classes of all sizes, though the median class size is 37 for Gradescope and 63 for Crowdmark. We also explored the frequency of assessment numbers between the tools (Figure 5). We found that all but one course had 1-60 assessments created, with both tools most frequently having 2-20 assessments. Gradescope showed an interesting secondary peak of courses having 35-45 assessments. We do not have detailed information for either tool on what kinds of assessments were created or whether all of those assessments were actually used, not just created in the course for practice, or duplicates (e.g., available later, more accessible, or different versions for different class sections in Gradescope).
Figure 3. Number of courses created in each tool that had at least one student and at least one assessment for each calendar year since 2016.
Figure 4. Number of courses having a given class size and at least one assessment.
Figure 5. Number of classes having a given number of assessments and at least one student.
Discussion:
Our analysis showed significant functional overlap between Crowdmark and Gradescope, where either tool could be chosen with little to no impact on instructor capability. However, there are a few advantages to the way that Crowdmark handles assignment tracking, submission, and grade syncing to D2L. In particular, Crowdmark already offers a fast QR-code method for matching every page of in-person assessments to the appropriate student enrolled in the course when scanning the assessments in batches. We expect this feature will become a strong asset in the Fall 2021 semester as more classes will be on campus. If we were to choose between Crowdmark and Gradescope for continued support, we would recommend Crowdmark. Gradescope is a competitive technology, but it is still developing and refining capabilities that are already available through Crowdmark or D2L. If an instructor were to need to switch from Gradescope to Crowdmark, they should refer to the D2L self-enroll course “MSU Tools and Technologies” for detailed information and resources on using Crowdmark at MSU and closely review Table 1 to understand the key differences they may encounter. The Assessment Services team and/or Instructional Technology & Development team in the IT department are also available for one-on-one consultation on using either technology (request a consultation via the MSU Help Desk).
Authored by:
Jennifer Wagner & Natalie Vandepol

Posted on: #iteachmsu

Comparative Analysis of Crowdmark and Gradescope
Executive Summary
This analysis presents a review and compari...
This analysis presents a review and compari...
Authored by:
Tuesday, Aug 24, 2021
Posted on: #iteachmsu
PEDAGOGICAL DESIGN
Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies
How can a first-year writing course help to create 21st century STEM students with foundations for interdisciplinary inquiry? Could such as curriculum engage STEM students in knowledge production in ways that help to acculturate them as collaborative, ethical, and empathetic learners? Bringing together insights from writing pedagogy, work on critical science literacy, and science studies, this round-table is hosted by the collaborative team leading an effort to rethink the first year writing course required of all students at Lyman Briggs College, MSU's residential college for STEM students. A major goal of the curriculum redesign is to develop science studies-inspired writing assignments that foster reflective experiential learning about the nature of science. The purpose of this approach is not only to demonstrate the value of inquiry in science studies (history, philosophy, and sociology of science) to STEM students as they pursue their careers, but to foster diverse inclusion in science by demystifying key aspects of scientific culture and its hidden curriculum for membership. Following the guidance of critical pedagogy (e.g. bell hooks), we aim to use the context of first-year writing instruction as an opportunity for critical reflection and empowerment. The roundtable describes how the instructional team designed the first-year curriculum and adapted it to teaching online during the pandemic, and shares data on lessons learned by both the instructor team and our students. We invite participants to think with us as we continue to iteratively develop and assess the curriculum.To access a PDF version of the "Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies" poster, click here. Description of Poster:
Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies
Marisa Brandt, HPS Lyman Briggs College & June Oh, English
Project Overview: Reimagining LB 133
Lyman Briggs College aims to provide a high quality science education to diverse students by teaching science in social, human, and global contexts. LB 133: Science & Culture fulfills the Tier 1 writing requirement for 80-85% of LBC students. Starting in F19, we implemented a new, collaboratively developed and taught cohort model of the LB 133 curriculum in order to take advantage of opportunity to foster a community of inquiry, inclusion, and curiosity.
First year college writing and literacy courses aim to give students skills to communicate and evaluate information in their own fields and beyond. While teaching important writing skills, LB 133 focuses on developing students’ science literacy by encouraging them to enact a subject position of a socially engaged science professional in training. LB 133 was designed based on ideas of HPS.
History, Philosophy, and Sociology (HPS) or “science studies” is an interdisciplinary field that studies science in context, often extended to include medicine, technology, and other sites of knowledge-production. LB 133 centers inquiry into relations of science and culture. One way HPS can help students succeed in STEM is by fostering inclusion. In LB 133, this occurs through demystifying scientific culture and hidden curriculum through authentic, project-based inquiry.
Like WRAC 110, LB 133 is organized around five writing projects. Each project entails a method of inquiry into science as a social, human practice and teaches them to write first as a form of sense-making about their data. (Column 2) Then, students develop writing projects to communicate what they have learned to non-scientific audiences.
Research Questions:
How did their conceptions of science change?[Text Wrapping Break] 2. Did their writing improve?[Text Wrapping Break] 3. What did they see as the most important ideas and skills they would take from the course?[Text Wrapping Break] 4. Did they want more HPS at LBC?
Data Collection:
[Text Wrapping Break]1. Analysis of the beginning and end of course Personal Writing assessments. [Text Wrapping Break]2. End of term survey. [Text Wrapping Break]3. Answers to course reflection questions.
Selected Results: See Column 3.
Conclusions: The new model seems successful! Students reported finding 133 surprisingly enjoyable and educational, for many reasons. Many felt motivated to write about science specifically, saw communication as valuable scientific skill. Most felt their writing improved and learned more than anticipated. Most learned and valued key HPS concepts and wanted to learn more about diversity in scientific cultures, and wanted to continue HPS education in LBC to do so.
Column 2 - Course Structure: Science & Culture
Assessment
Science Studies Content[Text Wrapping Break]Learning Goals
Literacy & Writing Skills Learning Goals
Part 1 - Cultures of Science
Personal Writing 1: Personal Statement [STEM Ed Op-ed][Text Wrapping Break]Short form writing from scientific subject position.
Reflect on evolving identity, role, and responsibilities in scientific culture.
Diagnostic for answering questions, supporting a claim, providing evidence, structure, and clear writing.
Scientific Sites Portfolio[Text Wrapping Break]Collaborative investigation of how a local lab produces knowledge.
Understand scientific practice, reasoning, and communication in its diverse social, material, and cultural contexts. Demystify labs and humanize scientists.
Making observational field notes. Reading scientific papers.
Peer review. Claim, evidence, reasoning. Writing analytical essays based on observation.
Part 2 - Science in Culture
Unpacking a Fact Poster
Partner project assessing validity of a public scientific claim.
Understand the mediation of science and how to evaluate scientific claims. Identify popular conceptions of science and contrast these with scientists’ practices.
Following sources upstream. Comparing sources.
APA citation style.
Visual display of info on a poster.
Perspectives Portfolio[Text Wrapping Break]Collaborative investigation of a debate concerning science in Michigan.
Identify and analyze how diverse stakeholders are included in and/or excluded from science. Recognize value of diverse perspective.
Find, use, and correctly cite primary and scholarly secondary sources from different stakeholder perspectives.
Learn communicating to a broader audience in an online platform.
Personal Writing 2: Letter + PS Revision[Text Wrapping Break]Sharing a course takeaway with someone.
Reflect again on evolving identity, role, and responsibilities in scientific culture.
Final assessment of answering questions, supporting a claim, providing evidence, structure, and clear writing.
Weekly Formative Assessments
Discussion Activities Pre-meeting writing about the readings
Reflect on prompted aspects of science and culture
Writing as critical inquiry.
Note-taking.
Preparation for discussion.
Curiosity Colloquium responses
200 words reflecting on weekly speaker series
Exposure to college, campus, and academic guests—including diverse science professionals— who share their curiosity and career story.
Writing as reflection on presentations and their personal value.
Some presenters share research and writing skills.
Column 3 - Results
Results from Personal Writing
Fall 19: There were largely six themes the op-ed assignments discussed. Majority of students chose to talk about the value of science in terms of its ubiquity, problem-solving skills and critical thinking skills, and the way it prompts technological innovation.
Fall 21: Students largely focused on 1. the nature of science as a product of human labor research embedded with many cultural issues, and 2. science as a communication and how scientists can gain public trust (e.g., transparency, collaboration, sharing failure.)
F19 & S20 Selected Survey Results
108 students responding.The full report here.
92.5% reported their overall college writing skills improved somewhat or a lot.
76% reported their writing skills improved somewhat or a lot more than they expected.
89% reported planning to say in LBC.
Selected Course Reflection Comments
The most impactful things students report learning at end of semester.
Science and Culture: Quotes: “how scientific knowledge is produced” “science is inherently social” “how different perspectives . . . impact science” “writing is integral to the scientific community as a method of sharing and documenting scientific research and discoveries”
Writing: Quotes: “a thesis must be specific and debatable” “claim, evidence, and reasoning” “it takes a long time to perfect.” Frequently mentioned skills: Thesis, research skill (citation, finding articles and proper sources), argument (evidence), structure and organization skills, writing as a (often long and arduous) process, using a mentor text, confidence.
What do you want to learn more about after this course?
“How culture(s) and science coexist, and . . . how different cultures view science”
“Gender and minority disparities in STEM” “minority groups in science and how their cultures impact how they conduct science” “different cultures in science instead of just the United States” “how to write scientific essays”
Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies
Marisa Brandt, HPS Lyman Briggs College & June Oh, English
Project Overview: Reimagining LB 133
Lyman Briggs College aims to provide a high quality science education to diverse students by teaching science in social, human, and global contexts. LB 133: Science & Culture fulfills the Tier 1 writing requirement for 80-85% of LBC students. Starting in F19, we implemented a new, collaboratively developed and taught cohort model of the LB 133 curriculum in order to take advantage of opportunity to foster a community of inquiry, inclusion, and curiosity.
First year college writing and literacy courses aim to give students skills to communicate and evaluate information in their own fields and beyond. While teaching important writing skills, LB 133 focuses on developing students’ science literacy by encouraging them to enact a subject position of a socially engaged science professional in training. LB 133 was designed based on ideas of HPS.
History, Philosophy, and Sociology (HPS) or “science studies” is an interdisciplinary field that studies science in context, often extended to include medicine, technology, and other sites of knowledge-production. LB 133 centers inquiry into relations of science and culture. One way HPS can help students succeed in STEM is by fostering inclusion. In LB 133, this occurs through demystifying scientific culture and hidden curriculum through authentic, project-based inquiry.
Like WRAC 110, LB 133 is organized around five writing projects. Each project entails a method of inquiry into science as a social, human practice and teaches them to write first as a form of sense-making about their data. (Column 2) Then, students develop writing projects to communicate what they have learned to non-scientific audiences.
Research Questions:
How did their conceptions of science change?[Text Wrapping Break] 2. Did their writing improve?[Text Wrapping Break] 3. What did they see as the most important ideas and skills they would take from the course?[Text Wrapping Break] 4. Did they want more HPS at LBC?
Data Collection:
[Text Wrapping Break]1. Analysis of the beginning and end of course Personal Writing assessments. [Text Wrapping Break]2. End of term survey. [Text Wrapping Break]3. Answers to course reflection questions.
Selected Results: See Column 3.
Conclusions: The new model seems successful! Students reported finding 133 surprisingly enjoyable and educational, for many reasons. Many felt motivated to write about science specifically, saw communication as valuable scientific skill. Most felt their writing improved and learned more than anticipated. Most learned and valued key HPS concepts and wanted to learn more about diversity in scientific cultures, and wanted to continue HPS education in LBC to do so.
Column 2 - Course Structure: Science & Culture
Assessment
Science Studies Content[Text Wrapping Break]Learning Goals
Literacy & Writing Skills Learning Goals
Part 1 - Cultures of Science
Personal Writing 1: Personal Statement [STEM Ed Op-ed][Text Wrapping Break]Short form writing from scientific subject position.
Reflect on evolving identity, role, and responsibilities in scientific culture.
Diagnostic for answering questions, supporting a claim, providing evidence, structure, and clear writing.
Scientific Sites Portfolio[Text Wrapping Break]Collaborative investigation of how a local lab produces knowledge.
Understand scientific practice, reasoning, and communication in its diverse social, material, and cultural contexts. Demystify labs and humanize scientists.
Making observational field notes. Reading scientific papers.
Peer review. Claim, evidence, reasoning. Writing analytical essays based on observation.
Part 2 - Science in Culture
Unpacking a Fact Poster
Partner project assessing validity of a public scientific claim.
Understand the mediation of science and how to evaluate scientific claims. Identify popular conceptions of science and contrast these with scientists’ practices.
Following sources upstream. Comparing sources.
APA citation style.
Visual display of info on a poster.
Perspectives Portfolio[Text Wrapping Break]Collaborative investigation of a debate concerning science in Michigan.
Identify and analyze how diverse stakeholders are included in and/or excluded from science. Recognize value of diverse perspective.
Find, use, and correctly cite primary and scholarly secondary sources from different stakeholder perspectives.
Learn communicating to a broader audience in an online platform.
Personal Writing 2: Letter + PS Revision[Text Wrapping Break]Sharing a course takeaway with someone.
Reflect again on evolving identity, role, and responsibilities in scientific culture.
Final assessment of answering questions, supporting a claim, providing evidence, structure, and clear writing.
Weekly Formative Assessments
Discussion Activities Pre-meeting writing about the readings
Reflect on prompted aspects of science and culture
Writing as critical inquiry.
Note-taking.
Preparation for discussion.
Curiosity Colloquium responses
200 words reflecting on weekly speaker series
Exposure to college, campus, and academic guests—including diverse science professionals— who share their curiosity and career story.
Writing as reflection on presentations and their personal value.
Some presenters share research and writing skills.
Column 3 - Results
Results from Personal Writing
Fall 19: There were largely six themes the op-ed assignments discussed. Majority of students chose to talk about the value of science in terms of its ubiquity, problem-solving skills and critical thinking skills, and the way it prompts technological innovation.
Fall 21: Students largely focused on 1. the nature of science as a product of human labor research embedded with many cultural issues, and 2. science as a communication and how scientists can gain public trust (e.g., transparency, collaboration, sharing failure.)
F19 & S20 Selected Survey Results
108 students responding.The full report here.
92.5% reported their overall college writing skills improved somewhat or a lot.
76% reported their writing skills improved somewhat or a lot more than they expected.
89% reported planning to say in LBC.
Selected Course Reflection Comments
The most impactful things students report learning at end of semester.
Science and Culture: Quotes: “how scientific knowledge is produced” “science is inherently social” “how different perspectives . . . impact science” “writing is integral to the scientific community as a method of sharing and documenting scientific research and discoveries”
Writing: Quotes: “a thesis must be specific and debatable” “claim, evidence, and reasoning” “it takes a long time to perfect.” Frequently mentioned skills: Thesis, research skill (citation, finding articles and proper sources), argument (evidence), structure and organization skills, writing as a (often long and arduous) process, using a mentor text, confidence.
What do you want to learn more about after this course?
“How culture(s) and science coexist, and . . . how different cultures view science”
“Gender and minority disparities in STEM” “minority groups in science and how their cultures impact how they conduct science” “different cultures in science instead of just the United States” “how to write scientific essays”
Authored by:
Marisa Brandt & June Oh

Posted on: #iteachmsu

Reimagining First-Year Writing for STEM Undergraduates as Inquiry-Based Learning in Science Studies
How can a first-year writing course help to create 21st century STE...
Authored by:
PEDAGOGICAL DESIGN
Thursday, May 6, 2021
Posted on: #iteachmsu
NAVIGATING CONTEXT
If you were waiting for the time, it's here: Thank an Educator
November is here and with this time of year, we often see an increase in messaging around gratitude, appreciation, and giving thanks. Gratitude is something I’ve always found great value in, and touted anecdotal benefits of. In 2015, I wrote ‘Tis the season of giving thanks: Why gratitude is important in leadership for MSU Extension. Then later, in 2018, I founded MSU’s Thank an Educator Initiative. I saw the invaluable work that people across roles were doing to support students and MSU’s teaching and learning goals. Not only did I see important work, I saw educators making huge impacts on learners' lives and experiences. Simultaneously, I noticed the sheer size (and let’s face it- siloing) at MSU as huge barriers to a) educators being celebrated for their work, and b) educators being about to learn with and from one another. So I started the “Thank an Educator” initiative.
Thanking an educator is super simple. Any Spartan can visit the Thank an Educator page on the #iteachmsu Commons. At the page, folx will see a brief form where they enter the information on the educator they’d like the thank, and then a short story/sentiment of thanks. That’s it! #iteachmsu does the rest. Every person who is recognized will receive a personalized message via email thanking them for their important work (the submitted story is included here). Then at the end of the academic year, all of the educators submitted for Thank an Educator are also recognized by the Provost with a #iteachmsu Educator Award. Since its initial conception, the Thank an Educator initiative has recognized educators over 550 times! We care about and are committed to celebrating and elevating the work of educators, and know that these efforts make an impact.
In January of 2020, when my son was born, I stopped working in person as a Graduate Assistant on the #iteachmsu Commons. During this parental leave I also moved to the west side of Michigan. The plan was to be remote for the remainder of my GA contract after returning from leave in March. Little did I know, I wouldn’t be alone. I returned to work (and continued as a Postdoc and now Academic Specialist) to meet all my colleagues online! Then reality hits (and continues to keep throwing punches). I couldn't access daycares for my infant because they were shut down. My partner’s business- the one we moved for- also shut down. My family unit’s makeup and health history made us high risk for infection; so ultimately we were first time parents, in a new place, in a vacuum. The isolation was terrible and both my partner and I struggled with the impacts of anxiety and depression. During this same time, I watched as colleagues and fellow educators at Michigan State (while dealing with many, if not all and more, of the same challenges as myself) rose to the occasion. Instructors switched to teaching online. Advisors innovated the ways they held appointments. Graduate students began co-working virtually via zoom. Administrators made extra efforts to transparently share the goings on of the university in personal ways that built community. New programs and training were created to support educators. Events were hosted completely online. In the 13+ years I’ve been at Michigan State, I don’t think I’ve ever seen a more glaring example of “Who will? Spartans Will.”
We're still "in it". The circumstances have continued to change, but educators are still constantly being kept on their toes, challenged to dodge, dive, and duck around barriers; all the while still supporting student success, still serving the teaching and learning mission of the university, still prioritizing health and safety… I’ve observed the toll this constantly changing, uncertain, and sometimes downright scary time has taken on myself and my colleagues. People seem to be yearning for personal connection, time to really see one another, but packed schedules and increasing demands on capacity make it feel challenging to take that time.
Now, maybe even more than ever, you all - MSU’s educators- deserve to be recognized for the phenomenal work you continue to do, despite extremely challenging circumstances. I know time is in short supply. I know people are burnt out. But please… submit someone to Thank an Educator. The process takes only a handful of minutes (I timed myself and it literally took me 5 minutes) but makes a huge impact. Every single day, I interact with individuals who are doing high impact work to support MSU’s teaching and learning, student success, and outreach mission. I’d guess you do too. Thank them.
If you need even more convincing, consider the research on practicing gratitude:
Emmons and McCullough (2003) showed that counting your blessings seems to be a much more effective way of enhancing your quality of life than counting your burdens.
Bartlett and DeSteno (2006) found that small acts of gratitude can cause ripple effects that reach farther than you would imagine.
Sheldon and Lyubomirsky (2007), found the regular practice of gratitude and/or positive visualization can lead to a higher quality of life, measured by affect.
Looking for even more? Check out the Greater Good Science Center (UC-Berkley) and all their tools, resources, research, and more!
Sources:
Emmons, R. A., & McCullough, M. E. (2003). Counting blessings versus burdens: An experimental investigation of gratitude and subjective well-being in daily life. Journal of Personality and Social Psychology. 84, 377-389.
Bartlett, M. & Desteno, D. (2006). Gratitude and prosocial behavior helping when it costs you. Psychological Science. 17. 319-25.
Sheldon, K.M. & Lyubomirsky, S. (2006) How to increase and sustain positive emotion: The effects of expressing gratitude and visualizing best possible selves, The Journal of Positive Psychology, 1:2, 73-82
Thanking an educator is super simple. Any Spartan can visit the Thank an Educator page on the #iteachmsu Commons. At the page, folx will see a brief form where they enter the information on the educator they’d like the thank, and then a short story/sentiment of thanks. That’s it! #iteachmsu does the rest. Every person who is recognized will receive a personalized message via email thanking them for their important work (the submitted story is included here). Then at the end of the academic year, all of the educators submitted for Thank an Educator are also recognized by the Provost with a #iteachmsu Educator Award. Since its initial conception, the Thank an Educator initiative has recognized educators over 550 times! We care about and are committed to celebrating and elevating the work of educators, and know that these efforts make an impact.
In January of 2020, when my son was born, I stopped working in person as a Graduate Assistant on the #iteachmsu Commons. During this parental leave I also moved to the west side of Michigan. The plan was to be remote for the remainder of my GA contract after returning from leave in March. Little did I know, I wouldn’t be alone. I returned to work (and continued as a Postdoc and now Academic Specialist) to meet all my colleagues online! Then reality hits (and continues to keep throwing punches). I couldn't access daycares for my infant because they were shut down. My partner’s business- the one we moved for- also shut down. My family unit’s makeup and health history made us high risk for infection; so ultimately we were first time parents, in a new place, in a vacuum. The isolation was terrible and both my partner and I struggled with the impacts of anxiety and depression. During this same time, I watched as colleagues and fellow educators at Michigan State (while dealing with many, if not all and more, of the same challenges as myself) rose to the occasion. Instructors switched to teaching online. Advisors innovated the ways they held appointments. Graduate students began co-working virtually via zoom. Administrators made extra efforts to transparently share the goings on of the university in personal ways that built community. New programs and training were created to support educators. Events were hosted completely online. In the 13+ years I’ve been at Michigan State, I don’t think I’ve ever seen a more glaring example of “Who will? Spartans Will.”
We're still "in it". The circumstances have continued to change, but educators are still constantly being kept on their toes, challenged to dodge, dive, and duck around barriers; all the while still supporting student success, still serving the teaching and learning mission of the university, still prioritizing health and safety… I’ve observed the toll this constantly changing, uncertain, and sometimes downright scary time has taken on myself and my colleagues. People seem to be yearning for personal connection, time to really see one another, but packed schedules and increasing demands on capacity make it feel challenging to take that time.
Now, maybe even more than ever, you all - MSU’s educators- deserve to be recognized for the phenomenal work you continue to do, despite extremely challenging circumstances. I know time is in short supply. I know people are burnt out. But please… submit someone to Thank an Educator. The process takes only a handful of minutes (I timed myself and it literally took me 5 minutes) but makes a huge impact. Every single day, I interact with individuals who are doing high impact work to support MSU’s teaching and learning, student success, and outreach mission. I’d guess you do too. Thank them.
If you need even more convincing, consider the research on practicing gratitude:
Emmons and McCullough (2003) showed that counting your blessings seems to be a much more effective way of enhancing your quality of life than counting your burdens.
Bartlett and DeSteno (2006) found that small acts of gratitude can cause ripple effects that reach farther than you would imagine.
Sheldon and Lyubomirsky (2007), found the regular practice of gratitude and/or positive visualization can lead to a higher quality of life, measured by affect.
Looking for even more? Check out the Greater Good Science Center (UC-Berkley) and all their tools, resources, research, and more!
Sources:
Emmons, R. A., & McCullough, M. E. (2003). Counting blessings versus burdens: An experimental investigation of gratitude and subjective well-being in daily life. Journal of Personality and Social Psychology. 84, 377-389.
Bartlett, M. & Desteno, D. (2006). Gratitude and prosocial behavior helping when it costs you. Psychological Science. 17. 319-25.
Sheldon, K.M. & Lyubomirsky, S. (2006) How to increase and sustain positive emotion: The effects of expressing gratitude and visualizing best possible selves, The Journal of Positive Psychology, 1:2, 73-82
Authored by:
Makena Neal

Posted on: #iteachmsu

If you were waiting for the time, it's here: Thank an Educator
November is here and with this time of year, we often see an increa...
Authored by:
NAVIGATING CONTEXT
Monday, Nov 7, 2022