We found 411 results that contain "syllabus design"

Posted on: #iteachmsu
Wednesday, Aug 26, 2020
An Exemplary Educator: Frances Kaneene
note: Statements bolded below are excerpts from a nomination letter submitted on Frances' behalf for a 2019 #iteachmsu Educator Award, which she received. 
 
I have just begun my second full-time job at MSU, and after completing all my degrees here consider myself a SPARTAN LIFER. In the past 12 years, I have met many of MSU finest educators. (What luck!) I feel compelled to share now because one of MSU's most wonderful educators is now deceased. 
 
I first met France Kaneene as an undeclared undergrad from a very small town. Neither of my parents went to a four-year institution and I felt lost at MSU. That is until I met Frances. I was perusing MSU's extensive list of majors, overwhelmed with possibilities and my seeming lack of direction, I got excited about a [then] specialization in Connected Learning that was housed in the Bailey Scholars Program, in the College of Agriculture and Natural Resources. I made an appointment with a random CANR advisor to learn more, and that random advisor happened to be Frances. 
 
I brought my mother with me to that meeting (in the summer) and when we walked into Frances' office, I immediately felt at home. She was so empathetic to the experiences I shared about my time on campus and not only connected me (in real-time via phone) with the Director of BSP, through our conversation she helped me pick a major! I joined the [now] Community Sustainability Studies program (formerly Community Agriculture Recreation and Resource Studies) to complete a B.S. in Environmental Studies and Agriscience. Frances was so kind and considerate of the experiences I was having on campus. She helped me feel like more than a number, going out of her way to share "insider" tips for navigating campus, and finding a sense of place.
 
When I told Frances that I wanted to apply to graduate school and thus also wanted to finish my degree in three years, she helped me process my motivations and goals. I knew I'd be more likely to receive funding in the fall semester of a program rather than the spring, and my trajectory to finish was December. After many genuine and whole-hearted conversations, Frances helped me think critically about my program plan, problem solve, and design a series of learning experiences that were both interesting AND met all my program requirements. I finished my bachelor's in three years and stayed in the CARRS program  for my M.S. and loved that I had Frances [then] as a mentor who kept up with what I was doing and where I was trying to go. 
 
Frances connected me with people on campus. She supported my goals and helped me maintain my curiosity throughout my MSU experience. After I finished my masters degree, I stayed on at MSU as an Extension educator, and Frances and I became work-friends. Over lunch, she encouraged me to pursue my PhD. I remember her saying "academia needs more people like you" and oh how impactful that comment as been.
 
This summer, I successfully defended my doctoral dissertation from the College of Education's Higher, Adult and Lifelong Education program. I'm now working as a postdoc academic specialist at MSU. So much has changed since I first met Frances. Without her, I might be working towards a PhD now (maybe), but it's also very possible that I would have transferred out of MSU to a smaller institution. Frances is the reason I stayed a Spartan. As an academic advisor, mentor, and colleague - she was ALWAYS an educator to me... and one of MSU's very best at that. 
 
https://www.dignitymemorial.com/obituaries/east-lansing-mi/frances-kaneene-9301208
 
Authored by: Makena Neal
post image
Posted on: IT - Educational Technology
Thursday, Jun 27, 2024
Fall 2024 Educational Technology Student Workshops
MSU IT presents a series of Zoom webinars helping new students utilize the educational technology tools encountered in classes at MSU. The following sessions cover D2L Brightspace, Zoom, Microsoft Office 365, using generative AI as a college student, and more! 
Navigating Your MSU Courses: D2L Brightspace Basics for Students 
August 13, 2024, 11 a.m. - 12 p.m., Virtual 
Presented by Rhonda Kessling and Sarah Freye, Instructional Technology and Development, MSU IT 
Register on Zoom 
Join us for an engaging workshop designed especially for students who are new to D2L Brightspace, our primary platform for digital learning. Master the essential skills you will need to succeed in your classes at MSU. MSU IT staff will equip you with a thorough understanding of D2L Brightspace to set you up for success. We look forward to seeing you there!  
MSU Tech Essentials: A Guide to Course Technology Tools for Students 
August 14, 2024, 1 p.m. - 2 p.m., Virtual 
Presented by Lisa Batchelder and Sarah Freye, Instructional Technology and Development, MSU IT 
Register on Zoom 
This informational webinar will introduce and demonstrate technology commonly used in classes at MSU, including D2L Brightspace, Microsoft Office 365, Google Apps, Kaltura MediaSpace, and more. We will also share resources for quick and easy access to tech support for students. 
 
AI: A Student’s Guide 
August 15, 2024, 11 a.m. - 12 p.m., Virtual 
Presented by Sarah Freye and Rhonda Kessling, Instructional Technology and Development, MSU IT 
Register on Zoom 
This webinar will empower students to use generative AI effectively and responsibly during their college career. We will explore how AI can be a partner in various academic tasks for planning, collaboration, writing, and editing. We will also cover how using AI fits into university expectations about academic integrity, as well as cover examples of how instructors include AI policies in their syllabi. This virtual event will end with a question-and-answer session. Join us to unlock the potential of AI to elevate your academic experience! 
Posted by: Lindsay Tigue
post image
Posted on: #iteachmsu
Tuesday, Oct 20, 2020
Labster: The virtual science lab simulation library
What is Labster? 
Labster is a simulation library that provides over 150 virtual science labs. Instructors can choose specific simulations to download and install within their D2L course free of charge. Simulations consist of an interactive lab activity in which students learn and apply their understanding of important STEM topics within the context of a virtual lab. Quiz questions are interspersed throughout many of the simulations to help students link their understanding of theory to the lab activity they are performing. 
 
Who might consider using Labster? 
If you are a STEM instructor, Labster could be a useful tool to enhance teaching and learning within your science course. These virtual labs can be used as standalone units, or they can be used as supplemental activities to reinforce important concepts or lab techniques. The Labster simulation catalog includes lab experiences for 21 course packages such as Chemistry, Physics, Food Science and nutrition, Engineering, Biology, Safety, Biotechnology, and others. If you would like to try out Labster, please contact the MSU IT Service Desk at ithelp@msu.edu to get access to the Labster simulations.  
 
Why use Labster? 
Well-designed lab experiences are a key part of student learning within STEM fields. While virtual labs can never replicate real-world lab experiences, simulation-based labs can enhance student learning because they provide students important opportunities for them to put their knowledge to use and engage in important scientific practices (e.g. developing models, constructing explanations, and arguing using data and evidence). For example, simulation-based learning environments have been shown to improve student learning, motivation and, self-efficacy within medical education (Makransky et. al., 2016).  
 
How to use Labster? 
Simulations are uploaded onto D2L and students can access virtual labs directly through the simulation link within D2L. Instructors can impersonate the Demo Student within their D2L course and complete the simulation to get the same experience that their students would have when engaging in the virtual lab. To promote student exploration and engagement with the simulation (as opposed to students focusing their attention on the correctness of their answers to quiz questions), we recommend using the simulation quizzes as a low-stakes formative assessment. 
 
Where to find out more information on Labster? 
If you would like to find out more information about Labster, please see the links below: 

General information about Labster
List of available Labster simulations 
Labster resources page 
Labster help page 

Stories/Feedback 
We would love to hear from you about your experiences with Labster. How was the installation process? How are you using the Labster quizzes in your class? What are your students’ perceptions of Labster and their experiences with the simulations? If you would like to share some of your Labster stories (frustrations, joys, surprises) or need more information about Labster, contact the MSU IT Service Desk at ithelp@msu.edu. 
 
References 
Makransky, G., Bonde, M.T., Wulff, J.S. et al. Simulation based virtual learning environment in medical genetics counseling: an example of bridging the gap between theory and practice in medical education. BMC Med Educ 16, 98 (2016). https://doi.org/10.1186/s12909-016-0620-6 
Authored by: Chiwimbo P. Mwika and Christopher J. Minter
post image
Posted on: #iteachmsu
post image
Labster: The virtual science lab simulation library
What is Labster? 
Labster is a simulation library th...
Authored by:
Tuesday, Oct 20, 2020
Posted on: #iteachmsu
Monday, Nov 15, 2021
Q & A with Garth Sabo & Stokes Schwartz: MSU Learning Community and #iteachmsu Group co-facilitators
This week, we wanted to highlight Stokes Schwartz and Garth Sabo, both educators with the MSU Center for Integrative Studies in the Arts and Humanities. Stokes and Garth are also the co-facilitators of the “Reading Group for Student Engagement and Success” Learning Community this academic year! According to the Office for Faculty and Academic Staff Development (formerly AAN), this Learning Community is: A Zoom based reading group that pairs theory and praxis of student engagement techniques to drive greater student success in general education and prerequisite courses at the university. Bimonthly meetings (twice a month) consist of reading and discussing 2-3 recent articles and sharing best practices for applying methods in courses across the university. These two also use a group on the #iteachmsu Commons to share information about upcoming meetings, attach reading files, and continue to engage in asynchronous dialogue outside their meeting times!
Read more about these Learning Community co-facilitators’ perspectives below. #iteachmsu's questions are bolded below, followed by their responses! 
Q & A with Garth Sabo & Stokes Schwartz

You are facilitators of a Learning Community (LC) and decided to have a group on iteach.msu.edu for that LC. What about the #iteachmsu Commons appealed to you for this group?
Sabo: A major component of the LC structure at MSU focuses on providing some element of public dissemination of the work we do together, and Stokes and I both appreciated that #iteachmsu would allow us to make our group activities visible to the wider MSU community. We both felt a strong need for some type of digital meeting space/repository for things like meeting notes, agendas, etc., and we found that iteach.msu.edu offered a suite of those tools that were fairly easy to wrap our heads around and adopt as practice. 
Schwartz: Having a central place for learning community members (and interested parties) to check-in, share our thoughts, relevant documents, and planned talking points for meetings/discussions as well as any follow-up observations in the days following a meeting.  Personally, I have found iteach.msu.edu relatively easy to use.  
Sabo: Our LC meets digitally, and we also thought that it would be nice to structure things in a way that leaned into benefits of that structure rather than simply trying to imagine ourselves as an in-person community that only meets via Zoom, so we’ve also tried to use iteach.msu.edu as a platform for ongoing and supplemental conversations to the discussions that come up during our scheduled meetings.
Tell me more about your LC and what activity in your #iteachmsu Group looks like? (This can include, but not limited to goals, topics, general overview of membership, the kinds of things being shared in your group.)
Sabo: Our learning community is titled “Reading Group for Student Engagement and Success,” and the only thing I don’t like about it is the name. Stokes and I are both faculty in the Center for Integrative Studies in the Arts and Humanities, and part of the impetus for the group was a desire to dig deeper into pedagogy research that might help us crack the egg of engaging students in a required course. We wanted to find a format that allowed us to have pedagogy conversations that were data-driven and practical in focus, so that our community members could feel like our conversations were driving towards concrete actions.
Our Zoom meetings focus on talking through a few pre-designated texts that the entire group reads. We’ve been fortunate that our current roster has also agreed to take turns as interlocutors, with one person briefly presenting on some additional text(s) that add additional context to the material we all consumed
Schwartz: Typically, Garth and I plan 8-10 multipart discussion questions for our meetings on fostering student engagement and success, which we share via iteach.msu.edu a few days ahead of time. Team community members have also shared information and related ideas via our iTeach group.  We are also in the process of compiling a playlist.  
Sabo: Our iteach activity tends to be kind of evenly split between looking back at what we’ve already done with logistical stuff (like meeting agendas, Zoom links, etc.) and what we might do (like additional discussions or resources that members post or comment on in the Feed). Our current membership is a great mix of folks across the College of Arts and Letters in a range of roles, which has enriched our conversation in ways that I couldn’t have anticipated. We’re certainly open in having more folks join us if interested!
Schwartz: We have had four meetings so far and have our fifth coming up on December 3, 2021 from 10-11:30 EST and all are welcome! Please note, if you're interested in joining the 12/3/21 meeting, please reach out to either co-facilitator via email or on iteach.msu.edu, as the readings we'll be discussing are from a hardcopy book we've procured!
What has been a highlight of this semester for your LC and what are you looking forward to next semester?
Sabo: I love talking about teaching with people who love teaching, and I really feel like that has been the tone of our conversations all semester. I’ve been very appreciative of the fact that our group has been able to talk about the challenges of teaching while still being productive and hopeful about what good teaching can do for our students and the world. One thing that has certainly helped that has been the spirit of collaboration that’s breathed through this group since its beginning. Stokes and I have had a good rapport even since the planning stages of this community, and that has continued as we’ve gone from the process of proposing the group to actually planning its meetings. As you might be able to tell from how long my answers to these questions are, I tend to be wordy and big-picture in my focus, and Stokes does a great job of bringing things back around to ask, “Okay, but what would that actually look like?” in a way that has helped our conversations find a great balance between macro and micro issues of engagement and student success. Our members have been great about thinking and sharing proactively as well.
Schwartz: The highlight?  Two actually.  First, working with my co-facilitator Garth.  We seem to have established an effective working relationship and bat our ideas-plans back and forth until they take solid shape.  It has been fun sharing our ideas, developing our respective parts, coming back to the figurative table for another round of mashup, and then seeing what the final results are before the day of an actual meeting.  Second, the knowledge and personalities of our learning community members, all of whom bring interesting experience and perspectives to our meetings.  Thus far, I have really enjoyed the experience.  It has been like grad school in the best way possible (without the egos and constant stress). 
Sabo: Just to peek behind the curtain a little bit,  next semester we’re pivoting slightly to frame our conversations with the goal of producing tangible results of our collaboration, whether that be conference presentations, publications, or something else entirely. I’d love to see a step on that road being a bigger focus on producing material that we might share to the wider iteach community via the Articles feature.
Schwartz: Looking ahead, I am excited to continue working with our community in the new year and possibly develop a panel or presentation on concrete things we might do to engage our students in the general education or lower division prerequisite "classroom" (F2F or online) more effectively. Beyond that, I am already mulling over ideas for proposing another similar learning community for the 2022-2023 AY.  The cross-pollination possibilities offered/brought about by learning communities like these is fantastic and a good way to break out of our various silos here at MSU.
If you are interested in learning more about this year’s Learning Communities at MSU you can see the full list here. If reading this story peaked your interest in #iteachmsu Groups, you can view all the current groups here. Looking for a group on a particular topic or practice, but don’t see one - start it! Any MSU user can create a group, just login to iteach.msu.edu with your MSU netID to get started. Easy to follow instructions for starting a group are here. 
Posted by: Makena Neal
post image
Posted on: The MSU Graduate Leadership Institute
Thursday, Sep 29, 2022
Natural Science Leadership Fellows
Leadership Fellows

2017-2020: Alex Wright
2020-2021: Acacia Ackles & Lindsay Williams
2021-2022: Katie Westby & Samridhi Satija
2021-2023: Daniel Puentes

Alex Wright (2017-2020)Alex Wright leveraged his position as the first Natural Science Leadership Fellow to build lasting relationships within the college and inform guidelines and feedback mechanisms for peer-mentoring. Working closely with Dean Phillip M. Duxbury, Alex authored, promoted, and helped implement strategies from his Peer Mentoring Program Guide. Alex ended his three-year fellowship in 2020 by working with stakeholders within the College to develop feedback mechanisms to measure and improve faculty mentoring and departmental graduate program policies. His successes helped secure funding for an additional Natural Science Leadership Fellow the next year, and we were able to welcome Acacia Ackles and Lindsay Williams as Fellows for 2020-2021.
Acacia Ackles (2020-2021)One of two 2020-2021 Natural Science Fellows, Acacia Ackles worked with a network of instructors across multiple institutions to draft an article on creating more equitable graduate admissions in the biological sciences. Their team drafted an article for submission to the American Naturalist’s call for special section papers on Nature, Data, and Power. Though the article was rejected, Acacia’s team is currently seeking a home for the proposal and the article itself. The proposal pulled together existing literature on equitable admissions, as well as expertise from faculty working across public, regional, and private institutions, from R1 to primarily teaching. The article aims to create a guide for Michigan State University and other biological science departments to assess and revise their admissions processes.Lindsay Williams (2020-2021)Lindsay Williams used her Fellowship to improve advertising and availability of career services and information on alternative careers for graduate students within the College of Natural Science. Lindsay began creating a guide for alternative career pathways for graduate students and working with Elizabeth Averkiadi, a Communication Arts and Sciences Ph.D. student, to create an undergraduate workshop titled “Ways of Researching” to introduce undergraduates to the research experience outside of academia.Katie Westby (2021-2022)Katie, drawing on over a decade of professional teaching experience, decided to address the unique accessibility needs within math education. Working closely with two faculty members in her department, Katie gained institutional support and funding to create a professional development workshop series for mathematics Graduate Teaching Assistants about compliance and inclusion from an anti-deficit, anti-ableist perspective. The goals of her workshops were to increase knowledge among GTAs about accommodations, understanding disability through a sociopolitical lens, and becoming comfortable creating a more inclusive learning environment.  The workshop also helped participants write diversity and teaching statements.  
Samridhi Satija (2021-2022)Samridhi decided to focus on supporting graduate students who identify as women. She collaborated with Heather Shea, the director of the Women*s Student Services office, to host the SmartStart program, a program designed to help women with salary negotiations in professional settings. Salary negotiation skills are one way to address the wage gap that exists across genders. Daniel Puentes (2021-2023)Daniel’s project focused on advocating for guaranteed transitional funding for graduate students who are forced to move to new labs or assistantships due to issues with their supervisor. Graduate students are financially vulnerable, especially when their funding is tied to a particular advisor or lab, and this funding would provide a safety net and financial security so students can prioritize their wellbeing. He created a survey to highlight the need for this type of emergency funding and in the process of submitting a proposal to the Graduate School based on benchmarking of similar programs at other institutions.
Posted by: Emma Dodd
post image
Posted on: #iteachmsu
Friday, Apr 5, 2024
Feedback Fun
Feedback Fun
April is national poetry month (https://poets.org/national-poetry-month), and you want to have some fun with AI. Consider the written feedback that you often provide your students. At times it can seem very sterile and institutional. Even when you offer valuable insights and points, it often gets overlooked because it does not seem genuine or sincere. Enter AI – You could take your typical feedback text and see if AI can make it more ‘memorable’. Here is an example based on a suggestion from Chrissi Nerantzi from the University of Leeds (As presented in: Nerantzi, C., Abegglen, S., Karatsiori, M. and Martinez-Arboleda, A. (Eds.) (2023). 101 Creative ideas to use AI in education. A collection curated by #creativeHE. Graphic Design by Bushra Hashim. CC-BY-NC-SA 4.0.)
 
Original Feedback Text
Below is an example of feedback given to a student found on the internet:
I’m impressed with your ability to write concisely. Not only did you fulfill the assignment, you also wrote a long-ish intro and answered questions that went beyond the prompt. Doing so within the word limit and doing it well deserves recognition. One result is that outside of the opening paragraph the entire essay is focused exclusively on the arguments –there’s no excess fat in this essay. Given the nature of the assignment, that’s great. As noted throughout, however, some of your specific arguments need developed –your paper would’ve been better served had you eliminated one of the arguments in order to better develop the others along the lines mentioned in my comments above. Overall, it’s clear that you understand each of the philosophers you address and you present interesting ideas.
(Feedback example retrieved from - https://blogs.commons.georgetown.edu/jco34/sample-assignments/examples-of-feedback-on-student-writing/).
 
Using the above text with a generative AI tool (perplexity.ai) the following prompts were given, and the resultant outputs were received:
 
AI Prompt: Compose the above feedback in the form of a poem.
Feedback Poem
Your concision impresses, a skill so refined,Not just the task fulfilled, but more you did find.An intro that's lengthy, yet focused throughout,No excess, just arguments, that's what it's about. For an assignment like this, that's exactly what's needed,Your understanding shines, your ideas are well-seeded.But some arguments lack the development they need,Eliminating one could make the others succeed. As noted before, your grasp of each thinker is clear,The ideas you present, they're certainly sincere.Yet honing those points, refining with care,Could elevate this work, make it beyond compare. So take this feedback, let it guide your next steps,Refine and expand, reach new intellectual depths.Your potential is evident, your skills are so strong,Keep pushing forward, and you can't go wrong.
 
AI Prompt: Compose the above feedback in the form of a limerick
Limerick Feedback
There once was a writer so clever,Whose arguments were quite the endeavor.They built in objections, then hit 'em back strong,But clarity sometimes went wrong.With the writing center's help, they'd do even better. The opening needed more clarity upfront,Revealing the main point, no need to be coy.And Sen's comments on comparing without an ideal,Were points that the writer should not have concealed.Overall the ideas were a true joy. So take this feedback, don't let it annoy,Use it to sharpen, refine, and deployAn essay that's tighter, more focused, more clear.The potential is there, have no fear!Keep working, keep growing, your skills to employ.
 
AI Prompt – Compose in the form of rap lyrics.
AI Prompt: Produce the feedback as rap lyrics
Rap Lyrics Feedback
Yo, listen up, I got some words for you,
Your writing's on point, but there's more to do.
You built in those objections, hit 'em back with your might,
But the clarity's lacking, gotta shine that light.
 
Second language, I know it ain't easy,
But hit up that writing center, make your skills feel breezy.
That opening, it needs to be more clear,
Don't hold back the main point, let it ring loud and near.
 
And yo, you missed Sen's comments, that's a big deal,
Comparing without an ideal, that's how you seal.
But overall, the ideas, they're a true joy to behold,
Gotta tighten it up, make that story unfold.
 
So take this feedback, don't let it annoy,
Use it to sharpen, refine, and deploy
An essay that's tight, focused, and oh so clear,
The potential is there, have no fear!
Keep working, keep growing, your skills to employ.
 
Note: The above examples do not violate privacy as there are no identifiers for the recipient and no grade associated with the feedback.
Authored by: Jay Loftus
post image
Posted on: #iteachmsu
Tuesday, Aug 24, 2021
Comparative Analysis of Crowdmark and Gradescope
Executive Summary 
This analysis presents a review and comparison of two instructional technologies for administering and digitally grading online and in-person assessments: Crowdmark and Gradescope. We tested both instructor and student workflows for creating, submitting, and grading assessments using Crowdmark and Gradescope integrated with a test course in D2L. Our evaluation criteria included ease of use, features available, accessibility, and flexibility. We found some key similarities:

Remote and in person assessments are supported, with multiple question types.
Grading is done by question rather than by student for more consistency.
Multiple graders can grade assignments, such as co-instructors and teaching assistants.
Grades are synced automatically with the gradebook in D2L Brightspace.

The primary differences between these two are:

Crowdmark can assign assessments according to sections and a drag and drop functionality is available for rubric comments.
Crowdmark emails students when assessments become available and can accept more file types as well as rotate files more easily.
Gradescope allows for time extensions at the course level as well as for each assessment and allows for grading the assessments before the due date.

Based on these findings, we recommend continuing with Crowdmark, the more established and familiar tool. Although Gradescope includes some extra functionalities over Crowdmark, such as programming assessments, these functions are already handled by other tools or have not been used often or at all by faculty (e.g., CSE 231 Introduction to Programming uses Mimir for programming assignments). Crowdmark also offers fast grade sync with the D2L gradebook and the scanning and matching capabilities are more robust for in person assessments.
"The second-best way to grade exams" by ilmungo is licensed under CC BY-NC-SA 2.0

Methods
We tested both instructor and student workflows for creating and submitting assessments using Crowdmark and Gradescope integrated with a test course in D2L. Sample assignments were created for the remote assessments that included all of the available question types (i.e., upload file, enter text, multiple choice, etc.). Using separate accounts, we assigned the assessments as an instructor, submitted the assessments as a student, then returned to the instructor account to grade the assessments and sync the grades to our D2L test course. 
Findings
Key Similarities:
Both Crowdmark and Gradescope offer keyboard shortcuts for faster grading; allow late submissions, group submissions, and enforced time limits; and allow for grading by question instead of by student as well as multiple graders such as teaching assistants. Assignment submissions can include pdf or image upload, free response/short answer in a text box, or multiple choice/multi select type questions (with bubble sheets) for online assessments. For both tools, students can upload one PDF and then drag and drop each page to match each question for remote assessments, while instructors can scan and upload student submissions in batches for in person assessments. Both tools will also attempt to split a batch PDF into individual student submissions.
Key Differences:
Accessing Tools
Students have to login to Crowdmark through the Crowdmark website. This link can be added to D2L Brightspace and opened in a new, external web page. The Crowdmark sign-in prompts students to select their institution and then uses students’ Brightspace login. Gradescope can be added to D2L Brightspace as an External Tool in a D2L content module. This allows students to access Gradescope within D2L as an embedded website within the D2L page, instead of as an external page, and does not require any additional login.
Creating Assessments
When creating assessments in Crowdmark, instructors choose between administered (in person) assessments that instructors will upload or assigned (remote) assessments that students will upload (Figure 1). Administered assessments can include bubble sheets for multiple choice questions. Assigned remote assessments can include file upload, text entry responses, or multiple-choice questions (which are automatically graded).When creating an assignment in Gradescope, the assignment type must be chosen first. Then, for the first three assignment types, the submission type is designated as either the instructor or the students (Figure 2). Although Exam/Quiz and Homework/Problem Set are offered as two different choices, they actually have the same options and essential functions. There are no further options if the instructor will be uploading the assessments, but other options are available if students will be uploading. Submissions can be variable length, where students submit any number of pages and indicate the pages where their question responses are, or fixed length where students submit work where answers are in fixed locations (like worksheets). Instructors can also allow students to view and download the assessment template if desired. Multiple choice assignments can be created with printable bubble sheets that either instructors or students can upload. Programming assignments are available, which Crowdmark does not support, and they can be automatically or manually graded.
Figure 1: Assessment types available in Crowdmark.

Figure 2: Assessment types available in Gradescope.
Both tools have the ability for students to take online quizzes. Both have multiple choice and multi select that are auto-graded, and both have free response and file upload that are NOT auto-graded. Gradescope supports short answer questions which are auto-graded, but Crowdmark only has free response questions.For assignments that students will upload, instructors must input text or upload a document for each individual question in Crowdmark. It is possible for an instructor to upload one document in the instructions field which contains all of the assignment questions and then simply enter numbers in the text boxes for each question, rather than the text of each question. Gradescope only requires one document to be uploaded. Each question is then identified by dragging a box around each question area on the page and a question title must be entered.
Assigning & Distributing Assessments
For courses with several sections, Crowdmark allows assessments to be assigned to specific sections rather than the entire course. To approximate this feature in Gradescope, an instructor would have to create separate Gradescope courses or duplicate assignments and direct students to the appropriate version for their section.Both tools allow instructors to set individual accommodations for each assignment to customize due date, lateness penalty, or time to complete. However, Gradescope also allows course-wide extensions for students, where extensions can be added for all assignments to customize time limits (multiply time by x or add x minutes) and due dates. Crowdmark requires accommodations to be made in the submission area for each assignment. It does not support course-wide accommodations.When an assessment is assigned and released to students, Crowdmark sends a notification email to students, where Gradescope only sends an in-platform notification. Gradescope does send a confirmation email when students successfully submit an assignment. Both tools give instructors the option to send a notification email when returning student work.
Submitting Assessments
For in-person assessments, Crowdmark can include a QR code on assignments to ensure that every page of student work is correctly matched to the appropriate student for grading. The QR code can be manually scanned and matched to each student using an app as the assignment is turned in, or instructors can use automated matching (beta) to include a form field where students write their name and ID number for automated character recognition to identify the student and match them to that assignment’s QR code. Gradescope is developing a feature to create a unique label for each copy of an assignment and add that label to each page, but this is not currently available.Submitted file types are more flexible in Crowdmark, which can support PDF, JPEG, PNG, and iPhone photos, any of which can be rotated after submission. Gradescope accepts only PDFs or JPEGs and only PDF pages can be rotated. This means that Crowdmark offers much more flexibility in scanning software and orientation. Gradescope does have a built-in PDF scanner for iOS devices to circumvent format issues and allow seamless upload. Both tools assume that image submissions are of work associated with a single question. All work can be scanned into a single PDF for upload and each page then manually associated with each question in the assignment. In both tools, the student selects which question(s) are associated with each page(s), where multiple questions may be on a single page or multiple pages may be associated with a single question.Crowdmark allows for group submissions when either the instructor or the students scan and upload the assessments. This ability to match multiple students to one assessment allows for two-stage exams, collaborative lab reports, or other group assignments. Gradescope only allows group submissions when students scan and upload assessments, although online assignments also allow group submissions.
Grading Assessments
Assignments can be graded immediately after students have submitted them in Gradescope. Crowdmark does not allow grading to be done until the due date has passed.In Crowdmark, all feedback comments created for each question are stored in a comment library which can be reordered easily by dragging a comment to the desired location. There is no limit on the number of comments that can be dragged and dropped onto each student’s submission. Crowdmark comments can have positive or negative points attached to them, but specifying points is not required. Gradescope does not allow for dragging and dropping multiple comments; however, text annotations are saved for each question and several can be applied to each submission. The separate rubric comments must be associated with positive or negative points for each question. The rubric type can be either negative scoring, where the points are subtracted from 1.0, or positive scoring, where the points are added to 0. Score bounds can also be set, with a maximum of 1.0 and a minimum of 0. While it is possible to select more than one rubric comment, only one comment can be added as part of a “submission specific adjustment” which can include an additional point adjustment.Crowdmark sends grades to D2L and automatically creates the grade item in the gradebook. Gradescope requires that the grade item be created first, then associated with an assignment, before sending grades is possible.
Table 1: Feature Comparison between Crowdmark and Gradescope.



Topic


Crowdmark


Advantage


Gradescope




Accessing Tools


Must access through separate website; sign in to Crowdmark via Brightspace



Can add External Tool to D2L module and it can be accessed within D2L (embedded website into page)




Creating Assessments


Upload PDF and designate where questions are for administered assessments that instructors upload (drag question number to location on page)



Upload PDF and designate where questions are by dragging boxes on the page for fixed length exam/homework that students upload or an administered exam/homework that instructors upload




Must input or upload individual questions manually when creating remote assessments that students upload (but instructor can upload PDF in directions area and just enter Q1, Q2, etc. in text boxes)



Must input question titles separately for variable length submissions that students upload, but questions are designated by dragging box over location on page (no need to enter text of question in Gradescope)




Assigning & Distributing Assessments


Can assign assessments to a section rather than entire course



Cannot assign assessments to a section; must create separate course or duplicate assignments and instruct students which one to submit




Add time for accommodations for each assessment only (customize due date, lateness penalty, or time to complete)



Add extensions at course level and/or for each assessment (multiply time by x or add x minutes)




Students always receive email when new assignments are ready to be completed



Students are not notified when new assignments are ready; but students do receive email when they have submitted an assignment, and instructor has option to send email once the assignment is graded




Submitting Assessments


QR codes on printed work for in person administered assessments (can also use app to match assessments to students when scanning)



Create printouts (beta) for in person assessments; give each student a copy of the assignment with a unique label on each page (this tool is NOT yet available)




iPhone photos supported; can accept PDF, JPG, or PNG (and can rotate any file) for remote assignments submitted by students



iPhone photos not supported; accepts PDF or JPG only (can only rotate PDFs) for remote assignments submitted by students; multiple files and any file type accepted for online assignments




Allows for group submissions whether students or instructors are uploading assessments (i.e. match multiple students to one assessment)



Allows for group submissions only if students are uploading assessments, but also available for online assignments




Grading Assignments


Must wait until due date to begin grading remote assessments



Online assignments can be graded immediately




Drag and drop any number of comments from comment library for each question



Can apply one previously used comment for each submission separate from rubric; cannot select or drag and drop multiple comments, but can add multiple previously used text annotations for each question




Comments can have positive or negative points attached to them, but specifying points is not required



Comments must have associated points (positive, negative, or 0) for each question; can change rubric type from negative scoring (points subtracted from 1.0) to positive scoring (points added to 0) as well as enable/disable score bounds (max of 1.0 and min of 0)




Grades sent to D2L automatically with no need to create grade item first



Grades sent to D2L automatically but must create grade item first




 
MSU Usage Data
We explored the usage of each tool at MSU to determine if there was a perceptible trend towards one tool over the other. The total number of courses created in each tool is fairly similar (Table 2). Interestingly, the total number of students enrolled in those courses is much higher in Crowdmark, while the number of assessments administered is higher in Gradescope.
Table 2. Tool usage in courses with at least one student and at least one assessment.


 

Crowdmark


Gradescope




Courses


322


292




Students


25,322


14,398




Assessments


3,308


4,494




Crowdmark has been used by MSU instructors since 2016. Gradescope has been used since 2018. More courses were created in Crowdmark until the 2020 calendar year (Figure 3). Usage of both tools spiked in 2020, presumably due to the COVID-19 induced shift to remote teaching, and was fairly equivalent that year. For the Spring 2021 semester, more courses have been created in Gradescope. It will be interesting to observe whether this trend towards Gradescope usage continues as 2021 progresses or if Crowdmark usage picks back up.Given the disparity between number of students vs. number of classes & assessments, we explored the frequency of class sizes between the two tools (Figure 4). Both tools have been used for classes of all sizes, though the median class size is 37 for Gradescope and 63 for Crowdmark. We also explored the frequency of assessment numbers between the tools (Figure 5). We found that all but one course had 1-60 assessments created, with both tools most frequently having 2-20 assessments. Gradescope showed an interesting secondary peak of courses having 35-45 assessments. We do not have detailed information for either tool on what kinds of assessments were created or whether all of those assessments were actually used, not just created in the course for practice, or duplicates (e.g., available later, more accessible, or different versions for different class sections in Gradescope).
Figure 3. Number of courses created in each tool that had at least one student and at least one assessment for each calendar year since 2016.

Figure 4. Number of courses having a given class size and at least one assessment.

Figure 5. Number of classes having a given number of assessments and at least one student.

Discussion:
Our analysis showed significant functional overlap between Crowdmark and Gradescope, where either tool could be chosen with little to no impact on instructor capability. However, there are a few advantages to the way that Crowdmark handles assignment tracking, submission, and grade syncing to D2L. In particular, Crowdmark already offers a fast QR-code method for matching every page of in-person assessments to the appropriate student enrolled in the course when scanning the assessments in batches. We expect this feature will become a strong asset in the Fall 2021 semester as more classes will be on campus. If we were to choose between Crowdmark and Gradescope for continued support, we would recommend Crowdmark. Gradescope is a competitive technology, but it is still developing and refining capabilities that are already available through Crowdmark or D2L. If an instructor were to need to switch from Gradescope to Crowdmark, they should refer to the D2L self-enroll course “MSU Tools and Technologies” for detailed information and resources on using Crowdmark at MSU and closely review Table 1 to understand the key differences they may encounter. The Assessment Services team and/or Instructional Technology & Development team in the IT department are also available for one-on-one consultation on using either technology (request a consultation via the MSU Help Desk).
Authored by: Jennifer Wagner & Natalie Vandepol
post image
Posted on: #iteachmsu
post image
Comparative Analysis of Crowdmark and Gradescope
Executive Summary 
This analysis presents a review and compari...
Authored by:
Tuesday, Aug 24, 2021
Posted on: #iteachmsu
Friday, Nov 6, 2020
Considerations for Exam Structure
Many decisions must go into the structure of an exam and how that assessment fits into the overall organization of a course. This document will review options for test configuration at multiple levels and then provide some examples of MSU faculty that have incorporated these strategies into their courses.
Course-Level Considerations
Course-level considerations require reviewing the structure of the class to see where major scheduling or grading changes can be made.

Lower the stakes / reduce the scope – Deliver more assessments that each cover less content. This provides students with more accountability for checking understanding in quicker and shorter ways throughout the course which can enhance the learning experience. Reducing the scope of exams in this way can also provide you as the instructor and the student with more targeted areas of feedback earlier on in the learning process
Drop a lowest exam grade – Provide students an “out” if they are unprepared or have a bad testing experience
Use honor codes – When combined with taking time to establish a climate of integrity, honor codes can reduce academic dishonesty

Exam-Level Considerations
Exam-level considerations can be made without altering other components of the course. However, these strategies often require evaluating the style of question asked.

Allow open book or notes and/or collaboration - The National Association of Colleges and Employers determined that the most important skill employers look for in college graduates are problem-solving and teamworking skills. Exams can be structured to practice and assess those skills
Write authentic questions – Teach and test skills and application of knowledge necessary for successful performance as a professional in the field
Allow corrections – Turn typical summative assessments into formative assessments by allowing students to use exams as a learning tool. Exams do not always need to be used as assessment of learning; they can also be used as assessment for learning
Offer more points on the exam that what is needed to achieve a 100% grade
Allow students to have multiple attempts at the exam
Use a two-part exam structure that has students take the exam both individually and in groups.

Question-Level Considerations
Question-level considerations are the easiest to implement; most changes can be accomplished using D2L quizzing tools.

Use question pools
Randomize questions
Limit the number of questions per page
Provide technology practice before the first major exam

Timing Considerations
Deciding on a time limit for an exam is an important decision. There are pros and cons for either limiting time or giving extended time.

Using untimed exams reduces student anxiety – When you have pools of questions that reduce the chances of students cheating on exams, it can allow a unique advantage of removing time limits on exams so as to reduce the anxiety that comes from timed exams
Using timed exams – Setting a time limit can provide a layer of security against academic misconduct. By minimizing the time students have to take the exam, they are more likely to spend that time focusing on the questions and not copying questions or collaborating
Ask TAs or ULAs to take the exam prior to delivery – Provides a report on time estimates that it will take for the class to complete the exam. It also provides opportunities for them to spot check the questions themselves for errors or opportunities to enhance the exam’s efficacy

Collaboration Considerations
When possible, collaborating with faculty colleagues, TAs, or ULAs in exam creation can help minimize the time and effort needed.

Generate questions pools as a faculty team
Have TAs or ULAs to create questions – Their direct involvement with students in supporting their learning throughout the course gives them a unique advantage in knowing how to write questions that can be useful for drawing out evidence of knowledge among learners

Examples from MSU Instructors
Mini-Exams
For many years, chemistry instructors in Lyman Briggs College have incorporated a low-stakes “mini-exam” as the first timed assessment in their introductory chemistry courses. In terms of points, the mini-exam is typically worth about 40% of a midterm exam. The mini-exam gives students an opportunity to experience “exam difficulty” questions in an exam setting. This early exam provides feedback to students regarding their approach to the class (have their study approaches been working?) on a lower-stakes exam. This also allows the instructors an early opportunity to intervene and support students prior to the first higher-stakes midterm exam. The mini-exam can be considered as either more formative (i.e., score dropped if midterm exam scores are higher) or more summative (testing on important expected prior knowledge), depending on the course design. With the move to online instruction, a mini-exam also gives instructors and students an opportunity to test and become familiar with the technology being used for midterm exams in a lower-stakes setting.
Strategies

Lower stakes exams
Provide technology practice before the first major exam

Extra Points
One approach has been successfully used in multiple introductory as well as some upper-level chemistry courses is offering more possible points on an exam than is needed for a grade of 100%. For example, if there are 80 possible points on an exam, grading might be based on a total of 73 points; a student who gets 73 points would earn a 100% grade. This approach allows instructors to communicate high standards for responses to exam questions but still relieves some pressure on students. Anecdotally, instructors have sometimes found that this alleviates the need for regrades. Instructors might choose to limit the maximum grade to 100% or offer bonus credit for students who score above 100%. In addition, building in extra points can potentially reduce some stress for first-year students accustomed to high-school grading scales where often scores above 90% are required for an “A.”
Strategies

Offer more points on the exam that what is needed to achieve a 100% grade

Authentic, Low Stakes Exams
In her neuroscience for non-majors course, Casey Henley writes exam questions that require students to make predictions about novel experiments based on content learned in class. These questions often require students to read and interpret graphs. Since the questions require problem solving, and the answers cannot be looked up, the exams are open book and open note. Additionally, the exams become a learning experience themselves because optional correction assignments are offered, and students can earn points back by reviewing their work and resubmitting answers. Exam corrections also provide information about the misconceptions that students held going into the test, which helps Casey create or edit content for future semesters. The class has four non-cumulative unit exams and one cumulative final. Each has the same point value, and students get to drop one exam grade.
Strategies

Write authentic questions
Lower the stakes
Drop a lowest exam grade
Allow open book or note
Allow corrections

Collaborating on Question Pool Creation
Consider working together with your colleagues on developing shared pools of questions that can be used for quizzes and exams within the same subject matter. This can greatly reduce the chances of cheating and bring a new sense of alignment across courses for those who are teaching similar courses already. It is also an important space for collaboration to take place among peers. A good example of this happening at MSU already is the way instructors in the Biological Sciences program share questions. Instructors in the Physics and Astronomy department have also shared questions across the institution with LON-CAPA for many years. and
Strategies

Use question pools
Generate questions pools as a faculty team
Authored by: Casey Henley and Dave Goodrich
post image