We found 18 results that contain "grading"
Posted on: #iteachmsu

Instruction, Feedback, Assessments & Centering Students in Remote Environments
This playlist is a growing collection of content aimed at supporting educators as they traverse ongoing shifts in teaching environment, procedures related to grading, and other uncertainties that results from ongoing pandemics... all the while keeping student success at the core of their work.
ASSESSING LEARNING
Posted on: #iteachmsu

Grading & Giving Feedback
Edit a Question During its Availability
Occasionally, a test question will need to be edited while an exam is in progress.
Quizzes – Manually Grade a Quiz - Instructor
Short answer questions, although auto-graded by D2L, should be double-checked for grading accuracy.
D2L Assessment Analytics
Examining quiz question statistics can help instructors determine if a question is too easy, too challenging, or needs editing for clarification.
The following is a quick guide for D2L Quiz and Grade Item statistics to help you monitor and improve your assessment questions and results.
D2L Quiz Statistics
To see how students performed overall on each of the quizzes, in your own course go to Assessments > Quizzes > Statistics (click on Statistics from the tab view across the top).
This list displays all of your course quiz averages.
Click on a quiz to see more details including User Stats, Question Stats, and Question Details.
Question Stats
The Question Stats list the Standard Deviation, Discrimination Index, and Point Biserial value for each question.
You can click on the link, "What do the statistics on this page mean?" above the table in your course to learn more. The information is also copied below.
What do the statistics on this page mean?
All statistics are calculated based on each user’s first attempt on the quiz. If a question is changed after attempts have been made, only the attempts on the newest version of the question are included in the statistics (ie. First attempts made before a question was changed are not included in the statistics for that question).
STANDARD DEVIATION
The standard deviation indicates how much scores vary from the average, ranging from 0% to 100%. A high standard deviation indicates that scores are spread out from the average, whereas a low standard deviation indicates that scores are close to the average.
DISCRIMINATION INDEX
The discrimination index indicates how well a question differentiates between high and low performers. It can range from -100% to 100%, with high values indicating a “good” question, and low values indicating a “bad” question.
POINT BISERIAL CORRELATION COEFFICIENT
The point biserial correlation coefficient is an analysis only applied to multiple choice and true/false question types that have only one answer with weight 100%, and all others with weight 0%.
Similarly to the discrimination index, the point biserial correlation coefficient relates individuals’ quiz scores to whether or not they got a question correct. It ranges from -1.00 to 1.00, with high values indicating a “good” question, and low values indicating a “bad” question.
*Note that only first attempts are included in that question's statistics.
Question Details
This tab will show you the summary of student responses for each question. If you notice a low or negative value for the Point Biserial or Discrimination Index, you may want to investigate the question. It could indicate a badly worded question or improperly keyed question answer.
For more, view the video tutorial on Generating Reports in D2L Learning Environment opens in new window. Currently, the statistics do not display for random "pool item" question types. Contact the MSU Service Desk to check on obtaining reports through the Data Hub.
Grade Item Statistics
To view grade item stats, in your own course go to, Assessments > Grades > (Grade Item) View Statistics – Use the pull down menu by a grade item title and select Statistics to display Class and User Statistics. If you have a grade scheme setup to display, you will also see the grade distribution chart on the page.
Working with student data
Keep the MSU Institutional Data Policy opens in new window in mind when storing data and making reports public in order to protect the security and confidentiality of student data.
Read more about best practices for handling data at secureit.msu.edu/data opens in new window from MSU IT Services – Academic Technology.
Addressing Issues of Academic Misconduct
What should you do if you discover cheating in your course? Follow the link to find out more.
What is an Academic Dishonesty Report
If you give a penalty grade as a result of academic misconduct, you must submit an Academic Dishonesty Report (ADR) to the university. See the link above as an example.
Occasionally, a test question will need to be edited while an exam is in progress.
Quizzes – Manually Grade a Quiz - Instructor
Short answer questions, although auto-graded by D2L, should be double-checked for grading accuracy.
D2L Assessment Analytics
Examining quiz question statistics can help instructors determine if a question is too easy, too challenging, or needs editing for clarification.
The following is a quick guide for D2L Quiz and Grade Item statistics to help you monitor and improve your assessment questions and results.
D2L Quiz Statistics
To see how students performed overall on each of the quizzes, in your own course go to Assessments > Quizzes > Statistics (click on Statistics from the tab view across the top).
This list displays all of your course quiz averages.
Click on a quiz to see more details including User Stats, Question Stats, and Question Details.
Question Stats
The Question Stats list the Standard Deviation, Discrimination Index, and Point Biserial value for each question.
You can click on the link, "What do the statistics on this page mean?" above the table in your course to learn more. The information is also copied below.
What do the statistics on this page mean?
All statistics are calculated based on each user’s first attempt on the quiz. If a question is changed after attempts have been made, only the attempts on the newest version of the question are included in the statistics (ie. First attempts made before a question was changed are not included in the statistics for that question).
STANDARD DEVIATION
The standard deviation indicates how much scores vary from the average, ranging from 0% to 100%. A high standard deviation indicates that scores are spread out from the average, whereas a low standard deviation indicates that scores are close to the average.
DISCRIMINATION INDEX
The discrimination index indicates how well a question differentiates between high and low performers. It can range from -100% to 100%, with high values indicating a “good” question, and low values indicating a “bad” question.
POINT BISERIAL CORRELATION COEFFICIENT
The point biserial correlation coefficient is an analysis only applied to multiple choice and true/false question types that have only one answer with weight 100%, and all others with weight 0%.
Similarly to the discrimination index, the point biserial correlation coefficient relates individuals’ quiz scores to whether or not they got a question correct. It ranges from -1.00 to 1.00, with high values indicating a “good” question, and low values indicating a “bad” question.
*Note that only first attempts are included in that question's statistics.
Question Details
This tab will show you the summary of student responses for each question. If you notice a low or negative value for the Point Biserial or Discrimination Index, you may want to investigate the question. It could indicate a badly worded question or improperly keyed question answer.
For more, view the video tutorial on Generating Reports in D2L Learning Environment opens in new window. Currently, the statistics do not display for random "pool item" question types. Contact the MSU Service Desk to check on obtaining reports through the Data Hub.
Grade Item Statistics
To view grade item stats, in your own course go to, Assessments > Grades > (Grade Item) View Statistics – Use the pull down menu by a grade item title and select Statistics to display Class and User Statistics. If you have a grade scheme setup to display, you will also see the grade distribution chart on the page.
Working with student data
Keep the MSU Institutional Data Policy opens in new window in mind when storing data and making reports public in order to protect the security and confidentiality of student data.
Read more about best practices for handling data at secureit.msu.edu/data opens in new window from MSU IT Services – Academic Technology.
Addressing Issues of Academic Misconduct
What should you do if you discover cheating in your course? Follow the link to find out more.
What is an Academic Dishonesty Report
If you give a penalty grade as a result of academic misconduct, you must submit an Academic Dishonesty Report (ADR) to the university. See the link above as an example.
Authored by: Casey Henley & Susan Halick
Assessing Learning
Posted on: #iteachmsu

Crowdmark: Deliver and Grade Assessments
What is Crowdmark?
Crowdmark is an online collaborative grading and analytics platform that helps educators assess student work. The platform allows for easy distribution and collection of student assignments, offers tools for team grading with rubrics, and streamlines the process for providing rich feedback to students.
How does Crowdmark improve the assessment experience?
Crowdmark allows instructors to deliver assignments and exams to students online with a due date and time limit, if desired. Students complete the assessment digitally or scan their handwritten work (as an image or PDF) and upload their completed work using a computer or mobile phone for evaluation on Crowdmark.
Graders can make annotations on the pages, add comments including hyperlinks, embedded images, mathematical and chemical notations, and attach scores according to a grading scheme/rubric. After evaluation is complete, the graded assessments can be electronically returned to students with the click of a button. Crowdmark also provides tools for visualizing student performance and the data can be exported in a convenient format.
Crowdmark is now integrated with MSU’s instance of D2L Brightspace. This integration provides features such as roster synchronization, team synchronization, and the ability to export grades from Crowdmark into the D2L gradebook.
What limitations or alternatives should I consider?
The grading rubrics and comment library make grading more consistent and efficient, however, the assessments are primarily graded manually. For auto-graded questions, you may want to consider using the MSU Scoring Office tool, WebAssess™ Assessment Solutions, in Digital Desk or D2L Quizzes. Gradescope is another alternative similar to Crowdmark.
Where do I start if I want to use it?
See Accessing Crowdmark through D2L, navigate to the Crowdmark sign-in page and select Michigan State University.
Where can I find more information?
MSU D2L Help:
Getting Started with Crowdmark
Crowdmark Documentation:
Introduction to Crowdmark
Getting Started for Instructors
D2L and Crowdmark
Crowdmark support
Crowdmark is an online collaborative grading and analytics platform that helps educators assess student work. The platform allows for easy distribution and collection of student assignments, offers tools for team grading with rubrics, and streamlines the process for providing rich feedback to students.
How does Crowdmark improve the assessment experience?
Crowdmark allows instructors to deliver assignments and exams to students online with a due date and time limit, if desired. Students complete the assessment digitally or scan their handwritten work (as an image or PDF) and upload their completed work using a computer or mobile phone for evaluation on Crowdmark.
Graders can make annotations on the pages, add comments including hyperlinks, embedded images, mathematical and chemical notations, and attach scores according to a grading scheme/rubric. After evaluation is complete, the graded assessments can be electronically returned to students with the click of a button. Crowdmark also provides tools for visualizing student performance and the data can be exported in a convenient format.
Crowdmark is now integrated with MSU’s instance of D2L Brightspace. This integration provides features such as roster synchronization, team synchronization, and the ability to export grades from Crowdmark into the D2L gradebook.
What limitations or alternatives should I consider?
The grading rubrics and comment library make grading more consistent and efficient, however, the assessments are primarily graded manually. For auto-graded questions, you may want to consider using the MSU Scoring Office tool, WebAssess™ Assessment Solutions, in Digital Desk or D2L Quizzes. Gradescope is another alternative similar to Crowdmark.
Where do I start if I want to use it?
See Accessing Crowdmark through D2L, navigate to the Crowdmark sign-in page and select Michigan State University.
Where can I find more information?
MSU D2L Help:
Getting Started with Crowdmark
Crowdmark Documentation:
Introduction to Crowdmark
Getting Started for Instructors
D2L and Crowdmark
Crowdmark support
Authored by: Susan Halick
Assessing Learning
Posted on: #iteachmsu

How to Grade Quiz or Exam Questions
Purpose
Multiple choice question autograde in D2L. However, other types of questions need to be graded by hand. Use this document to learn how to grade questions in a quiz or exam.
Log into d2l.msu.edu
Select the course where you want to add the questions.
Click on Quizzes in the Assessment Tab in the navigation bar or from the Course Admin list
Click on the down arrow next to the quiz name, and choose “Grade”
If you have to only review questions for one individual, you can click on that student in the user list. If you need to review questions for all students, click on the “Questions” tab
For grading written answers, choose “Grade Individual Responses.” You can choose to have names removed from questions by checking “Blind Marking”
If you need to regrade all attempts at a question, for example, if there was an error, and you need to give all student full credit, use “Update All Attempts”
The list of questions will indicate the type of question
WR: Written response
MC: Multiple choice
SA: Short answer
FIB: Fill in the blank
Choose the question you want to grade.
You can choose to grade more than one student per page using the numerical dropdown
The answer you submitted as correct when you wrote the question will display in blue
Grade the student’s response, set the points earned, and provide feedback in the feedback box
Move on to the next student
Save your feedback and scores when finished
Multiple choice question autograde in D2L. However, other types of questions need to be graded by hand. Use this document to learn how to grade questions in a quiz or exam.
Log into d2l.msu.edu
Select the course where you want to add the questions.
Click on Quizzes in the Assessment Tab in the navigation bar or from the Course Admin list
Click on the down arrow next to the quiz name, and choose “Grade”
If you have to only review questions for one individual, you can click on that student in the user list. If you need to review questions for all students, click on the “Questions” tab
For grading written answers, choose “Grade Individual Responses.” You can choose to have names removed from questions by checking “Blind Marking”
If you need to regrade all attempts at a question, for example, if there was an error, and you need to give all student full credit, use “Update All Attempts”
The list of questions will indicate the type of question
WR: Written response
MC: Multiple choice
SA: Short answer
FIB: Fill in the blank
Choose the question you want to grade.
You can choose to grade more than one student per page using the numerical dropdown
The answer you submitted as correct when you wrote the question will display in blue
Grade the student’s response, set the points earned, and provide feedback in the feedback box
Move on to the next student
Save your feedback and scores when finished
Authored by: Casey Henley & Susan Halick
Assessing Learning
Posted on: #iteachmsu

Face to Face writing assessment: What an "acapandemic" year taught us about grading
Topic Area: Online Teaching & Learning
Presented by: Ann Burke, Jeff Austin, Gretchen Rumohr, Ellen Foley
Abstract:
This interactive workshop welcomes educators spanning K-12 and college contexts desiring to learn more about Face to Face(F2F) assessment. F2F aligns with the sanctuary space described by Oakley (2018): a space where we can experience safety and comfort on our own terms. Holding this space requires pushing back against institutional demands for efficiency, quantity, and data gathering to attend to granular, individualized needs of each student, creating opportunities for equitable learning environments.
Workshop facilitators share how to implement F2F while navigating potential challenges such as limited time and high enrollments in online and in-person spaces. We share what we learned from teaching during a pandemic and how logistical challenges traditionally encountered with F2F--such as scheduling and classroom management--can be negated with online platforms. This workshop validates and affirms, as Fassler (1978) and Corbett (2010) do, that grading can be a “synergistic, multi-vocal, live conversation.”
Beyond the “how to,” workshop facilitators detail how F2F humanizes pedagogy and encourages ownership and agency. Facilitators explain how F2F disrupts inequities and inequalities of traditional grading, demystifies the grading process, develops the classroom community, engages student writers, minimizes instructor fatigue and frustration, and brings about meaningful inquiry about writers’ own skills and practices. Those interested in writing assessment in both K-12 and higher education spaces are encouraged to attend.
Presented by: Ann Burke, Jeff Austin, Gretchen Rumohr, Ellen Foley
Abstract:
This interactive workshop welcomes educators spanning K-12 and college contexts desiring to learn more about Face to Face(F2F) assessment. F2F aligns with the sanctuary space described by Oakley (2018): a space where we can experience safety and comfort on our own terms. Holding this space requires pushing back against institutional demands for efficiency, quantity, and data gathering to attend to granular, individualized needs of each student, creating opportunities for equitable learning environments.
Workshop facilitators share how to implement F2F while navigating potential challenges such as limited time and high enrollments in online and in-person spaces. We share what we learned from teaching during a pandemic and how logistical challenges traditionally encountered with F2F--such as scheduling and classroom management--can be negated with online platforms. This workshop validates and affirms, as Fassler (1978) and Corbett (2010) do, that grading can be a “synergistic, multi-vocal, live conversation.”
Beyond the “how to,” workshop facilitators detail how F2F humanizes pedagogy and encourages ownership and agency. Facilitators explain how F2F disrupts inequities and inequalities of traditional grading, demystifies the grading process, develops the classroom community, engages student writers, minimizes instructor fatigue and frustration, and brings about meaningful inquiry about writers’ own skills and practices. Those interested in writing assessment in both K-12 and higher education spaces are encouraged to attend.
Authored by: Ann Burke, Jeff Austin, Gretchen Rumohr, Ellen Foley
Assessing Learning
Posted on: New Technologies

Free Assessment Tools: Feature Comparison
In an effort to help you understand which of the free grading and assessment tools offered by MSU IT you may wish to try, we've put together a brief feature comparison table that allows you to see, at a glance, what features exist in what tools.
If you'd like a consultation on selecting the right assessment tool, contact the MSU IT Service desk at (517)432-6200 or by e-mailing ithelp@msu.edu. They will set you up with a consultation with our Assessment Services office (formerly known as the Scoring Office).
Feature
Crowdmark
Gradescope
Digital Desk
Notes
Electronic assessment grading
x
x
Paper grading assessment
x
x
x
Commenting/Annotation
x
x
Collaborative grading
x
not clear
not yet
Analytics
x
x
x
Brightspace integration
x
x
working on that now
Scan exams
x
x
x
Crowdmark calls it "exam matching"
Create exams
x
x
x
Upload exams
x
x
x
Rubrics
x
x
Autograding
x
x
x
Academic integrity
x
x
Proctoring built-in
x
x
Digital Desk allows for 3rd party proctoring
AI assisted grading
x
x
not yet
If you'd like a consultation on selecting the right assessment tool, contact the MSU IT Service desk at (517)432-6200 or by e-mailing ithelp@msu.edu. They will set you up with a consultation with our Assessment Services office (formerly known as the Scoring Office).
Feature
Crowdmark
Gradescope
Digital Desk
Notes
Electronic assessment grading
x
x
Paper grading assessment
x
x
x
Commenting/Annotation
x
x
Collaborative grading
x
not clear
not yet
Analytics
x
x
x
Brightspace integration
x
x
working on that now
Scan exams
x
x
x
Crowdmark calls it "exam matching"
Create exams
x
x
x
Upload exams
x
x
x
Rubrics
x
x
Autograding
x
x
x
Academic integrity
x
x
Proctoring built-in
x
x
Digital Desk allows for 3rd party proctoring
AI assisted grading
x
x
not yet
Authored by: Jessica L. Knott
Assessing Learning
Posted on: #iteachmsu

Focusing on iteration and growth: Making the shift to ungrading
Topic Area: Student Success
Presented By: Candace Robertson, Brittany Dillman, Liz Boltz
Abstract:
How can we support student success by removing the barrier of grading? What impact would this have on feedback and iteration? Members from Team MAET (Master of Arts in Educational Technology) will share how they worked through these questions and others to move the majority of program courses to an ungrading philosophy as an act of social justice. In this session, you will learn from the triumphs, challenges, and solutions from their journey.Session resources:Google Slidedeck (website)
Presented By: Candace Robertson, Brittany Dillman, Liz Boltz
Abstract:
How can we support student success by removing the barrier of grading? What impact would this have on feedback and iteration? Members from Team MAET (Master of Arts in Educational Technology) will share how they worked through these questions and others to move the majority of program courses to an ungrading philosophy as an act of social justice. In this session, you will learn from the triumphs, challenges, and solutions from their journey.Session resources:Google Slidedeck (website)
Authored by: Candace Robertson, Brittany Dillman, Liz Boltz
Assessing Learning
Posted on: #iteachmsu

Comparative Analysis of Crowdmark and Gradescope
Executive Summary
This analysis presents a review and comparison of two instructional technologies for administering and digitally grading online and in-person assessments: Crowdmark and Gradescope. We tested both instructor and student workflows for creating, submitting, and grading assessments using Crowdmark and Gradescope integrated with a test course in D2L. Our evaluation criteria included ease of use, features available, accessibility, and flexibility. We found some key similarities:
Remote and in person assessments are supported, with multiple question types.
Grading is done by question rather than by student for more consistency.
Multiple graders can grade assignments, such as co-instructors and teaching assistants.
Grades are synced automatically with the gradebook in D2L Brightspace.
The primary differences between these two are:
Crowdmark can assign assessments according to sections and a drag and drop functionality is available for rubric comments.
Crowdmark emails students when assessments become available and can accept more file types as well as rotate files more easily.
Gradescope allows for time extensions at the course level as well as for each assessment and allows for grading the assessments before the due date.
Based on these findings, we recommend continuing with Crowdmark, the more established and familiar tool. Although Gradescope includes some extra functionalities over Crowdmark, such as programming assessments, these functions are already handled by other tools or have not been used often or at all by faculty (e.g., CSE 231 Introduction to Programming uses Mimir for programming assignments). Crowdmark also offers fast grade sync with the D2L gradebook and the scanning and matching capabilities are more robust for in person assessments.
"The second-best way to grade exams" by ilmungo is licensed under CC BY-NC-SA 2.0
Methods
We tested both instructor and student workflows for creating and submitting assessments using Crowdmark and Gradescope integrated with a test course in D2L. Sample assignments were created for the remote assessments that included all of the available question types (i.e., upload file, enter text, multiple choice, etc.). Using separate accounts, we assigned the assessments as an instructor, submitted the assessments as a student, then returned to the instructor account to grade the assessments and sync the grades to our D2L test course.
Findings
Key Similarities:
Both Crowdmark and Gradescope offer keyboard shortcuts for faster grading; allow late submissions, group submissions, and enforced time limits; and allow for grading by question instead of by student as well as multiple graders such as teaching assistants. Assignment submissions can include pdf or image upload, free response/short answer in a text box, or multiple choice/multi select type questions (with bubble sheets) for online assessments. For both tools, students can upload one PDF and then drag and drop each page to match each question for remote assessments, while instructors can scan and upload student submissions in batches for in person assessments. Both tools will also attempt to split a batch PDF into individual student submissions.
Key Differences:
Accessing Tools
Students have to login to Crowdmark through the Crowdmark website. This link can be added to D2L Brightspace and opened in a new, external web page. The Crowdmark sign-in prompts students to select their institution and then uses students’ Brightspace login. Gradescope can be added to D2L Brightspace as an External Tool in a D2L content module. This allows students to access Gradescope within D2L as an embedded website within the D2L page, instead of as an external page, and does not require any additional login.
Creating Assessments
When creating assessments in Crowdmark, instructors choose between administered (in person) assessments that instructors will upload or assigned (remote) assessments that students will upload (Figure 1). Administered assessments can include bubble sheets for multiple choice questions. Assigned remote assessments can include file upload, text entry responses, or multiple-choice questions (which are automatically graded).When creating an assignment in Gradescope, the assignment type must be chosen first. Then, for the first three assignment types, the submission type is designated as either the instructor or the students (Figure 2). Although Exam/Quiz and Homework/Problem Set are offered as two different choices, they actually have the same options and essential functions. There are no further options if the instructor will be uploading the assessments, but other options are available if students will be uploading. Submissions can be variable length, where students submit any number of pages and indicate the pages where their question responses are, or fixed length where students submit work where answers are in fixed locations (like worksheets). Instructors can also allow students to view and download the assessment template if desired. Multiple choice assignments can be created with printable bubble sheets that either instructors or students can upload. Programming assignments are available, which Crowdmark does not support, and they can be automatically or manually graded.
Figure 1: Assessment types available in Crowdmark.
Figure 2: Assessment types available in Gradescope.
Both tools have the ability for students to take online quizzes. Both have multiple choice and multi select that are auto-graded, and both have free response and file upload that are NOT auto-graded. Gradescope supports short answer questions which are auto-graded, but Crowdmark only has free response questions.For assignments that students will upload, instructors must input text or upload a document for each individual question in Crowdmark. It is possible for an instructor to upload one document in the instructions field which contains all of the assignment questions and then simply enter numbers in the text boxes for each question, rather than the text of each question. Gradescope only requires one document to be uploaded. Each question is then identified by dragging a box around each question area on the page and a question title must be entered.
Assigning & Distributing Assessments
For courses with several sections, Crowdmark allows assessments to be assigned to specific sections rather than the entire course. To approximate this feature in Gradescope, an instructor would have to create separate Gradescope courses or duplicate assignments and direct students to the appropriate version for their section.Both tools allow instructors to set individual accommodations for each assignment to customize due date, lateness penalty, or time to complete. However, Gradescope also allows course-wide extensions for students, where extensions can be added for all assignments to customize time limits (multiply time by x or add x minutes) and due dates. Crowdmark requires accommodations to be made in the submission area for each assignment. It does not support course-wide accommodations.When an assessment is assigned and released to students, Crowdmark sends a notification email to students, where Gradescope only sends an in-platform notification. Gradescope does send a confirmation email when students successfully submit an assignment. Both tools give instructors the option to send a notification email when returning student work.
Submitting Assessments
For in-person assessments, Crowdmark can include a QR code on assignments to ensure that every page of student work is correctly matched to the appropriate student for grading. The QR code can be manually scanned and matched to each student using an app as the assignment is turned in, or instructors can use automated matching (beta) to include a form field where students write their name and ID number for automated character recognition to identify the student and match them to that assignment’s QR code. Gradescope is developing a feature to create a unique label for each copy of an assignment and add that label to each page, but this is not currently available.Submitted file types are more flexible in Crowdmark, which can support PDF, JPEG, PNG, and iPhone photos, any of which can be rotated after submission. Gradescope accepts only PDFs or JPEGs and only PDF pages can be rotated. This means that Crowdmark offers much more flexibility in scanning software and orientation. Gradescope does have a built-in PDF scanner for iOS devices to circumvent format issues and allow seamless upload. Both tools assume that image submissions are of work associated with a single question. All work can be scanned into a single PDF for upload and each page then manually associated with each question in the assignment. In both tools, the student selects which question(s) are associated with each page(s), where multiple questions may be on a single page or multiple pages may be associated with a single question.Crowdmark allows for group submissions when either the instructor or the students scan and upload the assessments. This ability to match multiple students to one assessment allows for two-stage exams, collaborative lab reports, or other group assignments. Gradescope only allows group submissions when students scan and upload assessments, although online assignments also allow group submissions.
Grading Assessments
Assignments can be graded immediately after students have submitted them in Gradescope. Crowdmark does not allow grading to be done until the due date has passed.In Crowdmark, all feedback comments created for each question are stored in a comment library which can be reordered easily by dragging a comment to the desired location. There is no limit on the number of comments that can be dragged and dropped onto each student’s submission. Crowdmark comments can have positive or negative points attached to them, but specifying points is not required. Gradescope does not allow for dragging and dropping multiple comments; however, text annotations are saved for each question and several can be applied to each submission. The separate rubric comments must be associated with positive or negative points for each question. The rubric type can be either negative scoring, where the points are subtracted from 1.0, or positive scoring, where the points are added to 0. Score bounds can also be set, with a maximum of 1.0 and a minimum of 0. While it is possible to select more than one rubric comment, only one comment can be added as part of a “submission specific adjustment” which can include an additional point adjustment.Crowdmark sends grades to D2L and automatically creates the grade item in the gradebook. Gradescope requires that the grade item be created first, then associated with an assignment, before sending grades is possible.
Table 1: Feature Comparison between Crowdmark and Gradescope.
Topic
Crowdmark
Advantage
Gradescope
Accessing Tools
Must access through separate website; sign in to Crowdmark via Brightspace
Can add External Tool to D2L module and it can be accessed within D2L (embedded website into page)
Creating Assessments
Upload PDF and designate where questions are for administered assessments that instructors upload (drag question number to location on page)
Upload PDF and designate where questions are by dragging boxes on the page for fixed length exam/homework that students upload or an administered exam/homework that instructors upload
Must input or upload individual questions manually when creating remote assessments that students upload (but instructor can upload PDF in directions area and just enter Q1, Q2, etc. in text boxes)
Must input question titles separately for variable length submissions that students upload, but questions are designated by dragging box over location on page (no need to enter text of question in Gradescope)
Assigning & Distributing Assessments
Can assign assessments to a section rather than entire course
Cannot assign assessments to a section; must create separate course or duplicate assignments and instruct students which one to submit
Add time for accommodations for each assessment only (customize due date, lateness penalty, or time to complete)
Add extensions at course level and/or for each assessment (multiply time by x or add x minutes)
Students always receive email when new assignments are ready to be completed
Students are not notified when new assignments are ready; but students do receive email when they have submitted an assignment, and instructor has option to send email once the assignment is graded
Submitting Assessments
QR codes on printed work for in person administered assessments (can also use app to match assessments to students when scanning)
Create printouts (beta) for in person assessments; give each student a copy of the assignment with a unique label on each page (this tool is NOT yet available)
iPhone photos supported; can accept PDF, JPG, or PNG (and can rotate any file) for remote assignments submitted by students
iPhone photos not supported; accepts PDF or JPG only (can only rotate PDFs) for remote assignments submitted by students; multiple files and any file type accepted for online assignments
Allows for group submissions whether students or instructors are uploading assessments (i.e. match multiple students to one assessment)
Allows for group submissions only if students are uploading assessments, but also available for online assignments
Grading Assignments
Must wait until due date to begin grading remote assessments
Online assignments can be graded immediately
Drag and drop any number of comments from comment library for each question
Can apply one previously used comment for each submission separate from rubric; cannot select or drag and drop multiple comments, but can add multiple previously used text annotations for each question
Comments can have positive or negative points attached to them, but specifying points is not required
Comments must have associated points (positive, negative, or 0) for each question; can change rubric type from negative scoring (points subtracted from 1.0) to positive scoring (points added to 0) as well as enable/disable score bounds (max of 1.0 and min of 0)
Grades sent to D2L automatically with no need to create grade item first
Grades sent to D2L automatically but must create grade item first
MSU Usage Data
We explored the usage of each tool at MSU to determine if there was a perceptible trend towards one tool over the other. The total number of courses created in each tool is fairly similar (Table 2). Interestingly, the total number of students enrolled in those courses is much higher in Crowdmark, while the number of assessments administered is higher in Gradescope.
Table 2. Tool usage in courses with at least one student and at least one assessment.
Crowdmark
Gradescope
Courses
322
292
Students
25,322
14,398
Assessments
3,308
4,494
Crowdmark has been used by MSU instructors since 2016. Gradescope has been used since 2018. More courses were created in Crowdmark until the 2020 calendar year (Figure 3). Usage of both tools spiked in 2020, presumably due to the COVID-19 induced shift to remote teaching, and was fairly equivalent that year. For the Spring 2021 semester, more courses have been created in Gradescope. It will be interesting to observe whether this trend towards Gradescope usage continues as 2021 progresses or if Crowdmark usage picks back up.Given the disparity between number of students vs. number of classes & assessments, we explored the frequency of class sizes between the two tools (Figure 4). Both tools have been used for classes of all sizes, though the median class size is 37 for Gradescope and 63 for Crowdmark. We also explored the frequency of assessment numbers between the tools (Figure 5). We found that all but one course had 1-60 assessments created, with both tools most frequently having 2-20 assessments. Gradescope showed an interesting secondary peak of courses having 35-45 assessments. We do not have detailed information for either tool on what kinds of assessments were created or whether all of those assessments were actually used, not just created in the course for practice, or duplicates (e.g., available later, more accessible, or different versions for different class sections in Gradescope).
Figure 3. Number of courses created in each tool that had at least one student and at least one assessment for each calendar year since 2016.
Figure 4. Number of courses having a given class size and at least one assessment.
Figure 5. Number of classes having a given number of assessments and at least one student.
Discussion:
Our analysis showed significant functional overlap between Crowdmark and Gradescope, where either tool could be chosen with little to no impact on instructor capability. However, there are a few advantages to the way that Crowdmark handles assignment tracking, submission, and grade syncing to D2L. In particular, Crowdmark already offers a fast QR-code method for matching every page of in-person assessments to the appropriate student enrolled in the course when scanning the assessments in batches. We expect this feature will become a strong asset in the Fall 2021 semester as more classes will be on campus. If we were to choose between Crowdmark and Gradescope for continued support, we would recommend Crowdmark. Gradescope is a competitive technology, but it is still developing and refining capabilities that are already available through Crowdmark or D2L. If an instructor were to need to switch from Gradescope to Crowdmark, they should refer to the D2L self-enroll course “MSU Tools and Technologies” for detailed information and resources on using Crowdmark at MSU and closely review Table 1 to understand the key differences they may encounter. The Assessment Services team and/or Instructional Technology & Development team in the IT department are also available for one-on-one consultation on using either technology (request a consultation via the MSU Help Desk).
This analysis presents a review and comparison of two instructional technologies for administering and digitally grading online and in-person assessments: Crowdmark and Gradescope. We tested both instructor and student workflows for creating, submitting, and grading assessments using Crowdmark and Gradescope integrated with a test course in D2L. Our evaluation criteria included ease of use, features available, accessibility, and flexibility. We found some key similarities:
Remote and in person assessments are supported, with multiple question types.
Grading is done by question rather than by student for more consistency.
Multiple graders can grade assignments, such as co-instructors and teaching assistants.
Grades are synced automatically with the gradebook in D2L Brightspace.
The primary differences between these two are:
Crowdmark can assign assessments according to sections and a drag and drop functionality is available for rubric comments.
Crowdmark emails students when assessments become available and can accept more file types as well as rotate files more easily.
Gradescope allows for time extensions at the course level as well as for each assessment and allows for grading the assessments before the due date.
Based on these findings, we recommend continuing with Crowdmark, the more established and familiar tool. Although Gradescope includes some extra functionalities over Crowdmark, such as programming assessments, these functions are already handled by other tools or have not been used often or at all by faculty (e.g., CSE 231 Introduction to Programming uses Mimir for programming assignments). Crowdmark also offers fast grade sync with the D2L gradebook and the scanning and matching capabilities are more robust for in person assessments.
"The second-best way to grade exams" by ilmungo is licensed under CC BY-NC-SA 2.0
Methods
We tested both instructor and student workflows for creating and submitting assessments using Crowdmark and Gradescope integrated with a test course in D2L. Sample assignments were created for the remote assessments that included all of the available question types (i.e., upload file, enter text, multiple choice, etc.). Using separate accounts, we assigned the assessments as an instructor, submitted the assessments as a student, then returned to the instructor account to grade the assessments and sync the grades to our D2L test course.
Findings
Key Similarities:
Both Crowdmark and Gradescope offer keyboard shortcuts for faster grading; allow late submissions, group submissions, and enforced time limits; and allow for grading by question instead of by student as well as multiple graders such as teaching assistants. Assignment submissions can include pdf or image upload, free response/short answer in a text box, or multiple choice/multi select type questions (with bubble sheets) for online assessments. For both tools, students can upload one PDF and then drag and drop each page to match each question for remote assessments, while instructors can scan and upload student submissions in batches for in person assessments. Both tools will also attempt to split a batch PDF into individual student submissions.
Key Differences:
Accessing Tools
Students have to login to Crowdmark through the Crowdmark website. This link can be added to D2L Brightspace and opened in a new, external web page. The Crowdmark sign-in prompts students to select their institution and then uses students’ Brightspace login. Gradescope can be added to D2L Brightspace as an External Tool in a D2L content module. This allows students to access Gradescope within D2L as an embedded website within the D2L page, instead of as an external page, and does not require any additional login.
Creating Assessments
When creating assessments in Crowdmark, instructors choose between administered (in person) assessments that instructors will upload or assigned (remote) assessments that students will upload (Figure 1). Administered assessments can include bubble sheets for multiple choice questions. Assigned remote assessments can include file upload, text entry responses, or multiple-choice questions (which are automatically graded).When creating an assignment in Gradescope, the assignment type must be chosen first. Then, for the first three assignment types, the submission type is designated as either the instructor or the students (Figure 2). Although Exam/Quiz and Homework/Problem Set are offered as two different choices, they actually have the same options and essential functions. There are no further options if the instructor will be uploading the assessments, but other options are available if students will be uploading. Submissions can be variable length, where students submit any number of pages and indicate the pages where their question responses are, or fixed length where students submit work where answers are in fixed locations (like worksheets). Instructors can also allow students to view and download the assessment template if desired. Multiple choice assignments can be created with printable bubble sheets that either instructors or students can upload. Programming assignments are available, which Crowdmark does not support, and they can be automatically or manually graded.
Figure 1: Assessment types available in Crowdmark.
Figure 2: Assessment types available in Gradescope.
Both tools have the ability for students to take online quizzes. Both have multiple choice and multi select that are auto-graded, and both have free response and file upload that are NOT auto-graded. Gradescope supports short answer questions which are auto-graded, but Crowdmark only has free response questions.For assignments that students will upload, instructors must input text or upload a document for each individual question in Crowdmark. It is possible for an instructor to upload one document in the instructions field which contains all of the assignment questions and then simply enter numbers in the text boxes for each question, rather than the text of each question. Gradescope only requires one document to be uploaded. Each question is then identified by dragging a box around each question area on the page and a question title must be entered.
Assigning & Distributing Assessments
For courses with several sections, Crowdmark allows assessments to be assigned to specific sections rather than the entire course. To approximate this feature in Gradescope, an instructor would have to create separate Gradescope courses or duplicate assignments and direct students to the appropriate version for their section.Both tools allow instructors to set individual accommodations for each assignment to customize due date, lateness penalty, or time to complete. However, Gradescope also allows course-wide extensions for students, where extensions can be added for all assignments to customize time limits (multiply time by x or add x minutes) and due dates. Crowdmark requires accommodations to be made in the submission area for each assignment. It does not support course-wide accommodations.When an assessment is assigned and released to students, Crowdmark sends a notification email to students, where Gradescope only sends an in-platform notification. Gradescope does send a confirmation email when students successfully submit an assignment. Both tools give instructors the option to send a notification email when returning student work.
Submitting Assessments
For in-person assessments, Crowdmark can include a QR code on assignments to ensure that every page of student work is correctly matched to the appropriate student for grading. The QR code can be manually scanned and matched to each student using an app as the assignment is turned in, or instructors can use automated matching (beta) to include a form field where students write their name and ID number for automated character recognition to identify the student and match them to that assignment’s QR code. Gradescope is developing a feature to create a unique label for each copy of an assignment and add that label to each page, but this is not currently available.Submitted file types are more flexible in Crowdmark, which can support PDF, JPEG, PNG, and iPhone photos, any of which can be rotated after submission. Gradescope accepts only PDFs or JPEGs and only PDF pages can be rotated. This means that Crowdmark offers much more flexibility in scanning software and orientation. Gradescope does have a built-in PDF scanner for iOS devices to circumvent format issues and allow seamless upload. Both tools assume that image submissions are of work associated with a single question. All work can be scanned into a single PDF for upload and each page then manually associated with each question in the assignment. In both tools, the student selects which question(s) are associated with each page(s), where multiple questions may be on a single page or multiple pages may be associated with a single question.Crowdmark allows for group submissions when either the instructor or the students scan and upload the assessments. This ability to match multiple students to one assessment allows for two-stage exams, collaborative lab reports, or other group assignments. Gradescope only allows group submissions when students scan and upload assessments, although online assignments also allow group submissions.
Grading Assessments
Assignments can be graded immediately after students have submitted them in Gradescope. Crowdmark does not allow grading to be done until the due date has passed.In Crowdmark, all feedback comments created for each question are stored in a comment library which can be reordered easily by dragging a comment to the desired location. There is no limit on the number of comments that can be dragged and dropped onto each student’s submission. Crowdmark comments can have positive or negative points attached to them, but specifying points is not required. Gradescope does not allow for dragging and dropping multiple comments; however, text annotations are saved for each question and several can be applied to each submission. The separate rubric comments must be associated with positive or negative points for each question. The rubric type can be either negative scoring, where the points are subtracted from 1.0, or positive scoring, where the points are added to 0. Score bounds can also be set, with a maximum of 1.0 and a minimum of 0. While it is possible to select more than one rubric comment, only one comment can be added as part of a “submission specific adjustment” which can include an additional point adjustment.Crowdmark sends grades to D2L and automatically creates the grade item in the gradebook. Gradescope requires that the grade item be created first, then associated with an assignment, before sending grades is possible.
Table 1: Feature Comparison between Crowdmark and Gradescope.
Topic
Crowdmark
Advantage
Gradescope
Accessing Tools
Must access through separate website; sign in to Crowdmark via Brightspace
Can add External Tool to D2L module and it can be accessed within D2L (embedded website into page)
Creating Assessments
Upload PDF and designate where questions are for administered assessments that instructors upload (drag question number to location on page)
Upload PDF and designate where questions are by dragging boxes on the page for fixed length exam/homework that students upload or an administered exam/homework that instructors upload
Must input or upload individual questions manually when creating remote assessments that students upload (but instructor can upload PDF in directions area and just enter Q1, Q2, etc. in text boxes)
Must input question titles separately for variable length submissions that students upload, but questions are designated by dragging box over location on page (no need to enter text of question in Gradescope)
Assigning & Distributing Assessments
Can assign assessments to a section rather than entire course
Cannot assign assessments to a section; must create separate course or duplicate assignments and instruct students which one to submit
Add time for accommodations for each assessment only (customize due date, lateness penalty, or time to complete)
Add extensions at course level and/or for each assessment (multiply time by x or add x minutes)
Students always receive email when new assignments are ready to be completed
Students are not notified when new assignments are ready; but students do receive email when they have submitted an assignment, and instructor has option to send email once the assignment is graded
Submitting Assessments
QR codes on printed work for in person administered assessments (can also use app to match assessments to students when scanning)
Create printouts (beta) for in person assessments; give each student a copy of the assignment with a unique label on each page (this tool is NOT yet available)
iPhone photos supported; can accept PDF, JPG, or PNG (and can rotate any file) for remote assignments submitted by students
iPhone photos not supported; accepts PDF or JPG only (can only rotate PDFs) for remote assignments submitted by students; multiple files and any file type accepted for online assignments
Allows for group submissions whether students or instructors are uploading assessments (i.e. match multiple students to one assessment)
Allows for group submissions only if students are uploading assessments, but also available for online assignments
Grading Assignments
Must wait until due date to begin grading remote assessments
Online assignments can be graded immediately
Drag and drop any number of comments from comment library for each question
Can apply one previously used comment for each submission separate from rubric; cannot select or drag and drop multiple comments, but can add multiple previously used text annotations for each question
Comments can have positive or negative points attached to them, but specifying points is not required
Comments must have associated points (positive, negative, or 0) for each question; can change rubric type from negative scoring (points subtracted from 1.0) to positive scoring (points added to 0) as well as enable/disable score bounds (max of 1.0 and min of 0)
Grades sent to D2L automatically with no need to create grade item first
Grades sent to D2L automatically but must create grade item first
MSU Usage Data
We explored the usage of each tool at MSU to determine if there was a perceptible trend towards one tool over the other. The total number of courses created in each tool is fairly similar (Table 2). Interestingly, the total number of students enrolled in those courses is much higher in Crowdmark, while the number of assessments administered is higher in Gradescope.
Table 2. Tool usage in courses with at least one student and at least one assessment.
Crowdmark
Gradescope
Courses
322
292
Students
25,322
14,398
Assessments
3,308
4,494
Crowdmark has been used by MSU instructors since 2016. Gradescope has been used since 2018. More courses were created in Crowdmark until the 2020 calendar year (Figure 3). Usage of both tools spiked in 2020, presumably due to the COVID-19 induced shift to remote teaching, and was fairly equivalent that year. For the Spring 2021 semester, more courses have been created in Gradescope. It will be interesting to observe whether this trend towards Gradescope usage continues as 2021 progresses or if Crowdmark usage picks back up.Given the disparity between number of students vs. number of classes & assessments, we explored the frequency of class sizes between the two tools (Figure 4). Both tools have been used for classes of all sizes, though the median class size is 37 for Gradescope and 63 for Crowdmark. We also explored the frequency of assessment numbers between the tools (Figure 5). We found that all but one course had 1-60 assessments created, with both tools most frequently having 2-20 assessments. Gradescope showed an interesting secondary peak of courses having 35-45 assessments. We do not have detailed information for either tool on what kinds of assessments were created or whether all of those assessments were actually used, not just created in the course for practice, or duplicates (e.g., available later, more accessible, or different versions for different class sections in Gradescope).
Figure 3. Number of courses created in each tool that had at least one student and at least one assessment for each calendar year since 2016.
Figure 4. Number of courses having a given class size and at least one assessment.
Figure 5. Number of classes having a given number of assessments and at least one student.
Discussion:
Our analysis showed significant functional overlap between Crowdmark and Gradescope, where either tool could be chosen with little to no impact on instructor capability. However, there are a few advantages to the way that Crowdmark handles assignment tracking, submission, and grade syncing to D2L. In particular, Crowdmark already offers a fast QR-code method for matching every page of in-person assessments to the appropriate student enrolled in the course when scanning the assessments in batches. We expect this feature will become a strong asset in the Fall 2021 semester as more classes will be on campus. If we were to choose between Crowdmark and Gradescope for continued support, we would recommend Crowdmark. Gradescope is a competitive technology, but it is still developing and refining capabilities that are already available through Crowdmark or D2L. If an instructor were to need to switch from Gradescope to Crowdmark, they should refer to the D2L self-enroll course “MSU Tools and Technologies” for detailed information and resources on using Crowdmark at MSU and closely review Table 1 to understand the key differences they may encounter. The Assessment Services team and/or Instructional Technology & Development team in the IT department are also available for one-on-one consultation on using either technology (request a consultation via the MSU Help Desk).
Authored by: Jennifer Wagner & Natalie Vandepol
Posted on: #iteachmsu

CISGS Syllabus Template (Natural Science)
Here is a syllabus template that 1) meets MSU requirements as of 2023, 2) is accessible for online documents, 3) meets or describes how to meet most Quality Matters rubric criteria, and 4) encourages an inclusive and welcoming class. This includes a thorough list of student resources, statements that reduce the hidden curriculum (such as describing the purpose of office hours), and notes that continually encourage students to seek assistance from the instructor or resources.This template was developed for the Center for Integrative Studies in General Science (CISGS), College of Natural Science, but most of it is appropriate for other departments. The CISGS-specific aspects are highlighted in green (or search for CISGS in the file) and therefore can be easily removed. The template begins with an introduction to instructors so that you can learn more about the development of the syllabus. It was developed summer 2023 and is periodically updated- if you notice any issues (e.g., broken URL links), then please contact Andrea Bierema (abierema@msu.edu). To see any updates since last you looked at the template, once in the file, click "File," click "version history," select any version dates since last you looked at the file, and turn on "show changes."Check out the syllabus template, use whatever information you find useful, or start from the beginning by downloading and editing the template for your class!Featured Image: curriculum by Candy Design from <a href="https://thenounproject.com/browse/icons/term/curriculum/" target="_blank" title="curriculum Icons">Noun Project</a> (CC BY 3.0)
Authored by: Andrea Bierema
Navigating Context
Posted on: Graduate Teaching A...

Teaching and Effective Classroom Practices for any Educator
Teaching and Effective Classroom Practices for any Educator
2022-23 Graduate Teaching Assistant Preparation
The Graduate School Teaching Development Unit offers all international, new, and returning graduate teaching assistants (GTAs) an orientation and preparation program to get familiar with teaching in the U.S. as well as learn about important policies and their implementation, about supporting student success, being culturally responsive and communicate effectively and set healthy boundaries. In addition, accomplished educators deliver pedagogy workshop for educators.
This year, the Pedagogy Workshops and Best Practices in Teaching Sessions are offered in person at the STEM Teaching & Learning Facility (642 Cedar Rd.). Any educator can register and participate. Find the link to register for any of the workshops underneath the table with all workshop titles.
Workshops Round 1 (Select one)
Time (all ET)
Workshop Title
9:00 – 10:30 am
Room 2130
Preparing for Your First Day of Teaching & Cultivating Student Learning (Presenters: Stefanie Baier and Ellen Searle)
9:00 – 10:30 am
Room 2202
Promoting Student Engagement in Large Lecture-Based Courses
(Presenter: Kirstin Parkin)
10:30 – 11:00 am
BREAK
Workshops Round 2 (Select one)
11:00 – 12:30 pm
Room 2130
“What’s in Your Syllabus?”: Creating and Using Syllabi for Successful Teaching and Learning
(Presenter: Mary-Beth Heeder)
11:00 – 12:30 pm
Room 2202
Developing a Plan for Effective Grading: Technology, Communication, and Time-Management (Presenters: Seth Hunt and Chase Bruggeman)
12:30 – 1:00 pm
BREAK
Workshops Round 3 (Select one)
1:00 – 2:30 pm
Room 2130
Scientific Teaching and Assessing What’s Important in STEM Learning (Presenter: Diane Ebert May)
1:00 – 2:30 pm
Room 2202
Navigating Challenges: How to Be a Trauma-Informed Educator
(Presenter: Hima Rawal)
Register for your Workshops HERE
For more information about Graduate Student Teaching Professional Development Opportunities, go to https://grad.msu.edu/gtap and check the Graduate School calendar for sessions throughout the year.
2022-23 Graduate Teaching Assistant Preparation
The Graduate School Teaching Development Unit offers all international, new, and returning graduate teaching assistants (GTAs) an orientation and preparation program to get familiar with teaching in the U.S. as well as learn about important policies and their implementation, about supporting student success, being culturally responsive and communicate effectively and set healthy boundaries. In addition, accomplished educators deliver pedagogy workshop for educators.
This year, the Pedagogy Workshops and Best Practices in Teaching Sessions are offered in person at the STEM Teaching & Learning Facility (642 Cedar Rd.). Any educator can register and participate. Find the link to register for any of the workshops underneath the table with all workshop titles.
Workshops Round 1 (Select one)
Time (all ET)
Workshop Title
9:00 – 10:30 am
Room 2130
Preparing for Your First Day of Teaching & Cultivating Student Learning (Presenters: Stefanie Baier and Ellen Searle)
9:00 – 10:30 am
Room 2202
Promoting Student Engagement in Large Lecture-Based Courses
(Presenter: Kirstin Parkin)
10:30 – 11:00 am
BREAK
Workshops Round 2 (Select one)
11:00 – 12:30 pm
Room 2130
“What’s in Your Syllabus?”: Creating and Using Syllabi for Successful Teaching and Learning
(Presenter: Mary-Beth Heeder)
11:00 – 12:30 pm
Room 2202
Developing a Plan for Effective Grading: Technology, Communication, and Time-Management (Presenters: Seth Hunt and Chase Bruggeman)
12:30 – 1:00 pm
BREAK
Workshops Round 3 (Select one)
1:00 – 2:30 pm
Room 2130
Scientific Teaching and Assessing What’s Important in STEM Learning (Presenter: Diane Ebert May)
1:00 – 2:30 pm
Room 2202
Navigating Challenges: How to Be a Trauma-Informed Educator
(Presenter: Hima Rawal)
Register for your Workshops HERE
For more information about Graduate Student Teaching Professional Development Opportunities, go to https://grad.msu.edu/gtap and check the Graduate School calendar for sessions throughout the year.
Authored by: Stefanie Baier & the GTA Teaching Learning Community, Graduate School
Pedagogical Design
Posted on: #iteachmsu
Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom
By Asao B. Inoue
Copy edited by Don Donahue. Designed by Mike Palmquist.
In Labor-Based Grading Contracts, Asao B. Inoue argues for the use of labor-based grading contracts along with compassionate practices to determine course grades as a way to do social justice work with students. He frames this practice by considering how Freirean problem-posing led him to experiment with grading contracts and explore the literature on grading contracts. Inoue offers a robust Marxian theory of labor that considers Hannah Arendt's theory of labor-work-action and Barbara Adam's concept of "timescapes." The heart of the book details the theoretical and practical ways labor-based grading contracts can be used and assessed for effectiveness in classrooms and programs. Inoue concludes the book by moving outside the classroom, considering how assessing writing in the socially just ways he offers in the book may provide a way to address the violence and discord seen in the world today.
Access FULL TEXT in attachment
Inoue, Asao B. (2019). Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom. The WAC Clearinghouse; University Press of Colorado. https://doi.org/10.37514/PER-B.2019.0216.0
Accessed via https://wac.colostate.edu/books/perspectives/labor/?fbclid=IwAR1ZJWZbLYuAU4aQhQ9xlBiIzbX60bGg_VGQwwnZImFUnofX1L5Il2Ec53w
By Asao B. Inoue
Copy edited by Don Donahue. Designed by Mike Palmquist.
In Labor-Based Grading Contracts, Asao B. Inoue argues for the use of labor-based grading contracts along with compassionate practices to determine course grades as a way to do social justice work with students. He frames this practice by considering how Freirean problem-posing led him to experiment with grading contracts and explore the literature on grading contracts. Inoue offers a robust Marxian theory of labor that considers Hannah Arendt's theory of labor-work-action and Barbara Adam's concept of "timescapes." The heart of the book details the theoretical and practical ways labor-based grading contracts can be used and assessed for effectiveness in classrooms and programs. Inoue concludes the book by moving outside the classroom, considering how assessing writing in the socially just ways he offers in the book may provide a way to address the violence and discord seen in the world today.
Access FULL TEXT in attachment
Inoue, Asao B. (2019). Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom. The WAC Clearinghouse; University Press of Colorado. https://doi.org/10.37514/PER-B.2019.0216.0
Accessed via https://wac.colostate.edu/books/perspectives/labor/?fbclid=IwAR1ZJWZbLYuAU4aQhQ9xlBiIzbX60bGg_VGQwwnZImFUnofX1L5Il2Ec53w
Posted by: Makena Neal
Assessing Learning
Posted on: Ungrading (a CoP)
Alternative Grading Conference
It's virtual and inexpensive. I found last year's meeting helpful and insightful.
https://thegradingconference.com/
It's virtual and inexpensive. I found last year's meeting helpful and insightful.
https://thegradingconference.com/
Posted by: Laura M Markham
Assessing Learning
Posted on: Ungrading (a CoP)
Multiple stories and sentiments were generously shared by 4/4 Beyond Buzzwords: Ungrading workshop participants (thank you for your vulnerability and candor) about the varied ways in which students react to, and make assumption / inferences about their instructors, after the employment of ungrading and ungrading-inspired practices.
This article (linked below) "Academe Has a Lot to Learn About How Inclusive Teaching Affects Instructors" By Chavella Pittman and Thomas J. Tobin in The Chronicle of Higher Education on FEBRUARY 7, 2022 will likely be of interest to you. Starting out by recognizing / acknowledging the power held by some identities (core, chosen, and given) but not by others, complicates the idea that all educators have the same "power and authority" to give up/share to increase learners' sense of ownership and agency in the classroom. ""What if you have neither the institutional authority (a full-time or tenure-track job) nor the dominant-culture identity (by virtue of your race, gender, and/or ability) that usually go hand in hand with being treated as a respected, powerful presence in the college classroom?... In urging faculty members to adopt inclusive teaching practices, we need to start asking if they actually can — and at what cost, " say Pittman and Tobin.
Take-aways shared in this piece include:
1. Understand that your classroom choices may unintentionally affect or undercut a colleague
2. Discuss in your department the issue of bias in students' rating of teaching
3. Respect the variability among your colleagues, as well as among your students
4. Find trained help
"Share your stories, experiences, and thought processes as you negotiate your instructor role in the classroom..." iteach.msu.edu is one space where we can continue to help "normalize the conversation about instructor identity and status as a necessary element in the adoption of inclusive design and teaching practices".
https://www.chronicle.com/article/academe-has-a-lot-to-learn-about-how-inclusive-teaching-affects-instructors
This article (linked below) "Academe Has a Lot to Learn About How Inclusive Teaching Affects Instructors" By Chavella Pittman and Thomas J. Tobin in The Chronicle of Higher Education on FEBRUARY 7, 2022 will likely be of interest to you. Starting out by recognizing / acknowledging the power held by some identities (core, chosen, and given) but not by others, complicates the idea that all educators have the same "power and authority" to give up/share to increase learners' sense of ownership and agency in the classroom. ""What if you have neither the institutional authority (a full-time or tenure-track job) nor the dominant-culture identity (by virtue of your race, gender, and/or ability) that usually go hand in hand with being treated as a respected, powerful presence in the college classroom?... In urging faculty members to adopt inclusive teaching practices, we need to start asking if they actually can — and at what cost, " say Pittman and Tobin.
Take-aways shared in this piece include:
1. Understand that your classroom choices may unintentionally affect or undercut a colleague
2. Discuss in your department the issue of bias in students' rating of teaching
3. Respect the variability among your colleagues, as well as among your students
4. Find trained help
"Share your stories, experiences, and thought processes as you negotiate your instructor role in the classroom..." iteach.msu.edu is one space where we can continue to help "normalize the conversation about instructor identity and status as a necessary element in the adoption of inclusive design and teaching practices".
https://www.chronicle.com/article/academe-has-a-lot-to-learn-about-how-inclusive-teaching-affects-instructors
Posted by: Makena Neal
Pedagogical Design
Posted on: Ungrading (a CoP)
the Center for Integrative Studies in the Arts and Humanities invites you to attend a workshop on Alternate Grading April 21st, from 10 to 11:30 am via Zoom.
We are honored to welcome Prof. Nicole Coleman of Wayne State University to run the workshop. If you are interested in learning ways to prioritize learning over grading and to make assessments more meaningful for students, you may want to consider a new grading system. Coleman will lead an interactive program on her experiences with teaching courses in both the Specs Grading and Ungrading structures. She will provide some information on how each system works and the theory behind them. She will then guide educators in adjusting an assignment or a syllabus to work with these methods. Please bring a rubric and/or a syllabus to the session to be able to participate fully in this workshop.
We are honored to welcome Prof. Nicole Coleman of Wayne State University to run the workshop. If you are interested in learning ways to prioritize learning over grading and to make assessments more meaningful for students, you may want to consider a new grading system. Coleman will lead an interactive program on her experiences with teaching courses in both the Specs Grading and Ungrading structures. She will provide some information on how each system works and the theory behind them. She will then guide educators in adjusting an assignment or a syllabus to work with these methods. Please bring a rubric and/or a syllabus to the session to be able to participate fully in this workshop.
Posted by: Makena Neal
Pedagogical Design
Host: CTLI
"Welcome to My Classroom" Series: Dr. Brittany Dillman
Please join the Center for Teaching and Learning Innovation as we showcase some of MSU's educators and the great work they are doing. Step into a virtual space alive with enthusiasm and curiosity as passionate educators unveil their most effective teaching methods. From technology integration and active learning strategies to inventive assessments, each presenter offers a glimpse into their teaching journey, providing attendees with a collection of adaptable ideas. Engage in discussions, ask questions, and connect with peers to cultivate a collaborative spirit that transcends disciplines.
The fourth in the Welcome to My Classroom series will feature Dr. Brittany Dillman, educator and Director of Graduate Certificate Programs for the College of Education’s Master of Arts in Educational Technology. Brittany loves working with the MAET program especially her roles of advising all GC students, curriculum development for GC and MAET courses, and teaching both online courses and hybrid courses. She is also a CTLI Affiliate sharing her experiences and expertise in ungrading. For this Welcome to My Classroom, Brittany will showcase a single assignment, from design to grading, as an example of her educator practice.
Brittany has a doctorate from the Educational Psychology and Educational Technology program at Michigan State University. She is curious about teachers’ decision making process. She taught middle school mathematics for a decade prior to coming to MSU. Brittany is organized by nature and loves to alphabetize and color code. She loves being silly with her family, traveling, making books on Shutterfly, teaching MAET courses, and working with MAET students. Learn more about Brittany at her website.
Our Welcome to My Classroom series aims to be a catalyst for continuous improvement, uniting educators in their commitment to elevate the art of teaching. Join us in celebrating the dedication and creativity that drive education forward, as we learn from one another and collectively enrich the learning experience for both educators and students alike.
The "Welcome to My Classroom" series will function like a pedagogy and practice show and tell where educators from throughout MSU's ecosystem share something from their teaching and learning practice. Examples of an educator's showcase could include a walk through of a specific activity or assignment, sharing out the integration of a particular educational technology, describing their process of redesigning a learning experience, and more!
*for any educator interested in hosting a Welcome to My Classroom, please contact Makena Neal at mneal@msu.edu
Navigating Context
Host: CTLI
Community Engaged Learning In the Classroom: Rubrics, Reflections, and Resources
Date: Monday, November 13, 2023 from 9:30 - 11:00 AM
Location: Via Zoom
Registration Deadline: Friday, November 10, 2023
This session will discuss creative examples, strategies for grading, and best practices for leading students to reflect upon and make meaning of their community-engaged learning experiences. Hear from faculty peers, dialogue with colleagues, and leave with new ideas to enhance student learning and complement your teaching practices.
This workshop is hosted by the Center for Community Engaged Learning (CCEL) through MSU's office of University Outreach and Engagement. Community Engaged Learning is a teaching and learning strategy that integrates meaningful community partnerships with instruction and critical reflection to enrich the student learning experience, teach civic and social responsibility, and strengthen communities. CCEL is committed to supporting students, faculty/staff, and community partners in many ways.
Register for Community Engaged Learning In the Classroom: Rubrics, Reflections, and Resources by 12/05/23.
Navigating Context