We found 181 results that contain "grading"
Posted on: #iteachmsu

Instruction, Feedback, Assessments & Centering Students in Remote Environments
This playlist is a growing collection of content aimed at supporting educators as they traverse ongoing shifts in teaching environment, procedures related to grading, and other uncertainties that results from ongoing pandemics... all the while keeping student success at the core of their work.
ASSESSING LEARNING
Posted on: #iteachmsu
A "Complete" Guide to Writing Syllabi: A Constant Cycle
The syllabus in a college class serves as the first impression between a course and its students. It often wears many hats acting as: a schedule, list of rules, summary of course policies, semi-grading rubric, and various other roles depending on its author. Due to the heavy lifting it provides to a course and its structure a plethora of research has been conducted on its value, and Universities often hold seminars each year on the process of creating and drafting syllabi for their staff. To understand how students and instructors view the role of syllabi in the classroom authors Gauthier, Banner, And Winer attempt introduce a framework in their piece: “What is the syllabus for? Revealing tensions through a scoping review of syllabus uses”
In it, they identify nine interconnected uses which are then categorized into three primary purposes or tools: an Administrative Tool, a Learning Tool, and a Teaching Tool. The goal of this project is to take their writing and configure the information into a writing guide to help instructors write/develop/improve their own syllabi for their own courses. While this may appear as though this is designed as a developmental tool (because in part, it is), it is my goal that this project truly captures the necessity of treating the creation of syllabus as a fluid, recursive and reflective process. As we develop as instructors, and the student bodies we teach change through the times, so must our syllabi change with it.
In it, they identify nine interconnected uses which are then categorized into three primary purposes or tools: an Administrative Tool, a Learning Tool, and a Teaching Tool. The goal of this project is to take their writing and configure the information into a writing guide to help instructors write/develop/improve their own syllabi for their own courses. While this may appear as though this is designed as a developmental tool (because in part, it is), it is my goal that this project truly captures the necessity of treating the creation of syllabus as a fluid, recursive and reflective process. As we develop as instructors, and the student bodies we teach change through the times, so must our syllabi change with it.
PEDAGOGICAL DESIGN
Posted on: #iteachmsu

Grading & Giving Feedback
Edit a Question During its Availability
Occasionally, a test question will need to be edited while an exam is in progress.
Quizzes – Manually Grade a Quiz - Instructor
Short answer questions, although auto-graded by D2L, should be double-checked for grading accuracy.
D2L Assessment Analytics
Examining quiz question statistics can help instructors determine if a question is too easy, too challenging, or needs editing for clarification.
The following is a quick guide for D2L Quiz and Grade Item statistics to help you monitor and improve your assessment questions and results.
D2L Quiz Statistics
To see how students performed overall on each of the quizzes, in your own course go to Assessments > Quizzes > Statistics (click on Statistics from the tab view across the top).
This list displays all of your course quiz averages.
Click on a quiz to see more details including User Stats, Question Stats, and Question Details.
Question Stats
The Question Stats list the Standard Deviation, Discrimination Index, and Point Biserial value for each question.
You can click on the link, "What do the statistics on this page mean?" above the table in your course to learn more. The information is also copied below.
What do the statistics on this page mean?
All statistics are calculated based on each user’s first attempt on the quiz. If a question is changed after attempts have been made, only the attempts on the newest version of the question are included in the statistics (ie. First attempts made before a question was changed are not included in the statistics for that question).
STANDARD DEVIATION
The standard deviation indicates how much scores vary from the average, ranging from 0% to 100%. A high standard deviation indicates that scores are spread out from the average, whereas a low standard deviation indicates that scores are close to the average.
DISCRIMINATION INDEX
The discrimination index indicates how well a question differentiates between high and low performers. It can range from -100% to 100%, with high values indicating a “good” question, and low values indicating a “bad” question.
POINT BISERIAL CORRELATION COEFFICIENT
The point biserial correlation coefficient is an analysis only applied to multiple choice and true/false question types that have only one answer with weight 100%, and all others with weight 0%.
Similarly to the discrimination index, the point biserial correlation coefficient relates individuals’ quiz scores to whether or not they got a question correct. It ranges from -1.00 to 1.00, with high values indicating a “good” question, and low values indicating a “bad” question.
*Note that only first attempts are included in that question's statistics.
Question Details
This tab will show you the summary of student responses for each question. If you notice a low or negative value for the Point Biserial or Discrimination Index, you may want to investigate the question. It could indicate a badly worded question or improperly keyed question answer.
For more, view the video tutorial on Generating Reports in D2L Learning Environment opens in new window. Currently, the statistics do not display for random "pool item" question types. Contact the MSU Service Desk to check on obtaining reports through the Data Hub.
Grade Item Statistics
To view grade item stats, in your own course go to, Assessments > Grades > (Grade Item) View Statistics – Use the pull down menu by a grade item title and select Statistics to display Class and User Statistics. If you have a grade scheme setup to display, you will also see the grade distribution chart on the page.
Working with student data
Keep the MSU Institutional Data Policy opens in new window in mind when storing data and making reports public in order to protect the security and confidentiality of student data.
Read more about best practices for handling data at secureit.msu.edu/data opens in new window from MSU IT Services – Academic Technology.
Addressing Issues of Academic Misconduct
What should you do if you discover cheating in your course? Follow the link to find out more.
What is an Academic Dishonesty Report
If you give a penalty grade as a result of academic misconduct, you must submit an Academic Dishonesty Report (ADR) to the university. See the link above as an example.
Occasionally, a test question will need to be edited while an exam is in progress.
Quizzes – Manually Grade a Quiz - Instructor
Short answer questions, although auto-graded by D2L, should be double-checked for grading accuracy.
D2L Assessment Analytics
Examining quiz question statistics can help instructors determine if a question is too easy, too challenging, or needs editing for clarification.
The following is a quick guide for D2L Quiz and Grade Item statistics to help you monitor and improve your assessment questions and results.
D2L Quiz Statistics
To see how students performed overall on each of the quizzes, in your own course go to Assessments > Quizzes > Statistics (click on Statistics from the tab view across the top).
This list displays all of your course quiz averages.
Click on a quiz to see more details including User Stats, Question Stats, and Question Details.
Question Stats
The Question Stats list the Standard Deviation, Discrimination Index, and Point Biserial value for each question.
You can click on the link, "What do the statistics on this page mean?" above the table in your course to learn more. The information is also copied below.
What do the statistics on this page mean?
All statistics are calculated based on each user’s first attempt on the quiz. If a question is changed after attempts have been made, only the attempts on the newest version of the question are included in the statistics (ie. First attempts made before a question was changed are not included in the statistics for that question).
STANDARD DEVIATION
The standard deviation indicates how much scores vary from the average, ranging from 0% to 100%. A high standard deviation indicates that scores are spread out from the average, whereas a low standard deviation indicates that scores are close to the average.
DISCRIMINATION INDEX
The discrimination index indicates how well a question differentiates between high and low performers. It can range from -100% to 100%, with high values indicating a “good” question, and low values indicating a “bad” question.
POINT BISERIAL CORRELATION COEFFICIENT
The point biserial correlation coefficient is an analysis only applied to multiple choice and true/false question types that have only one answer with weight 100%, and all others with weight 0%.
Similarly to the discrimination index, the point biserial correlation coefficient relates individuals’ quiz scores to whether or not they got a question correct. It ranges from -1.00 to 1.00, with high values indicating a “good” question, and low values indicating a “bad” question.
*Note that only first attempts are included in that question's statistics.
Question Details
This tab will show you the summary of student responses for each question. If you notice a low or negative value for the Point Biserial or Discrimination Index, you may want to investigate the question. It could indicate a badly worded question or improperly keyed question answer.
For more, view the video tutorial on Generating Reports in D2L Learning Environment opens in new window. Currently, the statistics do not display for random "pool item" question types. Contact the MSU Service Desk to check on obtaining reports through the Data Hub.
Grade Item Statistics
To view grade item stats, in your own course go to, Assessments > Grades > (Grade Item) View Statistics – Use the pull down menu by a grade item title and select Statistics to display Class and User Statistics. If you have a grade scheme setup to display, you will also see the grade distribution chart on the page.
Working with student data
Keep the MSU Institutional Data Policy opens in new window in mind when storing data and making reports public in order to protect the security and confidentiality of student data.
Read more about best practices for handling data at secureit.msu.edu/data opens in new window from MSU IT Services – Academic Technology.
Addressing Issues of Academic Misconduct
What should you do if you discover cheating in your course? Follow the link to find out more.
What is an Academic Dishonesty Report
If you give a penalty grade as a result of academic misconduct, you must submit an Academic Dishonesty Report (ADR) to the university. See the link above as an example.
Authored by: Casey Henley & Susan Halick
Assessing Learning
Posted on: #iteachmsu

Crowdmark: Deliver and Grade Assessments
What is Crowdmark?
Crowdmark is an online collaborative grading and analytics platform that helps educators assess student work. The platform allows for easy distribution and collection of student assignments, offers tools for team grading with rubrics, and streamlines the process for providing rich feedback to students.
How does Crowdmark improve the assessment experience?
Crowdmark allows instructors to deliver assignments and exams to students online with a due date and time limit, if desired. Students complete the assessment digitally or scan their handwritten work (as an image or PDF) and upload their completed work using a computer or mobile phone for evaluation on Crowdmark.
Graders can make annotations on the pages, add comments including hyperlinks, embedded images, mathematical and chemical notations, and attach scores according to a grading scheme/rubric. After evaluation is complete, the graded assessments can be electronically returned to students with the click of a button. Crowdmark also provides tools for visualizing student performance and the data can be exported in a convenient format.
Crowdmark is now integrated with MSU’s instance of D2L Brightspace. This integration provides features such as roster synchronization, team synchronization, and the ability to export grades from Crowdmark into the D2L gradebook.
What limitations or alternatives should I consider?
The grading rubrics and comment library make grading more consistent and efficient, however, the assessments are primarily graded manually. For auto-graded questions, you may want to consider using the MSU Scoring Office tool, WebAssess™ Assessment Solutions, in Digital Desk or D2L Quizzes. Gradescope is another alternative similar to Crowdmark.
Where do I start if I want to use it?
See Accessing Crowdmark through D2L, navigate to the Crowdmark sign-in page and select Michigan State University.
Where can I find more information?
MSU D2L Help:
Getting Started with Crowdmark
Crowdmark Documentation:
Introduction to Crowdmark
Getting Started for Instructors
D2L and Crowdmark
Crowdmark support
Crowdmark is an online collaborative grading and analytics platform that helps educators assess student work. The platform allows for easy distribution and collection of student assignments, offers tools for team grading with rubrics, and streamlines the process for providing rich feedback to students.
How does Crowdmark improve the assessment experience?
Crowdmark allows instructors to deliver assignments and exams to students online with a due date and time limit, if desired. Students complete the assessment digitally or scan their handwritten work (as an image or PDF) and upload their completed work using a computer or mobile phone for evaluation on Crowdmark.
Graders can make annotations on the pages, add comments including hyperlinks, embedded images, mathematical and chemical notations, and attach scores according to a grading scheme/rubric. After evaluation is complete, the graded assessments can be electronically returned to students with the click of a button. Crowdmark also provides tools for visualizing student performance and the data can be exported in a convenient format.
Crowdmark is now integrated with MSU’s instance of D2L Brightspace. This integration provides features such as roster synchronization, team synchronization, and the ability to export grades from Crowdmark into the D2L gradebook.
What limitations or alternatives should I consider?
The grading rubrics and comment library make grading more consistent and efficient, however, the assessments are primarily graded manually. For auto-graded questions, you may want to consider using the MSU Scoring Office tool, WebAssess™ Assessment Solutions, in Digital Desk or D2L Quizzes. Gradescope is another alternative similar to Crowdmark.
Where do I start if I want to use it?
See Accessing Crowdmark through D2L, navigate to the Crowdmark sign-in page and select Michigan State University.
Where can I find more information?
MSU D2L Help:
Getting Started with Crowdmark
Crowdmark Documentation:
Introduction to Crowdmark
Getting Started for Instructors
D2L and Crowdmark
Crowdmark support
Authored by: Susan Halick
Assessing Learning
Posted on: #iteachmsu

5 Innovative Grading Strategies: A Quick Guide
Introduction:
As educators we seek to enhance student engagement and learning outcomes, exploring innovative grading strategies can offer fresh perspectives and effective solutions. Here’s a concise overview of five innovative grading practices:
1. Transparent Grading:
What is it? Transparent grading involves clearly defining and communicating grading criteria, processes, and feedback to students.
Key Elements: Detailed rubrics, open communication, student involvement.
Benefits: Enhanced understanding, improved performance, increased trust.
2. Self-Grading:
What is it? Self-grading allows students to assess their own work, promoting reflection and autonomy.
Key Elements: Self-assessment, reflection, feedback loops.
Benefits: Empowers students, promotes deeper learning, supports self-regulation.
3. Peer Grading (Peer Review):
What is it? Peer grading involves students assessing each other’s work, enhancing collaboration and responsibility.
Key Elements: Peer evaluation, feedback exchange, critical thinking.
Benefits: Deepens understanding, builds skills, fosters collaboration.
4. Gameful or Gamified Grading:
What is it? Gameful grading integrates game design elements, such as points, badges, and leaderboards, into the grading process.
Key Elements: Gamification, student choice, immediate feedback.
Benefits: Increases engagement, enhances mastery, supports skill development.
5. Ungrading:
What is it?: Ungrading minimizes or eliminates traditional grades in favor of detailed feedback and alternative assessments.
Key Elements: Detailed feedback, self-assessment, focus on growth.
Benefits: Promotes deep learning, reduces stress, supports equity.
Explore these strategies to boost student engagement and learning outcomes!
As educators we seek to enhance student engagement and learning outcomes, exploring innovative grading strategies can offer fresh perspectives and effective solutions. Here’s a concise overview of five innovative grading practices:
1. Transparent Grading:
What is it? Transparent grading involves clearly defining and communicating grading criteria, processes, and feedback to students.
Key Elements: Detailed rubrics, open communication, student involvement.
Benefits: Enhanced understanding, improved performance, increased trust.
2. Self-Grading:
What is it? Self-grading allows students to assess their own work, promoting reflection and autonomy.
Key Elements: Self-assessment, reflection, feedback loops.
Benefits: Empowers students, promotes deeper learning, supports self-regulation.
3. Peer Grading (Peer Review):
What is it? Peer grading involves students assessing each other’s work, enhancing collaboration and responsibility.
Key Elements: Peer evaluation, feedback exchange, critical thinking.
Benefits: Deepens understanding, builds skills, fosters collaboration.
4. Gameful or Gamified Grading:
What is it? Gameful grading integrates game design elements, such as points, badges, and leaderboards, into the grading process.
Key Elements: Gamification, student choice, immediate feedback.
Benefits: Increases engagement, enhances mastery, supports skill development.
5. Ungrading:
What is it?: Ungrading minimizes or eliminates traditional grades in favor of detailed feedback and alternative assessments.
Key Elements: Detailed feedback, self-assessment, focus on growth.
Benefits: Promotes deep learning, reduces stress, supports equity.
Explore these strategies to boost student engagement and learning outcomes!
Authored by: Monica L. Mills
Assessing Learning
Posted on: #iteachmsu

A Quick Guide to Self-Grading
Overview:
Self-grading involves students assessing their own work, which fosters autonomy, reflection, and ownership of their learning process. This approach aligns with a student-centered approach by focusing on self-assessment and personal growth.
Key Aspects:
Active Student Involvement: Students evaluate their work, which enhances their engagement and investment in the learning process.
Enhanced Metacognition: Encourages students to reflect on their learning, identify strengths and weaknesses, and set goals for improvement.
Ownership and Responsibility: Increases students' ownership of their learning and motivation to improve their work.
Personalized Feedback: Allows students to provide immediate and relevant feedback to themselves.
Increased Engagement: Self-assessment can lead to greater involvement and commitment to learning.
Development of Critical Thinking: Helps students develop critical thinking and evaluation skills.
Alignment with Learning Objectives: Assists students in understanding and aligning their work with course objectives.
Formative Assessment: Provides insights into students' learning progress and areas for development.
Benefits:
Empowerment: Students feel more in control of their learning journey.
Deep Learning: Promotes deeper engagement with material and better retention.
Self-Regulation: Encourages independent learning and self-regulation.
Equity: Provides a more personalized and equitable assessment process.
Implementation Tips:
Provide clear grading criteria and rubrics to guide self-assessment.
Include reflection activities where students analyze their work and identify areas for growth.
Create feedback loops where students compare their self-assessments with peer and instructor feedback.
Allow revisions based on self-assessment and feedback to encourage continuous improvement.
Offer training and support to help students develop effective self-assessment skills.
Resources:
Five Innovative Grading Strategies (iteach article)
A student-centered approach to Grading (CTLI Workshop Slides)
Why students should be allowed to Grade Themselves (Insider Higher Ed article)
Self-Assessment (Center for Teaching Innovation Cornell University)
How to Shift to Self-Grading in English Classes (Edutopia)
Self-grading involves students assessing their own work, which fosters autonomy, reflection, and ownership of their learning process. This approach aligns with a student-centered approach by focusing on self-assessment and personal growth.
Key Aspects:
Active Student Involvement: Students evaluate their work, which enhances their engagement and investment in the learning process.
Enhanced Metacognition: Encourages students to reflect on their learning, identify strengths and weaknesses, and set goals for improvement.
Ownership and Responsibility: Increases students' ownership of their learning and motivation to improve their work.
Personalized Feedback: Allows students to provide immediate and relevant feedback to themselves.
Increased Engagement: Self-assessment can lead to greater involvement and commitment to learning.
Development of Critical Thinking: Helps students develop critical thinking and evaluation skills.
Alignment with Learning Objectives: Assists students in understanding and aligning their work with course objectives.
Formative Assessment: Provides insights into students' learning progress and areas for development.
Benefits:
Empowerment: Students feel more in control of their learning journey.
Deep Learning: Promotes deeper engagement with material and better retention.
Self-Regulation: Encourages independent learning and self-regulation.
Equity: Provides a more personalized and equitable assessment process.
Implementation Tips:
Provide clear grading criteria and rubrics to guide self-assessment.
Include reflection activities where students analyze their work and identify areas for growth.
Create feedback loops where students compare their self-assessments with peer and instructor feedback.
Allow revisions based on self-assessment and feedback to encourage continuous improvement.
Offer training and support to help students develop effective self-assessment skills.
Resources:
Five Innovative Grading Strategies (iteach article)
A student-centered approach to Grading (CTLI Workshop Slides)
Why students should be allowed to Grade Themselves (Insider Higher Ed article)
Self-Assessment (Center for Teaching Innovation Cornell University)
How to Shift to Self-Grading in English Classes (Edutopia)
Authored by: Monica L. Mills
Assessing Learning
Posted on: Instructional Design
New Quick Video Tip: Final Grades
New Featured Resource! Quick Video Tip
Create a 4.0 GPA Scheme to Submit Final Grades from D2L
Reference this brief 4-minute video to learn how to add a 4.0 GPA scheme to gradebook and submit grades to the Registrar's office directly from D2L Brightspace. A time-saver that can help you efficiently manage end-of-semester final grade submission.
Create a 4.0 GPA Scheme to Submit Final Grades from D2L
Reference this brief 4-minute video to learn how to add a 4.0 GPA scheme to gradebook and submit grades to the Registrar's office directly from D2L Brightspace. A time-saver that can help you efficiently manage end-of-semester final grade submission.
Posted by: Lindsay Tigue
Posted on: #iteachmsu

A Quick Guide to Transparent Grading
Overview:
Transparent grading involves clarifying and sharing grading criteria, processes, and feedback with students. This approach ensures that students understand how their work is assessed and how they can meet the expectations set for their assignments.
Key Aspects:
Clear Criteria and Standards: Develop and share detailed rubrics and grading criteria to guide students.
Open Communication: Discuss grading policies and provide ongoing, transparent feedback.
Student Involvement: Include students in creating or refining grading criteria to enhance their understanding.
Consistent Application: Apply grading standards consistently to maintain fairness and equity.
Feedback Focus: Provide specific, actionable feedback that helps students improve their performance.
Transparency in Grade Calculation: Clearly explain how final grades are derived from assignment scores and criteria.
Accessible Information: Make grading criteria and feedback easily accessible through LMS or course materials.
Benefits:
Enhanced Understanding: Students gain clarity on expectations and reduce confusion about their grades.
Improved Performance: Detailed guidelines and feedback help students focus on areas for improvement.
Increased Trust: Builds trust between students and instructors by making the grading process transparent and fair.
Greater Accountability: Ensures that grading practices are consistent and equitable.
Implementation Tips:
Provide detailed rubrics for major assignments and share them early in the course (check out this AI-powered Rubric Generator as a starting point).
Regularly review grading criteria with students and encourage them to ask questions.
Use examples to demonstrate different levels of performance according to the rubric.
Offer feedback on assignments promptly and discuss it with students during office hours.
Resources:
Five Innovative Grading Strategies (iteach article)
Transparent Assignment Design (iteach article)
Transparent Assignment Design (CTLI Quick Guide)
Transparent Assignment Design (CTLI workshop slides)
A Student-Centered Approach to Grading (CTLI workshop slides)
CTLI- Student Centered Grading Resouces
Fair Assignments: Designing Transparent Assignments via the Grading Criteria (iteach article)
TILT Higher Ed Examples and Resources
Transparent grading involves clarifying and sharing grading criteria, processes, and feedback with students. This approach ensures that students understand how their work is assessed and how they can meet the expectations set for their assignments.
Key Aspects:
Clear Criteria and Standards: Develop and share detailed rubrics and grading criteria to guide students.
Open Communication: Discuss grading policies and provide ongoing, transparent feedback.
Student Involvement: Include students in creating or refining grading criteria to enhance their understanding.
Consistent Application: Apply grading standards consistently to maintain fairness and equity.
Feedback Focus: Provide specific, actionable feedback that helps students improve their performance.
Transparency in Grade Calculation: Clearly explain how final grades are derived from assignment scores and criteria.
Accessible Information: Make grading criteria and feedback easily accessible through LMS or course materials.
Benefits:
Enhanced Understanding: Students gain clarity on expectations and reduce confusion about their grades.
Improved Performance: Detailed guidelines and feedback help students focus on areas for improvement.
Increased Trust: Builds trust between students and instructors by making the grading process transparent and fair.
Greater Accountability: Ensures that grading practices are consistent and equitable.
Implementation Tips:
Provide detailed rubrics for major assignments and share them early in the course (check out this AI-powered Rubric Generator as a starting point).
Regularly review grading criteria with students and encourage them to ask questions.
Use examples to demonstrate different levels of performance according to the rubric.
Offer feedback on assignments promptly and discuss it with students during office hours.
Resources:
Five Innovative Grading Strategies (iteach article)
Transparent Assignment Design (iteach article)
Transparent Assignment Design (CTLI Quick Guide)
Transparent Assignment Design (CTLI workshop slides)
A Student-Centered Approach to Grading (CTLI workshop slides)
CTLI- Student Centered Grading Resouces
Fair Assignments: Designing Transparent Assignments via the Grading Criteria (iteach article)
TILT Higher Ed Examples and Resources
Authored by: Monica L. Mills
Assessing Learning
Posted on: #iteachmsu

How to Grade Quiz or Exam Questions
Purpose
Multiple choice question autograde in D2L. However, other types of questions need to be graded by hand. Use this document to learn how to grade questions in a quiz or exam.
Log into d2l.msu.edu
Select the course where you want to add the questions.
Click on Quizzes in the Assessment Tab in the navigation bar or from the Course Admin list
Click on the down arrow next to the quiz name, and choose “Grade”
If you have to only review questions for one individual, you can click on that student in the user list. If you need to review questions for all students, click on the “Questions” tab
For grading written answers, choose “Grade Individual Responses.” You can choose to have names removed from questions by checking “Blind Marking”
If you need to regrade all attempts at a question, for example, if there was an error, and you need to give all student full credit, use “Update All Attempts”
The list of questions will indicate the type of question
WR: Written response
MC: Multiple choice
SA: Short answer
FIB: Fill in the blank
Choose the question you want to grade.
You can choose to grade more than one student per page using the numerical dropdown
The answer you submitted as correct when you wrote the question will display in blue
Grade the student’s response, set the points earned, and provide feedback in the feedback box
Move on to the next student
Save your feedback and scores when finished
Multiple choice question autograde in D2L. However, other types of questions need to be graded by hand. Use this document to learn how to grade questions in a quiz or exam.
Log into d2l.msu.edu
Select the course where you want to add the questions.
Click on Quizzes in the Assessment Tab in the navigation bar or from the Course Admin list
Click on the down arrow next to the quiz name, and choose “Grade”
If you have to only review questions for one individual, you can click on that student in the user list. If you need to review questions for all students, click on the “Questions” tab
For grading written answers, choose “Grade Individual Responses.” You can choose to have names removed from questions by checking “Blind Marking”
If you need to regrade all attempts at a question, for example, if there was an error, and you need to give all student full credit, use “Update All Attempts”
The list of questions will indicate the type of question
WR: Written response
MC: Multiple choice
SA: Short answer
FIB: Fill in the blank
Choose the question you want to grade.
You can choose to grade more than one student per page using the numerical dropdown
The answer you submitted as correct when you wrote the question will display in blue
Grade the student’s response, set the points earned, and provide feedback in the feedback box
Move on to the next student
Save your feedback and scores when finished
Authored by: Casey Henley & Susan Halick
Assessing Learning
Posted on: IT - Educational Te...
New Quick Video Tip! Final Grades
New Featured Resource! Quick Video Tip
Create a 4.0 GPA Scheme to Submit Final Grades from D2L
Reference this brief 4-minute video to learn how to add a 4.0 GPA scheme to gradebook and submit grades to the Registrar's office directly from D2L Brightspace. A time-saver that can help you efficiently manage end-of-semester final grade submission.
Create a 4.0 GPA Scheme to Submit Final Grades from D2L
Reference this brief 4-minute video to learn how to add a 4.0 GPA scheme to gradebook and submit grades to the Registrar's office directly from D2L Brightspace. A time-saver that can help you efficiently manage end-of-semester final grade submission.
Posted by: Lindsay Tigue
Posted on: #iteachmsu
Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom
By Asao B. Inoue
Copy edited by Don Donahue. Designed by Mike Palmquist.
In Labor-Based Grading Contracts, Asao B. Inoue argues for the use of labor-based grading contracts along with compassionate practices to determine course grades as a way to do social justice work with students. He frames this practice by considering how Freirean problem-posing led him to experiment with grading contracts and explore the literature on grading contracts. Inoue offers a robust Marxian theory of labor that considers Hannah Arendt's theory of labor-work-action and Barbara Adam's concept of "timescapes." The heart of the book details the theoretical and practical ways labor-based grading contracts can be used and assessed for effectiveness in classrooms and programs. Inoue concludes the book by moving outside the classroom, considering how assessing writing in the socially just ways he offers in the book may provide a way to address the violence and discord seen in the world today.
Access FULL TEXT in attachment
Inoue, Asao B. (2019). Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom. The WAC Clearinghouse; University Press of Colorado. https://doi.org/10.37514/PER-B.2019.0216.0
Accessed via https://wac.colostate.edu/books/perspectives/labor/?fbclid=IwAR1ZJWZbLYuAU4aQhQ9xlBiIzbX60bGg_VGQwwnZImFUnofX1L5Il2Ec53w
By Asao B. Inoue
Copy edited by Don Donahue. Designed by Mike Palmquist.
In Labor-Based Grading Contracts, Asao B. Inoue argues for the use of labor-based grading contracts along with compassionate practices to determine course grades as a way to do social justice work with students. He frames this practice by considering how Freirean problem-posing led him to experiment with grading contracts and explore the literature on grading contracts. Inoue offers a robust Marxian theory of labor that considers Hannah Arendt's theory of labor-work-action and Barbara Adam's concept of "timescapes." The heart of the book details the theoretical and practical ways labor-based grading contracts can be used and assessed for effectiveness in classrooms and programs. Inoue concludes the book by moving outside the classroom, considering how assessing writing in the socially just ways he offers in the book may provide a way to address the violence and discord seen in the world today.
Access FULL TEXT in attachment
Inoue, Asao B. (2019). Labor-Based Grading Contracts: Building Equity and Inclusion in the Compassionate Writing Classroom. The WAC Clearinghouse; University Press of Colorado. https://doi.org/10.37514/PER-B.2019.0216.0
Accessed via https://wac.colostate.edu/books/perspectives/labor/?fbclid=IwAR1ZJWZbLYuAU4aQhQ9xlBiIzbX60bGg_VGQwwnZImFUnofX1L5Il2Ec53w
Posted by: Makena Neal
Assessing Learning
Posted on: Ungrading (a CoP)
Alternative Grading Conference
It's virtual and inexpensive. I found last year's meeting helpful and insightful.
https://thegradingconference.com/
It's virtual and inexpensive. I found last year's meeting helpful and insightful.
https://thegradingconference.com/
Posted by: Laura Markham
Assessing Learning
Posted on: Ungrading (a CoP)
the Center for Integrative Studies in the Arts and Humanities invites you to attend a workshop on Alternate Grading April 21st, from 10 to 11:30 am via Zoom.
We are honored to welcome Prof. Nicole Coleman of Wayne State University to run the workshop. If you are interested in learning ways to prioritize learning over grading and to make assessments more meaningful for students, you may want to consider a new grading system. Coleman will lead an interactive program on her experiences with teaching courses in both the Specs Grading and Ungrading structures. She will provide some information on how each system works and the theory behind them. She will then guide educators in adjusting an assignment or a syllabus to work with these methods. Please bring a rubric and/or a syllabus to the session to be able to participate fully in this workshop.
We are honored to welcome Prof. Nicole Coleman of Wayne State University to run the workshop. If you are interested in learning ways to prioritize learning over grading and to make assessments more meaningful for students, you may want to consider a new grading system. Coleman will lead an interactive program on her experiences with teaching courses in both the Specs Grading and Ungrading structures. She will provide some information on how each system works and the theory behind them. She will then guide educators in adjusting an assignment or a syllabus to work with these methods. Please bring a rubric and/or a syllabus to the session to be able to participate fully in this workshop.
Posted by: Makena Neal
Pedagogical Design
Posted on: Ungrading (a CoP)
Multiple stories and sentiments were generously shared by 4/4 Beyond Buzzwords: Ungrading workshop participants (thank you for your vulnerability and candor) about the varied ways in which students react to, and make assumption / inferences about their instructors, after the employment of ungrading and ungrading-inspired practices.
This article (linked below) "Academe Has a Lot to Learn About How Inclusive Teaching Affects Instructors" By Chavella Pittman and Thomas J. Tobin in The Chronicle of Higher Education on FEBRUARY 7, 2022 will likely be of interest to you. Starting out by recognizing / acknowledging the power held by some identities (core, chosen, and given) but not by others, complicates the idea that all educators have the same "power and authority" to give up/share to increase learners' sense of ownership and agency in the classroom. ""What if you have neither the institutional authority (a full-time or tenure-track job) nor the dominant-culture identity (by virtue of your race, gender, and/or ability) that usually go hand in hand with being treated as a respected, powerful presence in the college classroom?... In urging faculty members to adopt inclusive teaching practices, we need to start asking if they actually can — and at what cost, " say Pittman and Tobin.
Take-aways shared in this piece include:
1. Understand that your classroom choices may unintentionally affect or undercut a colleague
2. Discuss in your department the issue of bias in students' rating of teaching
3. Respect the variability among your colleagues, as well as among your students
4. Find trained help
"Share your stories, experiences, and thought processes as you negotiate your instructor role in the classroom..." iteach.msu.edu is one space where we can continue to help "normalize the conversation about instructor identity and status as a necessary element in the adoption of inclusive design and teaching practices".
https://www.chronicle.com/article/academe-has-a-lot-to-learn-about-how-inclusive-teaching-affects-instructors
This article (linked below) "Academe Has a Lot to Learn About How Inclusive Teaching Affects Instructors" By Chavella Pittman and Thomas J. Tobin in The Chronicle of Higher Education on FEBRUARY 7, 2022 will likely be of interest to you. Starting out by recognizing / acknowledging the power held by some identities (core, chosen, and given) but not by others, complicates the idea that all educators have the same "power and authority" to give up/share to increase learners' sense of ownership and agency in the classroom. ""What if you have neither the institutional authority (a full-time or tenure-track job) nor the dominant-culture identity (by virtue of your race, gender, and/or ability) that usually go hand in hand with being treated as a respected, powerful presence in the college classroom?... In urging faculty members to adopt inclusive teaching practices, we need to start asking if they actually can — and at what cost, " say Pittman and Tobin.
Take-aways shared in this piece include:
1. Understand that your classroom choices may unintentionally affect or undercut a colleague
2. Discuss in your department the issue of bias in students' rating of teaching
3. Respect the variability among your colleagues, as well as among your students
4. Find trained help
"Share your stories, experiences, and thought processes as you negotiate your instructor role in the classroom..." iteach.msu.edu is one space where we can continue to help "normalize the conversation about instructor identity and status as a necessary element in the adoption of inclusive design and teaching practices".
https://www.chronicle.com/article/academe-has-a-lot-to-learn-about-how-inclusive-teaching-affects-instructors
Posted by: Makena Neal
Pedagogical Design
Posted on: Center for Teaching...
I had the good fortune to attend an enlightening workshop today: "Student-Centered Approach to Grading" presented by Jeremy Van Hof and Monica Mills. Among the resources they provided was this treasure trove: https://www.gettoby.com/p/jp9xrk523nt1
Posted by: David V. Howe
Assessing Learning
Posted on: GenAI & Education
AI Commons Bulletin 2/10/2025
🚨 CSU Launches “AI Commons” – Sound Familiar?
The California State University (CSU) system just rolled out CSU AI Commons, a system-wide hub for AI tools, training, and research. Backed by Big Tech partnerships, it focuses on faculty development, student literacy, and workforce acceleration. BUT: AI strategy isn’t just about resources—it’s about who controls the narrative. With corporate-backed AI in higher education, what happens to independent faculty innovation?
Learn More: https://genai.calstate.edu/
🔍Tracking AI Policies in Higher Ed
Embry-Riddle Aeronautical University has compiled a Padlet featuring AI policies and guidelines from institutions worldwide. This evolving resource provides insight into how different universities are shaping their AI approaches.
Learn More: https://padlet.com/cetl6/university-policies-on-generative-ai-m9n7wf05r7rdc6pe
📚 AI Submissions Outperform Students in Recent Study
A PLOS ONE study found that 94% of AI-generated assignments went undetected, with grades averaging half a grade higher than those of real students. There was also an 83.4% chance AI submissions would outperform a random selection of student work across modules.
Learn More: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0305354#:~:text=The%20%27Turing%20Test%27%20is%20now,a%20predefined%20set%20of%20rules
⚞ Blurry Lines in AI and Assessment
A study in Assessment & Evaluation in Higher Education highlights student and educator confusion over acceptable AI use in assessments. Many rely on personal judgment or Grammarly analogies. The authors propose the Dynamic Educational Boundaries Model to embed clear AI-use guidelines directly into assessments.
Learn More: https://doi.org/10.1080/02602938.2025.2456207
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
🚨 CSU Launches “AI Commons” – Sound Familiar?
The California State University (CSU) system just rolled out CSU AI Commons, a system-wide hub for AI tools, training, and research. Backed by Big Tech partnerships, it focuses on faculty development, student literacy, and workforce acceleration. BUT: AI strategy isn’t just about resources—it’s about who controls the narrative. With corporate-backed AI in higher education, what happens to independent faculty innovation?
Learn More: https://genai.calstate.edu/
🔍Tracking AI Policies in Higher Ed
Embry-Riddle Aeronautical University has compiled a Padlet featuring AI policies and guidelines from institutions worldwide. This evolving resource provides insight into how different universities are shaping their AI approaches.
Learn More: https://padlet.com/cetl6/university-policies-on-generative-ai-m9n7wf05r7rdc6pe
📚 AI Submissions Outperform Students in Recent Study
A PLOS ONE study found that 94% of AI-generated assignments went undetected, with grades averaging half a grade higher than those of real students. There was also an 83.4% chance AI submissions would outperform a random selection of student work across modules.
Learn More: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0305354#:~:text=The%20%27Turing%20Test%27%20is%20now,a%20predefined%20set%20of%20rules
⚞ Blurry Lines in AI and Assessment
A study in Assessment & Evaluation in Higher Education highlights student and educator confusion over acceptable AI use in assessments. Many rely on personal judgment or Grammarly analogies. The authors propose the Dynamic Educational Boundaries Model to embed clear AI-use guidelines directly into assessments.
Learn More: https://doi.org/10.1080/02602938.2025.2456207
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Sarah Freye
Posted on: #iteachmsu
This article was shared in an academic group I'm a part of on a social networking site... it's framing is within the Canadian Higher Education setting, but the message about student mental health is relevant for all.
Here are a couple of thoughts from the article worth sharing if you can't take the time to read the entire piece:
"To fully understand the present crisis, one has to appreciate a fundamental and often overlooked fact: higher education is not what it used to be. Not only do we have a more diverse student body with equally diverse psychiatric needs, we also have an academic culture that has changed profoundly in the past six decades, making the university experience more stressful than it once was. The classic liberal conception of postsecondary institutions as places where young people take a kind of sabbatical from life—read the great books, engage in endless debates, and learn to see themselves as citizens—has given way to a new model, more narrowly vocational in focus."
"By prioritizing high achievers, Henderson argues, universities are selecting not only for diligent candidates but also for those who view scholastic success as central to their identities. For such students, a bad grade can be destabilizing. When that grade appears on an exam worth 80 percent of a final course mark, or when it comes from a harried teaching assistant who doesn’t offer in-depth feedback, students can feel like they are losing a game whose rules were never explained. Imagine being told all your life that you are ahead of the pack and that you must stay there, both to secure a stable future and to get a return on the investments that family members or granting agencies have made on your behalf. Then, imagine falling behind, for reasons you don’t understand, at the precise moment when staying on top feels more critical than ever before. Furthermore, imagine that you are contending with profound loneliness, past trauma, and financial insecurity, all while working a part-time job with the usual mix of erratic hours."
"Such stressors can lead to sleep disruption, irregular eating, and substance abuse—all of which correlate with mental illness—or they can trigger preexisting psychiatric conditions. They can deplete reserves of neurochemicals, like dopamine and serotonin, needed to sustain a sense of well-being, or they can flood the brain and body with cortisol, the stress hormone, which, in excess, can push people into near-constant states of anxiety, making it difficult to conceptualize daily challenges in a proportionate or healthy way. They can also lead to identity confusion and an acute sense of shame."
Inside the Mental Health Crisis Facing College and University Students by Simon Lewsen : https://thewalrus.ca/inside-the-mental-health-crisis-facing-college-and-university-students/?fbclid=IwAR12PokSFpCrBo1NmtpNYoGEohKf3csYHQc9X8LwFAdNPTtBF_zIRbEqwhs
Here are a couple of thoughts from the article worth sharing if you can't take the time to read the entire piece:
"To fully understand the present crisis, one has to appreciate a fundamental and often overlooked fact: higher education is not what it used to be. Not only do we have a more diverse student body with equally diverse psychiatric needs, we also have an academic culture that has changed profoundly in the past six decades, making the university experience more stressful than it once was. The classic liberal conception of postsecondary institutions as places where young people take a kind of sabbatical from life—read the great books, engage in endless debates, and learn to see themselves as citizens—has given way to a new model, more narrowly vocational in focus."
"By prioritizing high achievers, Henderson argues, universities are selecting not only for diligent candidates but also for those who view scholastic success as central to their identities. For such students, a bad grade can be destabilizing. When that grade appears on an exam worth 80 percent of a final course mark, or when it comes from a harried teaching assistant who doesn’t offer in-depth feedback, students can feel like they are losing a game whose rules were never explained. Imagine being told all your life that you are ahead of the pack and that you must stay there, both to secure a stable future and to get a return on the investments that family members or granting agencies have made on your behalf. Then, imagine falling behind, for reasons you don’t understand, at the precise moment when staying on top feels more critical than ever before. Furthermore, imagine that you are contending with profound loneliness, past trauma, and financial insecurity, all while working a part-time job with the usual mix of erratic hours."
"Such stressors can lead to sleep disruption, irregular eating, and substance abuse—all of which correlate with mental illness—or they can trigger preexisting psychiatric conditions. They can deplete reserves of neurochemicals, like dopamine and serotonin, needed to sustain a sense of well-being, or they can flood the brain and body with cortisol, the stress hormone, which, in excess, can push people into near-constant states of anxiety, making it difficult to conceptualize daily challenges in a proportionate or healthy way. They can also lead to identity confusion and an acute sense of shame."
Inside the Mental Health Crisis Facing College and University Students by Simon Lewsen : https://thewalrus.ca/inside-the-mental-health-crisis-facing-college-and-university-students/?fbclid=IwAR12PokSFpCrBo1NmtpNYoGEohKf3csYHQc9X8LwFAdNPTtBF_zIRbEqwhs
Posted by: Makena Neal
Navigating Context
Posted on: Reading Group for S...
My background in Scandinavian languages and literature keeps rearing its head in various ways after many years. Specifically,when it comes to folklore, magical tales, and perilous journeys toward maturation. In a way, I have become a pedagogical Ashland, of sorts, since coming to MSU in 2015. My journey, an ongoing quest if you will, has been in trying to find that one magical key, which will unlock the enchanted door to greater student interest and involvement in their general education course requirements.
Those of us who teach these courses know that, too often, many students view gen. ed. requirements as hoops to jump through. Something they must satisfy to graduate. Subjects that, they feel, have little to do with the real world, their intended majors, or envisioned careers. Scheduling and convenience more than genuine interest seem to be the determining factor for many students when they choose to enroll in such courses. Put the head down, muddle through, and get it done with as little effort as possible.
But there might be another way.
In my own ongoing quest to motivate and engage the students in my various IAH courses more effectively, I have come back to Bloom's Taxonomy again and again since first learning about it in the 2016-2017 Walter and Pauline Adams Academy cohort. More specifically, it is Bloom's Digital Taxonomy, revised by various scholars for use with 21st century students who exist in an increasingly digital world, that has been especially useful when it comes to designing assessments for my students.
For those who are interested, there are all kinds of sources online -- journal article pdfs, infographics, Youtube explainer videos, etc. -- that will be informative and helpful for anyone who might be interested in learning more. Just search for 'Bloom's Digital Taxonomy' on Google. It's that easy.
For my specific IAH courses, I organize my students into permanent student learning teams early each semester and ask them to create three collaborative projects (including a team reflection). These are due at the end of Week Five, Week 10, and Week 14. Right now, the projects include:
1) A TV Newscast/Talkshow Article Review Video in which teams are ask to locate, report on, review, and evaluate two recent journal articles pertinent to material read or viewed during the first few weeks of the course.
2) A Readers' Guide Digital Flipbook (using Flipsnack) that reviews and evaluates the usefulness of two books, two more recent journal articles, and two blogs or websites on gender and sexuality OR race and ethnicity within the context of specific course materials read or viewed during roughly the middle third of the course.
3) An Academic Poster (due at the end of Week 14) in which student teams revisit course materials and themes related to gender, sexuality, race, ethnicity, class, and identity. In addition, students are asked to examine issues of power, marginalization, disparity, equity, etc. in those same sources and look at how these same issues affect our own societies/cultures of origin in the real world. Finally, student teams (in course as diverse as Film Noir of the 1940s and 50s, Horror Cinema, and the upcoming Contemporary Scandinavian and Nordic Authors) are asked to propose realistic, concrete solutions to the social problems facing us.
Anecdotally, student feedback has been largely very favorable so far. Based on remarks in their team reflections this semester (Fall 2021), students report that they enjoy these collaborative, creative projects and feel like they have considerable leeway to shape what their teams develop. Moreover, they also feel that they are learning quite a bit about the material presented as well as valuable 21st century employability skills in the process. Where their all important assignment grades are concerned, student learning teams in my courses are meeting or exceeding expectations with the work they have produced for the first two of three team projects this semester according to the grading rubrics currently in use.
Beginning in Spring 2022, I plan to give my student teams even more agency in choosing how they are assessed and will provide two possible options for each of the three collaborative projects. Right not, these will probably include:
Project #1 (Recent Journal Article Review and Evaluation)-- Powtoon Animated TV Newscast OR Infographic
Project #2 -- (Review and Evaluation of Digital Sources on Gender and Sexuality OR Race and Ethnicty in our specific course materials) Flipbook OR Podcast
Project #3 -- (Power, Marginality, Disparity, Equity in Course Materials and Real World of 21st Century Problem-Solving) Electronic Poster OR Digital Scrapbook.
Through collaborative projects like these, I am attempting to motivate and engage the students in my IAH courses more effectively, help them to think more actively and critically about the material presented as well as the various social issues that continue to plague our world, and provide them with ample opportunity to cultivate essential skills that will enable their full participation in the globalized world and economy of the 21st century. Bloom's (Revised) Digital Taxonomy, among other resources, continues to facilitate my evolving thought about how best to reach late Gen Y and Gen Z students within a general education context.
If anyone would like to talk more about all of this, offer constructive feedback, or anything else, just drop me a line. I am always looking for those magic beans that will increase student motivation and engagement, and eager to learn more along the way. Bloom's Digital Taxonomy has certainly been one of my three magical helpers in the quest to to do that.
Those of us who teach these courses know that, too often, many students view gen. ed. requirements as hoops to jump through. Something they must satisfy to graduate. Subjects that, they feel, have little to do with the real world, their intended majors, or envisioned careers. Scheduling and convenience more than genuine interest seem to be the determining factor for many students when they choose to enroll in such courses. Put the head down, muddle through, and get it done with as little effort as possible.
But there might be another way.
In my own ongoing quest to motivate and engage the students in my various IAH courses more effectively, I have come back to Bloom's Taxonomy again and again since first learning about it in the 2016-2017 Walter and Pauline Adams Academy cohort. More specifically, it is Bloom's Digital Taxonomy, revised by various scholars for use with 21st century students who exist in an increasingly digital world, that has been especially useful when it comes to designing assessments for my students.
For those who are interested, there are all kinds of sources online -- journal article pdfs, infographics, Youtube explainer videos, etc. -- that will be informative and helpful for anyone who might be interested in learning more. Just search for 'Bloom's Digital Taxonomy' on Google. It's that easy.
For my specific IAH courses, I organize my students into permanent student learning teams early each semester and ask them to create three collaborative projects (including a team reflection). These are due at the end of Week Five, Week 10, and Week 14. Right now, the projects include:
1) A TV Newscast/Talkshow Article Review Video in which teams are ask to locate, report on, review, and evaluate two recent journal articles pertinent to material read or viewed during the first few weeks of the course.
2) A Readers' Guide Digital Flipbook (using Flipsnack) that reviews and evaluates the usefulness of two books, two more recent journal articles, and two blogs or websites on gender and sexuality OR race and ethnicity within the context of specific course materials read or viewed during roughly the middle third of the course.
3) An Academic Poster (due at the end of Week 14) in which student teams revisit course materials and themes related to gender, sexuality, race, ethnicity, class, and identity. In addition, students are asked to examine issues of power, marginalization, disparity, equity, etc. in those same sources and look at how these same issues affect our own societies/cultures of origin in the real world. Finally, student teams (in course as diverse as Film Noir of the 1940s and 50s, Horror Cinema, and the upcoming Contemporary Scandinavian and Nordic Authors) are asked to propose realistic, concrete solutions to the social problems facing us.
Anecdotally, student feedback has been largely very favorable so far. Based on remarks in their team reflections this semester (Fall 2021), students report that they enjoy these collaborative, creative projects and feel like they have considerable leeway to shape what their teams develop. Moreover, they also feel that they are learning quite a bit about the material presented as well as valuable 21st century employability skills in the process. Where their all important assignment grades are concerned, student learning teams in my courses are meeting or exceeding expectations with the work they have produced for the first two of three team projects this semester according to the grading rubrics currently in use.
Beginning in Spring 2022, I plan to give my student teams even more agency in choosing how they are assessed and will provide two possible options for each of the three collaborative projects. Right not, these will probably include:
Project #1 (Recent Journal Article Review and Evaluation)-- Powtoon Animated TV Newscast OR Infographic
Project #2 -- (Review and Evaluation of Digital Sources on Gender and Sexuality OR Race and Ethnicty in our specific course materials) Flipbook OR Podcast
Project #3 -- (Power, Marginality, Disparity, Equity in Course Materials and Real World of 21st Century Problem-Solving) Electronic Poster OR Digital Scrapbook.
Through collaborative projects like these, I am attempting to motivate and engage the students in my IAH courses more effectively, help them to think more actively and critically about the material presented as well as the various social issues that continue to plague our world, and provide them with ample opportunity to cultivate essential skills that will enable their full participation in the globalized world and economy of the 21st century. Bloom's (Revised) Digital Taxonomy, among other resources, continues to facilitate my evolving thought about how best to reach late Gen Y and Gen Z students within a general education context.
If anyone would like to talk more about all of this, offer constructive feedback, or anything else, just drop me a line. I am always looking for those magic beans that will increase student motivation and engagement, and eager to learn more along the way. Bloom's Digital Taxonomy has certainly been one of my three magical helpers in the quest to to do that.
Posted by: Stokes Schwartz
Pedagogical Design
Host: CTLI
No Surprises: Designing Assignments Students Understand
This workshop introduces the Transparency in Learning and Teaching (TILT) framework as a tool for designing clear, equity-minded assignments. Participants will explore how transparency supports student success, reduce confusion and grading time, and learn how to structure assignments using the Transparent Assignment Design (TAD) model. The session includes strategies to improve student motivation, performance, and clarity around expectations.
Upon completion of this learning experience, participants will be able to:
understand the history of the TILT and it’s related research findings
describe how the TAD framework relates to equitable learning
define transparent assignment design and its key elements (purpose, task, criteria)
apply TAD best practices
identify resources for implementing the TAD framework.
Navigating Context