We found 339 results that contain "assessment"
Posted on: #iteachmsu

Strategies for Assessment
This playlist details several methods for assessing student learning, focused on strategies for time management.
ASSESSING LEARNING
Posted on: #iteachmsu

MSU Online Assessment Training
This is a collection of resources on assessments that was a part of a workshop offering in the Summer of 2020. This workshop was designed and delivered by Amy Lockwood, Andrea Bierema, Becky Matz, Casey Henley, Dave Goodrich, Julie Libarkin, Michael Lockett, Nicola Imbracsio, Stephen Thomas, and Sue Halick.
ASSESSING LEARNING
Posted on: #iteachmsu

Instruction, Feedback, Assessments & Centering Students in Remote Environments
This playlist is a growing collection of content aimed at supporting educators as they traverse ongoing shifts in teaching environment, procedures related to grading, and other uncertainties that results from ongoing pandemics... all the while keeping student success at the core of their work.
ASSESSING LEARNING
Posted on: #iteachmsu
Teaching & Learning Conference Day 2: Thursday–Virtual day with online sessions (all day)
Spring CTLI Conference Landing Page Content
Graphics if needed
CTLI Spring Teaching and Learning Conference
The Return of MSU's 2023 Spring Conference on Teaching & Learning: Community, Conversation, and Classroom Experience, organized by the Center for Teaching & Learning Innovation (CTLI).
A conference where MSU educators gather to share approaches, tools, and techniques that support teaching and learning.
May 10-11, 2023
Wednesday Day 1: In-person in the STEM Teaching and Learning Facility (all day)
Thursday Day 2: Virtual day with online sessions (all day)
Keynote Speakers:
Stephen Thomas, (Associate Director, CISGS; Assistant Dean for STEM Education Teaching and Learning in the Office of the APUE)
Dr. Kris Renn (Professor of Higher, Adult, and Lifelong Education and serves as Associate Dean of Undergraduate Studies for Student Success Research)
Conference Registration
Registration for the event is open! Please submit your information into the form below. We will reach out with more information on sessions and schedule closer to the event.
***insert registration form button***
Interested in Submitting a Presentation Proposal?
We are extending an invitation for presentation proposals on select teaching and learning topics across a wide array of presentation formats, including synchronous digital sessions on May 11th to accommodate virtual attendees. The deadline for submissions is February 17th.
Proposal Learning Topics and Formats
Please refer to the descriptions below for details regarding formats and topics. At least one presenter per session should be an educator at MSU.
Teaching and Learning Topics:
The conference committee welcomes presentations on post-secondary education that address one or more of these core topics:
Curriculum and Pedagogy
Assessment and Evaluation
Diversity, Equity, and Inclusion
Learning Technologies
Proposal formats
Presentations may be in-person or virtual.
Paper Presentation: individual papers authored by one or more people, delivered in 15-to-20 minutes. Individual papers will be grouped according to topic and delivered in a multi-paper session that includes a 15-minute question period.
Workshop: this format will include participatory exercises where attendees will learn about a select educational topic or practice from an expert practitioner. These sessions will run for 50-to-60 minutes and include a 15-minute question period.
Welcome to my Classroom: these 50-to-60-minute sessions will feature a short overview of a teaching and learning theory or practice followed by a demonstration of active pedagogy. The audience will be positioned as learners, according to the educational and disciplinary context, and observe the presenter’s demonstration of actual classroom exercises and practices. The sessions will conclude with a 15-to-20-minute discussion or question period.
Learning Technology Demonstration: these 15-to-20-minute demonstrations of learning technologies will be grouped according to topic and delivered in a multi-presentation session that concludes with a 15-minute question period.
***insert proposal form button***
Contact the Center
If you are interested in hearing more about the conference, would like to submit a proposal or have any questions, please contact the Center for Teaching and Learning Innovation.
Graphics if needed
CTLI Spring Teaching and Learning Conference
The Return of MSU's 2023 Spring Conference on Teaching & Learning: Community, Conversation, and Classroom Experience, organized by the Center for Teaching & Learning Innovation (CTLI).
A conference where MSU educators gather to share approaches, tools, and techniques that support teaching and learning.
May 10-11, 2023
Wednesday Day 1: In-person in the STEM Teaching and Learning Facility (all day)
Thursday Day 2: Virtual day with online sessions (all day)
Keynote Speakers:
Stephen Thomas, (Associate Director, CISGS; Assistant Dean for STEM Education Teaching and Learning in the Office of the APUE)
Dr. Kris Renn (Professor of Higher, Adult, and Lifelong Education and serves as Associate Dean of Undergraduate Studies for Student Success Research)
Conference Registration
Registration for the event is open! Please submit your information into the form below. We will reach out with more information on sessions and schedule closer to the event.
***insert registration form button***
Interested in Submitting a Presentation Proposal?
We are extending an invitation for presentation proposals on select teaching and learning topics across a wide array of presentation formats, including synchronous digital sessions on May 11th to accommodate virtual attendees. The deadline for submissions is February 17th.
Proposal Learning Topics and Formats
Please refer to the descriptions below for details regarding formats and topics. At least one presenter per session should be an educator at MSU.
Teaching and Learning Topics:
The conference committee welcomes presentations on post-secondary education that address one or more of these core topics:
Curriculum and Pedagogy
Assessment and Evaluation
Diversity, Equity, and Inclusion
Learning Technologies
Proposal formats
Presentations may be in-person or virtual.
Paper Presentation: individual papers authored by one or more people, delivered in 15-to-20 minutes. Individual papers will be grouped according to topic and delivered in a multi-paper session that includes a 15-minute question period.
Workshop: this format will include participatory exercises where attendees will learn about a select educational topic or practice from an expert practitioner. These sessions will run for 50-to-60 minutes and include a 15-minute question period.
Welcome to my Classroom: these 50-to-60-minute sessions will feature a short overview of a teaching and learning theory or practice followed by a demonstration of active pedagogy. The audience will be positioned as learners, according to the educational and disciplinary context, and observe the presenter’s demonstration of actual classroom exercises and practices. The sessions will conclude with a 15-to-20-minute discussion or question period.
Learning Technology Demonstration: these 15-to-20-minute demonstrations of learning technologies will be grouped according to topic and delivered in a multi-presentation session that concludes with a 15-minute question period.
***insert proposal form button***
Contact the Center
If you are interested in hearing more about the conference, would like to submit a proposal or have any questions, please contact the Center for Teaching and Learning Innovation.
PEDAGOGICAL DESIGN
Posted on: MSU Online & Remote...

Implementation of Remote Teaching
To implement your new plans and modifications, your considerations should be given to six key focus areas: Communication, Assessment, Assignments and Activities, Lecture, Participation & Engagement, and Library Resources. Content on each of these areas can be found in this playlist.
PEDAGOGICAL DESIGN
Posted on: #iteachmsu

Assessment Workshops
The Hub for Innovation in Learning and Technology is supporting two assessment workshops in March: Assessment Options Beyond the Exam and Exam Design.
1) Assessment Options Beyond the Exam, led by Dr. Andrea Bierema: This workshop is for any MSU educator who is looking for resources and help with formative assessments and alternatives to exams such as projects, infographics, and debates. Examples include ideas for classes with 100 or more students. This workshop ran synchronously on 3/10 via zoom.
2) Exam Design, led by Dr. Casey Henley: This workshop is for any MSU educator who is looking for resources and help with academic integrity on summative quizzes and exams. We will focus on writing multiple-choice and short-answer questions, creating a climate of integrity in the course, the pros and cons of video proctoring and creating exams specifically in D2L. This workshop ran synchronously on 3/9 via zoom.
If you have questions related to the SOIREE workshops, please reach out to Ashley Braman (behanash@msu.edu) for additional support.
1) Assessment Options Beyond the Exam, led by Dr. Andrea Bierema: This workshop is for any MSU educator who is looking for resources and help with formative assessments and alternatives to exams such as projects, infographics, and debates. Examples include ideas for classes with 100 or more students. This workshop ran synchronously on 3/10 via zoom.
2) Exam Design, led by Dr. Casey Henley: This workshop is for any MSU educator who is looking for resources and help with academic integrity on summative quizzes and exams. We will focus on writing multiple-choice and short-answer questions, creating a climate of integrity in the course, the pros and cons of video proctoring and creating exams specifically in D2L. This workshop ran synchronously on 3/9 via zoom.
If you have questions related to the SOIREE workshops, please reach out to Ashley Braman (behanash@msu.edu) for additional support.
Authored by: Breana Yaklin, Andrea Bierema, Casey Henley
Assessing Learning
Posted on: #iteachmsu

MSU Testing Center & Assessment Services
MSU Testing Center & Assessment Services
Team: Testing CenterThe MSU Testing Center serves as the primary resource for best practice, policy, and implementation of campus and community testing services and subscribes to the NCTA Professional Standards and Guidelines. Through its work with the campus and community, it helps promote academic integrity and academic success.What we do: The MSU Testing Center administers and/or proctors over 20,000 exams per year. Our operations fall into several categories:
National Standardized exams such as the GRE, TOEFL, SAT, ACT, and LSAT
Make-up exams for MSU classes
Exams for MSU online courses
Exams for online courses for other universities
Proctoring services for classroom tests on MSU’s campus
MSU Scoring & Assessment Services offers: assessment consultations, High Resolution Optical Scanning (for exams, quizzes, attendance, research, SIRS), and digital production of dynamic reports, analytics, and images!Services provided:
Crowdmark scanning
Gradescope scanning
LON-CAPA scanning
Custom form design
Custom scanning programs
Website: https://testingcenter.msu.edu/Contact us:Please browse our website in order to find information that is specific to your needs. If you’re unable to find what you’re looking for, feel free to contact us!
Team: Testing CenterThe MSU Testing Center serves as the primary resource for best practice, policy, and implementation of campus and community testing services and subscribes to the NCTA Professional Standards and Guidelines. Through its work with the campus and community, it helps promote academic integrity and academic success.What we do: The MSU Testing Center administers and/or proctors over 20,000 exams per year. Our operations fall into several categories:
National Standardized exams such as the GRE, TOEFL, SAT, ACT, and LSAT
Make-up exams for MSU classes
Exams for MSU online courses
Exams for online courses for other universities
Proctoring services for classroom tests on MSU’s campus
MSU Scoring & Assessment Services offers: assessment consultations, High Resolution Optical Scanning (for exams, quizzes, attendance, research, SIRS), and digital production of dynamic reports, analytics, and images!Services provided:
Crowdmark scanning
Gradescope scanning
LON-CAPA scanning
Custom form design
Custom scanning programs
Website: https://testingcenter.msu.edu/Contact us:Please browse our website in order to find information that is specific to your needs. If you’re unable to find what you’re looking for, feel free to contact us!
Posted by: Makena Neal
Navigating Context
Posted on: Center for Teaching...

Transparent Assessments
Are you ready to level up your teaching game and promote equity in your classroom? Introducing Transparent Assignment Design (or TAD for short)! This powerful, yet easy to implement, framework not only makes your assignments crystal clear but also ensures inclusivity and fairness. By providing clear expectations and support, TAD helps level the playing field and gives every student a chance to shine. I have personally found that redesigning assignments using the TAD framework has led to (Mills, M.L. (formerly Rosen, M.L.) et al., 2022):
improved quality of student submissions
reduced requests for regrades
reduced late submissions
reduced student frustration
An assignment that utilizes the TAD framework includes three important sections:
Purpose - an explanation on how the knowledge and skills used in this activity are relevant to a student and their future.
Task - a detailed explanation of the steps a student needs to take to complete the assignment.
Criteria - an explanation for how student's submission of the assignment will be evaluated.
In the next set of articles in this playlist, we will expand on each of these sections by describing what a good Purpose, Task, and Criteria looks like. At the end of the playlist we will provide with a template to get you started.Resources:
TILT Higher Ed Examples and Resources
Transparent Assignment Design | Center for Advancing Teaching and Learning Through Research (northeastern.edu)
Quick Guide to Transparent Assignment Design (wsu.edu)
improved quality of student submissions
reduced requests for regrades
reduced late submissions
reduced student frustration
An assignment that utilizes the TAD framework includes three important sections:
Purpose - an explanation on how the knowledge and skills used in this activity are relevant to a student and their future.
Task - a detailed explanation of the steps a student needs to take to complete the assignment.
Criteria - an explanation for how student's submission of the assignment will be evaluated.
In the next set of articles in this playlist, we will expand on each of these sections by describing what a good Purpose, Task, and Criteria looks like. At the end of the playlist we will provide with a template to get you started.Resources:
TILT Higher Ed Examples and Resources
Transparent Assignment Design | Center for Advancing Teaching and Learning Through Research (northeastern.edu)
Quick Guide to Transparent Assignment Design (wsu.edu)
Authored by: Monica L. Mills
Assessing Learning
Posted on: #iteachmsu

Spartan Studios: Assessment
AssessmentThis is the seventh article in our iTeach.MSU playlist for the Spartan Studios Playkit.
There are many options for assessment in your experiential course. In addition to assessing performance or content retention, you have the option to assess students’ holistic contribution, as well as to focus on engagement with the experiential process.
Assessment of Student Work
▶️Consider holistic and/or assignment level assessment. Will you be more focused on a holistic assessment, meaning their overall contribution and engagement with the course? There may be specific components of the course where assignment-level assessment is a better fit, eg. evaluating deliverables, specific performance, or content retention/mastery.
▶️You can assess team deliverables based on a rubric evaluating levels of quality with explicit descriptions for work that fails to meet, meets, or exceeds expectations.
▶️Assessment should reflect both an opportunity for students to demonstrate mastery of knowledge as well as their ability to apply what they know to a novel environment/challenge. Some assessment methods capture intended learning outcomes; other methods should be designed to capture emergent learning outcomes.
▶️Whatever your choice of assessment, be upfront and transparent with your students about expectations. Keep feedback central; a focus on grades can interrupt student ownership, while a focus on feedback reinforces relationships and students’ sense of ownership (see our GORP model for details).
Assessment As Learning
In addition to assessment towards their grade, experiential courses offer opportunities for instructors and students to learn from the course processes and from each other.
How do students charge forward in a project and stumble into/ respond to barriers?
How can faculty follow student work, prototypes, collaborations, and goals with their expertise to coach students to their next sprint?
Assessment of your own coaching: are you modeling the role shift you want to see in students? If you want student groups to give good feedback to each other, are you demonstrating good feedback practices with other faculty in public so students can see how to do it?
Focus on feedback-rich classroom practices and formative assessment.
Evaluation of Process and Reflection
Give students the opportunity to reflect on their progress a few times during the project. Reflection leads to metacognitive moments, where students think about their own thinking, creates opportunities for deep learning, and can be transformative. This also helps you understand team dynamics and how students assigned work within their teams. Reflections can be written or more open-format, like art projects. Reflection prompts can be specific to your course as well as more general prompts about their learning experiences.
🔧Photovoice is a well-documented visual reflection tool used in qualitative participatory community-based research projects. MSU’s University Outreach and Engagement (engage@msu.edu) offers workshops about Photovoice. Here are some Photovoice resources including a Photovoice manual, a facilitator toolkit, an implementation guide, and an organizer’s manual.
▶️You can use reflections to:
evaluate project progress and topical knowledge
evaluate team communication
evaluate use of feedback
evaluate how individuals learn their way through challenges
evaluate student identity exploration
🔧Table of written vs. open-format reflections
Written reflections
Open-format/interpretive reflections
Pros
Cons
Pros
Cons
Explicit framing of student feelings and growth
Easier to standardize responses
Good for maintaining student accountability of their own growth and experience
Potentially restrictive to student expression
Potential reduction in student ownership of content
Can resemble a traditional homework assignment, reinstates traditional power dynamics
Encourages student creativity and exploration
Potentially the most honest representation of student experience
Could facilitate more meaningful discussion
Difficult to standardize and interpret
Difficult to glean data from
Harder to communicate expectations
Sample prompt questions:
What strengths do you bring to your team? Are you using the strengths you anticipated using, or are new/different capabilities emerging as you work on your events?
What do you most appreciate about some of the other students in the course? Rather than naming names, identify characteristics, activities, behaviors, etc.
Has the course changed the way you are thinking about your work in the future? In what ways?
Are there things that you would change about the course? What suggestions would you make to the instructors for future classes?
Image by Shahid Abdullah from Pixabay
There are many options for assessment in your experiential course. In addition to assessing performance or content retention, you have the option to assess students’ holistic contribution, as well as to focus on engagement with the experiential process.
Assessment of Student Work
▶️Consider holistic and/or assignment level assessment. Will you be more focused on a holistic assessment, meaning their overall contribution and engagement with the course? There may be specific components of the course where assignment-level assessment is a better fit, eg. evaluating deliverables, specific performance, or content retention/mastery.
▶️You can assess team deliverables based on a rubric evaluating levels of quality with explicit descriptions for work that fails to meet, meets, or exceeds expectations.
▶️Assessment should reflect both an opportunity for students to demonstrate mastery of knowledge as well as their ability to apply what they know to a novel environment/challenge. Some assessment methods capture intended learning outcomes; other methods should be designed to capture emergent learning outcomes.
▶️Whatever your choice of assessment, be upfront and transparent with your students about expectations. Keep feedback central; a focus on grades can interrupt student ownership, while a focus on feedback reinforces relationships and students’ sense of ownership (see our GORP model for details).
Assessment As Learning
In addition to assessment towards their grade, experiential courses offer opportunities for instructors and students to learn from the course processes and from each other.
How do students charge forward in a project and stumble into/ respond to barriers?
How can faculty follow student work, prototypes, collaborations, and goals with their expertise to coach students to their next sprint?
Assessment of your own coaching: are you modeling the role shift you want to see in students? If you want student groups to give good feedback to each other, are you demonstrating good feedback practices with other faculty in public so students can see how to do it?
Focus on feedback-rich classroom practices and formative assessment.
Evaluation of Process and Reflection
Give students the opportunity to reflect on their progress a few times during the project. Reflection leads to metacognitive moments, where students think about their own thinking, creates opportunities for deep learning, and can be transformative. This also helps you understand team dynamics and how students assigned work within their teams. Reflections can be written or more open-format, like art projects. Reflection prompts can be specific to your course as well as more general prompts about their learning experiences.
🔧Photovoice is a well-documented visual reflection tool used in qualitative participatory community-based research projects. MSU’s University Outreach and Engagement (engage@msu.edu) offers workshops about Photovoice. Here are some Photovoice resources including a Photovoice manual, a facilitator toolkit, an implementation guide, and an organizer’s manual.
▶️You can use reflections to:
evaluate project progress and topical knowledge
evaluate team communication
evaluate use of feedback
evaluate how individuals learn their way through challenges
evaluate student identity exploration
🔧Table of written vs. open-format reflections
Written reflections
Open-format/interpretive reflections
Pros
Cons
Pros
Cons
Explicit framing of student feelings and growth
Easier to standardize responses
Good for maintaining student accountability of their own growth and experience
Potentially restrictive to student expression
Potential reduction in student ownership of content
Can resemble a traditional homework assignment, reinstates traditional power dynamics
Encourages student creativity and exploration
Potentially the most honest representation of student experience
Could facilitate more meaningful discussion
Difficult to standardize and interpret
Difficult to glean data from
Harder to communicate expectations
Sample prompt questions:
What strengths do you bring to your team? Are you using the strengths you anticipated using, or are new/different capabilities emerging as you work on your events?
What do you most appreciate about some of the other students in the course? Rather than naming names, identify characteristics, activities, behaviors, etc.
Has the course changed the way you are thinking about your work in the future? In what ways?
Are there things that you would change about the course? What suggestions would you make to the instructors for future classes?
Image by Shahid Abdullah from Pixabay
Authored by: Ellie Louson
Pedagogical Design
Posted on: #iteachmsu
The Assessment Triangle
Sometimes when we hear the word "assessment," we think of students silently completing a multiple-choice exam during class. But, there are a variety of ways to assess learning, and how we assess it depends on which skills and ideas we are interested in finding out what students can do.
Assessment Triangle
The assessment triangle helps us think about how we should assess because it connects what we want students to know and do with how we plan to observe what they know and can do. There are three points on the assessment triangle: cognition, observation, and interpretation (National Research Council, 2001).
Cognition
Which concepts and skills do students need to know and do?There are likely some concepts that students need to memorize. There might, though, also be skills that we are interested in students being able to do. For instance, maybe students need to be able to create something, such as a research question for a study and applicable methods. Maybe they need to solve problems and interpret data. What are you looking to assess?
Observation
What types of tasks will illustrate student knowledge and skills?What you have students do for the assessment will be determined by what you want them to know and do. There are a variety of ways to assess, such as (and these are just a few examples):
Multiple choice exam
Essay exam
Group exam
Project
Research investigation
Case study (real life or fictitious)
Poster
Research paper
Infographic
Presentation
Interpretation
How will the tasks determine student knowledge and skills?Once students complete the assessment, how will understanding be identified? That is, how will the assessment be scored? Scoring or grading rubrics can be a helpful start in identifying your expectations of how a student might approach an assessment and how accurate each approach is (or how many points each one is). Rubrics can either have everything graded on a single scale or can be broken down into separate criteria, culminating into one grade for the task. There are many guides available online for creating rubrics, such as from UC Berkeley's Center for Teaching and Learning.
Try it for Yourself
Draw a triangle on a piece of paper. Label each corner: cognition, observation, and interpretation. Choose a few cognitive aspects that you teach together in a single lesson or unit, identify how you might observe understanding of those cognitive aspects, and how you might interpret your observations.
Reference
National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. https://doi.org/10.17226/10019.
Additional Resources
For assessment examples, view the Assessment Workshop videos, one on assessment options beyond the exam (which includes a description of the assessment triangle a few minutes into the video) and another on exam design.
Although designed for GTAs, this #iteachmsu article on assessment of student learning provides a nice overview, including formative and summative assessment.
This #iteachmsu article on experiential learning describes a more holistic approach to assessment.
Assessment Triangle
The assessment triangle helps us think about how we should assess because it connects what we want students to know and do with how we plan to observe what they know and can do. There are three points on the assessment triangle: cognition, observation, and interpretation (National Research Council, 2001).
Cognition
Which concepts and skills do students need to know and do?There are likely some concepts that students need to memorize. There might, though, also be skills that we are interested in students being able to do. For instance, maybe students need to be able to create something, such as a research question for a study and applicable methods. Maybe they need to solve problems and interpret data. What are you looking to assess?
Observation
What types of tasks will illustrate student knowledge and skills?What you have students do for the assessment will be determined by what you want them to know and do. There are a variety of ways to assess, such as (and these are just a few examples):
Multiple choice exam
Essay exam
Group exam
Project
Research investigation
Case study (real life or fictitious)
Poster
Research paper
Infographic
Presentation
Interpretation
How will the tasks determine student knowledge and skills?Once students complete the assessment, how will understanding be identified? That is, how will the assessment be scored? Scoring or grading rubrics can be a helpful start in identifying your expectations of how a student might approach an assessment and how accurate each approach is (or how many points each one is). Rubrics can either have everything graded on a single scale or can be broken down into separate criteria, culminating into one grade for the task. There are many guides available online for creating rubrics, such as from UC Berkeley's Center for Teaching and Learning.
Try it for Yourself
Draw a triangle on a piece of paper. Label each corner: cognition, observation, and interpretation. Choose a few cognitive aspects that you teach together in a single lesson or unit, identify how you might observe understanding of those cognitive aspects, and how you might interpret your observations.
Reference
National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. https://doi.org/10.17226/10019.
Additional Resources
For assessment examples, view the Assessment Workshop videos, one on assessment options beyond the exam (which includes a description of the assessment triangle a few minutes into the video) and another on exam design.
Although designed for GTAs, this #iteachmsu article on assessment of student learning provides a nice overview, including formative and summative assessment.
This #iteachmsu article on experiential learning describes a more holistic approach to assessment.
Authored by: Andrea Bierema
Assessing Learning
Posted on: Graduate Teaching A...

Assessment of Student Learning: Best Practices and Techniques
In this workshop GTAs learn about assessment strategies for their courses based on best practices. A strong focus is given to assessment tools and gradebook functions of D2L. We also use a template to create a simple rubric structure for any assignment.
Upon completing this session, GTAs will be able to:
Articulate the difference between summative and formative assessment.
Identify multiple assessment strategies based on best practices.
Effectively use the gradebook functions on D2L.
Use a template to develop a simple rubric structure for any assignment.
Upon completing this session, GTAs will be able to:
Articulate the difference between summative and formative assessment.
Identify multiple assessment strategies based on best practices.
Effectively use the gradebook functions on D2L.
Use a template to develop a simple rubric structure for any assignment.
Posted by: Kenneth Gene Herrema
Pedagogical Design
Posted on: MSU Online & Remote...
Assessment in Remote Teaching
Offering assessments in a remote setting will require some planning. For remote delivery, the primary concern should be assessing how well students have achieved the key learning objectives and determining what objectives are still unmet. It may be necessary to modify the nature of the assessment to allow for the more limited affordances of the remote environment.
Posted by: Makena Neal
Assessing Learning
Posted on: Teaching Toolkit Ta...
Exit Card Formative Assessment
Tips
What is Formative Assessment?
Formative assessment allows educators to engage in their students’ learning process in order to assess whether they need to modify teaching strategies in order to ensure student learning and content attainment.
The Notecard
At the completion of some/all classes or content areas, hand out one notecard to each student and ask them to write on one side something they learned and on the backside, one question they still have about the content.
Review each Notecard
Through your review, assess whether students accurately understood the content they needed to learn. If not, plan to reteach areas students are struggling within a different way (a project, handout, guided notes, etc.).
Appropriate Courses to Consider this Activity
This activity may be best used with small to medium size class loads. It may be cumbersome to review all the notecards of an extremely large class. Consider making the notecards anonymous because this is about the educator assessing their teaching of the students (formative), but not assessing the students’ knowledge (summative assessment).
Resources and Tools
You can also use Entry Cards
Not necessary for this activity, but you can also have students complete a card at the beginning of class with a prompt from their readings.
Helpful Links
These educators discuss similar formative uses of exit cards:
https://www.edutopia.org/blog/formative-assessment-exit-slip-rebecca-alber
https://www.nwea.org/blog/2012/classroom-techniques-formative-assessment-idea-number-two/
Additional Ways to Use the Notecard
You are not limited to just having students write what they learned on one side of the card and questions they have on the other. You can also pose a question or a short writing prompt to the students at the end of the class and have the students write their answers on the notecard. Students turn in the card and then are allowed to leave the room. The cards are still used for formative, not summative assessment.
Feedback to Students
If you do have students put their names on the cards, you can write comments on the card or page numbers from the textbook to review in order to ensure they know where to correct inaccuracies or answers to their questions.
What is Formative Assessment?
Formative assessment allows educators to engage in their students’ learning process in order to assess whether they need to modify teaching strategies in order to ensure student learning and content attainment.
The Notecard
At the completion of some/all classes or content areas, hand out one notecard to each student and ask them to write on one side something they learned and on the backside, one question they still have about the content.
Review each Notecard
Through your review, assess whether students accurately understood the content they needed to learn. If not, plan to reteach areas students are struggling within a different way (a project, handout, guided notes, etc.).
Appropriate Courses to Consider this Activity
This activity may be best used with small to medium size class loads. It may be cumbersome to review all the notecards of an extremely large class. Consider making the notecards anonymous because this is about the educator assessing their teaching of the students (formative), but not assessing the students’ knowledge (summative assessment).
Resources and Tools
You can also use Entry Cards
Not necessary for this activity, but you can also have students complete a card at the beginning of class with a prompt from their readings.
Helpful Links
These educators discuss similar formative uses of exit cards:
https://www.edutopia.org/blog/formative-assessment-exit-slip-rebecca-alber
https://www.nwea.org/blog/2012/classroom-techniques-formative-assessment-idea-number-two/
Additional Ways to Use the Notecard
You are not limited to just having students write what they learned on one side of the card and questions they have on the other. You can also pose a question or a short writing prompt to the students at the end of the class and have the students write their answers on the notecard. Students turn in the card and then are allowed to leave the room. The cards are still used for formative, not summative assessment.
Feedback to Students
If you do have students put their names on the cards, you can write comments on the card or page numbers from the textbook to review in order to ensure they know where to correct inaccuracies or answers to their questions.
Authored by: Michelle Malkin
Assessing Learning
Posted on: GenAI & Education
just another dave… ‘learning consumes failure & poops out success’
Posted by: Dave Goodrich
Assessing Learning
Posted on: Center for Teaching...
I had the good fortune to attend an enlightening workshop today: "Student-Centered Approach to Grading" presented by Jeremy Van Hof and Monica Mills. Among the resources they provided was this treasure trove: https://www.gettoby.com/p/jp9xrk523nt1
Posted by: David V. Howe
Assessing Learning
Posted on: #iteachmsu
As you are designing learning experiences in or out of the classroom, what are your “go-to” resources? (Please share details and a link to more information if you have it!)
Posted by: Makena Neal
Pedagogical Design
Posted on: #iteachmsu
Click on the attached file to find an accessible PDF of the MSU Remote Assessment Quick Guide.
Posted by: Makena Neal
Assessing Learning
Posted on: GenAI & Education
AI Commons Bulletin 2/10/2025
🚨 CSU Launches “AI Commons” – Sound Familiar?
The California State University (CSU) system just rolled out CSU AI Commons, a system-wide hub for AI tools, training, and research. Backed by Big Tech partnerships, it focuses on faculty development, student literacy, and workforce acceleration. BUT: AI strategy isn’t just about resources—it’s about who controls the narrative. With corporate-backed AI in higher education, what happens to independent faculty innovation?
Learn More: https://genai.calstate.edu/
🔍Tracking AI Policies in Higher Ed
Embry-Riddle Aeronautical University has compiled a Padlet featuring AI policies and guidelines from institutions worldwide. This evolving resource provides insight into how different universities are shaping their AI approaches.
Learn More: https://padlet.com/cetl6/university-policies-on-generative-ai-m9n7wf05r7rdc6pe
📚 AI Submissions Outperform Students in Recent Study
A PLOS ONE study found that 94% of AI-generated assignments went undetected, with grades averaging half a grade higher than those of real students. There was also an 83.4% chance AI submissions would outperform a random selection of student work across modules.
Learn More: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0305354#:~:text=The%20%27Turing%20Test%27%20is%20now,a%20predefined%20set%20of%20rules
⚞ Blurry Lines in AI and Assessment
A study in Assessment & Evaluation in Higher Education highlights student and educator confusion over acceptable AI use in assessments. Many rely on personal judgment or Grammarly analogies. The authors propose the Dynamic Educational Boundaries Model to embed clear AI-use guidelines directly into assessments.
Learn More: https://doi.org/10.1080/02602938.2025.2456207
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
🚨 CSU Launches “AI Commons” – Sound Familiar?
The California State University (CSU) system just rolled out CSU AI Commons, a system-wide hub for AI tools, training, and research. Backed by Big Tech partnerships, it focuses on faculty development, student literacy, and workforce acceleration. BUT: AI strategy isn’t just about resources—it’s about who controls the narrative. With corporate-backed AI in higher education, what happens to independent faculty innovation?
Learn More: https://genai.calstate.edu/
🔍Tracking AI Policies in Higher Ed
Embry-Riddle Aeronautical University has compiled a Padlet featuring AI policies and guidelines from institutions worldwide. This evolving resource provides insight into how different universities are shaping their AI approaches.
Learn More: https://padlet.com/cetl6/university-policies-on-generative-ai-m9n7wf05r7rdc6pe
📚 AI Submissions Outperform Students in Recent Study
A PLOS ONE study found that 94% of AI-generated assignments went undetected, with grades averaging half a grade higher than those of real students. There was also an 83.4% chance AI submissions would outperform a random selection of student work across modules.
Learn More: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0305354#:~:text=The%20%27Turing%20Test%27%20is%20now,a%20predefined%20set%20of%20rules
⚞ Blurry Lines in AI and Assessment
A study in Assessment & Evaluation in Higher Education highlights student and educator confusion over acceptable AI use in assessments. Many rely on personal judgment or Grammarly analogies. The authors propose the Dynamic Educational Boundaries Model to embed clear AI-use guidelines directly into assessments.
Learn More: https://doi.org/10.1080/02602938.2025.2456207
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Sarah Freye
Posted on: Reading Group for S...
Well, we have reached the end of the road, at least for the 2021-2022 academic year. With that in mind, here is a list, no doubt incomplete, of possible ways we might foster and improvie student motivation, engagement, and success in the classroom:
* Digital Materials and Projects
* Provide Feedback (High Impact)
* Agency and (Assessment) Choice (Universal Learning Design)
* 21st Century Skills (Relevance and Usefulness)
* Collaborative Learning (High Impact)
* Critical (Deeper) Thinking (High Impact)
* Bloom’s Digital Taxonomy (Tasks and Feedback)
* Inclusive Pedagogy (Collaboration)
* Social Justice Pedagogy (Intersectionality, High Impact )
* Animated Explainer Videos
* Universal Design for Learning (Inclusivity)
* High Impact Practices (Feedback, Collaboration, Reflection, Capstone Project,)
* Project-based Learning/Problem-based Learning
* Digital Learning (Assessment, etc.)
* Reflective Learning
* Culturally Responsive Teaching (CRT) – Empower students and expand their capabilities through ownership.
* Create Accessible Content
* Cultivate DEI Practices (Foster a sense of belonging, instill respect, and promote tolerance for ALL members of the class and related ideas.)
* Envision and enact new ways of teaching (leading).
* Multiple Modes of Assessment.
* Continuous Improvement in Our Efforts and Course Design/Presentation
* Encourage students to adopt an interdisciplinary approach in their course projects.
Considerable overlap between some of these very broad points, but if we can incorporate even a few of these ideas into our work with undergrads, we might get just a bit closer to the pot of gold at the end of the rainbow when it comes to improved student motivation, engagement, and performance in our courses. Thank you for an interesting year everyone!
Stokes and Garth
* Digital Materials and Projects
* Provide Feedback (High Impact)
* Agency and (Assessment) Choice (Universal Learning Design)
* 21st Century Skills (Relevance and Usefulness)
* Collaborative Learning (High Impact)
* Critical (Deeper) Thinking (High Impact)
* Bloom’s Digital Taxonomy (Tasks and Feedback)
* Inclusive Pedagogy (Collaboration)
* Social Justice Pedagogy (Intersectionality, High Impact )
* Animated Explainer Videos
* Universal Design for Learning (Inclusivity)
* High Impact Practices (Feedback, Collaboration, Reflection, Capstone Project,)
* Project-based Learning/Problem-based Learning
* Digital Learning (Assessment, etc.)
* Reflective Learning
* Culturally Responsive Teaching (CRT) – Empower students and expand their capabilities through ownership.
* Create Accessible Content
* Cultivate DEI Practices (Foster a sense of belonging, instill respect, and promote tolerance for ALL members of the class and related ideas.)
* Envision and enact new ways of teaching (leading).
* Multiple Modes of Assessment.
* Continuous Improvement in Our Efforts and Course Design/Presentation
* Encourage students to adopt an interdisciplinary approach in their course projects.
Considerable overlap between some of these very broad points, but if we can incorporate even a few of these ideas into our work with undergrads, we might get just a bit closer to the pot of gold at the end of the rainbow when it comes to improved student motivation, engagement, and performance in our courses. Thank you for an interesting year everyone!
Stokes and Garth
Posted by: Stokes Schwartz
Pedagogical Design
Posted on: GenAI & Education
Massive changes have occurred recently with regard to artificial intelligence (AI) and the ability of the public to generate novel text and images using AI tools (e.g. ChatGPT). Many in education are concerned with what this means for assessing student understanding: if a student can generate a novel, accurate essay on almost any topic, how will you assess learning from short-answer and essay assignments?
On 02/01/2023, a campus collaboration of the APUE STEM+ Ed@State, Enhanced Digital Learning Initiative (EDLI), Center for Teaching and Learning Innovation (CTLI), MSU Libraries, and MSU IT EdTech, hosted the "Symposium on AI in Education and Academic Writing". During the symposium, the basics of how AI works were shared and attendees had opportunities to play with some AI tools. The event provided opportunities to hear how faculty are addressing these challenges, discuss concerns and opportunities with colleagues, and reflect on individual teaching philosophies in the time of artificial intelligence (AI).
On 02/01/2023, a campus collaboration of the APUE STEM+ Ed@State, Enhanced Digital Learning Initiative (EDLI), Center for Teaching and Learning Innovation (CTLI), MSU Libraries, and MSU IT EdTech, hosted the "Symposium on AI in Education and Academic Writing". During the symposium, the basics of how AI works were shared and attendees had opportunities to play with some AI tools. The event provided opportunities to hear how faculty are addressing these challenges, discuss concerns and opportunities with colleagues, and reflect on individual teaching philosophies in the time of artificial intelligence (AI).
Posted by: Makena Neal
Posted on: Reading Group for S...
A couple of resources I want to share:
Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction 3rd Edition
by Robert F. Mager (cheap used versions available)
I've only begun digging through this, and I am hoping it will help me to clarify and target the kind of thinking I would like to promote in my teaching:
The Rationality Quotient: Toward a Test of Rational Thinking. Stanovich, West and Toplak
'Smart people do foolish things because intelligence is not the same as the capacity for rational thinking. The Rationality Quotient explains that these two traits, often (and incorrectly) thought of as one, refer to different cognitive functions. The standard IQ test...doesn't measure any of the broad components of rationality—adaptive responding, good judgment, and good decision making. The authors show that rational thinking, like intelligence, is a measurable cognitive competence....[T]hey present the first prototype for an assessment of rational thinking analogous to the IQ test: the CART (Comprehensive Assessment of Rational Thinking).
Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction 3rd Edition
by Robert F. Mager (cheap used versions available)
I've only begun digging through this, and I am hoping it will help me to clarify and target the kind of thinking I would like to promote in my teaching:
The Rationality Quotient: Toward a Test of Rational Thinking. Stanovich, West and Toplak
'Smart people do foolish things because intelligence is not the same as the capacity for rational thinking. The Rationality Quotient explains that these two traits, often (and incorrectly) thought of as one, refer to different cognitive functions. The standard IQ test...doesn't measure any of the broad components of rationality—adaptive responding, good judgment, and good decision making. The authors show that rational thinking, like intelligence, is a measurable cognitive competence....[T]hey present the first prototype for an assessment of rational thinking analogous to the IQ test: the CART (Comprehensive Assessment of Rational Thinking).
Posted by: David V. Howe
Assessing Learning
Host: CTLI
Understanding AI in your pedagogy
This workshop is designed to equip MSU educators with the knowledge and skills necessary to navigate the evolving educational landscape shaped by generative AI. Participants will explore the multifaceted impact of AI on teaching and learning, and develop strategies to integrate AI into their courses effectively while addressing both opportunities and challenges.
Upon completion of this learning experience participants will be able to:
implement AI tools and techniques to enhance teaching practices and improve administrative efficiency in their courses
integrate discussions and content about AI within their discipline to help students understand its relevance and implications in their field of study
develop comprehensive AI policies for their courses, addressing acceptable use, academic integrity, and guidelines for AI-supported assignments and assessments.
Navigating Context
EXPIRED
Host: CTLI
Start with the End in Mind: Backward Design for Better Assessment
This workshop introduces the concept of alignment as a foundation for effective course and assessment design. Participants will learn how to write clear, measurable learning objectives, identify course materials and assessments that align with those objectives, and evaluate the overall coherence of course elements. The session emphasizes backward design as a practical approach to creating intentional, goal-driven learning experiences.
Upon completion of this learning experience, participants will be able to:
define the concept of alignment as it pertains to curriculum design and development
write appropriately stated learning objectives using best practices (e.g., learning taxonomy)
suggest course materials and assessments that are aligned with learning objectives
evaluate various parts of a course for alignment.
Navigating Context
EXPIRED
Host: CTLI
Introduction to Creating Effective Assessments
This hybrid workshop introduces educators to core strategies for designing effective assessments that support student learning and course goals. Participants will explore various types of assessments, evaluate their alignment with learning objectives, and compare approaches based on course context, including discipline, size, and level. The session will also address the emerging role of generative AI in assessment design, offering insights into both challenges and opportunities in today’s evolving educational landscape.
Upon completion of this learning experience, participants will be able to:
identify various assessments strategies and their types
evaluate whether various assessment types are aligned with a course's objectives
compare different assessment strategies based on course discipline, size, level, and goals
describe the role of generative AI in assessment design.
The in-person location for this session is the Center for Teaching and Learning Innovation. Please join us in the Main Library, Room W207. For directions to W207, please visit the Room Locations page..
Navigating Context
EXPIRED