We found 347 results that contain "instructors"

Posted on: #iteachmsu
Friday, Aug 23, 2024
A Quick Guide to Peer Grading / Peer Review
Overview:
Peer grading involves students assessing each other's work based on predefined criteria. This practice promotes active learning, collaboration, and responsibility, aligning with a student-centered approach.
Key Aspects:

Student Involvement: Students engage deeply with the material and grading criteria by evaluating their peers' work.
Feedback Exchange: Provides and receives feedback from peers, fostering a collaborative learning environment.

Key Components:

Empowering Students:

Active Participation: Involves students in their learning journey, enhancing engagement.
Critical Thinking: Develops critical thinking through evaluating peers' work.

Developing Assessment Skills:

Understanding Criteria: Helps students understand high-quality work and improve their own.
Constructive Feedback: Teaches students to provide and articulate constructive feedback.

Promoting Equity and Inclusivity:

Diverse Perspectives: Introduces varied perspectives, leading to a more comprehensive understanding.
Empathy and Respect: Encourages appreciation of different viewpoints and approaches.

Enhancing Engagement and Motivation:

Ownership of Learning: Increases student ownership and motivation.
Collaborative Environment: Fosters a sense of community and mutual support.

Utilizing Technology and D2L:

Efficient Management: Use D2L tools for submission, anonymous grading, and feedback.
Resource Accessibility: Leverage D2L for rubrics, training materials, and discussion forums.


Challenges and Concerns:

Training and Calibration: Ensure students are well-trained in using rubrics and providing feedback.
Bias and Fairness: Minimize biases through anonymizing submissions and instructor oversight.
Balancing Roles: Complement peer grading with instructor assessments to ensure fairness

Benefits:

Enhanced Learning Outcomes: Improves understanding and retention through teaching and evaluating others.
Skill Development: Develops critical thinking, communication, and self-reflection skills.
Improved Engagement: Increases engagement by involving students in grading.


Resources:

Five Innovative Grading Strategies (iteach article)
A Student-Centered Approach to Grading (CTLI workshops)
Teaching students to evaluate each other (Center for Teaching Innovation, Cornell University)
Peer assessment (Center for Teaching Innovation, Cornell University)
Teaching Students to Give Peer Feedback (Edutopia)
Peer review strategies (University of Nevada, Reno)
Kritik (specifically an edtech used for peer review)
Perusall (a social annotation tool with an MSU license that can also be used for Peer Review)
Authored by: Monica L. Mills
post image
Posted on: #iteachmsu
Wednesday, Oct 21, 2020
Crowdmark: Deliver and Grade Assessments
What is Crowdmark? 
Crowdmark is an online collaborative grading and analytics platform that helps educators assess student work. The platform allows for easy distribution and collection of student assignments, offers tools for team grading with rubrics, and streamlines the process for providing rich feedback to students. 
How does Crowdmark improve the assessment experience?
Crowdmark allows instructors to deliver assignments and exams to students online with a due date and time limit, if desired. Students complete the assessment digitally or scan their handwritten work (as an image or PDF) and upload their completed work using a computer or mobile phone for evaluation on Crowdmark. 
Graders can make annotations on the pages, add comments including hyperlinks, embedded images, mathematical and chemical notations, and attach scores according to a grading scheme/rubric. After evaluation is complete, the graded assessments can be electronically returned to students with the click of a button. Crowdmark also provides tools for visualizing student performance and the data can be exported in a convenient format. 
Crowdmark is now integrated with MSU’s instance of D2L Brightspace. This integration provides features such as roster synchronization, team synchronization, and the ability to export grades from Crowdmark into the D2L gradebook. 
What limitations or alternatives should I consider?
The grading rubrics and comment library make grading more consistent and efficient, however, the assessments are primarily graded manually. For auto-graded questions, you may want to consider using the MSU Scoring Office tool, WebAssess™ Assessment Solutions, in Digital Desk or D2L Quizzes. Gradescope is another alternative similar to Crowdmark. 
Where do I start if I want to use it?
See Accessing Crowdmark through D2L, navigate to the Crowdmark sign-in page and select Michigan State University.
Where can I find more information? 
MSU D2L Help:



Getting Started with Crowdmark 



Crowdmark Documentation:



Introduction to Crowdmark 
Getting Started for Instructors 
D2L and Crowdmark 
Crowdmark support 
Authored by: Susan Halick
post image
Posted on: d2l
Tuesday, Jun 4, 2024
D2L Online Test Security (settings, time limits, and submission views)
Online Test Security Issues
Unfortunately, it has become far too commonplace for students to “help each other out” by posting test questions and answers on websites. Try a google search for your course code and exam title and you may find sites listing your exam (e.g., Course Hero, Koofers, Chegg, and Quizlet). Also try a search of specific questions to see what’s out there. Most of these sites are meant to help students study and they post honor codes* but not all students will abide by them.
As the instructor, you can ask these study sites to remove your material when you find it.
Here are some ways to minimize issues when using the D2L Quiz tool, while also promoting honesty and learning. 
Recommendations to reduce cheating

Limit the opportunity to use outside sources by enforcing a time-limit.
Add a mandatory academic honesty question at the beginning of the exam, asking students to certify that the test represents their independent work.
Create large question pools (reword questions and choices each semester so they cannot be easily searched).
Randomize the question sequence and/or answer choices.
Display one question per page, or at least fewer questions per page. This makes it harder for students to take a screenshot of the questions in bulk. 
Craft questions that require critical thinking: Avoid straightforward identification questions, where students can answer through a quick search.
Ask students to select all of the correct answers (use multiple-select type and change the setting to "correct answers" to award partial credit).
Provide limited views of results upon submission. Limiting the viewing window does not prevent copying but it can reduce the ability to go back later to copy.
Use remote proctoring opens in new window for high-stakes exams. The downside can be technical obstacles, cost, and privacy issues (e.g., use of webcams).

Time limits on exams/quizzes
If you are giving an online exam, time-limits help to both reduce cheating and encourage more studying. See The Value of Time Limits on Internet Quizzes opens in new window.
"Time limits on exams are associated with better learning and exam performance because they reduce the opportunity to look up answers in lieu of learning the material."
There is also a setting in D2L quizzes to disable right-click. This prevents students from doing a quick copy of the whole question in order to search for answers in another window. 
Another potential issue: Submission Views
If the submission view shows all of the questions and what they got right and wrong as soon as they submit it, this can easily be copied and sent to another student without you seeing it posted on the internet. Random selection helps to make this less of an issue but it would be better to have clear end-dates with restricted submission views. Keep the default submission view as "score only" and then don’t show the other details until after the end-date.
Here are recommended Submission View settings.
DEFAULT VIEW
Date: immediatelyShow Questions? NoStatistics: none(Saying "No" to Show Questions means students will only see a score)
ADDITIONAL VIEWS
Display right/wrong answers after the due dateDate: ##/##/####Show Questions? Yes, Show all questions with user responsesShow question answers: NoStatistics: none (or show statistics - your preference)
Back to score only at end of courseDate: ##/##/####Show Questions? NoStatistics: none
Bring the second additional view back to “score only” at the end of the semester so that views are not left open when you copy the course to a new semester.
Reviewing Quiz results
To review quiz activity, go to Assessments > Quizzes and select "Grade" on the pull-down menu by the quiz title. Go to the attempts tab, and click on individual attempts to review results by student. Click on "Attempt Logs" to see the quiz entry and completion for each student.
Use the pull-down menu next to a Quiz title in the Quiz list view, and select Statistics to view Quiz/Question stats (view each tab).
Also, watch student activity within your course content to understand patterns that may alert you to issues. Look at Table of Contents > related tools > view reports > users tab, to compare quiz results with content views to discover potential issues.
Here are the recommendations from the D2L Brightspace Community to help prevent cheating:

On the Edit Quiz page, in the Optional Advanced Properties area, select Disable right click.
Select Disable Email, Instant Messages and alerts (but only if there is an enforced time-limit). Students will not be able to use D2L email in any of their courses while the quiz is pending.
Incorporate question pools into your quizzes to distribute unique sets of questions to users.
In the Quiz Questions area, select Shuffle questions at the quiz level.
If Sections are used, select Shuffle questions in this section.

Note:  You can shuffle within sections rather than shuffling questions at the quiz level if you would like to present auto-graded items first (MC, M-S, T/F) and then present essay (WR) questions later.
*Examples of honor codes

MSU Academic Integrity (PDF) opens in new window
Spartan Academic Pledge opens in new window
Course Hero honor code opens in new window
Chegg Honor Code opens in new window
Quizlet Honor Code opens in new window
Koofers Terms of Use opens in new window

Go to Brightspace D2L Documentation for more on creating and managing quizzes opens in new window.
Authored by: Sue Halick and Casey Henley
post image
Posted on: MSU Online & Remote Teaching
Tuesday, May 5, 2020
Communication and Remote Teaching
Communication
As we transition to remote instruction, communicate with your students right away and often. Even if you don’t have a plan in place for your course, communicate with your students as soon as it’s clear that your course will need remote delivery. Be clear with them that changes are coming and what your expectations are for near term engagement with the course. Communication is best done with courses by using the Instructor Systems tool on the Registrar’s website, or by using the Email function of D2L.
Posted by: Makena Neal
post image
Posted on: #iteachmsu
Sunday, Apr 20, 2025
Establishing A Contract via the Syllabus
 
The syllabus is often a document that outlines the rules of a particular class. To this end, it should be clearly communicated what the responsibilities are of the student and instructor and the consequences of breaking that contract.
 
Course Policies:
We’ve already drafted out the course policies in previous guides. So more important now is establishing the potential repercussions of a student violating course policies. While not an exhaustive list, here are some questions you should ask yourself regarding each policy:
 

If a student violates this policy, does it affect their grade?

If so, is the effect comparable to the offense?

For example, if you have an attendance policy, it’s unreasonable to fail a student for missing a single day of class.


If not, what are the consequences of breaking this policy?

For example, students talking when you are explaining something to the class may not influence their grade but still needs to be addressed.




If a student violates this policy, does it affect their academic status?

For example, violence in the classroom certainly shouldn’t be tolerated and needs to have clearly defined consequences.
Academic dishonesty is typically a topic that has consequences outlined by the institution. These may cause students to face suspension and thus those consequences should be outlined.


What happens after repeat offenses?

Are the consequences the same or do they get more severe?
Can a student "come back" from breaking a particular policy multiple times?  


How does one use the consequences of breaking a policy help students improve?

Do you have meetings with the student(s) to address concerns?
Is the institution involved? In what way? 
How much control do you have over the situation?
Etc.



 
The Intangibles: 
Previously discussed is that the syllabus outlines rules for the class outside of course policies. Make sure to detail what happens for students who miss due dates, or who aren’t spending enough time outside of class. Briefly discuss why you have these rules in place and what happens to those who are not fulfilling these expectations.
 
Additionally, what outcomes occur when the instructor does not follow their end of the “contract.” Here are some examples to consider:

What happens when an exam is coming up, but a student hasn’t gotten back any graded homework?

How can they be expected to improve?
What steps can they take to grow? 


Do students understand their current grade/standing in the class ahead of the end of the semester?
How will the instructor rectify being behind?

Communicate why it happened.
When a student can expect it to be fixed.
How that will affect the class.



 
Having the syllabus be used as a contract between insturctor and student communicates to students that the teacher takes the class seriously and is willing to make promises. That hopefully will cause students to be more receptive to suggestions. The purpose is not to try to accentuate perfection, but explain the cause/effect relationship of the course policies/rules and help students navigate college life and numerous courses.
Authored by: Erik Flinn
post image
Posted on: #iteachmsu
Tuesday, Dec 22, 2020
Academic Dishonesty Report
If you are concerned that a student is participating in dishonest academic practices, you may consider filing an Academic Dishonesty Report (ADR). This form can be accessed through the Instructor Forms Menu. An ADR is the form that an instructor completes when they allege that a student has committed an act of academic dishonesty. More information and FAQ can be found on the Office of the University Ombudsperson’s page here. The Office of the University Ombudsperson also provides additional resources on their page here related to academic integrity. 
As the Dean of Students Office manages the formal hearing process for undergraduates under the Integrity of Scholarship and Grades policy, they have also provided flow charts outlining the steps a student might take in appealing either the allegation of academic misconduct or the sanction associated with it found below. Hearings for graduate students are managed by hearing procedures within the individual academic units and colleges, as well as the Graduate School.

Flow chart for process to determine additional sanctions for a student accused of academic misconduct (pdf).
Flow chart for process for Academic Grievance Hearings initiated by undergraduate students contesting allegations of academic misconduct (pdf). 

 
Posted by: Kelly Mazurkiewicz
post image
Posted on: #iteachmsu
Wednesday, Oct 21, 2020
Introducing Packback: a new student discussion portal with AI-mediated coaching & quality assessment
What is Packback?
Packback is an online discussion forum tool is designed to motivate students to explore and investigate the assigned topic, encouraging genuine curiosity and engagement. Students receive live AI-based coaching on writing quality, topic relevance, content originality, and inclusion of citations in their discussion forum posts. Their performance in each of these areas are combined into a single “Curiosity Score” provided to the instructor. Altogether, this increases the detail of feedback the student receives while still alleviating the hands-on time required of the instructor.
How it works & where it excels.

The Curiosity Score is meant to reflect student effort and is calculated based on the Legibility, Endedness, and Credibility of the student’s post. Students can still submit a post flagged by the AI as having a low Curiosity Score. If a flagged post is submitted, a human coach will review the post and provide private feedback to the student, who can then revise their post any number of times until the deadline set by the instructor.
Students are actively encouraged to develop (and respond to) open-ended questions that prompt discussion, relate course material to the “real world”, and may not have one “right” answer.
Packback intercepts closed-ended, off-topic, plagiarized, and poorly written posts, increasing the value of each post to the overall discussion forum. This allows instructors to spend their time assessing the content of discussion posts and answers, as opposed to the students’ writing. The assessment criteria considered by the Packbak AI can be customized by the instructor, including number of posts in each time period, minimum Curiosity Score for each post, and the discussion topic.  
Packback’s AI suggests posts with particularly high scores for you to feature and promote in your class, giving students examples to emulate and learn from the feedback given to the featured posts.
Large courses utilize Packback to improve student performance and utility of assessment feedback. Courses can be broken into smaller groups by creating self-selected sub-sections.
Small courses use Packback as a platform for centralizing peer discussion and communication as students to deep-dive into a given topic.

Packback limitations.

The AI assessment can feel like a “black box” to instructors since a single, comprehensive Curiosity Score is given to each post. Scores can be modified manually in cases where the instructor disagrees with the assigned score.
The Packback-D2L Integration is effectively a direct link to the Packback community for your course, as opposed to embedding the Packback experience directly into your D2L course.
Grades are not included in the Packback D2L integration. Packback’s Curiosity Scores are not sent to the D2L course gradebook, scores must be downloaded from Packback and uploaded to the D2L gradebook.

Where to learn more. 

The Packback home page is a great starting point for exploring Packback features, design, user satisfaction data, and testimonials from both students and instructors.
Packback has a curated, consolidated Educator Support page.
The Packback blog includes numerous articles, interviews, and lecture webinars on pedagogy, user testimonials, and news/updates.
This recorded Packback training webinar from the August 2020 Online Readiness Webinar Series at MSU
This webinar on Tips, Tricks, and Practical Approaches for Getting the Most Value out of Packback
Authored by: Natalie Vandepol
post image
Posted on: #iteachmsu
Friday, Nov 6, 2020
Considerations for Exam Structure
Many decisions must go into the structure of an exam and how that assessment fits into the overall organization of a course. This document will review options for test configuration at multiple levels and then provide some examples of MSU faculty that have incorporated these strategies into their courses.
Course-Level Considerations
Course-level considerations require reviewing the structure of the class to see where major scheduling or grading changes can be made.

Lower the stakes / reduce the scope – Deliver more assessments that each cover less content. This provides students with more accountability for checking understanding in quicker and shorter ways throughout the course which can enhance the learning experience. Reducing the scope of exams in this way can also provide you as the instructor and the student with more targeted areas of feedback earlier on in the learning process
Drop a lowest exam grade – Provide students an “out” if they are unprepared or have a bad testing experience
Use honor codes – When combined with taking time to establish a climate of integrity, honor codes can reduce academic dishonesty

Exam-Level Considerations
Exam-level considerations can be made without altering other components of the course. However, these strategies often require evaluating the style of question asked.

Allow open book or notes and/or collaboration - The National Association of Colleges and Employers determined that the most important skill employers look for in college graduates are problem-solving and teamworking skills. Exams can be structured to practice and assess those skills
Write authentic questions – Teach and test skills and application of knowledge necessary for successful performance as a professional in the field
Allow corrections – Turn typical summative assessments into formative assessments by allowing students to use exams as a learning tool. Exams do not always need to be used as assessment of learning; they can also be used as assessment for learning
Offer more points on the exam that what is needed to achieve a 100% grade
Allow students to have multiple attempts at the exam
Use a two-part exam structure that has students take the exam both individually and in groups.

Question-Level Considerations
Question-level considerations are the easiest to implement; most changes can be accomplished using D2L quizzing tools.

Use question pools
Randomize questions
Limit the number of questions per page
Provide technology practice before the first major exam

Timing Considerations
Deciding on a time limit for an exam is an important decision. There are pros and cons for either limiting time or giving extended time.

Using untimed exams reduces student anxiety – When you have pools of questions that reduce the chances of students cheating on exams, it can allow a unique advantage of removing time limits on exams so as to reduce the anxiety that comes from timed exams
Using timed exams – Setting a time limit can provide a layer of security against academic misconduct. By minimizing the time students have to take the exam, they are more likely to spend that time focusing on the questions and not copying questions or collaborating
Ask TAs or ULAs to take the exam prior to delivery – Provides a report on time estimates that it will take for the class to complete the exam. It also provides opportunities for them to spot check the questions themselves for errors or opportunities to enhance the exam’s efficacy

Collaboration Considerations
When possible, collaborating with faculty colleagues, TAs, or ULAs in exam creation can help minimize the time and effort needed.

Generate questions pools as a faculty team
Have TAs or ULAs to create questions – Their direct involvement with students in supporting their learning throughout the course gives them a unique advantage in knowing how to write questions that can be useful for drawing out evidence of knowledge among learners

Examples from MSU Instructors
Mini-Exams
For many years, chemistry instructors in Lyman Briggs College have incorporated a low-stakes “mini-exam” as the first timed assessment in their introductory chemistry courses. In terms of points, the mini-exam is typically worth about 40% of a midterm exam. The mini-exam gives students an opportunity to experience “exam difficulty” questions in an exam setting. This early exam provides feedback to students regarding their approach to the class (have their study approaches been working?) on a lower-stakes exam. This also allows the instructors an early opportunity to intervene and support students prior to the first higher-stakes midterm exam. The mini-exam can be considered as either more formative (i.e., score dropped if midterm exam scores are higher) or more summative (testing on important expected prior knowledge), depending on the course design. With the move to online instruction, a mini-exam also gives instructors and students an opportunity to test and become familiar with the technology being used for midterm exams in a lower-stakes setting.
Strategies

Lower stakes exams
Provide technology practice before the first major exam

Extra Points
One approach has been successfully used in multiple introductory as well as some upper-level chemistry courses is offering more possible points on an exam than is needed for a grade of 100%. For example, if there are 80 possible points on an exam, grading might be based on a total of 73 points; a student who gets 73 points would earn a 100% grade. This approach allows instructors to communicate high standards for responses to exam questions but still relieves some pressure on students. Anecdotally, instructors have sometimes found that this alleviates the need for regrades. Instructors might choose to limit the maximum grade to 100% or offer bonus credit for students who score above 100%. In addition, building in extra points can potentially reduce some stress for first-year students accustomed to high-school grading scales where often scores above 90% are required for an “A.”
Strategies

Offer more points on the exam that what is needed to achieve a 100% grade

Authentic, Low Stakes Exams
In her neuroscience for non-majors course, Casey Henley writes exam questions that require students to make predictions about novel experiments based on content learned in class. These questions often require students to read and interpret graphs. Since the questions require problem solving, and the answers cannot be looked up, the exams are open book and open note. Additionally, the exams become a learning experience themselves because optional correction assignments are offered, and students can earn points back by reviewing their work and resubmitting answers. Exam corrections also provide information about the misconceptions that students held going into the test, which helps Casey create or edit content for future semesters. The class has four non-cumulative unit exams and one cumulative final. Each has the same point value, and students get to drop one exam grade.
Strategies

Write authentic questions
Lower the stakes
Drop a lowest exam grade
Allow open book or note
Allow corrections

Collaborating on Question Pool Creation
Consider working together with your colleagues on developing shared pools of questions that can be used for quizzes and exams within the same subject matter. This can greatly reduce the chances of cheating and bring a new sense of alignment across courses for those who are teaching similar courses already. It is also an important space for collaboration to take place among peers. A good example of this happening at MSU already is the way instructors in the Biological Sciences program share questions. Instructors in the Physics and Astronomy department have also shared questions across the institution with LON-CAPA for many years. and
Strategies

Use question pools
Generate questions pools as a faculty team
Authored by: Casey Henley and Dave Goodrich
post image