We found 283 results that contain "feedback request"

Posted on: #iteachmsu
Friday, May 22, 2020
Needs Assessment: How to plan the first step to making change
Graduate school is hard. It can be even harder when there are very few people in your program that look like you or come from a similar background as you. Studies show that not only does the diversity of a program contribute to student experiences, but feelings of inclusion and belonging play an equally important role. In fact, a study of MSU graduate students showed that students who did not feel supported or included due to gender, race, sexual identity, culture, or religion were significantly less satisfied with their graduate school programs. This research and our personal experiences suggested that an important project for us as Fellows of the Graduate School would be to focus on improving the experience of underrepresented minority graduate students in our college. Together we built a framework of a multi-tier support system that would provide resources at different levels (student, faculty, and college level). However, in brainstorming possible resources, we discovered that we had very different ideas of what would benefit students.
 
The experts in change management would suggest that you should first conduct a needs assessment before attempting to provide a solution, particularly in a big system. After already establishing that satisfaction with one’s graduate program depends on underrepresented minority graduate students feeling included, we next should figure out what they need to feel included and like they belong. This article is a step by step description of our journey planning a needs assessment to discover how we might best serve underrepresented graduate students.
 
Step 1: Do your research
As PhD students, research comes second nature! We dug into the academic literature to find studies or theories that offered solutions for improving feelings of belonging, engagement, happiness, success, or retention of underrepresented students in graduate school. The literature also provided a bonus: tips for how to successfully implement the solutions. The list of possibilities was further narrowed when we considered what could be done within the bounds of our college with the resources and time we had.
 
Next, we did some benchmarking studies where we looked at what other campuses implemented to address the same issue. Many were far past our ability to implement (giving fellowships for example), but we were really inspired by the student led efforts we found. A few more solutions were added to our list.
 
Step 2: Construct tools to capture voices
A needs assessment suggests that you must ASSESS NEEDS! It is extremely important to not just use your own experiences or examples of what others did to “create a solution”. Instead, the crux of creating sustainable change is to find out what the recipients of the solution need and think through how you might address those needs.
 
We began by creating an open-ended survey that allows underrepresented minority students to freely tell us if they are satisfied with their experiences in graduate school and their suggestions on how to improve their experiences. The second part of the survey asks for feedback and opinions on the list of solutions that we came up with.
 
Next, we made a plan to conduct focus groups. We thought through recruitment, locations, the questions we would ask, and how to merge this information with our survey results. The two methods would allow us to capture the voices of underrepresented students and build our multi-tier support system from their needs and suggestions.
 
Step 3: Identify and engage with administrators
Institutional support is an essential part of creating sustainable change. Getting partners within administration to lead the change also increases the chance that you implement lasting change. To begin this process, we reached out to introduce ourselves and our role to the Dean and Assistant Dean of our college. After establishing a friendly connection, we identified the Assistant Dean as a person who could help champion this work and who had a personal interest in the topic.
 
We set up a meeting and prepared for it by outlining our project (including future steps), summarizing the benefits to the MSU community, and preparing a list of “asks”. The meeting went over successfully with the Assistant Dean agreeing to support and help with the project. She also recommended other administrators that we could consult with. We went about the same process for engaging those administrators.
 
Step 4: Execute Your Plan!
The next step in completing a needs assessment is to do it! Unfortunately, a global pandemic prevented us from being able to conduct our activities, but we can share our plan.
 
Our survey was to be administered from the Dean’s office directly to underrepresented minority students. We concurrently would begin to recruit and conduct focus groups.
 
The information would be consolidated, and we would make a decision about what type of changes, resources, solutions, etc. to provide in order to enhance the experience of underrepresented minority graduate students. Once we created a proposal, we would begin to engage the necessary stakeholders to make it come to life. For example, one of the solutions on our list was to create a page on the college website that featured all of the available resources for underrepresented students. We would have to compile a list of these resources, request permission to do so by administration, get assistance from the website owners, and establish an updating protocol so that the page stays current.
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The best advice that we can offer when conducting a needs assessment is to stay flexible and stay encouraged. You have to be flexible enough to realize that what you thought people needed may not be exactly what they think they need or what can be provided. You will need to be innovative so that your solution satisfies the needs of your audience. Change leadership projects can also be very involved and take a long time. You must bring your patience and enjoy the ride! Stay encouraged throughout the journey and persevere to you goal.
Authored by: Courtney Bryant and Tatiana Bustos
post image
Posted on: #iteachmsu
Monday, Oct 19, 2020
Reintroducing Eli Review
Hello Everyone, 
It’s me, Nick Noel, your friendly neighborhood instructional designer and technologist. I’d like to take a minute to talk about Eli Review. We’ve had this technology on campus for about seven years Since then, we’ve gone through many changes, so I think it’s time to talk about it again.  
What is it? 
Eli Review is a web-based peer review application. It provides an easy platform for students to review each other's work and provide feedback. Instructors set up the writing assignments, focusing on small, targeted assessments, that can lead to larger, more complicated, works.  
How is it helpful?  
The philosophy behind Eli Review is that “feedback is the engine of writing improvement”. For students to develop as writers, they need meaningful feedback on their work. The act of providing feedback, is also beneficial for students learning. By providing a method for instructors to direct peer feedback, and interact with it, Eli review assists in creating a community of learning within the course. So that students and instructors are all actively engaged in the learning process, and are con tributing the success of the class.  
Where can I find more information?  
There is a great deal of documentation on Eli Review’s website. You can sign-up and gain access to the application by going here: Eli Review Website 
There are also numerous resources on how to get started: 

How Eli Review Works 
Navigating Eli Review as an Instructor 
Bill Hart-Davidson talks about how data shaped his teaching during the live classroom activity 
Authored by: Nick Noel
post image
Posted on: GenAI & Education
Thursday, Jun 13, 2024
An Intro to Artificial Intelligence
The following are a collection of publicly sourced articles related to the recent rise of artificial intelligence in everyday applications. note: following each [hyperlinked] title/subtitle are abstracts generated by ChatGPT after receiving prompts from M. Neal. "Write an abstract for [article link]" is indicated with [URL] abstract. "Write an abstract for [article text]" is indicated with [text] abstract. These very similar prompt-responses are shared consecutively in the hope that reading the abstracts will showcase the variations in ChatGPT output (despite similar input). Big Tech was moving cautiously on AI. Then came ChatGPT.
Google, Facebook and Microsoft helped build the scaffolding of AI. Smaller companies are taking it to the masses, forcing Big Tech to react.
[URL] Abstract: "ChatGPT, the Large Language Model developed by OpenAI, has been acquired by Google for an undisclosed sum. The acquisition has been widely seen as a move to improve Google's AI capabilities and stay ahead of competitors in the field. The ChatGPT model will be integrated into various Google services and products, including search, voice assistants, and translation services. This acquisition highlights the growing importance of AI technology and the increasing demand for advanced language models like ChatGPT."
[text] Abstract: Meta's Blenderbot, a chatbot released by Facebook's parent company three months before OpenAI's ChatGPT, was deemed a failure due to its over-cautious content moderation. In contrast, ChatGPT's conversational style, which allows for discussion on controversial topics, has made it a mainstream success. As ChatGPT continues to grow in popularity, tech giants such as Meta and Google are feeling pressure to speed up their own AI processes, potentially disregarding safety concerns. This has sparked concerns among AI ethicists, who fear that big tech's rush to market may expose billions of people to potential harms before they have been fully studied. Despite these concerns, companies like OpenAI continue to release their AI tools to the public, viewing it as the only way to assess real-world harm. The fast pace of AI progress requires companies to prioritize making the right decisions and releasing AI models that best serve the community, while considering the broader societal impacts of these innovations.ChatGPT Is Dumber Than You Think
Treat it like a toy, not a tool.
Abstract: "The deployment of AI language models like ChatGPT, developed by OpenAI, in writing and journalism raises questions about the ethics of AI-generated content. ChatGPT is capable of producing human-like text, making it difficult to distinguish between AI-generated content and content written by a human. The ethical concerns include issues of accuracy, accountability, and bias in AI-generated writing, as well as the potential for AI systems to manipulate public opinion. The article highlights the need for transparent and responsible deployment of AI language models in journalism and writing, and the importance of considering the ethical implications of AI-generated content."
ChatGPT could be used for good, but like many other AI models, it's rife with racist and discriminatory bias
[URL] abstract: ChatGPT, a large language model developed by OpenAI, is prone to biases like many other AI models. Despite its impressive performance in generating human-like text, the model has been found to have biases based on the training data it was exposed to. These biases can be a result of various factors such as the demographic representation in the training data, cultural norms and values, and other systemic biases present in society. It is crucial for developers and users to be aware of these biases and take steps to mitigate them.
[text] abstract: ChatGPT, an advanced artificial intelligence chatbot developed by OpenAI, is at the forefront of the debate on the potential and dangers of AI. Although the chatbot can assist with writing and speaking tasks and provide entertainment, it has also been used to cheat, plagiarize, spread misinformation, and enable unethical business practices. Furthermore, like many chatbots, it has a history of being biased. OpenAI added guardrails to prevent problematic responses, but these were easily bypassed by users. As AI is increasingly used to make important decisions in various industries, such as healthcare and law enforcement, the issue of bias must be addressed to prevent amplifying existing inequalities. This problem is not unique to ChatGPT and is well-documented, with studies showing the negative effects of biased algorithms. The AI industry is divided on who should be held responsible for these biases and whether harmful models should be released. Despite these concerns, AI models are becoming more prevalent in our lives and researchers are urging caution on the potential for harm due to AI biases.
The US Copyright Office says an AI can’t copyright its art[text] abstract: The US Copyright Office has rejected a request to copyright a work of art created by an AI algorithm named Creativity Machine. A three-person board determined that the AI-generated image lacked the necessary "human authorship" required for protection under copyright law. The board emphasized the vital role of the human mind in creative expression, citing past court decisions which have consistently found that non-human expression is ineligible for copyright protection. This decision highlights the importance of human agency in machine-produced works and raises questions about the limits of AI in the realm of art and copyright.[Google] An important next step on our AI journey[URL] abstract: This blog post provides an overview of BARD (Bidirectional Attention Representation for Documents), a new AI-powered search system developed by Google. BARD leverages the latest advances in natural language processing and machine learning to deliver more relevant and accurate search results, even for complex queries. The system utilizes a bidirectional attention mechanism to better understand the context of each word in a document and make connections between words, which helps to improve the relevance of search results. With BARD, Google aims to provide users with a more intuitive and seamless search experience, enabling them to find what they're looking for more quickly and easily.[text] abstract: Google has opened up its experimental conversational AI service, Bard, to trusted testers ahead of wider availability. Bard aims to combine the world's knowledge with the power, intelligence and creativity of large language models, drawing on information from the web to provide high-quality responses. Bard will initially be released with a lightweight version of LaMDA, the company's language model for dialogue applications. The release will be used to gather feedback and improve quality and speed. AI-powered features in Google Search, which will distill complex information and multiple perspectives, will soon be rolled out. Google also plans to onboard individual developers and enterprises to build on top of its best AI models.
last updated on 02/06/2023
Authored by: Makena Neal & ChatGPT
post image
Posted on: GenAI & Education
Friday, Aug 2, 2024
ChatGPT FAQ for MSU Educators
ChatGPT FAQ for MSU Educators
This Q&A is for educators who are interested in learning more about using ChatGPT in higher education. Use this list to learn the basics on how this technology can enhance teaching and learning experiences while also addressing concerns related to its potential misuse.
There are other chatbot platforms that existed before ChatGPT, such as Jasper.ai for marketing and creative content. There are also competing platforms that quickly entered the market following ChatGPT, such as Google Bard and Bing Chat. Many of the answers below also apply to these other AI platforms.
We are focusing on ChatGPT because of how often educators around the world are discussing its potential for disrupting current teaching and learning practices.

What is ChatGPT?
What can ChatGPT do and not do?
Can I trust ChatGPT?
How can I access ChatGPT to try it out?
What is ChatGPT Plus?
Are there tools that detect ChatGPT writing?
Does Turnitin detect AI generated text?
Are there other MSU supported tools that use AI?
What other tools can I use to compare AI detection results?
Is there a university policy on the use of AI tools like ChatGPT?
Is there a recommendation for how to address the use of AI tools in my class?
How can I improve the output from ChatGPT?
Could AI chatbots potentially create issues of digital equity for students?
What are the privacy concerns associated with using AI in education?
What is MSU doing and how can I stay connected with future developments?

Q1: What is ChatGPT?
A: ChatGPT is an AI chatbot that was launched by OpenAI in November 2022. GPT stands for Generative Pre-trained Transformer. It is based on a Large Language Model (LLM) that checks for the probability of what words are likely to come next in a sequence. It is not the same as a search engine because it generates responses on the fly from a vast data source. The model is trained to sound like natural language and is optimized for dialogue. 
Q2: What can ChatGPT do and not do?
A: What ChatGPT can do:
ChatGPT can generate text related to natural language processing, including, but not limited to, answering questions, translating text from one language to another, providing writing assistance by proofreading, editing, and offering suggestions, generating content (e.g., writing an essay, creating a story or poem, summarizing a long article, brainstorming ideas), and engaging in conversations. The tool can also be used to generate and edit code.
In the context of higher education instruction, some educators have already started experimenting with ChatGPT for developing curriculum plans, learning activities, various types of assessments, and rubrics, as well as providing feedback on students’ writing. Students might use ChatGPT to explore different learning topics, get editing suggestions on their written work, brainstorm ideas for projects, and even generate responses to quizzes, exams, and essays, some of which would raise academic integrity issues.
What ChatGPT cannot do:
ChatGPT does not access current websites for information, and according to its statement of limitations, ChatGPT has “limited knowledge of world events after 2021,” and “may occasionally generate incorrect information” and “harmful instructions or biased content.” It is not very accurate at listing citations/references and all output should be checked, as it often makes things up. However, processes are improved with GPT-4 and results may change significantly with new versions over time.
While ChatGPT can create new content based on the data it has been trained on, it still lacks the ability to generate truly original ideas or solve complex problems that require higher-order thinking and creativity. Even though ChatGPT can assist with providing feedback on student work, it is important to note that OpenAI recommends against educators relying on ChatGPT, as giving student feedback involves possible decision making and complex, context-based considerations. See Educator Considerations for ChatGPT for more details of OpenAI’s discussion on the use of ChatGPT in education. 
ChatGPT can be used as an assistant for designing, developing, and teaching courses, but it is not a substitute for educators’ teaching expertise. The best way to learn about its capabilities and limitations is to experiment with ChatGPT within your specific teaching context.
Q3: Can I trust ChatGPT?
A: Ensuring that AI-driven data is accurate and unbiased is very important. The model’s output can sound convincing, but it doesn’t “know” what it is saying and will at times make things up. It is not a substitute for human expertise, judgement, and responsibility. Educators and students need to critically evaluate the information generated by ChatGPT. In practice, assume there are inaccuracies and possible biases (see OpenAI’s FAQ and the given limitations statement).
Ensuring that the collected data is secure and used ethically is also a major challenge. Avoid entering sensitive information. Do not provide any student information or student grades to ChatGPT, as it may be a FERPA violation for disclosing educational records to a third party without the student’s written consent.
Q4: How can I access ChatGPT to try it out?
A: The free version, GPT-3.5, is available at chat.openai.com. Sign up with an email address or Google account. You can create multiple sign-ins to use for work or personal use by using a different email. The sign-up process will ask for a phone number and send you a code for verification. You can use the same phone number for the verification process.
After signing up, go to chat.openai.com/chat. There is a text input field at the bottom where you will enter your prompt. Select “Regenerate response” for another version and scroll through the numbers to the left of the prompt to view each version. Continue refining your results by giving subsequent prompts or start a new chat from the menu on the left. You can edit chat labels or delete them by selecting the item in the menu. If you find an answer is incorrect, you can provide feedback by using the "Thumbs Down" button.
Q5: What is ChatGPT Plus (ChatGPT-4)?
A: ChatGPT Plus (ChatGPT-4) is available with a $20/month subscription. The upgrade provides better access during high demand, faster responses, and priority access to new features. One of the new plugins recently adds web browsing for more current data. GPT-4 surpasses ChatGPT in its advanced reasoning capabilities and can solve difficult problems with greater accuracy. You can access it from the menu options after signing up for the free version.
Q6: Are there tools that detect ChatGPT writing?
A: There are tools that claim to be able to detect AI generated text (e.g., Turnitin, CheckGPT, GPTZero). However, keep in mind that the only evidence is the presence of statistical markers about the likelihood of word patterns. It is possible for human writing to fall along similar patterns, leading to false positives, and it is also possible to produce AI-generated responses that go undetected, leading to false negatives. As of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.). Do not assume AI detectors are giving you absolute facts (see Can AI Generated Text be Reliably Detected by Sadasivan et al.), rather let them guide you in addressing concerns with students. 
The best approach is to have a conversation with the student about whether, how, and why they used the tool. For example, an international student may have entered their own work to polish up language structure. Some students may not know using the technology constitutes academic dishonesty, or to what extent they are allowed to get AI assistance if it hasn’t been mentioned explicitly.Be careful how you approach students and consider refraining from mentioning the use of AI detectors as a threat. “The use of these tools to evaluate student text can increase students’ anxiety and stress (both of which have been found to inhibit learning), while also creating an atmosphere of distrust.” [source: Evaluating Student Text with AI Text Detectors]
Bottom line, dedicate some classroom time to educating students about AI and what you consider to be misuse in your class. Have open discussions about its benefits and limitations. Help students understand the downsides of relying on it and emphasize the importance of developing their own writing abilities.
Q7: Does Turnitin detect AI generated text?
A: Turnitin has its own AI writing detection tool, however, it was removed from within the tool on our D2L instance due to concerns over bias and unreliability. Turnitin acknowledges that false positives and false negatives are possible. Again, as of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.).
Turnitin Disclaimer:
“Our AI writing assessment is designed to help educators identify text that might be prepared by a generative AI tool. Our AI writing assessment may not always be accurate (it may misidentify both human and AI-generated text) so it should not be used as the sole basis for adverse actions against a student. It takes further scrutiny and human judgment in conjunction with an organization's application of its specific academic policies to determine whether any academic misconduct has occurred.”
See the following for more Turnitin resources.

Turnitin’s AI Writing Detection FAQ
Academic integrity in the age of AI
AI conversations: Handling false positives for educators

Return to the Turnitin AI writing resource center for educators periodically to find current articles on this rapidly evolving topic.
Q8: Are there other MSU supported tools that detect AI?
A: Packback is an MSU supported AI tool that monitors student work. When using the Packback discussion tool, students are notified when AI generated text is detected, and guidance is provided on how to use AI tools with integrity. See the article, "Post may have been generated by AI" Reason for Moderation.
Packback also has a free AI detection tool called CheckGPT that can be used even if you don’t use the discussion feature, Packback Questions. CheckGPT will analyze a piece of text and suggest using what you learn to help create teaching moments with your students about ethical use and the importance of academic honesty. The tool author intentionally tuned CheckGPT towards a low false positive.Be aware that as of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.).
Q9: What other tools can I use to compare AI detection results?
A: Some other tools being used include GPTZero, Hive Moderation, and AI Text Classifier. Try submitting your own original work, as well as AI generated results, to get a better understanding of the differences between tools.
GPTZero is a free AI detection tool for educators that is finetuned for student writing and academic prose. You can analyze pasted text or upload files. There is also a separate product with a similar name called ZeroGPT.
Hive Moderation has an AI detection tool that will score the likelihood of generated text by segment. On top of an overall score, results include which engine created an image and which segment of text has the most probable artificial content.
If you search the internet, you will likely find many others (e.g., Top 7 Best Plagiarism Checkers For AI-Generated Content). Experiment with the different tools to get a feel for their usefulness within the context of your teaching. Use the tools as a point of discussion, rather than considering them as proof of misconduct, which may not be the case. Due to the unreliability, it is unclear whether detection has an advantage in the long-term as all have disclaimers.
Q10: Is there a university policy on the use of AI tools like ChatGPT?
A: On August 1, 2023, the Office of the Provost at MSU posted the Generative Artificial Intelligence Guidance, and Technology at MSU also shared an Interim Guidance on Data Uses and Risks of Generative AI. Check for related policies within your college or department. Consider explaining the new university guidance, existing academic integrity policies, and your approach to the use of AI tools in your course syllabus. 
MSU policies and resources related to academic integrity:

Spartan Code of Honor
Integrity of Scholarship and Grades
Student Rights and Responsibilities Article 7
MSU Campus Resources for Academic Integrity

Q11: Is there a recommendation for how to address the use of AI tools in my class?
A: Generally speaking, educators have been responding to the rise of AI tools by either resisting or adapting (see ChatGPT and AI Text Generators: Should Academia Adapt or Resist). Resistors may see more issues than benefits in using AI for teaching and assessing students. They prohibit, bypass or discourage the use of AI tools by returning to in-person pen and paper assessments or using AI detection tools to detect AI generated content. Educators who take a more adaptive approach will likely see more benefits of using AI in teaching and learning and the need of better preparing students for the challenges they will face in a post-AI world. Of course, how you address the use of AI tools in your course depends on your specific teaching context and course goals. An instructor teaching writing classes and an instructor teaching AI-related courses are likely to take different approaches.
In your course syllabus, make your expectations clear on whether students can use AI tools,  what students can use them for, what students should not use them for, and whether they need to explain how they have used them. The more detailed your expectations are, the less likely students will misuse these tools.
Carefully (re)design your course activities and assessments, whether you are embracing AI tools or discouraging the use of them in your course. Consider designing activities or assessments that encourage higher-order thinking, critical thinking, and creative thinking (e.g., reflection activities or essays, activities that develop critical digital literacy, work that focuses on process over end-product, and real world, ill-structured problem-solving), which are essential for successful learning and resistant to students’ misuse of AI tools. Below are several resources for designing activities and assessments in the age of AI:

Five ways to prepare writing assignments in the age of AI by Turnitin
New modes of learning enabled by AI chatbots: Three methods and assignments by Ethan and Lilach Mollick, University of Pennsylvania - Wharton School

Q12: How can I improve the output from ChatGPT?
A: You can improve the output with targeted prompts and subsequent tasks, such as defining the structure type (essay, email, letter, blog, lesson, assignment, quiz, rubric, list, table, outline, etc.) and tone (professional, heartfelt, humorous, in the voice of a celebrity, for a fifth grader, etc.). You can give the chatbot a role and a task (prompt: “you are a college professor teaching __, write a ___ about ___). You can continue improving the output by providing more context details.
You can also train the model by providing a dataset of your own. It will not read text from a website by providing a url (try it sometime to see the model make stuff up). You will have to paste in the text with your request. For example, prompt ChatGPT to give you a summary of … “paste in article text” or ask for a list of discussion questions to give to students from the copied article text. If you want something more concise, give a subsequent prompt to “make it shorter”. If the text appears to have cutoff at the end, you can extend the length by telling ChatGPT to continue from “paste in the last sentence.” For more, see Open AI’s guide on prompt design.
Q13: Could AI chatbots potentially create issues of digital equity for students?
A: Using AI technology in the classroom may lead to unequal access for students. Those with slower internet speeds or no internet access at home may face disadvantages. Additionally, those who pay for subscriptions may have better access and results. Students at schools that prohibit the technology may also be at a disadvantage. However, AI technology can help alleviate inequity in some cases, such as assisting international students with grammar and improving communication with instructors. It can also provide academic support for students without access to private tutors.
Disclosure: The above paragraph was rewritten with the help of ChatGPT. The original is listed below.
The prompt given was simply “Rewrite: If you allow and even encourage the proper use of AI technology in your classroom, students will not all have the same level of access. Students who experience slower internet speeds or have no access to the internet at home may face a disadvantage. Additionally, during peak usage times, the free version may not be readily available. Those who pay for a subscription will have an advantage with better access and improved results. Students from schools that prohibit the use of the technology may find themselves disadvantaged relative to those who were trained to use it as a tool. The technology may help to alleviate inequity in other cases, such as helping an international student polish their grammar or improve communication with an instructor. Students without access to private tutors can potentially get assistance with their studies.”
Both versions are provided to give you an example of using ChatGPT to improve or shorten a piece of writing.
Q14: What are the privacy concerns associated with using AI in education?
A: The company collects information from you and/or your students when signing up for an account. Entering personally identifiable information related to your students in a prompt would be a FERPA violation because prompts may be reviewed by AI trainers to improve their systems. See the privacy-policy for more details. Instructors who are embracing the technology as a learning tool may be creating assignments that specifically encourage its use. However, students who are concerned about privacy issues may be reluctant to use the technology. Consider creating an alternative assignment for those cases.
Q15: What is MSU doing and how can I stay connected with future developments?
A: MSU offered a university-wide AI Symposium in February 2023, followed by on-going conversations within departments and academic programs around the topic. For example, there was an AI/IAH Workshop on April 14, 2023 (resources are posted on iTeach). Other AI coffee talks were posted on the MSU Library calendar. The MSU Spring Conference in May had sessions on the topic, as well as the EdTech Summit in June, 2023 MSU Educational Technology. Check with your department and the following areas to keep up on the latest developments.
iTeach & CTLI (Center for Teaching and Learning Innovation)

Artificial Intelligence in Higher Education (and Learning)
AI & Education Group – login to find the group and join the growing list of members.
MSU’s Spring Conference on Teaching and Learning - May 2023
Using AI in Teaching & Learning iTeach playlist

EDLI (Enhanced Digital Learning Initiative) and Broad College

Short Overview of ChatGPT for University Faculty By Jeremy Van Hof, Eli Broad College of Business, MSU
Quarterly Newsletter Apr 2023

OSSA (Office of Student Support & Accountability) and MSU Policy

When It Comes to Academic Integrity, Even ChatGPT Has the Answer By Jake Kasper, Office of Student Support & Accountability
Office of Student Support & Accountability

Spartan Code of Honor
Integrity of Scholarship and Grades
Student Rights and Responsibilities Article 4
MSU Campus Resources for Academic Integrity



Spartan Newsroom and MSU Today

Are teachers ready for the rise of Artificial Intelligence? - January 20, 2023
What if AI helped write a commencement speech? ComArtSci 2023 commencement

Additional Resources:

Educator Considerations for ChatGPT By OpenAI
ChatGPT Updates and FAQ By OpenAI
ChatGPT General FAQ By OpenAI
AI Chatbot FAQ By Western Carolina University
ChatGPT & Education By Torrey Trust, University of Massachusetts Amherst
Academic integrity in the age of AI By Turnitin
White Paper: How Academia is Adapting to Generative AI
How to Productively Address AI-Generated Text in Your Classroom By Indiana University Bloomington

Currently, there is an explosion of tools integrating Chat AI tools. For example, Quizlet has a new tool called Q-Chat, Khan Academy recently developed Khanmigo, and Grammarly introduced a ChatGPT-style AI tool. On a humorous note, there is CatGPT (not a typo). More guidance and updates are likely to follow this getting started FAQ.Originally posted: May 2023Updated: Nov 2023
Authored by: Sue Halick and Cui Cheng
post image
Posted on: GenAI & Education
post image
ChatGPT FAQ for MSU Educators
ChatGPT FAQ for MSU Educators
This Q&A is for educators who are...
Authored by:
Friday, Aug 2, 2024
Posted on: #iteachmsu
Monday, Oct 12, 2020
A Discussion Board Guide for Undergraduate Learning Assistants (Assignment Example)
The ISB202 course is asynchronous, and thus one of the main ways that the teaching team (i.e., the ULAs and instructor [Dr. Bierema]) interact with students is on the discussion board. This guide was created for ULAs, but much of it might also be helpful for instructors that are new to participating in a discussion board.
 
You can read more about the instructions provided to students as well!
How is Participation Different for a ULA vs. a Student?

In some ways, participation looks the same for a student or ULA, but the purpose of posting is different. As a student, the goal of participating (whether the student realizes it or not), is to engage with the material of the course in a meaningful way. As a ULA, it is to further discussion so that students engage with the material on a deeper level.
Some ULA posts may be representative of a student post. For instance, if students are posting about a specific topic and you know of a great resource for them to check out, the resource should be cited properly. This allows the students to see what a good post looks like.
Other ULA posts are different from a student post. For instance, sometimes we need to ask students to think more deeply or research an idea further. Although these posts would not count as a student participation post, they are essential for furthering that student’s engagement with the material.
Some types of posts may or may not be representative of a student post. For instance, if a discussion is moving off-track, then a student or a ULA may bring it back to the course objectives by asking a meaningful, detailed, possibly leading question. A student may do this, but they are not in charge of this. It is our responsibility as a teaching team to bring the discussion back to the course objectives.

Why Participate in the Discussion Board?

Interact with and form relationships with students.
Illustrate how to setup a meaningful and detailed post.
Move off-topic discussions back to course objectives.
Have students think more deeply about their posts.
Offer new information and new resources relevant to student posts.
Relate posts to what was covered previously in class or what will be covered in the future.
Encourage students to look up information rather than guess.

 
When do I Participate?

Start at the beginning of the discussion week. It can be overwhelming to start later in the week when there are so many posts.
Have the goal to participate more during the beginning than later in the semester (not vice versa). If we do a good job in our posts and feedback early on, then students should need less of our help later in the semester.
If the class is divided into multiple discussion board groups and there is more than one ULA, then self-assign yourselves to different groups and change it each week. Make a calendar to keep track.

 
How do I Format the Post?

Address the student and, if posting something for the entire class, also refer to the class.
Change the text color of the post. Everyone in the teaching team can use dark blue to make it clear which posts come from the teaching team.
Italicize components of a post that address the entire class. See examples throughout this document.
There may be a few students that posted something similar for which you would like to address. If so, acknowledge all of them in your post.

Example response:

Interesting point, [student], about using greenways to connect areas for nature. Although we tend to only think of paths being used by humans, there are also non-human animals that might use these paths. Class, do a little research and let us know which animals might benefit from having these paths and which ones might not benefit. Explain your reasoning and don’t forget to cite your sources. 





 
What do I Post?
The following is a list of different types of possible posts.

Request that the student think more deeply about their post and provide additional information.

Example response to one student:

Student initial post: My family does our best to limit of carbon footprint. We recycle more than we throw away, reuse as much as we can but aim to reduce our plastic consumption, and limit our mat consumption (I consider myself a flexitarian) There are still a lot of adjustments to be made. 
I really like the term "flexitarian," [student]! What are some ways that you and your family are working to reduce your plastic consumption?


Example response to the class:

Context: Students were posting generic concepts regarding ecological footprints.
[Student], you brought up an important point about how overwhelming it can be to make these steps. Some of the points brought up by others, such as remembering to turn off the lights or trying a vegan meal once a month, as [student] mentioned earlier, are things are not too drastic. Class, look back at your reflections- or if you haven't written one yet, consider this question: are the ideas that you brought up actually likely to happen? Which specific things could you start doing now that wouldn't be too costly or too life changing?


Provide a more specific question than what a student originally posted.

Example response:

Student’s question: “I wonder if in the United States they implemented more options in more places, if people would be less likely to litter.”
Great question, [student]! Class, do some research in your own area. What are the recycling capabilities in your area? Does it match what you see? Explain.


Re-route the conversation to class objectives with leading questions.

Example response:

Context: A course focus was on environmental impacts and a discussion on Covid-19 drifted off-topic.
Nice job [student] and others for relating the topics to a truly "current event." Class, how much has delivery increased during these times? How is that overall impacting the environment compared to the positives of so fewer people driving and less manufacturing? Do some research and get back to us!


Provide more information. Make the post representative of the ideal student post: cite properly and hyperlink as needed.

Example response without a citation:

Great point, [student], about avoiding plastic bags at grocery stores. This point, though, reminded me of another interesting point- Class, how have your behaviors that affect the environment been impacted by COVID 19? For instance, I have used reusable bags for years- this is why it got me thinking- but now many stores will not allow them during this pandemic. On the flip side of things, I live in Kalamazoo and used to commute to East Lansing three to five days a week, but since campus closed, I barely drive my car at all.


Example response with a citation:

Great point, [student], regarding how we also need to consider our impact on water when we are determining our ecological footprint quiz. There is actually a calculation for determining our water footprint! Class, check out the Water Footprint Calculator created by the Grace Communications Foundation (2020). What did you find out? Compare it to the ecological footprint quiz- why do you think those similarities and differences exist?











Citation: Grace Communications Foundation. (2020). Water footprint calculator [web interactive]. Retrieved from https://www.watercalculator.org/

Ask the class to think about topics in different ways.

Example response:

Context: Each student had to choose a specific stakeholder to represent, and after the first day of discussion everyone had chosen the same stakeholder.
Hi class,we've heard from a couple of "environmental scientists" now and they have opposing views. After reading through them and critiquing their explanations, what do you think? Would one area be better than another? Those that have written the posts from the perspective of an environmental scientist- what do you think? (Please note that I am not leading to one right answer- the goal is to evaluate the evidence.)


Admire and acknowledge a great post

Example response:

Interesting question, [student]- it takes the idea of our case study this week and really scales it up. I look forward to hearing students' responses!







How do I Build Relationships?

Address the specific student that you are replying to.

If more than one student posted a similar idea, then state all of their names.
Or, if it is more than a few students, use the person’s name for which you are replying and add “others” or “and other classmates.”
If posting a question to the entire class, then address the class, not just the student who you are responding to.
See the various examples in this document.


Acknowledge the good work that students do, such as participating early in the week.

Example response to a student that posted early and thoroughly:

Thanks for starting the conversation, [student]! [student] concluded that this greenway "would help all of the public in many ways."Class, which stakeholders are part of the "public" and which ones benefit from this greenway being between 2 and 5 (besides City Bicycling Club, as [student] thoroughly described)?


Example response to a student that posted a detailed question:

Great questions, [student]!Class, I look forward to your thoughts!


Example response to a student that posted a useful resource:

Thanks for sharing the infographic, [student]! I'm going to post it in the announcements so that students in all of the DB sections are aware of it!


Encourage students to bring in information specific to their discipline.

Example response:

Very interesting question, Sean! Any engineers in the class (or engineer enthusiasts)? How could Houston approach this problem? Feel free to bring in ideas learned from other classes!


If a student seems to be struggling with developing thorough posts, then send the student an email with the included post. Do not wait until grading the following week.

Example Email:

Thank you, [name], for participating in the discussion board. Although I noticed that you have already done a few posts, they are not quite yet demonstrating critical thinking. For instance, rather than posting that someone should look into a topic, do the investigation on your own. What did you find and where did you find it? Post a reply with your findings and citations and explain how it relates to what your peer posted. Please see the rubric on D2L for details on what we look for in posts and contact us if you have any questions. We are here to help!
Authored by: Andrea Bierema
post image
Posted on: #iteachmsu
Monday, Oct 3, 2022
Peer-Educator Dialogue Guide
Peer-Educator Dialogue Guide[*]
This is a "Checklist" guide, not a scaled rating or evaluation form. This guide is intended to be used as a tool to enable educators… “who teach, supervise and/or support students’ learning to gain feedback from one or more colleagues as part of the process of reflecting on their own practices” (University of Exeter). It asks peer-educators to indicate the presence of teaching activities/behaviors already established as indicative of high-quality teaching.  Individual educators, units, departments, etc. can determine which of the items in the categorized lists below reflect their priorities; a targeted set of items per list will make the guide easier for educators to use.
Date:Time:  Instructor-educator name:Course #:Course Title:Modality:No. Students:Peer-Educator name:                                                       
Peer-educator instructions:  Indicate with a check (√) the presence of the following actions and behaviors that indicate high quality teaching.  Leave blank items you do not observe.  Use N/A if an item is not relevant for this experience or the instructor’s teaching style. 
Variety and Pacing of Instruction
The instructor-educator:

uses more than one form of instruction
pauses after asking questions
accepts students’ responses
draws non-participating students into activities/discussions
prevents specific students from dominating activities/discussions
helps students extend their responses
guides the direction of discussion
mediates conflict or differences of opinion
demonstrates active listening
provides explicit directions for active learning tasks (e.g. rationale, duration, product)
allows sufficient time to complete tasks such as group work
specifies how learning tasks will be evaluated (if at all)
provides opportunities and time for students to practice

Examples of instructor-educator actions or behaviors that support the above indications (√):
 
 
Organization
The instructor-educator :

arrives on time
relates this and previous class(es), or provides students with an opportunity to do so
provides class goals or objectives for the class session
provides an outline or organization for the class session
knows how to use the educational technology needed for the class
locates class materials as needed
makes transitional statements between class segments
follows the stated structure
conveys the purpose of each class activity or assignment
completes the scheduled topics
summarizes periodically and at the end of class (or prompts students to do so)

Examples of instructor-educator actions or behaviors that support the above indications (√):
 
 
Presentation Skills
The instructor-educator:

is audible to all students
articulates words so that they are understandable to students, and/or visually represents words that might he difficult for students to hear
varies the tone and pitch of voice for emphasis and interest
speaks at a pace that permits students to understand and take notes
establishes and maintains eye contact
avoids over-reliance on reading content from notes, slides, or texts
avoids distracting mannerisms
uses visual aids effectively (e.g. when appropriate to reinforce a concept, legible handwriting, readable slides)
effectively uses the classroom space

Examples of instructor-educator actions or behaviors that support the above indications (√):

Clarity
The instructor-educator:

notes new terms or concepts
elaborates or repeats complex information
uses examples to explain content
makes explicit statements drawing student attention to certain ideas
pauses during explanations to ask and answer questions

Examples of instructor-educator actions or behaviors that support the above indications (√):
 
 
Content Knowledge
The instructor-educator:

makes statements that are accurate according to the standards of the field
incorporates current research in the field
identifies sources, perspectives, and authorities in the field
identifies diverse sources, perspectives, and authorities in the field
communicates the reasoning process behind operations and/or concepts

Examples of instructor-educator actions or behaviors that support the above indications (√):
 
 
Instructor-Student Rapport
The instructor-educator:

attends respectfully to student comprehension or puzzlement
invites students’ participation and comments
treats students as individuals (e.g. uses students’ names)
provides periodic feedback
incorporates student ideas into class
uses positive reinforcement (i.e. doesn’t punish or deliberately embarrass students in class)

Examples of instructor-educator actions or behaviors that support the above indications (√):

General Peer-Educator Reflection:
What did you observe that went well?
What suggestions for enhancement do you have?
Additional Comments:
 
[*]  Adapted 1/2006 from Chism, N.V.N. (1999) Chapter 6: Classroom Observation, Peer Review of Teaching: A Sourcebook.  Bolton, MA: Anker Publishing, by Angela R. Linse, Executive Director, Schreyer Institute for Teaching Excellence, Penn State.  If you further adapt this form, please include this source citation.
Posted by: Makena Neal
post image
Posted on: #iteachmsu
Tuesday, Sep 15, 2020
High Impact Assessment Peer Review: Assignment Example
In ISB202, Spring Semester 2020, the final course assignment was an infographic (read more on the assignment instructions in the document). A part of the design of this high impact assessment was peer review. Included here are the instructions and grading rubric for the peer review assignment in which students provide feedback to each other online on their final project.
Authored by: Andrea Bierema
post image
Posted on: #iteachmsu
Wednesday, Mar 3, 2021
Action planning with data: Checking in throughout the semester
If you’re wondering, “do I have to wait until end of semester evaluations to see if my changes improved student’s learning experience,” the answer is no! There are simple ways that you can check in with students in your course throughout the semester. Dr. Ashley Moore is an Assistant Professor in MSU’s Department of Teacher Education. Dr. Moore shared some of her approaches for “checking in” in this current remote learning environment. 
Every two to three weeks, Ashley distributes a Google Form to her class. She discloses at the start of the semester here commitment to this practice and includes questions related to her instructional approaches, materials, and uses of technology. The survey is anonymous, so learners can share feedback without fear of retaliation (just like with the mid-semester feedback survey). As a part of this practice, Ashley compiles the results into key ideas and themes to share out with her students the next week. Her reasons for this are two fold: 1) it helps students feel heard, and 2) it reifies that students are often not alone in their feelings/feedback. Transparency in regards to decision making is key in Dr. Moore’s courses, and the data from these brief check-ins help her provide evidence for why decisions are being made and calling out changes in real time. Two samples from Dr. Moore’s TE 101 check-in surveys are included below:

In real time, Dr. Moore uses a few approaches to check in with her students. Ashley utilizes...
Zoom polling: (learn more about utilizing this Zoom feature here: Polling for meetings – Zoom Help Center)

to check in on how people are feeling after a reading or activity

Knowing where where students are emotionally influences how they engage and how as an instructor, Ashley considers structuring the remaining dialogue

Basics on course prep

Knowing whether or not students have purchased the course readings and whether or not they have arrived is good for level setting expectations. 


“Exit ticket” in Zoom chat: 

Directive: Drop one word or phrase about how you’re feelings as we wrap up class

Ashley used the chat from synchronous class to send synthesis email to students containing: overview of the meeting, highlights from the discussions, and reminders of expectations and upcoming deadlines


Class (verbal) dialogue:

Ashley uses Polleverywhere  to allow students to send in questions and comments to the call anonymously and in real time. She then uses these comments/questions to continue driving and prompting class conversation. 

If there is anything harmful, Ashley can identify it in the background and reframe before introducing to the class

Intro check in - 

Ashley logs on 15 min early to her synchronous course meetings. She plays music and usually displays a meme or prompt - asking for responses as learners log in. 

Example: What's one thing you're proud of yourself for doing in the past week?

She starts class officially with an overview of the day then addresses the comments from the intro check-ins generally.


Using tactics like these described by Dr. Moore can help you determine if the changes you’ve enacted based on mid-semester feedback are meeting the needs or if further adaptation is necessary; rather than waiting until end of year evaluations!
 
To read more about Ashley’s background and hear about her graduate experiences check out: https://grad.msu.edu/spotlights/ashley-moore 
Authored by: Makena Neal & Ashley Moore
post image