We found 8 results that contain "symposium"
Posted on: #iteachmsu

Dr. Moroski-Rigney Introduces Writing Centers and Access: A Disability Justice Speaker Series
Dr. Karen Moroski-Rigney
Bio:
I’m Karen-Elizabeth Moroski-Rigney, one of the Associate Directors of The Writing Center. I just joined Michigan State University in Fall 2019! Previously, I had been English & WGSS faculty at Penn State (University Park) where I also coordinated the Undergraduate Writing Center at Penn State Learning. Here at MSU, I’m also one of the inaugural Faculty Accessibility Fellows and a 2019-2020 CAL Leadership Fellow. I’m deeply interested in questions of accessibility and intersectionality — I want to know how the field of writing and writing center studies can continue to engage the digital and the accessible in a pedagogically and personally complex world.
I completed a master’s degree (2013) and doctorate (2017) in Critical Theory (specializing in affective neuroscience, trauma, and the writing process) at Binghamton University. After my PhD, I returned to my first love: Writing Center work. I’m very active in service work for the field of writing center studies! I am an Associate Editor for WAC Clearinghouse; I serve on the board of the International Writing Center Association (IWCA); I spent two years as the elected Secretary and an executive board member for the Mid-Atlantic Writing Center Association (MAWCA) and am the current secretary of the Michigan Writing Center Association (MiWCA); I chaired the scholarships and travel grants committee for IWCA in 2018. Further, I gave a keynote address entitled “Toward a Translingual Writing Center” at the Southeastern Writing Centre Symposium in Toronto in December 2018.
In 2019, I had several peer-reviewed articles published: a colloqium in the Spring 2019 edition of WPA Journal; an article in the Pre/Text special edition on queer rhetorics (Vol 24); a chapter included in a recent edited collection entitled the Pedagogical Potential of Story (Peter Lang, 2019). Still in the pipeline to publication is a chapter in Holly Jackson’s Sharing Spaces and Students: Employing Students in Collaborative Partnerships. My work has also been featured in WLN Blog and Slate.com.
Dr. Moroski-Rigney Introduces Writing Centers and Access: A Disability Justice Speaker Series
"Writing Centers and Access: A Disability Justice Speaker Series" brings together the fields of composition, writing center studies, and disability studies to ask (and begin the process of answering) the question: How do writing centers engage the process of disability justice? Featuring world-renown disability studies scholars, the series seeks to generate conversation and provide community to teachers of writing, to writers, to writing center professionals, and to communities both on our campuses and beyond about the ways in which disability affects writers, writing, and higher education. The series is hosted by The Writing Center at MSU through the hard work of their Accessibility Committee, under the supervision of Dr. Karen Moroski-Rigney.
Bio:
I’m Karen-Elizabeth Moroski-Rigney, one of the Associate Directors of The Writing Center. I just joined Michigan State University in Fall 2019! Previously, I had been English & WGSS faculty at Penn State (University Park) where I also coordinated the Undergraduate Writing Center at Penn State Learning. Here at MSU, I’m also one of the inaugural Faculty Accessibility Fellows and a 2019-2020 CAL Leadership Fellow. I’m deeply interested in questions of accessibility and intersectionality — I want to know how the field of writing and writing center studies can continue to engage the digital and the accessible in a pedagogically and personally complex world.
I completed a master’s degree (2013) and doctorate (2017) in Critical Theory (specializing in affective neuroscience, trauma, and the writing process) at Binghamton University. After my PhD, I returned to my first love: Writing Center work. I’m very active in service work for the field of writing center studies! I am an Associate Editor for WAC Clearinghouse; I serve on the board of the International Writing Center Association (IWCA); I spent two years as the elected Secretary and an executive board member for the Mid-Atlantic Writing Center Association (MAWCA) and am the current secretary of the Michigan Writing Center Association (MiWCA); I chaired the scholarships and travel grants committee for IWCA in 2018. Further, I gave a keynote address entitled “Toward a Translingual Writing Center” at the Southeastern Writing Centre Symposium in Toronto in December 2018.
In 2019, I had several peer-reviewed articles published: a colloqium in the Spring 2019 edition of WPA Journal; an article in the Pre/Text special edition on queer rhetorics (Vol 24); a chapter included in a recent edited collection entitled the Pedagogical Potential of Story (Peter Lang, 2019). Still in the pipeline to publication is a chapter in Holly Jackson’s Sharing Spaces and Students: Employing Students in Collaborative Partnerships. My work has also been featured in WLN Blog and Slate.com.
Dr. Moroski-Rigney Introduces Writing Centers and Access: A Disability Justice Speaker Series
"Writing Centers and Access: A Disability Justice Speaker Series" brings together the fields of composition, writing center studies, and disability studies to ask (and begin the process of answering) the question: How do writing centers engage the process of disability justice? Featuring world-renown disability studies scholars, the series seeks to generate conversation and provide community to teachers of writing, to writers, to writing center professionals, and to communities both on our campuses and beyond about the ways in which disability affects writers, writing, and higher education. The series is hosted by The Writing Center at MSU through the hard work of their Accessibility Committee, under the supervision of Dr. Karen Moroski-Rigney.
Authored by: information provided by The Writing Center at MSU
Pedagogical Design
Posted on: GenAI & Education

ChatGPT FAQ for MSU Educators
ChatGPT FAQ for MSU Educators
This Q&A is for educators who are interested in learning more about using ChatGPT in higher education. Use this list to learn the basics on how this technology can enhance teaching and learning experiences while also addressing concerns related to its potential misuse.
There are other chatbot platforms that existed before ChatGPT, such as Jasper.ai for marketing and creative content. There are also competing platforms that quickly entered the market following ChatGPT, such as Google Bard and Bing Chat. Many of the answers below also apply to these other AI platforms.
We are focusing on ChatGPT because of how often educators around the world are discussing its potential for disrupting current teaching and learning practices.
What is ChatGPT?
What can ChatGPT do and not do?
Can I trust ChatGPT?
How can I access ChatGPT to try it out?
What is ChatGPT Plus?
Are there tools that detect ChatGPT writing?
Does Turnitin detect AI generated text?
Are there other MSU supported tools that use AI?
What other tools can I use to compare AI detection results?
Is there a university policy on the use of AI tools like ChatGPT?
Is there a recommendation for how to address the use of AI tools in my class?
How can I improve the output from ChatGPT?
Could AI chatbots potentially create issues of digital equity for students?
What are the privacy concerns associated with using AI in education?
What is MSU doing and how can I stay connected with future developments?
Q1: What is ChatGPT?
A: ChatGPT is an AI chatbot that was launched by OpenAI in November 2022. GPT stands for Generative Pre-trained Transformer. It is based on a Large Language Model (LLM) that checks for the probability of what words are likely to come next in a sequence. It is not the same as a search engine because it generates responses on the fly from a vast data source. The model is trained to sound like natural language and is optimized for dialogue.
Q2: What can ChatGPT do and not do?
A: What ChatGPT can do:
ChatGPT can generate text related to natural language processing, including, but not limited to, answering questions, translating text from one language to another, providing writing assistance by proofreading, editing, and offering suggestions, generating content (e.g., writing an essay, creating a story or poem, summarizing a long article, brainstorming ideas), and engaging in conversations. The tool can also be used to generate and edit code.
In the context of higher education instruction, some educators have already started experimenting with ChatGPT for developing curriculum plans, learning activities, various types of assessments, and rubrics, as well as providing feedback on students’ writing. Students might use ChatGPT to explore different learning topics, get editing suggestions on their written work, brainstorm ideas for projects, and even generate responses to quizzes, exams, and essays, some of which would raise academic integrity issues.
What ChatGPT cannot do:
ChatGPT does not access current websites for information, and according to its statement of limitations, ChatGPT has “limited knowledge of world events after 2021,” and “may occasionally generate incorrect information” and “harmful instructions or biased content.” It is not very accurate at listing citations/references and all output should be checked, as it often makes things up. However, processes are improved with GPT-4 and results may change significantly with new versions over time.
While ChatGPT can create new content based on the data it has been trained on, it still lacks the ability to generate truly original ideas or solve complex problems that require higher-order thinking and creativity. Even though ChatGPT can assist with providing feedback on student work, it is important to note that OpenAI recommends against educators relying on ChatGPT, as giving student feedback involves possible decision making and complex, context-based considerations. See Educator Considerations for ChatGPT for more details of OpenAI’s discussion on the use of ChatGPT in education.
ChatGPT can be used as an assistant for designing, developing, and teaching courses, but it is not a substitute for educators’ teaching expertise. The best way to learn about its capabilities and limitations is to experiment with ChatGPT within your specific teaching context.
Q3: Can I trust ChatGPT?
A: Ensuring that AI-driven data is accurate and unbiased is very important. The model’s output can sound convincing, but it doesn’t “know” what it is saying and will at times make things up. It is not a substitute for human expertise, judgement, and responsibility. Educators and students need to critically evaluate the information generated by ChatGPT. In practice, assume there are inaccuracies and possible biases (see OpenAI’s FAQ and the given limitations statement).
Ensuring that the collected data is secure and used ethically is also a major challenge. Avoid entering sensitive information. Do not provide any student information or student grades to ChatGPT, as it may be a FERPA violation for disclosing educational records to a third party without the student’s written consent.
Q4: How can I access ChatGPT to try it out?
A: The free version, GPT-3.5, is available at chat.openai.com. Sign up with an email address or Google account. You can create multiple sign-ins to use for work or personal use by using a different email. The sign-up process will ask for a phone number and send you a code for verification. You can use the same phone number for the verification process.
After signing up, go to chat.openai.com/chat. There is a text input field at the bottom where you will enter your prompt. Select “Regenerate response” for another version and scroll through the numbers to the left of the prompt to view each version. Continue refining your results by giving subsequent prompts or start a new chat from the menu on the left. You can edit chat labels or delete them by selecting the item in the menu. If you find an answer is incorrect, you can provide feedback by using the "Thumbs Down" button.
Q5: What is ChatGPT Plus (ChatGPT-4)?
A: ChatGPT Plus (ChatGPT-4) is available with a $20/month subscription. The upgrade provides better access during high demand, faster responses, and priority access to new features. One of the new plugins recently adds web browsing for more current data. GPT-4 surpasses ChatGPT in its advanced reasoning capabilities and can solve difficult problems with greater accuracy. You can access it from the menu options after signing up for the free version.
Q6: Are there tools that detect ChatGPT writing?
A: There are tools that claim to be able to detect AI generated text (e.g., Turnitin, CheckGPT, GPTZero). However, keep in mind that the only evidence is the presence of statistical markers about the likelihood of word patterns. It is possible for human writing to fall along similar patterns, leading to false positives, and it is also possible to produce AI-generated responses that go undetected, leading to false negatives. As of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.). Do not assume AI detectors are giving you absolute facts (see Can AI Generated Text be Reliably Detected by Sadasivan et al.), rather let them guide you in addressing concerns with students.
The best approach is to have a conversation with the student about whether, how, and why they used the tool. For example, an international student may have entered their own work to polish up language structure. Some students may not know using the technology constitutes academic dishonesty, or to what extent they are allowed to get AI assistance if it hasn’t been mentioned explicitly.Be careful how you approach students and consider refraining from mentioning the use of AI detectors as a threat. “The use of these tools to evaluate student text can increase students’ anxiety and stress (both of which have been found to inhibit learning), while also creating an atmosphere of distrust.” [source: Evaluating Student Text with AI Text Detectors]
Bottom line, dedicate some classroom time to educating students about AI and what you consider to be misuse in your class. Have open discussions about its benefits and limitations. Help students understand the downsides of relying on it and emphasize the importance of developing their own writing abilities.
Q7: Does Turnitin detect AI generated text?
A: Turnitin has its own AI writing detection tool, however, it was removed from within the tool on our D2L instance due to concerns over bias and unreliability. Turnitin acknowledges that false positives and false negatives are possible. Again, as of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.).
Turnitin Disclaimer:
“Our AI writing assessment is designed to help educators identify text that might be prepared by a generative AI tool. Our AI writing assessment may not always be accurate (it may misidentify both human and AI-generated text) so it should not be used as the sole basis for adverse actions against a student. It takes further scrutiny and human judgment in conjunction with an organization's application of its specific academic policies to determine whether any academic misconduct has occurred.”
See the following for more Turnitin resources.
Turnitin’s AI Writing Detection FAQ
Academic integrity in the age of AI
AI conversations: Handling false positives for educators
Return to the Turnitin AI writing resource center for educators periodically to find current articles on this rapidly evolving topic.
Q8: Are there other MSU supported tools that detect AI?
A: Packback is an MSU supported AI tool that monitors student work. When using the Packback discussion tool, students are notified when AI generated text is detected, and guidance is provided on how to use AI tools with integrity. See the article, "Post may have been generated by AI" Reason for Moderation.
Packback also has a free AI detection tool called CheckGPT that can be used even if you don’t use the discussion feature, Packback Questions. CheckGPT will analyze a piece of text and suggest using what you learn to help create teaching moments with your students about ethical use and the importance of academic honesty. The tool author intentionally tuned CheckGPT towards a low false positive.Be aware that as of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.).
Q9: What other tools can I use to compare AI detection results?
A: Some other tools being used include GPTZero, Hive Moderation, and AI Text Classifier. Try submitting your own original work, as well as AI generated results, to get a better understanding of the differences between tools.
GPTZero is a free AI detection tool for educators that is finetuned for student writing and academic prose. You can analyze pasted text or upload files. There is also a separate product with a similar name called ZeroGPT.
Hive Moderation has an AI detection tool that will score the likelihood of generated text by segment. On top of an overall score, results include which engine created an image and which segment of text has the most probable artificial content.
If you search the internet, you will likely find many others (e.g., Top 7 Best Plagiarism Checkers For AI-Generated Content). Experiment with the different tools to get a feel for their usefulness within the context of your teaching. Use the tools as a point of discussion, rather than considering them as proof of misconduct, which may not be the case. Due to the unreliability, it is unclear whether detection has an advantage in the long-term as all have disclaimers.
Q10: Is there a university policy on the use of AI tools like ChatGPT?
A: On August 1, 2023, the Office of the Provost at MSU posted the Generative Artificial Intelligence Guidance, and Technology at MSU also shared an Interim Guidance on Data Uses and Risks of Generative AI. Check for related policies within your college or department. Consider explaining the new university guidance, existing academic integrity policies, and your approach to the use of AI tools in your course syllabus.
MSU policies and resources related to academic integrity:
Spartan Code of Honor
Integrity of Scholarship and Grades
Student Rights and Responsibilities Article 7
MSU Campus Resources for Academic Integrity
Q11: Is there a recommendation for how to address the use of AI tools in my class?
A: Generally speaking, educators have been responding to the rise of AI tools by either resisting or adapting (see ChatGPT and AI Text Generators: Should Academia Adapt or Resist). Resistors may see more issues than benefits in using AI for teaching and assessing students. They prohibit, bypass or discourage the use of AI tools by returning to in-person pen and paper assessments or using AI detection tools to detect AI generated content. Educators who take a more adaptive approach will likely see more benefits of using AI in teaching and learning and the need of better preparing students for the challenges they will face in a post-AI world. Of course, how you address the use of AI tools in your course depends on your specific teaching context and course goals. An instructor teaching writing classes and an instructor teaching AI-related courses are likely to take different approaches.
In your course syllabus, make your expectations clear on whether students can use AI tools, what students can use them for, what students should not use them for, and whether they need to explain how they have used them. The more detailed your expectations are, the less likely students will misuse these tools.
Carefully (re)design your course activities and assessments, whether you are embracing AI tools or discouraging the use of them in your course. Consider designing activities or assessments that encourage higher-order thinking, critical thinking, and creative thinking (e.g., reflection activities or essays, activities that develop critical digital literacy, work that focuses on process over end-product, and real world, ill-structured problem-solving), which are essential for successful learning and resistant to students’ misuse of AI tools. Below are several resources for designing activities and assessments in the age of AI:
Five ways to prepare writing assignments in the age of AI by Turnitin
New modes of learning enabled by AI chatbots: Three methods and assignments by Ethan and Lilach Mollick, University of Pennsylvania - Wharton School
Q12: How can I improve the output from ChatGPT?
A: You can improve the output with targeted prompts and subsequent tasks, such as defining the structure type (essay, email, letter, blog, lesson, assignment, quiz, rubric, list, table, outline, etc.) and tone (professional, heartfelt, humorous, in the voice of a celebrity, for a fifth grader, etc.). You can give the chatbot a role and a task (prompt: “you are a college professor teaching __, write a ___ about ___). You can continue improving the output by providing more context details.
You can also train the model by providing a dataset of your own. It will not read text from a website by providing a url (try it sometime to see the model make stuff up). You will have to paste in the text with your request. For example, prompt ChatGPT to give you a summary of … “paste in article text” or ask for a list of discussion questions to give to students from the copied article text. If you want something more concise, give a subsequent prompt to “make it shorter”. If the text appears to have cutoff at the end, you can extend the length by telling ChatGPT to continue from “paste in the last sentence.” For more, see Open AI’s guide on prompt design.
Q13: Could AI chatbots potentially create issues of digital equity for students?
A: Using AI technology in the classroom may lead to unequal access for students. Those with slower internet speeds or no internet access at home may face disadvantages. Additionally, those who pay for subscriptions may have better access and results. Students at schools that prohibit the technology may also be at a disadvantage. However, AI technology can help alleviate inequity in some cases, such as assisting international students with grammar and improving communication with instructors. It can also provide academic support for students without access to private tutors.
Disclosure: The above paragraph was rewritten with the help of ChatGPT. The original is listed below.
The prompt given was simply “Rewrite: If you allow and even encourage the proper use of AI technology in your classroom, students will not all have the same level of access. Students who experience slower internet speeds or have no access to the internet at home may face a disadvantage. Additionally, during peak usage times, the free version may not be readily available. Those who pay for a subscription will have an advantage with better access and improved results. Students from schools that prohibit the use of the technology may find themselves disadvantaged relative to those who were trained to use it as a tool. The technology may help to alleviate inequity in other cases, such as helping an international student polish their grammar or improve communication with an instructor. Students without access to private tutors can potentially get assistance with their studies.”
Both versions are provided to give you an example of using ChatGPT to improve or shorten a piece of writing.
Q14: What are the privacy concerns associated with using AI in education?
A: The company collects information from you and/or your students when signing up for an account. Entering personally identifiable information related to your students in a prompt would be a FERPA violation because prompts may be reviewed by AI trainers to improve their systems. See the privacy-policy for more details. Instructors who are embracing the technology as a learning tool may be creating assignments that specifically encourage its use. However, students who are concerned about privacy issues may be reluctant to use the technology. Consider creating an alternative assignment for those cases.
Q15: What is MSU doing and how can I stay connected with future developments?
A: MSU offered a university-wide AI Symposium in February 2023, followed by on-going conversations within departments and academic programs around the topic. For example, there was an AI/IAH Workshop on April 14, 2023 (resources are posted on iTeach). Other AI coffee talks were posted on the MSU Library calendar. The MSU Spring Conference in May had sessions on the topic, as well as the EdTech Summit in June, 2023 MSU Educational Technology. Check with your department and the following areas to keep up on the latest developments.
iTeach & CTLI (Center for Teaching and Learning Innovation)
Artificial Intelligence in Higher Education (and Learning)
AI & Education Group – login to find the group and join the growing list of members.
MSU’s Spring Conference on Teaching and Learning - May 2023
Using AI in Teaching & Learning iTeach playlist
EDLI (Enhanced Digital Learning Initiative) and Broad College
Short Overview of ChatGPT for University Faculty By Jeremy Van Hof, Eli Broad College of Business, MSU
Quarterly Newsletter Apr 2023
OSSA (Office of Student Support & Accountability) and MSU Policy
When It Comes to Academic Integrity, Even ChatGPT Has the Answer By Jake Kasper, Office of Student Support & Accountability
Office of Student Support & Accountability
Spartan Code of Honor
Integrity of Scholarship and Grades
Student Rights and Responsibilities Article 4
MSU Campus Resources for Academic Integrity
Spartan Newsroom and MSU Today
Are teachers ready for the rise of Artificial Intelligence? - January 20, 2023
What if AI helped write a commencement speech? ComArtSci 2023 commencement
Additional Resources:
Educator Considerations for ChatGPT By OpenAI
ChatGPT Updates and FAQ By OpenAI
ChatGPT General FAQ By OpenAI
AI Chatbot FAQ By Western Carolina University
ChatGPT & Education By Torrey Trust, University of Massachusetts Amherst
Academic integrity in the age of AI By Turnitin
White Paper: How Academia is Adapting to Generative AI
How to Productively Address AI-Generated Text in Your Classroom By Indiana University Bloomington
Currently, there is an explosion of tools integrating Chat AI tools. For example, Quizlet has a new tool called Q-Chat, Khan Academy recently developed Khanmigo, and Grammarly introduced a ChatGPT-style AI tool. On a humorous note, there is CatGPT (not a typo). More guidance and updates are likely to follow this getting started FAQ.Originally posted: May 2023Updated: Nov 2023
This Q&A is for educators who are interested in learning more about using ChatGPT in higher education. Use this list to learn the basics on how this technology can enhance teaching and learning experiences while also addressing concerns related to its potential misuse.
There are other chatbot platforms that existed before ChatGPT, such as Jasper.ai for marketing and creative content. There are also competing platforms that quickly entered the market following ChatGPT, such as Google Bard and Bing Chat. Many of the answers below also apply to these other AI platforms.
We are focusing on ChatGPT because of how often educators around the world are discussing its potential for disrupting current teaching and learning practices.
What is ChatGPT?
What can ChatGPT do and not do?
Can I trust ChatGPT?
How can I access ChatGPT to try it out?
What is ChatGPT Plus?
Are there tools that detect ChatGPT writing?
Does Turnitin detect AI generated text?
Are there other MSU supported tools that use AI?
What other tools can I use to compare AI detection results?
Is there a university policy on the use of AI tools like ChatGPT?
Is there a recommendation for how to address the use of AI tools in my class?
How can I improve the output from ChatGPT?
Could AI chatbots potentially create issues of digital equity for students?
What are the privacy concerns associated with using AI in education?
What is MSU doing and how can I stay connected with future developments?
Q1: What is ChatGPT?
A: ChatGPT is an AI chatbot that was launched by OpenAI in November 2022. GPT stands for Generative Pre-trained Transformer. It is based on a Large Language Model (LLM) that checks for the probability of what words are likely to come next in a sequence. It is not the same as a search engine because it generates responses on the fly from a vast data source. The model is trained to sound like natural language and is optimized for dialogue.
Q2: What can ChatGPT do and not do?
A: What ChatGPT can do:
ChatGPT can generate text related to natural language processing, including, but not limited to, answering questions, translating text from one language to another, providing writing assistance by proofreading, editing, and offering suggestions, generating content (e.g., writing an essay, creating a story or poem, summarizing a long article, brainstorming ideas), and engaging in conversations. The tool can also be used to generate and edit code.
In the context of higher education instruction, some educators have already started experimenting with ChatGPT for developing curriculum plans, learning activities, various types of assessments, and rubrics, as well as providing feedback on students’ writing. Students might use ChatGPT to explore different learning topics, get editing suggestions on their written work, brainstorm ideas for projects, and even generate responses to quizzes, exams, and essays, some of which would raise academic integrity issues.
What ChatGPT cannot do:
ChatGPT does not access current websites for information, and according to its statement of limitations, ChatGPT has “limited knowledge of world events after 2021,” and “may occasionally generate incorrect information” and “harmful instructions or biased content.” It is not very accurate at listing citations/references and all output should be checked, as it often makes things up. However, processes are improved with GPT-4 and results may change significantly with new versions over time.
While ChatGPT can create new content based on the data it has been trained on, it still lacks the ability to generate truly original ideas or solve complex problems that require higher-order thinking and creativity. Even though ChatGPT can assist with providing feedback on student work, it is important to note that OpenAI recommends against educators relying on ChatGPT, as giving student feedback involves possible decision making and complex, context-based considerations. See Educator Considerations for ChatGPT for more details of OpenAI’s discussion on the use of ChatGPT in education.
ChatGPT can be used as an assistant for designing, developing, and teaching courses, but it is not a substitute for educators’ teaching expertise. The best way to learn about its capabilities and limitations is to experiment with ChatGPT within your specific teaching context.
Q3: Can I trust ChatGPT?
A: Ensuring that AI-driven data is accurate and unbiased is very important. The model’s output can sound convincing, but it doesn’t “know” what it is saying and will at times make things up. It is not a substitute for human expertise, judgement, and responsibility. Educators and students need to critically evaluate the information generated by ChatGPT. In practice, assume there are inaccuracies and possible biases (see OpenAI’s FAQ and the given limitations statement).
Ensuring that the collected data is secure and used ethically is also a major challenge. Avoid entering sensitive information. Do not provide any student information or student grades to ChatGPT, as it may be a FERPA violation for disclosing educational records to a third party without the student’s written consent.
Q4: How can I access ChatGPT to try it out?
A: The free version, GPT-3.5, is available at chat.openai.com. Sign up with an email address or Google account. You can create multiple sign-ins to use for work or personal use by using a different email. The sign-up process will ask for a phone number and send you a code for verification. You can use the same phone number for the verification process.
After signing up, go to chat.openai.com/chat. There is a text input field at the bottom where you will enter your prompt. Select “Regenerate response” for another version and scroll through the numbers to the left of the prompt to view each version. Continue refining your results by giving subsequent prompts or start a new chat from the menu on the left. You can edit chat labels or delete them by selecting the item in the menu. If you find an answer is incorrect, you can provide feedback by using the "Thumbs Down" button.
Q5: What is ChatGPT Plus (ChatGPT-4)?
A: ChatGPT Plus (ChatGPT-4) is available with a $20/month subscription. The upgrade provides better access during high demand, faster responses, and priority access to new features. One of the new plugins recently adds web browsing for more current data. GPT-4 surpasses ChatGPT in its advanced reasoning capabilities and can solve difficult problems with greater accuracy. You can access it from the menu options after signing up for the free version.
Q6: Are there tools that detect ChatGPT writing?
A: There are tools that claim to be able to detect AI generated text (e.g., Turnitin, CheckGPT, GPTZero). However, keep in mind that the only evidence is the presence of statistical markers about the likelihood of word patterns. It is possible for human writing to fall along similar patterns, leading to false positives, and it is also possible to produce AI-generated responses that go undetected, leading to false negatives. As of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.). Do not assume AI detectors are giving you absolute facts (see Can AI Generated Text be Reliably Detected by Sadasivan et al.), rather let them guide you in addressing concerns with students.
The best approach is to have a conversation with the student about whether, how, and why they used the tool. For example, an international student may have entered their own work to polish up language structure. Some students may not know using the technology constitutes academic dishonesty, or to what extent they are allowed to get AI assistance if it hasn’t been mentioned explicitly.Be careful how you approach students and consider refraining from mentioning the use of AI detectors as a threat. “The use of these tools to evaluate student text can increase students’ anxiety and stress (both of which have been found to inhibit learning), while also creating an atmosphere of distrust.” [source: Evaluating Student Text with AI Text Detectors]
Bottom line, dedicate some classroom time to educating students about AI and what you consider to be misuse in your class. Have open discussions about its benefits and limitations. Help students understand the downsides of relying on it and emphasize the importance of developing their own writing abilities.
Q7: Does Turnitin detect AI generated text?
A: Turnitin has its own AI writing detection tool, however, it was removed from within the tool on our D2L instance due to concerns over bias and unreliability. Turnitin acknowledges that false positives and false negatives are possible. Again, as of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.).
Turnitin Disclaimer:
“Our AI writing assessment is designed to help educators identify text that might be prepared by a generative AI tool. Our AI writing assessment may not always be accurate (it may misidentify both human and AI-generated text) so it should not be used as the sole basis for adverse actions against a student. It takes further scrutiny and human judgment in conjunction with an organization's application of its specific academic policies to determine whether any academic misconduct has occurred.”
See the following for more Turnitin resources.
Turnitin’s AI Writing Detection FAQ
Academic integrity in the age of AI
AI conversations: Handling false positives for educators
Return to the Turnitin AI writing resource center for educators periodically to find current articles on this rapidly evolving topic.
Q8: Are there other MSU supported tools that detect AI?
A: Packback is an MSU supported AI tool that monitors student work. When using the Packback discussion tool, students are notified when AI generated text is detected, and guidance is provided on how to use AI tools with integrity. See the article, "Post may have been generated by AI" Reason for Moderation.
Packback also has a free AI detection tool called CheckGPT that can be used even if you don’t use the discussion feature, Packback Questions. CheckGPT will analyze a piece of text and suggest using what you learn to help create teaching moments with your students about ethical use and the importance of academic honesty. The tool author intentionally tuned CheckGPT towards a low false positive.Be aware that as of Aug. 1, no reliable detector has been identified, and the detectors that are widely available have been shown by multiple studies to be biased against English language learners and people with disabilities (e.g., GPT detectors are biased against non-native English writers by Liang et al. and A study of implicit bias in pretrained language models against people with disabilities by Venkit et al.).
Q9: What other tools can I use to compare AI detection results?
A: Some other tools being used include GPTZero, Hive Moderation, and AI Text Classifier. Try submitting your own original work, as well as AI generated results, to get a better understanding of the differences between tools.
GPTZero is a free AI detection tool for educators that is finetuned for student writing and academic prose. You can analyze pasted text or upload files. There is also a separate product with a similar name called ZeroGPT.
Hive Moderation has an AI detection tool that will score the likelihood of generated text by segment. On top of an overall score, results include which engine created an image and which segment of text has the most probable artificial content.
If you search the internet, you will likely find many others (e.g., Top 7 Best Plagiarism Checkers For AI-Generated Content). Experiment with the different tools to get a feel for their usefulness within the context of your teaching. Use the tools as a point of discussion, rather than considering them as proof of misconduct, which may not be the case. Due to the unreliability, it is unclear whether detection has an advantage in the long-term as all have disclaimers.
Q10: Is there a university policy on the use of AI tools like ChatGPT?
A: On August 1, 2023, the Office of the Provost at MSU posted the Generative Artificial Intelligence Guidance, and Technology at MSU also shared an Interim Guidance on Data Uses and Risks of Generative AI. Check for related policies within your college or department. Consider explaining the new university guidance, existing academic integrity policies, and your approach to the use of AI tools in your course syllabus.
MSU policies and resources related to academic integrity:
Spartan Code of Honor
Integrity of Scholarship and Grades
Student Rights and Responsibilities Article 7
MSU Campus Resources for Academic Integrity
Q11: Is there a recommendation for how to address the use of AI tools in my class?
A: Generally speaking, educators have been responding to the rise of AI tools by either resisting or adapting (see ChatGPT and AI Text Generators: Should Academia Adapt or Resist). Resistors may see more issues than benefits in using AI for teaching and assessing students. They prohibit, bypass or discourage the use of AI tools by returning to in-person pen and paper assessments or using AI detection tools to detect AI generated content. Educators who take a more adaptive approach will likely see more benefits of using AI in teaching and learning and the need of better preparing students for the challenges they will face in a post-AI world. Of course, how you address the use of AI tools in your course depends on your specific teaching context and course goals. An instructor teaching writing classes and an instructor teaching AI-related courses are likely to take different approaches.
In your course syllabus, make your expectations clear on whether students can use AI tools, what students can use them for, what students should not use them for, and whether they need to explain how they have used them. The more detailed your expectations are, the less likely students will misuse these tools.
Carefully (re)design your course activities and assessments, whether you are embracing AI tools or discouraging the use of them in your course. Consider designing activities or assessments that encourage higher-order thinking, critical thinking, and creative thinking (e.g., reflection activities or essays, activities that develop critical digital literacy, work that focuses on process over end-product, and real world, ill-structured problem-solving), which are essential for successful learning and resistant to students’ misuse of AI tools. Below are several resources for designing activities and assessments in the age of AI:
Five ways to prepare writing assignments in the age of AI by Turnitin
New modes of learning enabled by AI chatbots: Three methods and assignments by Ethan and Lilach Mollick, University of Pennsylvania - Wharton School
Q12: How can I improve the output from ChatGPT?
A: You can improve the output with targeted prompts and subsequent tasks, such as defining the structure type (essay, email, letter, blog, lesson, assignment, quiz, rubric, list, table, outline, etc.) and tone (professional, heartfelt, humorous, in the voice of a celebrity, for a fifth grader, etc.). You can give the chatbot a role and a task (prompt: “you are a college professor teaching __, write a ___ about ___). You can continue improving the output by providing more context details.
You can also train the model by providing a dataset of your own. It will not read text from a website by providing a url (try it sometime to see the model make stuff up). You will have to paste in the text with your request. For example, prompt ChatGPT to give you a summary of … “paste in article text” or ask for a list of discussion questions to give to students from the copied article text. If you want something more concise, give a subsequent prompt to “make it shorter”. If the text appears to have cutoff at the end, you can extend the length by telling ChatGPT to continue from “paste in the last sentence.” For more, see Open AI’s guide on prompt design.
Q13: Could AI chatbots potentially create issues of digital equity for students?
A: Using AI technology in the classroom may lead to unequal access for students. Those with slower internet speeds or no internet access at home may face disadvantages. Additionally, those who pay for subscriptions may have better access and results. Students at schools that prohibit the technology may also be at a disadvantage. However, AI technology can help alleviate inequity in some cases, such as assisting international students with grammar and improving communication with instructors. It can also provide academic support for students without access to private tutors.
Disclosure: The above paragraph was rewritten with the help of ChatGPT. The original is listed below.
The prompt given was simply “Rewrite: If you allow and even encourage the proper use of AI technology in your classroom, students will not all have the same level of access. Students who experience slower internet speeds or have no access to the internet at home may face a disadvantage. Additionally, during peak usage times, the free version may not be readily available. Those who pay for a subscription will have an advantage with better access and improved results. Students from schools that prohibit the use of the technology may find themselves disadvantaged relative to those who were trained to use it as a tool. The technology may help to alleviate inequity in other cases, such as helping an international student polish their grammar or improve communication with an instructor. Students without access to private tutors can potentially get assistance with their studies.”
Both versions are provided to give you an example of using ChatGPT to improve or shorten a piece of writing.
Q14: What are the privacy concerns associated with using AI in education?
A: The company collects information from you and/or your students when signing up for an account. Entering personally identifiable information related to your students in a prompt would be a FERPA violation because prompts may be reviewed by AI trainers to improve their systems. See the privacy-policy for more details. Instructors who are embracing the technology as a learning tool may be creating assignments that specifically encourage its use. However, students who are concerned about privacy issues may be reluctant to use the technology. Consider creating an alternative assignment for those cases.
Q15: What is MSU doing and how can I stay connected with future developments?
A: MSU offered a university-wide AI Symposium in February 2023, followed by on-going conversations within departments and academic programs around the topic. For example, there was an AI/IAH Workshop on April 14, 2023 (resources are posted on iTeach). Other AI coffee talks were posted on the MSU Library calendar. The MSU Spring Conference in May had sessions on the topic, as well as the EdTech Summit in June, 2023 MSU Educational Technology. Check with your department and the following areas to keep up on the latest developments.
iTeach & CTLI (Center for Teaching and Learning Innovation)
Artificial Intelligence in Higher Education (and Learning)
AI & Education Group – login to find the group and join the growing list of members.
MSU’s Spring Conference on Teaching and Learning - May 2023
Using AI in Teaching & Learning iTeach playlist
EDLI (Enhanced Digital Learning Initiative) and Broad College
Short Overview of ChatGPT for University Faculty By Jeremy Van Hof, Eli Broad College of Business, MSU
Quarterly Newsletter Apr 2023
OSSA (Office of Student Support & Accountability) and MSU Policy
When It Comes to Academic Integrity, Even ChatGPT Has the Answer By Jake Kasper, Office of Student Support & Accountability
Office of Student Support & Accountability
Spartan Code of Honor
Integrity of Scholarship and Grades
Student Rights and Responsibilities Article 4
MSU Campus Resources for Academic Integrity
Spartan Newsroom and MSU Today
Are teachers ready for the rise of Artificial Intelligence? - January 20, 2023
What if AI helped write a commencement speech? ComArtSci 2023 commencement
Additional Resources:
Educator Considerations for ChatGPT By OpenAI
ChatGPT Updates and FAQ By OpenAI
ChatGPT General FAQ By OpenAI
AI Chatbot FAQ By Western Carolina University
ChatGPT & Education By Torrey Trust, University of Massachusetts Amherst
Academic integrity in the age of AI By Turnitin
White Paper: How Academia is Adapting to Generative AI
How to Productively Address AI-Generated Text in Your Classroom By Indiana University Bloomington
Currently, there is an explosion of tools integrating Chat AI tools. For example, Quizlet has a new tool called Q-Chat, Khan Academy recently developed Khanmigo, and Grammarly introduced a ChatGPT-style AI tool. On a humorous note, there is CatGPT (not a typo). More guidance and updates are likely to follow this getting started FAQ.Originally posted: May 2023Updated: Nov 2023
Authored by: Sue Halick and Cui Cheng
Posted on: #iteachmsu

Spartan Studios Playkit: Appendix
AppendixThis is the ninth and final article in our iTeach.MSU playlist for the Spartan Studios Playkit.This appendix includes categories related to different elements of interdisciplinary, experiential teaching and course design, and includes what we hope are useful annotations.
Research from the Spartan Studios project
Heinrich, W. F., Louson, E., Blommel, C., & Green, A. R. (2021). Who Coaches the Coaches? The Development of a Coaching Model for Experiential Learning. Innov High Educ 46, 357–375. https://doi.org/10.1007/s10755-020-09537-3
This paper is an overview of the Spartan Studios project and our results for students and faculty who ran prototype courses. It outlines the GORP model as well as the benefits and challenges of this approach to teaching and course planning.
Heinrich, W. F., Lauren, B., & Logan, S. (2020). Interdisciplinary teaching, learning and power in an experiential classroom. Submitted to Experiential Learning & Teaching in Higher Education.
This paper [under review] describes the first iteration of what became the Studios pattern at MSU and introduces the GORP framework.
Research from the James Madison University X-Labs, our colleagues in Virginia working in a similar course model
McCarthy, S., Barnes, A., Briggs, F., Giovanetti, K., Ludwig, P., Robinson, K., & Swayne, N. (Fall 2016). Undergraduate Social Entrepreneurship Education and Communication Design. SIGDOC 2015 Conference Proceedings. https://doi.org/10.1145/2987592.2987625
This report describes some communication strategies within the X-Labs’ drones course, how students documented and presented their works and how faculty plan to iterate the course.
Ludwig, P. M., Lewis, E. J., Nagel, J. K. (2017). Student learning outcomes from a pilot medical innovations course with nursing, engineering and biology undergraduate students. International Journal of STEM Education, 4(33) https://doi.org/10.1186/s40594-017-0095-y
Describes an X-Labs multidisciplinary course on medical innovations and its assessment using qualitative content analysis about students’ attitudes and perceptions of different occupations.
McCarthy, S., Barnes, A., Holland, S. K., Lewis, E., Ludwig, P., & Swayne, N. (2018). Making It: Institutionalizing Collaborative Innovation in Public Higher Education. Proceedings of the 4th International Conference on Higher Education Advances (HEAd’18) 1,549–1,557. http://dx.doi.org/10.4995/HEAD18.2018.8560
A descriptive case study of the academic maker space in the JMU X-Labs, both describing specific courses and how X-Labs is administered. Offers this model as applicable elsewhere in higher ed.
Kishbaugh, A. (2018). An Exploratory Case Study of Cross-Disciplinary Project-Based (i.e. Maker) Curricula as a Catalyst for Entrepreneurship. International Symposium on Academic Makerspaces. https://jmuxlabs.org/app/uploads/2018/10/ISAM_2018_akish_v6.pdf
Describes cross-disciplinary courses as promoting entrepreneurship and innovation, by looking at startups coming from these courses. Offers a framework based on multidisciplinary problem-solving, Design Thinking approaches, and a lean startup methodology.
Selznick, B. S., Mayhew, M. J., & Swayne, N. (2018, November 20). Stop Blaming Innovation. (Correspondence from Chronicle readers). The Chronicle of Higher Education. https://www.chronicle.com/blogs/letters/stop-blaming-innovation/
A rebuttal to an argument that higher ed’s emphasis on innovation is misguided. Argues that innovation has positive student outcomes, is different from entrepreneurship, and that their interventions are effective.
Swayne, N., McCarthy, S., Selznick, B. S., & Fisher, K. A. (2019). Breaking up I/E: Consciously Uncoupling Innovation and Entrepreneurship to Improve Undergraduate Learning. Innovation and Entrepreneurship Theory and Practice. https://doi.org/10.24251/HICSS.2019.651
Describes the X-Labs as evidence for uncoupling entrepreneurship and innovation, and argues that conceptually they are separate; teaching innovation needs to precede teaching entrepreneurship
Lewis, E. J., Ludwig, P. M., Nagel, J., & Ames, A. (2019). Student ethical reasoning confidence pre/post an innovative makerspace course: A survey of ethical reasoning. Nurse Education Today, 75, 75-79. https://doi.org/10.1016/j.nedt.2019.01.011
Describes gains to ethical reasoning after the Medical Innovations X-Labs course.
El-Tawab, S., Sprague, N. & Stewart, M. (2020). Teaching Innovation in Higher Education: A Multidisciplinary Class. In D. Schmidt-Crawford (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 8-13). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/215725/.
Describes a case of the X-Labs autonomous vehicles course, its support of students’ technical and soft skills, and its reproducibility.
McMurtrie, B. (2019) No Textbooks, No Lectures, and No Right Answers. Is This What Higher Education Needs? Chronicle of Higher Education 10 Feb. https://www.chronicle.com/article/no-textbooks-no-lectures-and-no-right-answers-is-this-what-higher-education-needs/
Chronicle of Higher Education story about the JMU X-Labs course model.
Interdisciplinarity
Harden, R. M. (2000) The integration ladder: A tool for curriculum planning and evaluation. Medical Education, 34(7), 551–557. https://doi.org/10.1046/j.1365-2923.2000.00697.x
Offers a framework for thinking about different disciplinary connections, from disciplines being isolated/siloed from each other through transdisciplinarity.
Carmicheal, T. & LaPierre, Y. (2014). Interdisciplinary Learning Works: The Results of a Comprehensive Assessment of Students and Student Learning Outcomes in an Integrative Learning Community. Issues in Interdisciplinary Studies, 32(3), 53–78. http://hdl.handle.net/10323/6647
Evidence-based assessment of student learning outcomes and academic growth metrics as a result of participation in a first-year integrative learning community. The author outlines the interdisciplinary learning goals and processes of the program, and shows that students that participated in the program consistently outperformed students outside of the program in both short term and long term learning and academic growth benchmarks.
Ivanitskaya, L., Clark, D., Montgomery, G., & Primeau, R. (2002). Interdisciplinary Learning: Process and Outcomes. Innovative Higher Education, 27, 95–111. https://doi.org/10.1023/A:1021105309984
A review of expected benefits, learning outcomes, and processes (and potential roadblocks) of interdisciplinary education. Review applied to an interdisciplinary discussion based course. The authors claim that interdisciplinary learning can significantly contribute to intellectual maturity and cognitive development of students, and provide a framework of milestones that students may hit in the process of cognitive development through interdisciplinary ed.
Kezar, A. & Elrod, S. (2012). Facilitating Interdisciplinary Learning: Lessons from Project Kaleidoscope. Change: The Magazine of Higher Learning, 44(1), 16–25, https://doi.org/10.1080/00091383.2012.635999
This magazine article argues for the benefits of interdisciplinary education for both students and institutions, and provides ways to encourage interdisciplinary education on a systemic level. The authors give key strategies and tips for facilitating interdisciplinary learning and creating student experiences. The barriers to interdisciplinary learning/education are recognized (specifically institutional) and potential solutions are given as well.
Stentoft D. (2017) From saying to doing interdisciplinary learning: Is problem-based learning the answer? Active Learning in Higher Education, 18(1). 51–61. https://doi.org/10.1177/1469787417693510
Author argues that PBL is an effective strategy to facilitate interdisciplinary learning and vice versa. The author also acknowledges three barriers to effective interdisciplinary education: curriculum organization, student competencies to navigate interdisciplinary problems, and instructor competency - and proposes how to address these barriers.
Imafuku, R., Kataoka, R., Mayahara, M., Suzuki, H., & Saiki, T. (2014). Students’ Experiences in Interdisciplinary Problem-based Learning: A Discourse Analysis of Group Interaction. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1388
Kruck, S. E. and Teer, Faye P. (2009). Interdisciplinary Student Teams Projects: A Case Study. Journal of Information Systems Education, 20(3), 325–330. https://aisel.aisnet.org/jise/vol20/iss3/7
Problem-Based Learning/Project-Based Learning
Ertmer, P. A., & Simons, K. D. (2006). Jumping the PBL Implementation Hurdle: Supporting the Efforts of K–12 Teachers. Interdisciplinary Journal of Problem-Based Learning, 1(1). https://doi.org/10.7771/1541-5015.1005
While focused on problem based learning at the K-12 level, this paper covers topics relevant to higher education instruction, including implementation challenges, creating collaborative classroom culture, teachers adjusting to changing roles, scaffolding student learning, initiating student inquiry, maintaining student engagement, aiding conceptual integration, and promoting reflective thinking
Fukuzawa, S., Boyd, C., & Cahn, J. (2017). Student motivation in response to problem-based learning. Collected Essays on Learning and Teaching, 10, 175-188. https://doi.org/10.22329/celt.v10i0.4748
Study of student perceptions of problem-based learning in an anthropology course found that students with more subject matter experience didn’t necessarily have greater intrinsic motivation about the course. Also includes strategies for transitioning students to PBL when they are used to traditional lectures.
Guo, P., Saab, N., Post, L. S., & Admiraal, W. (2020). A review of project-based learning in higher education: Student outcomes and measures. International Journal of Educational Research, 102, 101586. https://doi.org/10.1016/j.ijer.2020.101586
A review of literature around project based learning that includes 76 papers. Topics covered in the review include cognitive outcomes of PjBL including knowledge and cognitive strategies, affective outcomes including perceptions of the benefits of PjBL and perceptions of the experience of PBL, and behavior outcomes including skills and engagement
Lee, J. S., Blackwell, S., Drake, J., & Moran, K. A. (2014). Taking a leap of faith: redefining teaching and learning in higher education through project-based learning. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1426
Study of instructors who implemented PjBL that focused around their challenges and successes with community partnerships, student engagement, and assessment
Moro, C., & McLean, M. (2017). Supporting students’ transition to university and problem-based learning. Medical Science Educator, 27(2), 353-361. https://doi.org/10.1007/s40670-017-0384-6
15 strategies for scaffolding learning and supporting students in PBL programs includes using a phased approach to PBL, getting student feedback in the first few weeks of the program, and develop learner’s reflective skills before self-assessment
Pepper C. (2010). ‘There’s a lot of learning going on but NOT much teaching!’: Student perceptions of problem‐based learning in science. Higher Education Research & Development, 29(6), 693-707. https://doi.org/10.1080/07294360.2010.501073
Overview of student responses to problem based learning at an Australian university. Developed a continuum of how students react to problem based learning that includes missing the point, working in groups, splitting the workload, completing the task, assessing the task, learning new information, sharing ideas, and being self directed learners
Perrault, E. K., & Albert, C. A. (2018). Utilizing project-based learning to increase sustainability attitudes among students. Applied Environmental Education & Communication, 17(2), 96-105. https://doi.org/10.1080/1533015X.2017.1366882
While PjBL is often concerned with knowledge gain, this study suggests that PBL can also shift student attitudes around the topic. For this study, students designed a communications campaign for an office of sustainability. The students themselves were found to have more favorable views around sustainability by the end of the course
Boston University Center for Teaching & Learning. (n.d.). Project-based learning: teaching guide. http://www.bu.edu/ctl/guides/project-based-learning/
Brief overview of what project based learning is and four key steps to implementing it (defining the problem, generating ideas, prototyping solutions, and testing)
Strobel, J., & van Barneveld, A. (2009). When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. Interdisciplinary Journal of Problem-Based Learning, 3(1). https://doi.org/10.7771/1541-5015.1046
Combines the results of many meta-analyses around PBL over the last few decades to compare PBL to traditional classroom learning. The study finds that PBL results in more satisfaction among students and faculty, leads to better long term retention of knowledge (traditional was better for short-term), and better skill development
Vogler, J. S., Thompson, P., Davis, D. W., Mayfield, B. E., Finley, P. M., & Yasseri, D. (2018). The hard work of soft skills: augmenting the project-based learning experience with interdisciplinary teamwork. Instructional Science, 46(3), 457-488. https://doi.org/10.1007/s11251-017-9438-9
Two-year study of an interdisciplinary problem based learning task and student outcomes. Study used student feedback during each year to understand how students were feeling about the course. The instructors learned that students felt the instructors had inconsistent and unclear expectations and hence, experienced anxiety about grades. The instructors took this to mean that they needed to do a better job of articulating the learning outcomes and end of course goal. The instructors also learned that students often do not know how to collaborate interdisciplinary and decided to add scaffolding to the course
Learning Objectives and Bloom’s Taxonomy
Armstrong, P. (2010). Bloom’s taxonomy. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/
Overview of the original 6 levels of Bloom’s Taxonomy and the 6 levels of the Revised Taxonomy: remember, understand, apply, analyze, evaluate, and create. Includes the four types of knowledge: factual, conceptual, procedural, and metacognitive.
Carnegie Mellon University Eberly Center. (n.d.). Design & Teach a Course. https://www.cmu.edu/teaching/designteach/design/learningobjectives.html
Strategies and tips for articulating and writing learning objectives including that learning objectives should be student-centered, break down the task and focus on specific cognitive processes, use action verbs, and be measurable.
Ferguson, C. (2002). Using the revised taxonomy to plan and deliver team-taught, integrated, thematic units. Theory Into Practice, 41(4), 238-243. https://doi.org/10.1207/s15430421tip4104_6
Example of an interdisciplinary high school course (English & social studies) where the two instructors used a taxonomy table to map their learning objectives onto the 6 levels of the Revised Taxonomy and 4 types of knowledge. Such a table may be useful for thinking about the learning objectives in your course
Kidwell, L. A., Fisher, D. G., Braun, R. L., & Swanson, D. L. (2013). Developing learning objectives for accounting ethics using Bloom's taxonomy. Accounting Education, 22(1), 44-65. https://doi.org/10.1080/09639284.2012.698478
An example of using Bloom’s Taxonomy in accounting ethics to create learning objectives. For each larger course theme, the authors list examples how learning objectives could be created from each level of the Taxonomy.
Mayer, R. E. (2002). Rote versus meaningful learning. Theory Into Practice, 41(4), 226-232. https://doi.org/10.1207/s15430421tip4104_4
Includes 19 processes/action verbs, how they map to the 6 levels of the Revised Taxonomy, and simple examples of what a task for students to do might look like. Examples of included verbs are “compare,” “implement,” “organize,” “critique,” and “generate”
Tyran, C. K. (2010). Designing the spreadsheet-based decision support systems course: an application of Bloom's taxonomy. Journal of Business Research, 63(2), 207-216. https://doi.org/10.1016/j.jbusres.2009.03.009
An example of using Bloom’s taxonomy to map course activities to ensure students have the prerequisite knowledge to complete the assignments
Reflection; Reflection as Assessment
Ash, S. L., & Clayton, P. H. (2009). Learning through critical reflection: A tutorial for service-learning students. Ash, Clayton & Moses.
Introduces characteristics of critical reflection and the DEAL model.
Eyler, J., Eyler, J., Giles, D. E., & Schmeide, A. (1996). A practitioner's guide to reflection in service-learning: Student voices & reflections. Vanderbilt University.
Argues that successful reflection is continuous, challenging, connected, and contextualized.
Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximize student learning (2nd edition). Corwin Press.
Especially chapter 10, Using Assessment for Reflection and Self-Regulation
Ash, S. L., Clayton, P. H., & Atkinson, M. P. (2005). Integrating reflection and assessment to capture and improve student learning. Michigan Journal of Community Service Learning, 11(2), 49-60. http://hdl.handle.net/2027/spo.3239521.0011.204
Sees coupled reflection and assessment as mutually informing and reinforcing for students in service learning. Describes tools to guide reflective writing processes. Focus on both individual student learning and reflection as part of program-wide approaches to reflection.
Assessment of Experiential Education & Interdisciplinary Learning
Conrad, D., & Hedin, D. (1981). National assessment of experiential education: Summary and implications. Journal of Experiential Education, 4(2), 6–20. https://doi.org/10.1177/105382598100400202
A summary of the research of the Evaluation of Experiential Learning project which sought to (1) assess the impact of experiential learning on secondary school students and (2) use that data to identify the elements of the EE programs that contributed the most to such student development.
Field, M., Lee, R., & Field, M. L. (1994). Assessing interdisciplinary learning. New Directions for Teaching and Learning, 1994(58), 69–84. https://doi.org/10.1002/tl.37219945806
In-depth discussion of assessment techniques for interdisciplinary study in higher education
Heinrich, W. F., Habron, G. B., Johnson, H. L., & Goralnik, L. (2015). Critical thinking assessment across four sustainability-related experiential learning settings. Journal of Experiential Education, 38(4), 373–393. https://doi.org/10.1177/1053825915592890
Implications of critical thinking coupled with engaged citizenry within experiential education courses.
Mansilla, V. B., & Duraising, E. D. (2007). Target assessment of students’ interdisciplinary work: An empirically grounded framework proposed. The Journal of Higher Education, 78(2), 215-237. https://doi.org/10.1080/00221546.2007.11780874
Introduction of a framework for targeted assessment of interdisciplinary student work. Also a good review of relevant literature of assessment and interdisciplinary learning in higher education.
Yates, T., Wilson, J., & Purton, K. (2015). Surveying assessment in experiential learning: A single campus study. The Canadian Journal for the Scholarship of Teaching and Learning, 6(3). https://doi.org/10.5206/cjsotl-rcacea.2015.3.4
Exploration of experiential assessment within a Canadian University. Exploration intended for the use in identifying common methods and facilitating development of best assessment practices for higher education, specifically experiential higher education.
You, H. S., Marshall, J. A., & Delgado, C. (2019). Toward interdisciplinary learning: Development and validation of an assessment for interdisciplinary understanding of global carbon cycling. Research in Science Education. https://doi.org/10.1007/s11165-019-9836-x
Development and validation of an assessment which measured the understanding of the carbon cycle for high school and undergraduate students.
Building and Managing Student Teams & Team Dynamics
Burke, A. (2011) Group Work: How to Use Groups Effectively. Journal of Effective Teaching, 11(2), 87-95. https://uncw.edu/jet/articles/vol11_2/burke.pdf
Cano, J. L., Lidon, I., Rebollar, R., Roman, P., & Saenz, M. J. (2006). Student groups solving real-life projects. A case study of experiential learning. International Journal of Engineering Education, 22(6), 1252-1260. https://www.ijee.ie/articles/Vol22-6/16_IJEE1811.pdf
Fearon, C., McLaughlin, H., & Yoke Eng, T. (2012). Using student group work in higher education to emulate professional communities of practice. Education + Training, 54(2/3), 114–125. https://doi.org/10.1108/00400911211210233
Fellenz, M. R. (2006). Toward fairness in assessing student groupwork: A protocol for peer evaluation of individual contributions. Journal of Management Education, 30(4), 570–591. https://doi.org/10.1177/1052562906286713
Furman, R., Bender, K., & Rowan, D. (2014). An experiential approach to group work. Oxford University Press.
Smith, G. G., Sorensen, C., Gump, A., Heindel, A. J., Caris, M., & Martinez, C. D. (2011). Overcoming student resistance to group work: Online versus face-to-face. The Internet and Higher Education, 14(2), 121–128. https://doi.org/10.1016/j.iheduc.2010.09.005
Hassanien, A. (2006). Student Experience of Group Work and Group Assessment in Higher Education. Journal of Teaching in Travel & Tourism, 6(1), 17–39. https://doi.org/10.1300/j172v06n01_02
Kayes, A. B., Kayes, D. C., & Kolb, D. A. (2005). Experiential learning in teams. Simulation & Gaming, 36(3), 330–354. https://doi.org/10.1177/1046878105279012
Napier, N. P. & Johnson, R. D. (2007). Technical Projects: Understanding Teamwork Satisfaction In an Introductory IS Course. Journal of Information Systems Education. 18(1), 39-48. http://www.jise.org/volume18/n1/JISEv18n1p39.html
Winsett, C., Foster, C., Dearing, J., & Burch, G. (2016). The impact of group experiential learning on student engagement. Academy of Business Research Journal. 3, 7-17.
Online Experiential Education and Innovative Online Teaching & Course Structures
Bolan, C. M. (2003). Incorporating the experiential learning theory into the instructional design of online courses. Nurse Educator, 28(1), 10–14. https://doi.org/10.1097/00006223-200301000-00006
Provides insights on how to implement an experiential learning framework into an already developed online course.
Christian, D. D., McCarty, D. L., & Brown, C. L. (2020). Experiential education during the COVID-19 pandemic: A reflective process. Journal of Constructivist Psychology, 1–14. https://doi.org/10.1080/10720537.2020.1813666
Provides insight on how experiential learning can occur in an online format which acknowledges the new normal due to the COVID-19 pandemic. This includes case studies.
Sharoff, L. (2019). Creative and innovative online teaching strategies: Facilitation for active participation. The Journal of Educators Online, 16. https://doi.org/10.9743/jeo.2019.16.2.9
Piece on how to keep students thoughtfully engaged with online courses.
Diversity, Equity, and Inclusion
Bricklemyer, J. (2019, April 29). DEI online course supplemental checklist. https://codl.ku.edu/sites/codl.ku.edu/files/docs/DEI%20Online%20Course%20Supplemental%20Checklist%2029Apr19.pdf
A set of five principles around designing a course for inclusion geared specifically toward online courses. Also includes links to other resources for more in-depth resources
Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2). https://doi.org/10.1126/sciadv.aau4734
Students in classes where the instructor believed that student potential was fixed earned lower grades than in courses where the instructor believed student potential changed over time. In addition, the difference in grades between students from underrepresented racial groups and white/Asian students was larger in the classes with instructors who thought mindset was fixed.
CAST (2018). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org
A set of broad guidelines for ensuring that all learners can engage in learning, regardless of culture, language, or disability status. Each guideline includes practical examples of how it could be implemented in a course and the research supporting the guideline.
Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE—Life Sciences Education, 18(2). https://doi.org/10.1187/cbe.19-01-0021
Guide that covers why instructors need to develop self-awareness and empathy for students and consider classroom climate before pedagogical choices for inclusivity. Also includes an interactive webpage about inclusive teaching with literature citations and a checklist for instructors.
MyPronouns.org Resources on Personal Pronouns. (n.d.). https://www.mypronouns.org/
A guide about personal pronouns and best practices for using them: include your pronouns when introducing yourself, avoid using “preferred” in front of pronouns, and using “go by” instead of “uses” when introducing pronouns. E.g. My name is Sparty and I go by him/his pronouns.
University of Michigan Center for Research on Learning and Teaching. Inclusive Strategies Reflection. https://docs.google.com/document/d/1UK3HFQv-3qMDNjvt0fFPbts38ApOL7ghpPE0iSYJ1Z8/edit?usp=sharing
A self-reflection tool for instructors about their teaching practices measured along five dimensions: critical engagement of difference, academic belonging, transparency, structured interactions, and flexibility. Each dimension includes ideas for instructors to add to their own courses
Poorvu Center for Teaching and Learning.(n.d.) Inclusive Teaching Strategies. https://poorvucenter.yale.edu/InclusiveTeachingStrategies
Includes 9 recommendations instructors can take to create a more inclusive classroom including incorporating diversity into the curriculum, examining implicit biases, adding a diversity statement to the syllabus, and soliciting student feedback
Guide for Inclusive Teaching at Columbia https://ctl.columbia.edu/resources-and-technology/resources/inclusive-teaching-guide/
Photo from LubosHouska from Pixabay
Research from the Spartan Studios project
Heinrich, W. F., Louson, E., Blommel, C., & Green, A. R. (2021). Who Coaches the Coaches? The Development of a Coaching Model for Experiential Learning. Innov High Educ 46, 357–375. https://doi.org/10.1007/s10755-020-09537-3
This paper is an overview of the Spartan Studios project and our results for students and faculty who ran prototype courses. It outlines the GORP model as well as the benefits and challenges of this approach to teaching and course planning.
Heinrich, W. F., Lauren, B., & Logan, S. (2020). Interdisciplinary teaching, learning and power in an experiential classroom. Submitted to Experiential Learning & Teaching in Higher Education.
This paper [under review] describes the first iteration of what became the Studios pattern at MSU and introduces the GORP framework.
Research from the James Madison University X-Labs, our colleagues in Virginia working in a similar course model
McCarthy, S., Barnes, A., Briggs, F., Giovanetti, K., Ludwig, P., Robinson, K., & Swayne, N. (Fall 2016). Undergraduate Social Entrepreneurship Education and Communication Design. SIGDOC 2015 Conference Proceedings. https://doi.org/10.1145/2987592.2987625
This report describes some communication strategies within the X-Labs’ drones course, how students documented and presented their works and how faculty plan to iterate the course.
Ludwig, P. M., Lewis, E. J., Nagel, J. K. (2017). Student learning outcomes from a pilot medical innovations course with nursing, engineering and biology undergraduate students. International Journal of STEM Education, 4(33) https://doi.org/10.1186/s40594-017-0095-y
Describes an X-Labs multidisciplinary course on medical innovations and its assessment using qualitative content analysis about students’ attitudes and perceptions of different occupations.
McCarthy, S., Barnes, A., Holland, S. K., Lewis, E., Ludwig, P., & Swayne, N. (2018). Making It: Institutionalizing Collaborative Innovation in Public Higher Education. Proceedings of the 4th International Conference on Higher Education Advances (HEAd’18) 1,549–1,557. http://dx.doi.org/10.4995/HEAD18.2018.8560
A descriptive case study of the academic maker space in the JMU X-Labs, both describing specific courses and how X-Labs is administered. Offers this model as applicable elsewhere in higher ed.
Kishbaugh, A. (2018). An Exploratory Case Study of Cross-Disciplinary Project-Based (i.e. Maker) Curricula as a Catalyst for Entrepreneurship. International Symposium on Academic Makerspaces. https://jmuxlabs.org/app/uploads/2018/10/ISAM_2018_akish_v6.pdf
Describes cross-disciplinary courses as promoting entrepreneurship and innovation, by looking at startups coming from these courses. Offers a framework based on multidisciplinary problem-solving, Design Thinking approaches, and a lean startup methodology.
Selznick, B. S., Mayhew, M. J., & Swayne, N. (2018, November 20). Stop Blaming Innovation. (Correspondence from Chronicle readers). The Chronicle of Higher Education. https://www.chronicle.com/blogs/letters/stop-blaming-innovation/
A rebuttal to an argument that higher ed’s emphasis on innovation is misguided. Argues that innovation has positive student outcomes, is different from entrepreneurship, and that their interventions are effective.
Swayne, N., McCarthy, S., Selznick, B. S., & Fisher, K. A. (2019). Breaking up I/E: Consciously Uncoupling Innovation and Entrepreneurship to Improve Undergraduate Learning. Innovation and Entrepreneurship Theory and Practice. https://doi.org/10.24251/HICSS.2019.651
Describes the X-Labs as evidence for uncoupling entrepreneurship and innovation, and argues that conceptually they are separate; teaching innovation needs to precede teaching entrepreneurship
Lewis, E. J., Ludwig, P. M., Nagel, J., & Ames, A. (2019). Student ethical reasoning confidence pre/post an innovative makerspace course: A survey of ethical reasoning. Nurse Education Today, 75, 75-79. https://doi.org/10.1016/j.nedt.2019.01.011
Describes gains to ethical reasoning after the Medical Innovations X-Labs course.
El-Tawab, S., Sprague, N. & Stewart, M. (2020). Teaching Innovation in Higher Education: A Multidisciplinary Class. In D. Schmidt-Crawford (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 8-13). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/215725/.
Describes a case of the X-Labs autonomous vehicles course, its support of students’ technical and soft skills, and its reproducibility.
McMurtrie, B. (2019) No Textbooks, No Lectures, and No Right Answers. Is This What Higher Education Needs? Chronicle of Higher Education 10 Feb. https://www.chronicle.com/article/no-textbooks-no-lectures-and-no-right-answers-is-this-what-higher-education-needs/
Chronicle of Higher Education story about the JMU X-Labs course model.
Interdisciplinarity
Harden, R. M. (2000) The integration ladder: A tool for curriculum planning and evaluation. Medical Education, 34(7), 551–557. https://doi.org/10.1046/j.1365-2923.2000.00697.x
Offers a framework for thinking about different disciplinary connections, from disciplines being isolated/siloed from each other through transdisciplinarity.
Carmicheal, T. & LaPierre, Y. (2014). Interdisciplinary Learning Works: The Results of a Comprehensive Assessment of Students and Student Learning Outcomes in an Integrative Learning Community. Issues in Interdisciplinary Studies, 32(3), 53–78. http://hdl.handle.net/10323/6647
Evidence-based assessment of student learning outcomes and academic growth metrics as a result of participation in a first-year integrative learning community. The author outlines the interdisciplinary learning goals and processes of the program, and shows that students that participated in the program consistently outperformed students outside of the program in both short term and long term learning and academic growth benchmarks.
Ivanitskaya, L., Clark, D., Montgomery, G., & Primeau, R. (2002). Interdisciplinary Learning: Process and Outcomes. Innovative Higher Education, 27, 95–111. https://doi.org/10.1023/A:1021105309984
A review of expected benefits, learning outcomes, and processes (and potential roadblocks) of interdisciplinary education. Review applied to an interdisciplinary discussion based course. The authors claim that interdisciplinary learning can significantly contribute to intellectual maturity and cognitive development of students, and provide a framework of milestones that students may hit in the process of cognitive development through interdisciplinary ed.
Kezar, A. & Elrod, S. (2012). Facilitating Interdisciplinary Learning: Lessons from Project Kaleidoscope. Change: The Magazine of Higher Learning, 44(1), 16–25, https://doi.org/10.1080/00091383.2012.635999
This magazine article argues for the benefits of interdisciplinary education for both students and institutions, and provides ways to encourage interdisciplinary education on a systemic level. The authors give key strategies and tips for facilitating interdisciplinary learning and creating student experiences. The barriers to interdisciplinary learning/education are recognized (specifically institutional) and potential solutions are given as well.
Stentoft D. (2017) From saying to doing interdisciplinary learning: Is problem-based learning the answer? Active Learning in Higher Education, 18(1). 51–61. https://doi.org/10.1177/1469787417693510
Author argues that PBL is an effective strategy to facilitate interdisciplinary learning and vice versa. The author also acknowledges three barriers to effective interdisciplinary education: curriculum organization, student competencies to navigate interdisciplinary problems, and instructor competency - and proposes how to address these barriers.
Imafuku, R., Kataoka, R., Mayahara, M., Suzuki, H., & Saiki, T. (2014). Students’ Experiences in Interdisciplinary Problem-based Learning: A Discourse Analysis of Group Interaction. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1388
Kruck, S. E. and Teer, Faye P. (2009). Interdisciplinary Student Teams Projects: A Case Study. Journal of Information Systems Education, 20(3), 325–330. https://aisel.aisnet.org/jise/vol20/iss3/7
Problem-Based Learning/Project-Based Learning
Ertmer, P. A., & Simons, K. D. (2006). Jumping the PBL Implementation Hurdle: Supporting the Efforts of K–12 Teachers. Interdisciplinary Journal of Problem-Based Learning, 1(1). https://doi.org/10.7771/1541-5015.1005
While focused on problem based learning at the K-12 level, this paper covers topics relevant to higher education instruction, including implementation challenges, creating collaborative classroom culture, teachers adjusting to changing roles, scaffolding student learning, initiating student inquiry, maintaining student engagement, aiding conceptual integration, and promoting reflective thinking
Fukuzawa, S., Boyd, C., & Cahn, J. (2017). Student motivation in response to problem-based learning. Collected Essays on Learning and Teaching, 10, 175-188. https://doi.org/10.22329/celt.v10i0.4748
Study of student perceptions of problem-based learning in an anthropology course found that students with more subject matter experience didn’t necessarily have greater intrinsic motivation about the course. Also includes strategies for transitioning students to PBL when they are used to traditional lectures.
Guo, P., Saab, N., Post, L. S., & Admiraal, W. (2020). A review of project-based learning in higher education: Student outcomes and measures. International Journal of Educational Research, 102, 101586. https://doi.org/10.1016/j.ijer.2020.101586
A review of literature around project based learning that includes 76 papers. Topics covered in the review include cognitive outcomes of PjBL including knowledge and cognitive strategies, affective outcomes including perceptions of the benefits of PjBL and perceptions of the experience of PBL, and behavior outcomes including skills and engagement
Lee, J. S., Blackwell, S., Drake, J., & Moran, K. A. (2014). Taking a leap of faith: redefining teaching and learning in higher education through project-based learning. Interdisciplinary Journal of Problem-Based Learning, 8(2). https://doi.org/10.7771/1541-5015.1426
Study of instructors who implemented PjBL that focused around their challenges and successes with community partnerships, student engagement, and assessment
Moro, C., & McLean, M. (2017). Supporting students’ transition to university and problem-based learning. Medical Science Educator, 27(2), 353-361. https://doi.org/10.1007/s40670-017-0384-6
15 strategies for scaffolding learning and supporting students in PBL programs includes using a phased approach to PBL, getting student feedback in the first few weeks of the program, and develop learner’s reflective skills before self-assessment
Pepper C. (2010). ‘There’s a lot of learning going on but NOT much teaching!’: Student perceptions of problem‐based learning in science. Higher Education Research & Development, 29(6), 693-707. https://doi.org/10.1080/07294360.2010.501073
Overview of student responses to problem based learning at an Australian university. Developed a continuum of how students react to problem based learning that includes missing the point, working in groups, splitting the workload, completing the task, assessing the task, learning new information, sharing ideas, and being self directed learners
Perrault, E. K., & Albert, C. A. (2018). Utilizing project-based learning to increase sustainability attitudes among students. Applied Environmental Education & Communication, 17(2), 96-105. https://doi.org/10.1080/1533015X.2017.1366882
While PjBL is often concerned with knowledge gain, this study suggests that PBL can also shift student attitudes around the topic. For this study, students designed a communications campaign for an office of sustainability. The students themselves were found to have more favorable views around sustainability by the end of the course
Boston University Center for Teaching & Learning. (n.d.). Project-based learning: teaching guide. http://www.bu.edu/ctl/guides/project-based-learning/
Brief overview of what project based learning is and four key steps to implementing it (defining the problem, generating ideas, prototyping solutions, and testing)
Strobel, J., & van Barneveld, A. (2009). When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. Interdisciplinary Journal of Problem-Based Learning, 3(1). https://doi.org/10.7771/1541-5015.1046
Combines the results of many meta-analyses around PBL over the last few decades to compare PBL to traditional classroom learning. The study finds that PBL results in more satisfaction among students and faculty, leads to better long term retention of knowledge (traditional was better for short-term), and better skill development
Vogler, J. S., Thompson, P., Davis, D. W., Mayfield, B. E., Finley, P. M., & Yasseri, D. (2018). The hard work of soft skills: augmenting the project-based learning experience with interdisciplinary teamwork. Instructional Science, 46(3), 457-488. https://doi.org/10.1007/s11251-017-9438-9
Two-year study of an interdisciplinary problem based learning task and student outcomes. Study used student feedback during each year to understand how students were feeling about the course. The instructors learned that students felt the instructors had inconsistent and unclear expectations and hence, experienced anxiety about grades. The instructors took this to mean that they needed to do a better job of articulating the learning outcomes and end of course goal. The instructors also learned that students often do not know how to collaborate interdisciplinary and decided to add scaffolding to the course
Learning Objectives and Bloom’s Taxonomy
Armstrong, P. (2010). Bloom’s taxonomy. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/
Overview of the original 6 levels of Bloom’s Taxonomy and the 6 levels of the Revised Taxonomy: remember, understand, apply, analyze, evaluate, and create. Includes the four types of knowledge: factual, conceptual, procedural, and metacognitive.
Carnegie Mellon University Eberly Center. (n.d.). Design & Teach a Course. https://www.cmu.edu/teaching/designteach/design/learningobjectives.html
Strategies and tips for articulating and writing learning objectives including that learning objectives should be student-centered, break down the task and focus on specific cognitive processes, use action verbs, and be measurable.
Ferguson, C. (2002). Using the revised taxonomy to plan and deliver team-taught, integrated, thematic units. Theory Into Practice, 41(4), 238-243. https://doi.org/10.1207/s15430421tip4104_6
Example of an interdisciplinary high school course (English & social studies) where the two instructors used a taxonomy table to map their learning objectives onto the 6 levels of the Revised Taxonomy and 4 types of knowledge. Such a table may be useful for thinking about the learning objectives in your course
Kidwell, L. A., Fisher, D. G., Braun, R. L., & Swanson, D. L. (2013). Developing learning objectives for accounting ethics using Bloom's taxonomy. Accounting Education, 22(1), 44-65. https://doi.org/10.1080/09639284.2012.698478
An example of using Bloom’s Taxonomy in accounting ethics to create learning objectives. For each larger course theme, the authors list examples how learning objectives could be created from each level of the Taxonomy.
Mayer, R. E. (2002). Rote versus meaningful learning. Theory Into Practice, 41(4), 226-232. https://doi.org/10.1207/s15430421tip4104_4
Includes 19 processes/action verbs, how they map to the 6 levels of the Revised Taxonomy, and simple examples of what a task for students to do might look like. Examples of included verbs are “compare,” “implement,” “organize,” “critique,” and “generate”
Tyran, C. K. (2010). Designing the spreadsheet-based decision support systems course: an application of Bloom's taxonomy. Journal of Business Research, 63(2), 207-216. https://doi.org/10.1016/j.jbusres.2009.03.009
An example of using Bloom’s taxonomy to map course activities to ensure students have the prerequisite knowledge to complete the assignments
Reflection; Reflection as Assessment
Ash, S. L., & Clayton, P. H. (2009). Learning through critical reflection: A tutorial for service-learning students. Ash, Clayton & Moses.
Introduces characteristics of critical reflection and the DEAL model.
Eyler, J., Eyler, J., Giles, D. E., & Schmeide, A. (1996). A practitioner's guide to reflection in service-learning: Student voices & reflections. Vanderbilt University.
Argues that successful reflection is continuous, challenging, connected, and contextualized.
Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximize student learning (2nd edition). Corwin Press.
Especially chapter 10, Using Assessment for Reflection and Self-Regulation
Ash, S. L., Clayton, P. H., & Atkinson, M. P. (2005). Integrating reflection and assessment to capture and improve student learning. Michigan Journal of Community Service Learning, 11(2), 49-60. http://hdl.handle.net/2027/spo.3239521.0011.204
Sees coupled reflection and assessment as mutually informing and reinforcing for students in service learning. Describes tools to guide reflective writing processes. Focus on both individual student learning and reflection as part of program-wide approaches to reflection.
Assessment of Experiential Education & Interdisciplinary Learning
Conrad, D., & Hedin, D. (1981). National assessment of experiential education: Summary and implications. Journal of Experiential Education, 4(2), 6–20. https://doi.org/10.1177/105382598100400202
A summary of the research of the Evaluation of Experiential Learning project which sought to (1) assess the impact of experiential learning on secondary school students and (2) use that data to identify the elements of the EE programs that contributed the most to such student development.
Field, M., Lee, R., & Field, M. L. (1994). Assessing interdisciplinary learning. New Directions for Teaching and Learning, 1994(58), 69–84. https://doi.org/10.1002/tl.37219945806
In-depth discussion of assessment techniques for interdisciplinary study in higher education
Heinrich, W. F., Habron, G. B., Johnson, H. L., & Goralnik, L. (2015). Critical thinking assessment across four sustainability-related experiential learning settings. Journal of Experiential Education, 38(4), 373–393. https://doi.org/10.1177/1053825915592890
Implications of critical thinking coupled with engaged citizenry within experiential education courses.
Mansilla, V. B., & Duraising, E. D. (2007). Target assessment of students’ interdisciplinary work: An empirically grounded framework proposed. The Journal of Higher Education, 78(2), 215-237. https://doi.org/10.1080/00221546.2007.11780874
Introduction of a framework for targeted assessment of interdisciplinary student work. Also a good review of relevant literature of assessment and interdisciplinary learning in higher education.
Yates, T., Wilson, J., & Purton, K. (2015). Surveying assessment in experiential learning: A single campus study. The Canadian Journal for the Scholarship of Teaching and Learning, 6(3). https://doi.org/10.5206/cjsotl-rcacea.2015.3.4
Exploration of experiential assessment within a Canadian University. Exploration intended for the use in identifying common methods and facilitating development of best assessment practices for higher education, specifically experiential higher education.
You, H. S., Marshall, J. A., & Delgado, C. (2019). Toward interdisciplinary learning: Development and validation of an assessment for interdisciplinary understanding of global carbon cycling. Research in Science Education. https://doi.org/10.1007/s11165-019-9836-x
Development and validation of an assessment which measured the understanding of the carbon cycle for high school and undergraduate students.
Building and Managing Student Teams & Team Dynamics
Burke, A. (2011) Group Work: How to Use Groups Effectively. Journal of Effective Teaching, 11(2), 87-95. https://uncw.edu/jet/articles/vol11_2/burke.pdf
Cano, J. L., Lidon, I., Rebollar, R., Roman, P., & Saenz, M. J. (2006). Student groups solving real-life projects. A case study of experiential learning. International Journal of Engineering Education, 22(6), 1252-1260. https://www.ijee.ie/articles/Vol22-6/16_IJEE1811.pdf
Fearon, C., McLaughlin, H., & Yoke Eng, T. (2012). Using student group work in higher education to emulate professional communities of practice. Education + Training, 54(2/3), 114–125. https://doi.org/10.1108/00400911211210233
Fellenz, M. R. (2006). Toward fairness in assessing student groupwork: A protocol for peer evaluation of individual contributions. Journal of Management Education, 30(4), 570–591. https://doi.org/10.1177/1052562906286713
Furman, R., Bender, K., & Rowan, D. (2014). An experiential approach to group work. Oxford University Press.
Smith, G. G., Sorensen, C., Gump, A., Heindel, A. J., Caris, M., & Martinez, C. D. (2011). Overcoming student resistance to group work: Online versus face-to-face. The Internet and Higher Education, 14(2), 121–128. https://doi.org/10.1016/j.iheduc.2010.09.005
Hassanien, A. (2006). Student Experience of Group Work and Group Assessment in Higher Education. Journal of Teaching in Travel & Tourism, 6(1), 17–39. https://doi.org/10.1300/j172v06n01_02
Kayes, A. B., Kayes, D. C., & Kolb, D. A. (2005). Experiential learning in teams. Simulation & Gaming, 36(3), 330–354. https://doi.org/10.1177/1046878105279012
Napier, N. P. & Johnson, R. D. (2007). Technical Projects: Understanding Teamwork Satisfaction In an Introductory IS Course. Journal of Information Systems Education. 18(1), 39-48. http://www.jise.org/volume18/n1/JISEv18n1p39.html
Winsett, C., Foster, C., Dearing, J., & Burch, G. (2016). The impact of group experiential learning on student engagement. Academy of Business Research Journal. 3, 7-17.
Online Experiential Education and Innovative Online Teaching & Course Structures
Bolan, C. M. (2003). Incorporating the experiential learning theory into the instructional design of online courses. Nurse Educator, 28(1), 10–14. https://doi.org/10.1097/00006223-200301000-00006
Provides insights on how to implement an experiential learning framework into an already developed online course.
Christian, D. D., McCarty, D. L., & Brown, C. L. (2020). Experiential education during the COVID-19 pandemic: A reflective process. Journal of Constructivist Psychology, 1–14. https://doi.org/10.1080/10720537.2020.1813666
Provides insight on how experiential learning can occur in an online format which acknowledges the new normal due to the COVID-19 pandemic. This includes case studies.
Sharoff, L. (2019). Creative and innovative online teaching strategies: Facilitation for active participation. The Journal of Educators Online, 16. https://doi.org/10.9743/jeo.2019.16.2.9
Piece on how to keep students thoughtfully engaged with online courses.
Diversity, Equity, and Inclusion
Bricklemyer, J. (2019, April 29). DEI online course supplemental checklist. https://codl.ku.edu/sites/codl.ku.edu/files/docs/DEI%20Online%20Course%20Supplemental%20Checklist%2029Apr19.pdf
A set of five principles around designing a course for inclusion geared specifically toward online courses. Also includes links to other resources for more in-depth resources
Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2). https://doi.org/10.1126/sciadv.aau4734
Students in classes where the instructor believed that student potential was fixed earned lower grades than in courses where the instructor believed student potential changed over time. In addition, the difference in grades between students from underrepresented racial groups and white/Asian students was larger in the classes with instructors who thought mindset was fixed.
CAST (2018). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org
A set of broad guidelines for ensuring that all learners can engage in learning, regardless of culture, language, or disability status. Each guideline includes practical examples of how it could be implemented in a course and the research supporting the guideline.
Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE—Life Sciences Education, 18(2). https://doi.org/10.1187/cbe.19-01-0021
Guide that covers why instructors need to develop self-awareness and empathy for students and consider classroom climate before pedagogical choices for inclusivity. Also includes an interactive webpage about inclusive teaching with literature citations and a checklist for instructors.
MyPronouns.org Resources on Personal Pronouns. (n.d.). https://www.mypronouns.org/
A guide about personal pronouns and best practices for using them: include your pronouns when introducing yourself, avoid using “preferred” in front of pronouns, and using “go by” instead of “uses” when introducing pronouns. E.g. My name is Sparty and I go by him/his pronouns.
University of Michigan Center for Research on Learning and Teaching. Inclusive Strategies Reflection. https://docs.google.com/document/d/1UK3HFQv-3qMDNjvt0fFPbts38ApOL7ghpPE0iSYJ1Z8/edit?usp=sharing
A self-reflection tool for instructors about their teaching practices measured along five dimensions: critical engagement of difference, academic belonging, transparency, structured interactions, and flexibility. Each dimension includes ideas for instructors to add to their own courses
Poorvu Center for Teaching and Learning.(n.d.) Inclusive Teaching Strategies. https://poorvucenter.yale.edu/InclusiveTeachingStrategies
Includes 9 recommendations instructors can take to create a more inclusive classroom including incorporating diversity into the curriculum, examining implicit biases, adding a diversity statement to the syllabus, and soliciting student feedback
Guide for Inclusive Teaching at Columbia https://ctl.columbia.edu/resources-and-technology/resources/inclusive-teaching-guide/
Photo from LubosHouska from Pixabay
Authored by: Ellie Louson
Pedagogical Design
Posted on: Center for Teaching...

2025 National Day of Racial Healing (January 21)
2025's National Day of Racial Healing will occur on Tuesday, January 21. The American Association of Colleges and Universities has called on colleges and universities to "engage in activities, events, or strategies that promote healing, foster engagement around issues of racism, bias, inequity, and injustice, and build an equitable and just society where all individuals can thrive" as part of the National Racial Day of Healing. The National Day of Racial Healing is "a time to contemplate our shared values and create the blueprint together for #HowWeHeal from the effects of racism. Launched in 2017, it is an opportunity to bring [all] people together and inspire collective action to build common ground for a more just and equitable world." As part of this commitment, educators may consider engaging in events during the week, participate in the #HowWeHeal hashtag, and bringing awareness to the day with students. Some activities and events within the MSU and Lansing community includes:
Read the #HowWeHeal Conversation Guide
Watch the "Changing the Narrative" series
Attend the MSU's 2025 MLK Student Symposium
Attend Lansing's Beloved Community Week
Attend MSU Libraries' Social Justice Film and Discussion on: Fannie Lou Hamer's America
Read the #HowWeHeal Conversation Guide
Watch the "Changing the Narrative" series
Attend the MSU's 2025 MLK Student Symposium
Attend Lansing's Beloved Community Week
Attend MSU Libraries' Social Justice Film and Discussion on: Fannie Lou Hamer's America
Posted by: Bethany Meadows
Justice and Belonging
Posted on: GenAI & Education
Massive changes have occurred recently with regard to artificial intelligence (AI) and the ability of the public to generate novel text and images using AI tools (e.g. ChatGPT). Many in education are concerned with what this means for assessing student understanding: if a student can generate a novel, accurate essay on almost any topic, how will you assess learning from short-answer and essay assignments?
On 02/01/2023, a campus collaboration of the APUE STEM+ Ed@State, Enhanced Digital Learning Initiative (EDLI), Center for Teaching and Learning Innovation (CTLI), MSU Libraries, and MSU IT EdTech, hosted the "Symposium on AI in Education and Academic Writing". During the symposium, the basics of how AI works were shared and attendees had opportunities to play with some AI tools. The event provided opportunities to hear how faculty are addressing these challenges, discuss concerns and opportunities with colleagues, and reflect on individual teaching philosophies in the time of artificial intelligence (AI).
On 02/01/2023, a campus collaboration of the APUE STEM+ Ed@State, Enhanced Digital Learning Initiative (EDLI), Center for Teaching and Learning Innovation (CTLI), MSU Libraries, and MSU IT EdTech, hosted the "Symposium on AI in Education and Academic Writing". During the symposium, the basics of how AI works were shared and attendees had opportunities to play with some AI tools. The event provided opportunities to hear how faculty are addressing these challenges, discuss concerns and opportunities with colleagues, and reflect on individual teaching philosophies in the time of artificial intelligence (AI).
Posted by: Makena Neal
Posted on: GenAI & Education
AI Commons Bulletin 1/13/2025
Human-curated news about generative AI for Teaching and Learning in Higher Education.
😮 Word of the Day: “AI-giarism”
“The unethical practice of using artificial intelligence technologies, particularly generative language models, to generate content that is plagiarized either from original human-authored work or directly from AI-generated content, without appropriate acknowledgement of the original sources or AI’s contribution.” (Chan, 2024)
Learn More: https://doi.org/10.1007/s10639-024-13151-7
💚 H-Net Hosts 2025 AI Symposium: Fear, Faith, and Praxis: Artificial Intelligence in the Humanities and Social Sciences
This year’s theme, “Fear, Faith, and Praxis: Artificial Intelligence, Humanities, and Social Sciences,” focuses on student-centered approaches to the use of AI in pedagogical practice and reassessing previous assumptions about AI. This two-day event will be held on MSU’s campus on Feb 20-21, 2025, and available via live stream on the H-Net Commons.
Learn More: https://networks.h-net.org/2025-ai-symposium
💬 Try This: Use AI to Make Peer Feedback More Effective
Use this prompt: ‘‘I teach a university class where students work on teams for the semester. You are my assistant, who is going to help me provide formative feedback to my students. I collect peer comments periodically throughout out the semester, and I would like you to summarize the comments into a performance feedback review in a way that is constructive and actionable. Additionally, the students assess themselves and I would like you to compare their responses to the peer feedback. The output should be in the form of a letter, and please exclude anything that is inappropriate for the workplace.’’ [If there are less than 2 comments for a student, please provide generic feedback only.]
Learn More: https://www.ijee.ie/1atestissues/Vol40-5/02_ijee4488.pdf
🫥 AI’s That Can Read Your Student’s Emotions
Google wants its AI bots to read emotions. Critics point out the dangers from misclassifying user behaviors. AND recent research suggests the science of “universal emotions” is culturally biased.
Learn More: https://techcrunch.com/2024/12/05/google-says-its-new-open-models-can-identify-emotions-and-that-has-experts-worried/
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Human-curated news about generative AI for Teaching and Learning in Higher Education.
😮 Word of the Day: “AI-giarism”
“The unethical practice of using artificial intelligence technologies, particularly generative language models, to generate content that is plagiarized either from original human-authored work or directly from AI-generated content, without appropriate acknowledgement of the original sources or AI’s contribution.” (Chan, 2024)
Learn More: https://doi.org/10.1007/s10639-024-13151-7
💚 H-Net Hosts 2025 AI Symposium: Fear, Faith, and Praxis: Artificial Intelligence in the Humanities and Social Sciences
This year’s theme, “Fear, Faith, and Praxis: Artificial Intelligence, Humanities, and Social Sciences,” focuses on student-centered approaches to the use of AI in pedagogical practice and reassessing previous assumptions about AI. This two-day event will be held on MSU’s campus on Feb 20-21, 2025, and available via live stream on the H-Net Commons.
Learn More: https://networks.h-net.org/2025-ai-symposium
💬 Try This: Use AI to Make Peer Feedback More Effective
Use this prompt: ‘‘I teach a university class where students work on teams for the semester. You are my assistant, who is going to help me provide formative feedback to my students. I collect peer comments periodically throughout out the semester, and I would like you to summarize the comments into a performance feedback review in a way that is constructive and actionable. Additionally, the students assess themselves and I would like you to compare their responses to the peer feedback. The output should be in the form of a letter, and please exclude anything that is inappropriate for the workplace.’’ [If there are less than 2 comments for a student, please provide generic feedback only.]
Learn More: https://www.ijee.ie/1atestissues/Vol40-5/02_ijee4488.pdf
🫥 AI’s That Can Read Your Student’s Emotions
Google wants its AI bots to read emotions. Critics point out the dangers from misclassifying user behaviors. AND recent research suggests the science of “universal emotions” is culturally biased.
Learn More: https://techcrunch.com/2024/12/05/google-says-its-new-open-models-can-identify-emotions-and-that-has-experts-worried/
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).
Posted by: Sarah Freye
Host: MSU Libraries
Latine Voces Uncensored: Symposium on Latine Narratives
The digital age has brought us virtually (pun, intended) limitless access to information in the digital age. Despite this access, Latine communities continue to confront the challenges of rampant misinformation, censorship and misrepresentation—influencing how Latines are depicted in news coverage to popular culture to domestic and international policies that directly impact (and harm) our communities. Accurate Latine narratives and exposure to our histories are as important as ever. In this timely and important symposium, we address representation and free speech through the lens of Puerto Rican storytelling.
Hosted by Latino Spartans & MSU Chicanx/Latinx Association (ChiLA).
Note: The first half of this event will take place at the Multicultural Center. The Library portion will begin at 1:30 in the Green Room.
Please register by clicking here
Navigating Context