We found 60 results that contain "academic integrity"

Posted on: GenAI & Education
user pic
Posted by 9 months ago
AI Commons Bulletin - Human-curated news about generative AI for Teaching and Learning in Higher Education. 12/11/2024

📔 Automatic AI Summaries Now in ProQuest
MSU’s Proquest library database access added an AI “Research Assistant” in an article sidebar. The tool features article summaries, additional sources, important concepts and research topics.
Learn More: Library Learning Space - https://librarylearningspace.com/proquest-launches-ai-powered-research-assistant-to-promote-responsible-ai-use-in-academia/

🔎 Introduction to Prompts
Organizes many practical tips for writing AI prompts into one framework. The article is specific to education and includes links to authoritative resources.
Learn More: Park, J., & Choo, S. (2024). Generative AI Prompt Engineering for Educators: Practical Strategies. Journal of Special Education Technology, 0(0). https://journals.sagepub.com/doi/10.1177/01626434241298954

🧬 Think of AI Uses as Along a Continuum
Monash University describes four examples of AI use in their courses:
1. Explore AI with students to build AI Literacy and discuss academic integrity.
2. Design assessments that focus on process rather than product to build critical thinking.
3. Incorporate new AI-enabled activities, like simulated personas.
4. Use AI for basic assessment, freeing educators to focus on personalized feedback.

Learn More: Hook, J., Junor, A., Sell, C., & Sapsed, C. (2024). Navigating integrity and innovation: Case studies of generative AI integration from an Arts Faculty. ASCILITE Publications, 165–172. https://publications.ascilite.org/index.php/APUB/article/view/1234/1478

Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).

Posted on: #iteachmsu
user pic
Posted by about 1 year ago
I might have to fire Microsoft Copilot if it doesn't catch on soon. . . Let me explain. The second week of each semester, once enrollments have stabilized, I form my classes of 50 students into 10 student learning teams that will collaborate each week through Week 14. In the past, I have used a free, completely random online team-builder app to do this. It's a little time consuming, but basically pretty easy.

This summer, as I was developing 10 podcast episodes that address how we might better integrate GenAI into our classrooms (see The Collaborative Cafe@WSTKS-FM Worldwide on Youtube), it occurred to me that I might be able to engineer more cohesive student learning teams by collecting information from students on Day #1 about their academic strengths and preferences. My idea was to use Copilot to group students in such a way that each person would bring unique talents, skills and abilities to the collaborative table, making for stringer teams that would work more effectively together.

Sounds easy enough, right? Dine in just a few minutes, right? Au contraire!

Actually, I ended up spending at least as much time, if not more, double-checking Copilot's problematic output. Here's what it and I kept running into. Despite a fairly straightforward prompt, Copilot neglected to include ALL students in the class list and doubled or tripled up on other names, randomly ignoring some names and their assets/preferences while assigning others to two or three learning teams at the same time. This happened more than once despite repeated attempts to clarify my initial prompt(s), and Copilot never managed to correct its errors.

In the end, quite a bit of additional time was necessary to comb through what Copilot spit out and fix its mistakes to ensure all 50 students in each section were, in fact, assigned to five-person learning teams. What should have taken five minutes at most, took more than two hours when all was said and done. Time I had not anticipated and don't really have to waste.

Sigh. A rather frustrating way to start the semester. Live and learn, right?

Posted on: GenAI & Education
user pic
Posted by about 2 years ago
MSU-specific training on AI for educators is now available on D2L. It includes links to policy documents on AI, and has modules on AI basics, integrating AI into teaching, modifying teaching to discourage AI use, academic integrity and privacy concerns, and developing a personal strategy for approaching AI in the classroom: https://apps.d2l.msu.edu/selfenroll/course/2003221
(If directly clicking this link doesn't work, try copying and pasting the text into your browser - iTeach is doing something strange with the html and I can't directly edit it)

Posted on: Reading Group for Student Engagement and Success
user pic
Posted by almost 4 years ago
Hello again everyone! Our reading group on Student Engagement and Success is slated to meat for 90 minutes this Friday morning (October 22nd) at 10am. Hope to see you then. For your convenience, here are the questions we'll discuss (or use as jumping off points) related to Chapter One in our book Student Engagement in Higher Education, Third Edition:

Questions on Pendakur, Quaye, and Harper (Ch. 1)

1) What is your view of Pendakur, Quaye, and Harper’s assertion that U.S. higher education, in general, is obligated to do more to foster student engagement within and beyond the classroom? What might be some practical challenges to do that?

2) In the Preface, Pendakur, Quaye, and Harper suggest that there is something temporally specific about the crisis of engagement they and their contributors describe. How would you describe engagement as a timely matter? In other words - what shape(s) does the issue of engagement take in 2021?

3) At the micro level (within our own teaching, advising, or other close work with students), how might we address the issue? What are some concrete steps we might take?

4) Describe your reaction(s) to the approach advocated at the bottom of p. 6, “Faculty and student affairs educators must foster the conditions to enable diverse populations of students to be engaged, persist, and thrive.” Where do you see difficulties with that aim? How might you nevertheless integrate that goal into your own practices? What might you change or adapt?

5) What makes PQH’s intersectional and anti-deficit lens appealing for this type of research? In particular, how do you respond to the book’s organizational reliance upon identity-based systems of oppression (which, we should note, we’ve proposed to use as an organizing principle for our discussions as well)?

6) What are some concrete ways we might be more intentional in our teaching/advising practices or other close work with students when it comes to cultivating their engagement. How do we help them to help themselves?

7) Pendakur, Quaye, and Harper discuss Tinto’s assertion that academic (and social) communities are key to student engagement, performance, and retention (4-5). What is your own view? How might the use of academic communities (student learning teams) nevertheless present challenges of one kind or another? What might be some concrete steps we could take to ease or avoid potential issues?

8) Near the end of Chapter One, Pendakur, Quaye, and Harper acknowledge that “Linking theory and practice is not simple” (12). Realistically, how might we achieve at least some of what they call for? How could we maximize results -- “the amount of time and effort students put into their [Gen. Ed. or Prereq.] studies” -- without completely redesigning our courses and component classes/modules?

9) In the “Distinguishing Educationally Purposeful Engagement” section, PQH mention the National Survey of Student Engagement (NSSE), which has collected data on ten engagement indicators for approx. 4,000,000 college students since 2000. What, if any, familiarity do you have with the NSSE, and how do you respond to their engagement indicators (subcategorized under Academic Challenge, Learning with Peers, Experiences with Faculty, Campus Environment) and High-Impact Practices (service learning, study abroad, research with faculty, internships)?

10) PQH deride the so-called “magical thinking” philosophy that undergirds much traditional scholarship of engagement and insist, instead, that “educators must facilitate structured opportunities for these dialogues to transpire” (8). What experience have you had with this type of facilitation? How did it seem to benefit the students involved?

11) For your own courses, what would you prioritize when it comes to fostering greater student engagement? How might you create or improve conditions that could facilitate that?


Posted on: GenAI & Education
user pic
Posted by 7 months ago
AI Commons Bulletin 2/17/2025

📰 Chronicle of Higher Ed Launches AI Chatbot
The Chronicle of Higher Education has rolled out an AI-powered chatbot to help users navigate its vast archives and answer common higher ed questions. While details on its training data and accuracy are limited, this marks another step in AI’s growing role in academic media.

Learn More: https://www.chronicle.com/chron-faq

📕 New Book on AI and HE explores The Good, the Bad, and the Ugly
The good: AI is here to stay, so let’s make it work for students.
The bad: Convenience comes at the cost of deeper intellectual labor.
The ugly: AI risks shaping a culture of compliance—where decisions are guided by systems without consciousness or accountability.

Learn More: Pulk, K., & Koris, R. (Eds.). (2025). Generative AI in Higher Education. Cheltenham, UK: Edward Elgar Publishing.

❓ If You Teach AI Literacy, Don’t Forget to Assess the RAG as Well as the LLM
When LLMs use retrieval augmented generation (RAG), they can give more trustworthy responses. What does that mean? Ni and colleagues (2025) evaluate rages, using NIST’s list of essentials:
Reliability
Privacy
Explainability
Fairness
Accountability
Safety
Learn More: https://doi.org/10.4337/9781035326020

🤖 On the Horizon: More and More Automated Instruction, Less Faculty?
We should think critically before it’s too late. A study found students using an AI course tutor performed as well and were as satisfied as those in instructor-led courses. As publishers integrate AI tutors, instructors may rely on them more, reducing direct teaching.

Learn More: Chun et al (2025). A Comparative Analysis of On-Device AI-Driven, Self-Regulated Learning and Traditional Pedagogy in University Health Sciences Education. Applied Sciences, 15(4), Article 4.

Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).

Posted on: Teaching Toolkit Tailgate
user pic
Posted by about 5 years ago
post image
Good morning! I'm Shannon Lynn Burton, your #AMA host for September 2nd, 2020! I am the University's Ombudsperson and can help answer questions related to the Code of Teaching Responsibility, Syllabus design, academic integrity, as well as navigating difficult conversations with students in the classroom (and more!). For more information on our office, please visit www.ombud.msu.edu. Let me know what questions you have around student rights & responsibilities!

Posted on: GenAI & Education
user pic
Posted by 6 months ago
AI Commons Bulletin 3/12/2025

🔊 MSU IT Announces ChatGPT and Gemini “Coming Soon”
The new AI page on tech.msu.edu teases that Gemini and ChatGPT are “coming soon” But it is unclear if the applications will be available for purchase or if the campus community will have free access of the latest foundational models.

Learn More: https://tech.msu.edu/technology/ai/

✍️ Departments at Johns Hopkins Integrated AI into their Curriculum Development Process
It’s like experiential learning for faculty – integrate AI into a standard task that you need to do anyway. Also has a list of very concrete bite-sized learning objectives for learning to use AI, like: name 3 chatbots, start a chat, list 3 ways to make a better prompt.

Learn More: Khamis, N., et al. (2025). More intelligent faculty development: Integrating GenAI in curriculum development programs. Medical Teacher, 1–3.

⚙️ AI Tools Are Being Used for All Stages of the Scientific Research Process
This working paper gives quite in-depth description of several AI tools being used for each of step of the research cycle: (1) lit review, (2) generating research ideas, (3) conducting experiments, (4) generating multimodal content, and (5) conducting peer-review. Recommended to get a good lay of the land.

Learn More: Eger, S., et al. (2025). Transforming Science with Large Language Models: A Survey on AI-assisted Scientific Discovery, Experimentation, Content Generation, and Evaluation.

📈 Grammarly Acquires Coda: From Writing Assistant to AI Productivity
Grammarly, popular with students and educators as a writing assistant software, just purchased the AI productivity company Coda. While Grammarly has previously positioned itself as a teaching tool for writing, this acquisition signals a move towards an AI productivity platform.

Learn More: https://www.grammarly.com/blog/company/grammarly-acquires-coda/

Bulletin items compiled by MJ Jackson and Sarah Freye with production assistance from Lisa Batchelder. Get the AI-Commons Bulletin on our Microsoft Teams channel, at aicommons.commons.msu.edu, or by email (send an email to aicommons@msu.edu with the word “subscribe”).

Posted on: CISAH
user pic
Posted by over 2 years ago
As part of what will hopefully be an ongoing conversation of the role of AI in teaching the arts and humanities, please consider sharing ways you have integrated (or are interesting in integrating) generative AI like ChatGPT into your IAH teaching.