We found 117 results that contain "disability justice"

Posted on: #iteachmsu
Friday, Aug 20, 2021
MSU Learning Communities are Spaces to Explore Ideas in Education, Teaching, and Learning
"Being a part of the Learning Communities at MSU has been a wonderful experience. Within our community we have had the opportunity to share ideas, brainstorm solutions to challenges commonly faced, and expand our thinking with individuals from a wide variety of departments. I have deeply appreciated being a part of this new campus-wide community and having a space to connect with faculty and academic staff in similar positions to my own. Seeing what the other Learning Communities are doing has helped with inspiration for our own progress," said Mary-Anne Reid co-facilitator of the Sharing Process Improvement Tools in Undergraduate Internships and Experiential Education Learning Community. 
Learning Communities are self-organized, safe, and supportive spaces for faculty and academic staff to address complicated questions of curriculum and pedagogy. Michigan State University has supported these initiatives since 2004 and continues to do so through a funding program administered by the Academic Advancement Network in collaboration with the Hub for Innovation in Learning and Technology.
See what Learning Communities are available
 
Different Aims, Different Practices
Dr. Michael Lockett, the program Director, is quick to point out that the word “safe” is crucial to that statement of purpose, as it conveys the agency members and facilitators of Learning Communities enjoy. 
“Once a community is funded, our interventions in their work only take place at the most basic administrative level,” says Lockett. “It’s a space we designed to maximize autonomy and academic freedom.”
Learning Communities at MSU are free to propose their own topics and determine the structures that best support their interests. Accordingly, communities tend to vary greatly in their practices and topics. All communities, however, share three things in common: they meet at least eight times across the academic year, explore important educational themes, and welcome all members of MSU’s instructional staff, regardless of rank or discipline.
“We have approximately thirty communities running. That means approximately three hundred faculty members are contributing to and benefitting from the program.  Given that scale, there’s tremendous diversity in terms of topics and methods,” says Lockett. “Broadly defined, the conversations all connect back to ideas of education, teaching, and learning, but not necessarily in a formalized curricular context. We don’t limit their purview to credit-bearing courses at MSU and some communities are invested in educational topics that transcend this campus, or this country, or even this era.”
 
Dialogues Characterized by Freedom and Safety
Although many Learning Communities do not discuss fraught topics, some do.  “Because some groups explore topics related to critical pedagogy, they may require particular community structures,” says Lockett. “Which is to say the community is not closed but carefully defined.  All communities are inclusive. But the facilitators (those members responsible for the administration and protocol within the Community) determine the structure and it’s fair for them to ask their membership to commit to certain protocols.”
Some Communities only meet the required eight times during the academic year and encourage members to drop in or out at their discretion. Other Communities are working on highly complex questions of critical pedagogy, and require regular attendance, as the associated dialogues must be sustained and reflected upon. Ultimately, the facilitators decide the protocols for each Community. 
The conversations held in the Learning Communities might also involve very personal pedagogical experiences; those kinds of conversations require time, trust, and a sense of open inquiry to make the dialogue supportive and generative. The AAN strives to provide that atmosphere by respecting the autonomy of the facilitators and working diligently behind the scenes to design flexible administrative structures that can support diverse methods. Lockett says, “although it’s not necessarily their primary role, Learning Communities can be therapeutic spaces. There’s an emotional dimension to teaching, particularly in high-pressure contexts. These communities can become a place where people find support, where they can share and hopefully resolve some of the challenges they’re encountering, teacher-to-teacher.”
 
Why Learning Communities?
Variations on the Learning Communities program exist on many campuses. “Questions of curriculum and pedagogy are always complicated and often best addressed face-to-face,” says Lockett. “You can do a lot of important work through dialogue.  When colleagues get together to discuss curriculum and pedagogy, their conversations become nuanced and empathetic and situated in a way they can’t through other discursive forms.  They can also be highly creative and generative places where good ideas disseminate swiftly.”
 
Getting Involved
The Learning Communities at MSU grew over 150% last year, from 12 to 30 groups. Lockett credits the passion of the facilitators and the leadership of Drs. Grabill and Austin (Associate Provost for Teaching, Learning, and Technology, and Interim Associate Provost for Academic Staff Development, respectively).  He also applauds the work of his predecessor, Dr. Patricia Stewart, who advocated for the program’s continued existence and provided a vision of success. “We wouldn’t be seeing this level of engagement and success without Patti’s leadership and dedication to the program,” he says.
A full list of Learning Communities and the contact information of their facilitators is available below and on the Academic Advancement Network website, in addition to information on proposing new communities.
"As a co-facilitator of the ANS TLC the past few years, I have been impressed with our cohort’s desire to continue to become better educators. Our learning community focuses on presenting and supplying tools to our members that address their reported concerns of education, including but limited to instruction, assessment, and student engagement. Since the pandemic has rendered our instruction to be “survival mode”, the ANS TLC has reached out to provide tips and tricks to its members for better classroom experiences, in whatever platform is being used. We look forward to hosting monthly “Chitter-chatter What’s the Matter” discussions alongside our continual scaffolding of the ANS curriculum for the Fall 2020 semester." said Tasia Taxis, co-facilitator of the Department of Animal Science Teaching and Learning Community (ANS TLC) Learning Community.
 
Authored by: Gregory Teachout
post image
Posted on: #iteachmsu
Tuesday, Dec 3, 2024
Instructional Guidance Is Key to Promoting Active Learning in Online and Blended Courses
Instructional Guidance Is Key to Promoting Active Learning in Online and Blended Courses Written by: Jay Loftus Ed.D. (MSU / CTLI) & Michele Jacobsen, Ph.D. (Werklund School of Education - University of Calgary)
Abstract - Active learning strategies tend to originate from one of two dominant philosophical perspectives. The first position is active learning as an instructional philosophy, whereby inquiry-based and discovery learning are primary modalities for acquiring new information. The second perspective considers active learning a strategy to supplement the use of more structured forms of instruction, such as direct instruction. From the latter perspective, active learning is employed to reinforce conceptual learning following the presentation of factual or foundational knowledge. This review focuses on the second perspective and uses of active learning as a strategy. We highlight the need and often overlooked requirement for including instructional guidance to ensure active learning, which can be effective and efficient for learning and learners.
Keywords - Active learning, instructional guidance, design strategy, cognitive load, efficiency, online and blended courses
 
Introduction
Learner engagement in online courses has been a central theme in educational research for several years (Martin, Sun and Westing, 2020). As we consider the academic experiences during the COVID-19 pandemic, which began in 2020 and started to subside in 2022, it is essential to reflect on the importance of course quality (Cavanaugh, Jacquemin and Junker, 2023) and learner experience in online courses (Gherghel, Yasuda and Kita, 2023). Rebounding from our collected experience, learner engagement continues to be an important element of course design and delivery. This fact was highlighted in 2021, when the United States Department of Education (DOE) set forth new standards for institutions offering online courses. To be eligible for Title IV funding, new standards require non-correspondence courses to ensure regular and substantive interactions (RSI) between instructors and students (Downs, 2021). This requirement necessitates the need to find ways to engage students allowing instructors the ability to maximize their interactions. One possible solution is to use active learning techniques that have been shown to increase student engagement and learning outcomes (Ashiabi & O’ Neal, 2008; Cavanaugh et al., 2023).
Active learning is an important instructional strategy and pedagogical philosophy used to design quality learning experiences and foster engaging and interactive learning environments. However, this is not a novel perspective. Many years ago in their seminal work, Chickering and Gamson (1987) discussed the issue of interaction between instructors and students, suggesting that this was an essential practice for quality undergraduate education. The newfound focus on active learning strategies has become more pronounced following an examination of instructional practices from 2020 to 2022. For example, Tan, Chng, Chonardo, Ng  and Fung (2020) examined how chemistry instructors incorporated active learning into their instruction to achieve equivalent learning experiences in pre-pandemic classrooms. Similarly, Misra and Mazelfi (2021) described the need to incorporate group work or active learning activities into remote courses to: ‘increase students’ learning motivation, enforce mutual respect for friends’ opinions, foster excitement’ (p. 228). Rincon-Flores & Santos-Guevara (2021) found that gamification as a form of active learning, ‘helped to motivate students to participate actively and improved their academic performance, in a setting where the mode of instruction was remote, synchronous, and online’ (p.43). Further, the implementation of active learning, particularly gamification, was found to be helpful for promoting a more humanizing learning experience (Rincon-Flores & Santos-Guevara, 2021).
This review examines the use of active learning and presents instructional guidance as an often-overlooked element that must be included to make active learning useful and effective. The omission of explicit and direct instructional guidance when using active learning can be inefficient, resulting in an extraneous cognitive burden on learners (Lange, Gorbunova, Shcheglova and Costley, 2022). We hope to outline our justification through a review of active learning and offer strategies to ensure that the implementation of active learning is effective.
Active Learning as an Instructional Philosophy
Active learning is inherently a ‘student-centered’ instructional paradigm that is derived from a constructivist epistemological perspective (Krahenbuhl, 2016; Schunk, 2012). Constructivism theorizes that individuals construct their understanding through interactions and engagements, whereby the refinement of skills and knowledge results over time (Cobb & Bowers, 1999). Through inquiry, students produce experiences and make connections that lead to logical and conceptual growth (Bada & Olusegun, 2015). Engaging learners in activities, tasks, and planned experiences is an overarching premise of active learning as an instructional philosophy. As an overarching instructional philosophy, the role of instructional guidance can be minimized. As Hammer (1997) pointed out many years ago, the role of the instructor in these environments is to provide content and materials, and students are left make ‘discoveries’ through inquiry.
Inquiry-based learning (IBL) is an instructional practice that falls under the general category of ‘active learning’. The tenets of IBL adhere to a constructivist learning philosophy (de Jong et al., 2023) and can be characterized by the following six elements (Duncan & Chinn, 2021). Students will:

Generate knowledge through investigation of a novel issue or problem.
Work ‘actively’ to discover new findings.
Use of evidence to derive conclusions.
Take responsibility for their own learning through ‘epistemological agency’ (Chinn & Iordanou, 2023) and share their learning with a community of learners.
Use problem-solving and reasoning for complex tasks.
Collaborate, share ideas, and derive solutions with peers.

Historically, inquiry-based learning as a form of active learning was adopted as an overall instructional paradigm in disciplines such as medicine and was closely aligned with problem-based learning (PBL) (Barrows, 1996). Proponents of PBL advocate its use because of its emphasis on the development of skills such as communication, collaboration, and critical thinking (Dring, 2019). Critics of these constructivist approaches to instruction highlight the absence of a structure and any form of instructional guidance (Zhang & Cobern, 2021). Instead, they advocate a more explicit form of instruction such as direct instruction (Zhang, Kirschner, Corben and Sweller, 2022).
The view that a hybrid of IBL coupled with direct instruction is the optimal approach to implementing active learning has been highlighted in the recent academic literature (de Jong et al., 2023). The authors suggest that the selection of direct instruction or active learning strategies, such as IBL, should be guided by the desired outcomes of instruction. If the goal of instruction is the acquisition of more foundational or factual information, direct instruction is the preferred strategy. Conversely, IBL strategies are more appropriate ‘for the promotion of deep understanding and transferrable conceptual understanding of topics that are open-ended or susceptible to misconceptions’ (de Jong et al., 2023 p. 7).
The recommendation to use both direct instruction and approaches like IBL has reframed active learning as an instructional strategy rather than an overarching pedagogical philosophy. Active learning should be viewed as a technique or strategy coupled with direct instructional approaches (de Jong et al., 2023).
Active Learning as an Instructional Strategy
Approaching active learning as an instructional strategy rather than an overarching instructional philosophy helps clarify and address the varying perspectives found in the literature. Zhang et al. (2022) suggested that there is a push to emphasize exploration-based pedagogy. This includes instructional approaches deemed to be predicated on inquiry, discovery, or problem-based approaches. This emphasis has resulted in changes to curricular policies that mandate the incorporation of these instructional philosophies. Zhang et al. (2022) discussed how active learning approaches can be incorporated into science education policy to emphasize ‘inquiry’ approaches, despite adequate evidence for effectiveness.  Zhang et al. (2022) stated that the ‘disjoint between policy documents and research evidence is exacerbated by the tendency to ignore categories of research that do not provide the favored research outcomes that support teaching science through inquiry and investigations’ (p. 1162). Instead, Zhang et al. (2022) advocate for direct instruction as the primary mode of instruction in science education with active learning or ‘inquiry’ learning incorporated as a strategy, arguing that conceptual or foundational understanding ‘should not be ‘traded off’ by prioritizing other learning outcomes’ (p. 1172).
In response to Zhang et al. ’s (2022) critique, de Jong et al. (2023) argued that research evidence supports the use of inquiry-based instruction for the acquisition of conceptual understanding in science education. They asserted that both inquiry-based (or active learning approaches) and direct instruction serve specific learning needs. Direct instruction may be superior for foundational or factual learning, while inquiry-based or active learning may be better for conceptual understanding and reinforcement. The conclusion of de Jong et al. ’s (2023) argument suggests the use of a hybrid of direct instruction and active learning techniques, such as inquiry-based designs, depending on the stated learning objectives of the course or the desired outcomes.
This hybrid approach to instructional practice can help ensure that intended learning outcomes are matched with effective instructional strategies. Furthermore, a hybrid approach can help maintain efficiency in learning rather than leaving the acquisition of stated learning outcomes to discovery or happenstance (Slocum & Rolf, 2021).  This notion was supported by Nerantzi's (2020) suggestion that ‘students learn best when they are active and immersed in the learning process, when their curiosity is stimulated, when they can ask questions and debate in and outside the classroom, when they are supported in this process and feel part of a learning community’ (p. 187). Emphasis on learner engagement may support the belief that active learning strategies combined with direct instruction may provide an optimal environment for learning. Active learning strategies can be used to reinforce the direct or explicit presentation of concepts and principles (Lapitan Jr, Tiangco, Sumalinog, Sabarillo and  Diaz, 2021).
Recently, Zhang (2022) examined the importance of integrating direct instruction with hands-on investigation as an instructional model in high school physics classes. Zhang (2022) determined that ‘students benefit more when they develop a thorough theoretical foundation about science ideas before hands-on investigations’ (p. 111). This supports the earlier research in post-secondary STEM disciplines as reported by Freeman, Eddy, McDonough and Wenderoth (2014), where the authors suggested that active learning strategies help to improve student performance. The authors further predicted that active learning interventions would show more significant learning gains when combined with ‘required exercises that are completed outside of formal class sessions’ (p. 8413).
Active Learning Strategies
Active learning is characterized by activities, tasks, and learner interactions. Several characteristics of active learning have been identified, including interaction, peer learning, and instructor presence (Nerantzi, 2020). Technology affords students learning opportunities to connect pre-, during-, and post-formal learning sessions (Zou & Xie, 2019; Nerantzi, 2020). The interactions or techniques that instructors use help determine the types of interactions and outcomes that will result. Instructors may be ‘present’ or active in the process but may not provide adequate instructional guidance for techniques to be efficient or effective (Cooper, Schinske and Tanner, 2021; Kalyuga, Chandler and Sweller. 2001). To highlight this gap, we first consider the widely used technique of think-pair-share, an active learning strategy first introduced by Lyman (1981). This active learning strategy was introduced to provide all students equitable opportunities to think and discuss ideas with their peers. The steps involved in this technique were recently summarized (Cooper et al., 2021): i) provide a prompt or question to students, (ii) give students a chance to think about the question or prompt independently, (iii) have students share their initial answers/responses with a neighbor in a pair or a small group, and (iv) invite a few groups a chance to share their responses with the whole class.
Instructional guidance outlines the structure and actions associated with a task. This includes identifying the goals and subgoals, and suggesting strategies or algorithms to complete the task (Kalyuga et al., 2001). Employing the strategy of think-pair-sharing requires more instructional guidance than instructors may consider. The title of the strategy foreshadows what students will ‘do’ to complete the activity. However, instructional guidance is essential to help students focus on the outcome, rather than merely enacting the process of the activity. Furthermore, instructional guidance or instructions given to students when employing think-pair-sharing can help make this activity more equitable. Cooper et al. (2021) point out that equity is an important consideration when employing think-pair-share. Often, think-pair-share activities are not equitable during the pair or share portion of the exercise, and can be dominated by more vocal or boisterous students. Instructional guidance can help ensure that the activity is more equitable by providing more explicit instructions on expectations for sharing. For example, the instructions for a think-pair-share activity may include those that require each student to compose and then share ideas on a digital whiteboard or on a slide within a larger shared slide deck. The opportunity for equitable learning must be built into the instructions given to students. Otherwise, the learning experience could be meaningless or lack the contribution of students who are timid or find comfort in a passive role during group learning.
Further considerations for instructional guidance are necessary since we now use various forms of Information and Communications Technology (ICT) to promote active learning strategies. Web conferencing tools, such as Zoom, Microsoft Teams, and Google Meet, were used frequently during the height of required remote or hybrid teaching (Ahshan, 2021). Activities that separated students into smaller work groups via breakout rooms or unique discussion threads often included instructions on what students were to accomplish in these smaller collaborative groups. However, the communication of expectations or explicit guidance to help direct students in these groups were often not explicit or were not accessible once the students had been arranged into their isolated workspaces. These active learning exercises would have benefited from clear guidance and instructions on how to ‘call for help’ once separated from the larger group meetings. For example, Li, Xu, He, He, Pribesh, Watson and Major, (2021) described an activity for pair programming that uses zoom breakout rooms. In their description, the authors outlined the steps learners were expected to follow to successfully complete the active learning activity, as well as the mechanisms students used to ask for assistance once isolated from the larger Zoom session that contained the entire class. The description by Li et al. (2021) provided an effective approach to instructional guidance for active learning using Zoom.  Often, instructions are verbalized or difficult to refer to once individuals are removed from the general or common room. The lack of explicit instructional guidance in these activities can result in inefficiency (Kalyuga et al., 2001) and often inequity (Cooper et al., 2021).
The final active learning approach considered here was a case study analysis of asynchronous discussion forums. To extend engagement with course content, students were assigned a case study to discuss in a group discussion forum. The group is invited to apply course concepts and respond to questions as they analyze the case and prepare recommendations and a solution (Hartwell et al., 2021). Findings indicate that case study analysis in discussion forums as an active learning strategy “encouraged collaborative learning and contributed to improvement in cognitive learning” (Seethamraju, 2014, p. 9). While this active learning strategy can engage students with course materials to apply these concepts in new situations, it can also result in a high-volume-low-yield set of responses and posts without sufficient instructional guidance and clear expectations for engagement and deliverables. Hartwell, Anderson, Hanlon, and Brown (2021) offer guidance on the effective use of online discussion forums for case study analysis, such as clear expectations for student work in teams (e.g., a team contract), ongoing teamwork support through regular check-ins and assessment criteria, clear timelines and tasks for individual analysis, combined group discussion and cross-case comparison, review of posted solutions, and requirements for clear connections between case analysis and course concepts.
Active Learning & Cognitive Load Theory
In a recent review of current policy and educational standards within STEM disciplines, Zhang et al. (2022) argued that structured instructional approaches such as direct instruction align more closely with cognitive-based learning theories. These theories are better at predicting learning gains and identifying how learning occurs. Cognitive load theory is one such theory based on three main assumptions. First, humans have the capacity to obtain novel information through problem-solving or from other people. Obtaining information from other individuals is more efficient than generating solutions themselves. Second, acquired information is confronted by an individual’s limited capacity to first store information in working memory and then transfer it to unlimited long-term memory for later use. Problem-solving imposes a heavy burden on limited working memory. Thus, learners often rely on the information obtained from others. Finally, information stored in long-term memory can be transferred back to working memory to deal with familiar situations (Sweller, 2020). The recall of information from long-term memory to working memory is not bound by the limits of the initial acquisition of information in working memory (Zhang et al., 2022).
Zhang et al. (2022) state that ‘there never is a justification for engaging in inquiry-based learning or any other pedagogically identical approaches when students need to acquire complex, novel information’ (p. 1170). This is clearly a one-sided argument that focuses on the acquisition of information rather than the application of acquired information. This also presents an obvious issue related to the efficiency of acquiring novel information. However, Zhang et al. (2022) did not argue against the use of active learning or inquiry learning strategies to help reinforce concepts, or the use of the same to support direct instruction.
The combination of active learning strategies with direct instruction can be modified using assumptions of cognitive load, which highlights the need to include instructional guidance with active learning strategies. The inclusion of clear and precise instructions or instructional guidance is critical for effective active learning strategies (Murphy, 2023). As de Jong et al. (2023) suggest, ‘guidance is (initially) needed to make inquiry learning successful' (p.9). We cannot assume that instructional guidance is implied through the name of the activity or can be determined from the previous learning experiences of students. Assumptions lead to ambiguous learning environments that lack instructional guidance, force learners to infer expectations, and rely on prior and/or potentially limited active learning experiences. In the following section, we offer suggestions for improving the use of active learning strategies in online and blended learning environments by adding instructional guidance.
Suggestions for Improving the Use of Active Learning in Online and Blended Courses
The successful implementation of active learning depends on several factors. One of the most critical barriers to the adoption of active learning is student participation. As Finelli et al. (2018) highlighted, students may be reluctant to participate demonstrating behaviors such as, ‘not participating when asked to engage in an in-class activity, distracting other students, performing the required task with minimal effort, complaining, or giving lower course evaluations’ (p. 81). These behaviors are reminiscent of petulant adolescents, often discouraging instructors from implementing active learning in the future. To overcome this, the authors suggested that providing a clear explanation of the purpose of the active learning exercise would help curb resistance to participation. More recently, de Jong et al. (2023) stated a similar perspective that ‘a key issue in interpreting the impact of inquiry-based instruction is the role of guidance’ (p. 5). The inclusion of clear and explicit steps for completing an active learning exercise is a necessary design strategy. This aspect of instructional guidance is relatively easy to achieve with the arrival of generative artificial intelligence (AI) tools used to support instructors. As Crompton and Burke (2024) pointed out in their recent review, ‘ChatGPT can assist teachers in the creation of content, lesson plans, and learning activities’ (p.384). More specifically, Crompton and Burke (2024) suggested that generative AI could be used to provide step-by-step instructions for students. To illustrate this point, we entered the following prompt into the generative AI tool, goblin.tools (https://goblin.tools/) ‘Provide instructions given to students for a carousel activity in a college class.’ The output is shown in Fig. 1. This tool is used to break down tasks into steps, and if needed, it can further break down each step into a more discrete sequence of steps.

Figure 1 . Goblin.tools instructions for carousel active learning exercises.
The omission of explicit steps or direct instructional guidance in an active learning exercise can potentially increase extraneous cognitive load (Klepsch & Seufert, 2020; Sweller, 2020). This pernicious impact on cognitive load is the result of the diversion of one’s limited capacity to reconcile problems (Zhang, 2022). Furthermore, the complexity of active learning within an online or blended course is exacerbated by the inclusion of technologies used for instructional purposes. Instructional guidance should include requisite guidance for tools used in active learning. Again, generative AI tools, such as goblin.tools, may help mitigate the potential burden on cognitive load. For example, the use of webconferencing tools, such as Zoom or Microsoft Teams, has been pervasive in higher education. Anyone who uses these tools can relate to situations in which larger groups are segmented into smaller groups in isolated breakout rooms. Once participant relocation has occurred, there is often confusion regarding the intended purpose or goals of the breakout room. Newer features, such as collaborative whiteboards, exacerbate confusion and the potential for excessive extraneous load. Generative AI instructions (see Figure 2) could be created and offered to mitigate confusion and cognitive load burden.

Figure 2. Zoom collaborative whiteboard instructions produced by goblin.tools
 
Generative AI has the potential to help outline the steps in active learning exercises. This can be used to minimize confusion and serve as a reference for students. However, instruction alone is often insufficient to make active learning effective. As Finelli et al. (2018) suggest, the inclusion of a rationale for implementing active learning is an effective mechanism to encourage student participation. To this end, we suggest the adoption of what  Bereiter (2014) called Principled Practical Knowledge (PPK) which consists of the combination of ‘know-how’ with ‘know why’ (Bereiter, 2014). This perspective develops out of learners’ efforts to solve practical problems. It is a combination of knowledge that extends beyond simply addressing the task at hand. There is an investment of effort to provide a rationale or justification to address the ‘know why’ portion of PPK (Bereiter, 2014). Creating conditions for learners to develop ‘know-how’ is critical when incorporating active learning strategies in online and blended courses. Instructional guidance can reduce ambiguity and extraneous load and can also increase efficiency and potentially equity.
What is typically not included in the instructional guidance offered to students is comprehensive knowledge that outlines the requirements for technology that is often employed in active learning strategies. Ahshan (2021) suggests that technology skill competency is essential for the instructors and learners to implement the activities smoothly. Therefore, knowledge should include the tools employed in active learning. Instructors cannot assume that learners have a universal baseline of technological competency and thus need to be aware of this diversity when providing instructional guidance.
An often-overlooked element of instructional guidance connected to PPK is the ‘know-why’ component. Learners are often prescribed learning tasks without a rationale or justification for their utility. The underlying assumption for implementing active learning strategies is the benefits of collaboration, communication, and collective problem-solving are clear to learners (Dring, 2019; Hartikainen et al., 2019). However, these perceived benefits or rationales are often not provided explicitly to learners; instead, they are implied through use.
When implementing active learning techniques or strategies in a blended or online course one needs to consider not only the ‘know-how,’ but also the ‘know-why.’ Table 1 helps to identify the scope of instructional guidance that should be provided to students.
 
Table 1. Recommended Type of Instructional Guidance for Active Learning




 


Know How


Know Why




Activity


Steps


Purpose / Rationale




Technology


Steps


Purpose / Rationale




Outcomes / Products


Completion


Goals




 
The purpose of providing clear and explicit instructional guidance to learners is to ensure efficiency, equity, and value in incorporating active learning strategies into online and blended learning environments. Along with our argument for “know-why” (Bereiter, 2012), we draw upon Murphy (2023) who highlights the importance of “know-how’ by stating, ‘if students do not understand how a particular learning design helps them arrive at a particular outcome, they tend to be less invested in a course’ (n.p.).
Clear instructional guidance does not diminish the authenticity of various active learning strategies such as problem-based or inquiry-based techniques. In contrast, guidance serves to scaffold the activity and clearly outline learner expectations. Design standards organizations, such as Quality Matters, suggest the inclusion of statements that indicate a plan for how instructors will engage with learners, as well as the requirements for learner engagement in active learning. These statements regarding instructor engagement could be extended to include more transparency in the selection of instructional strategies. Murphy (2023) suggested that instructors should ‘pull back the curtain’ and take a few minutes to share the rationale and research that informs their decision to use strategies such as active learning. Opening a dialogue about the design process with students helps to manage expectations and anxieties that students might have in relation to the ‘What?’, ‘Why?’ and ‘How?’ for the active learning exercises.
Implications for Future Research
We contend that a blend of direct instruction and active learning strategies is optimized by instructional guidance, which provides explicit know-how and know-why for students to engage in learning tasks and activities. The present discussion does not intend to evaluate the utility of active learning as an instructional strategy. The efficacy of active learning is a recurring theme in the academic literature, and the justification for efficacy is largely anecdotal or based on self-reporting data from students (Hartikainen, Rintala, Pylväs and Nokelainen, 2019). Regardless, the process of incorporating active learning strategies with direct instruction appears to be beneficial for learning (Ahshan, 2021; Christie & De Graaff, 2017; Mintzes, 2020), and more likely, the learning experience can be harder to quantify. Our argument relates to the necessary inclusion of instructions and guidance that make the goals of active learning more efficient and effective (de Jong et al., 2023). Scardamalia and Bereiter (2006) stated earlier that knowledge about dominates traditional educational practice. It is the stuff of textbooks, curriculum guidelines, subject-matter tests, and typical school “projects” and “research” papers. Knowledge would be the product of active learning. In contrast, knowledge of, ‘suffers massive neglect’ (p. 101).  Knowledge enables learners to do something and allows them to actively participate in an activity. Knowledge comprises both procedural and declarative knowledge.  It is activated when the need for it is encountered in the action. Instructional guidance can help facilitate knowledge of, making the use of active learning techniques more efficient and effective.
Research is needed on the impact of instructional guidance on active learning strategies, especially when considering the incorporation of more sophisticated technologies and authentic problems (Rapanta, Botturi, Goodyear, Guardia and Koole 2021; Varvara, Bernardi, Bianchi, Sinjari and Piattelli, 2021). Recently, Lee (2020) examined the impact of instructor engagement on learning outcomes in an online course and determined that increased instructor engagement correlated with enhanced discussion board posts and student performance. A similar examination of the relationship between the instructional guidance provided and student learning outcomes would be a valuable next step. It could offer more explicit guidance and recommendations for the design and use of active learning strategies in online or blended courses.
Conclusion
Education was disrupted out of necessity for at least two years. This experience forced us to examine our practices in online and blended learning, as our sample size for evaluation grew dramatically. The outcome of our analysis is that effective design and inclusion of student engagement and interactions with instructors are critical for quality learning experiences (Rapanta et al., 2021; Sutarto, Sari and Fathurrochman, 2020; Varvara et al., 2021). Active learning appeals to many students (Christie & De Graaff, 2017) and instructors as it can help achieve many of the desired and required outcomes of our courses and programs. Our review and discussion highlighted the need to provide clear and explicit guidance to help minimize cognitive load and guide students through an invaluable learning experience. Further, instructors and designers who include explicit guidance participate in a metacognitive process, while they outline the purpose and sequence of steps required for the completion of active learning exercises. Creating instructions and providing a rationale for the use of active learning in a course gives instructors and designers an opportunity to reflect on the process and ensure that it aligns with the intended purpose or stated goals of the course. This reflective act makes active learning more intentional in use rather than employing it to ensure that students are present within the learning space.
 
References
Ahshan, R. (2021). A Framework of Implementing Strategies for Active Student Engagement in Remote/Online Teaching and Learning during the COVID-19 Pandemic. Education Sciences, 11(9). https://doi.org/10.3390/educsci11090483
Ashiabi, G. S., & O’neal, K. K. (2008). A Framework for Understanding the Association Between Food Insecurity and Children’s Developmental Outcomes. Child Development Perspectives, 2(2), 71–77.
Bada, S. O., & Olusegun, S. (2015). Constructivism learning theory: A paradigm for teaching and learning. Journal of Research & Method in Education, 5(6), 66–70.
Barrows, H. S. (1996). Problem‐based learning in medicine and beyond: A brief overview. New Directions for Teaching and Learning, 1996(68), 3–12.
Bereiter, C. (2014). Principled practical knowledge: Not a bridge but a ladder. Journal of the Learning Sciences, 23(1), 4–17.
Cavanaugh, J., Jacquemin, S. J., & Junker, C. R. (2023). Variation in student perceptions of higher education course quality and difficulty as a result of widespread implementation of online education during the COVID-19 pandemic. Technology, Knowledge and Learning, 28(4), 1787–1802.
Chinn, C. A., & Iordanou, K. (2023). Theories of Learning. Handbook of Research on Science Education: Volume III.
Christie, M., & De Graaff, E. (2017). The philosophical and pedagogical underpinnings of Active Learning in Engineering Education. European Journal of Engineering Education, 42(1), 5–16.
Cobb, P., & Bowers, J. (1999). Cognitive and situated learning perspectives in theory and practice. Educational Researcher, 28(2), 4–15.
Cooper, K. M., Schinske, J. N., & Tanner, K. D. (2021). Reconsidering the share of a think–pair–share: Emerging limitations, alternatives, and opportunities for research. CBE—Life Sciences Education, 20(1), fe1.
Crompton, H., & Burke, D. (2024). The Educational Affordances and Challenges of ChatGPT: State of the Field. TechTrends, 1–13.
de Jong, T., Lazonder, A. W., Chinn, C. A., Fischer, F., Gobert, J., Hmelo-Silver, C. E., Koedinger, K. R., Krajcik, J. S., Kyza, E. A., & Linn, M. C. (2023). Let’s talk evidence–The case for combining inquiry-based and direct instruction. Educational Research Review, 100536.
Dring, J. C. (2019). Problem-Based Learning – Experiencing and understanding the prominence during Medical School: Perspective. Annals of Medicine and Surgery, 47, 27–28. https://doi.org/10.1016/j.amsu.2019.09.004
Duncan, R. G., & Chinn, C. A. (2021). International handbook of inquiry and learning. Routledge.
Finelli, C. J., Nguyen, K., DeMonbrun, M., Borrego, M., Prince, M., Husman, J., Henderson, C., Shekhar, P., & Waters, C. K. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching, 47(5).
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415.
Hammer, D. (1997). Discovery learning and discovery teaching. Cognition and Instruction, 15(4), 485–529.
Hartikainen, S., Rintala, H., Pylväs, L., & Nokelainen, P. (2019). The Concept of Active Learning and the Measurement of Learning Outcomes: A Review of Research in Engineering Higher Education. Education Sciences, 9(4). https://doi.org/10.3390/educsci9040276
Hartwell, A., Anderson, M., Hanlon, P., & Brown, B. (2021). Asynchronous discussion forums: Five learning designs.
Kalyuga, S., Chandler, P., & Sweller, J. (2001). Learner experience and efficiency of instructional guidance. Educational Psychology, 21(1), 5–23.
Klepsch, M., & Seufert, T. (2020). Understanding instructional design effects by differentiated measurement of intrinsic, extraneous, and germane cognitive load. Instructional Science, 48(1), Article 1.
Krahenbuhl, K. S. (2016). Student-centered Education and Constructivism: Challenges, Concerns, and Clarity for Teachers. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 89(3), 97–105. https://doi.org/10.1080/00098655.2016.1191311
Lange, C., Gorbunova, A., Shcheglova, I., & Costley, J. (2022). Direct instruction, worked examples and problem solving: The impact of instructional strategies on cognitive load. Innovations in Education and Teaching International, 1–13.
Lapitan Jr, L. D., Tiangco, C. E., Sumalinog, D. A. G., Sabarillo, N. S., & Diaz, J. M. (2021). An effective blended online teaching and learning strategy during the COVID-19 pandemic. Education for Chemical Engineers, 35, 116–131.
Lee, J. W. (2020). The roles of online instructional facilitators and student performance of online class activity. Lee, Jung Wan (2020). The Roles of Online Instructional Facilitators and Student Performance of Online Class Activity. Journal of Asian Finance Economics and Business, 7(8), 723–733.
Li, L., Xu, L. D., He, Y., He, W., Pribesh, S., Watson, S. M., & Major, D. A. (2021). Facilitating online learning via zoom breakout room technology: A case of pair programming involving students with learning disabilities. Communications of the Association for Information Systems, 48(1), 12.
Lyman, F. (1981). Strategies for Reading Comprehension Think Pair Share. Unpublished Paper. University of Maryland Paper. Http://Www. Roe13. K12. Il.
Mintzes, J. J. (2020). From constructivism to active learning in college science. Active Learning in College Science: The Case for Evidence-Based Practice, 3–12.
Misra, F., & Mazelfi, I. (2021). Long-distance online learning during pandemic: The role of communication, working in group, and self-directed learning in developing student’s confidence. 225–234.
Murphy, J. T. (2023). Advice | 5 Ways to Ease Students Off the Lecture and Into Active Learning. The Chronicle of Higher Education. https://www.chronicle.com/article/5-ways-to-ease-students-off-the-lecture-and-onto-active-learning
Nerantzi, C. (2020). The use of peer instruction and flipped learning to support flexible blended learning during and after the COVID-19 Pandemic. International Journal of Management and Applied Research, 7(2), 184–195.
Rapanta, C., Botturi, L., Goodyear, P., Guàrdia, L., & Koole, M. (2021). Balancing technology, pedagogy and the new normal: Post-pandemic challenges for higher education. Postdigital Science and Education, 3(3), 715–742.
Rincon-Flores, E. G., & Santos-Guevara, B. N. (2021). Gamification during Covid-19: Promoting active learning and motivation in higher education. Australasian Journal of Educational Technology, 37(5), 43–60. https://doi.org/10.14742/ajet.7157
Scardamalia, M., & Bereiter, C. (2006). Knowledge building. The Cambridge.
Schunk, D. H. (2012). Learning theories an educational perspective. Pearson Education, Inc.
Seethamraju, R. (2014). Effectiveness of using online discussion forum for case study analysis. Education Research International, 2014.
Slocum, T. A., & Rolf, K. R. (2021). Features of direct instruction: Content analysis. Behavior Analysis in Practice, 14(3), 775–784.
Sutarto, S., Sari, D. P., & Fathurrochman, I. (2020). Teacher strategies in online learning to increase students’ interest in learning during COVID-19 pandemic. Jurnal Konseling Dan Pendidikan, 8(3), 129–137.
Sweller, J. (2020). Cognitive load theory and educational technology. Educational Technology Research and Development, 68(1), 1–16.
Tan, H. R., Chng, W. H., Chonardo, C., Ng, M. T. T., & Fung, F. M. (2020). How chemists achieve active learning online during the COVID-19 pandemic: Using the Community of Inquiry (CoI) framework to support remote teaching. Journal of Chemical Education, 97(9), 2512–2518.
Varvara, G., Bernardi, S., Bianchi, S., Sinjari, B., & Piattelli, M. (2021). Dental Education Challenges during the COVID-19 Pandemic Period in Italy: Undergraduate Student Feedback, Future Perspectives, and the Needs of Teaching Strategies for Professional Development. Healthcare, 9(4). https://doi.org/10.3390/healthcare9040454
Zhang, L. (2022). Guidance differs between teaching modes: Practical challenges in integrating hands-on investigations with direct instruction. Learning: Research and Practice, 8(2), 96–115.
Zhang, L., & Cobern, W. W. (2021). Confusions on “guidance” in inquiry-based science teaching: A response to Aditomo and Klieme (2020). Canadian Journal of Science, Mathematics and Technology Education, 21, 207–212.
Zhang, L., Kirschner, P. A., Cobern, W. W., & Sweller, J. (2022). There is an evidence crisis in science educational policy. Educational Psychology Review, 34(2), 1157–1176.
Zou, D., & Xie, H. (2019). Flipping an English writing class with technology-enhanced just-in-time teaching and peer instruction. Interactive Learning Environments, 27(8), 1127–1142.
 
Authored by: Jay Loftus
post image
Posted on: #iteachmsu
Tuesday, Aug 24, 2021
Comparative Analysis of Crowdmark and Gradescope
Executive Summary 
This analysis presents a review and comparison of two instructional technologies for administering and digitally grading online and in-person assessments: Crowdmark and Gradescope. We tested both instructor and student workflows for creating, submitting, and grading assessments using Crowdmark and Gradescope integrated with a test course in D2L. Our evaluation criteria included ease of use, features available, accessibility, and flexibility. We found some key similarities:

Remote and in person assessments are supported, with multiple question types.
Grading is done by question rather than by student for more consistency.
Multiple graders can grade assignments, such as co-instructors and teaching assistants.
Grades are synced automatically with the gradebook in D2L Brightspace.

The primary differences between these two are:

Crowdmark can assign assessments according to sections and a drag and drop functionality is available for rubric comments.
Crowdmark emails students when assessments become available and can accept more file types as well as rotate files more easily.
Gradescope allows for time extensions at the course level as well as for each assessment and allows for grading the assessments before the due date.

Based on these findings, we recommend continuing with Crowdmark, the more established and familiar tool. Although Gradescope includes some extra functionalities over Crowdmark, such as programming assessments, these functions are already handled by other tools or have not been used often or at all by faculty (e.g., CSE 231 Introduction to Programming uses Mimir for programming assignments). Crowdmark also offers fast grade sync with the D2L gradebook and the scanning and matching capabilities are more robust for in person assessments.
"The second-best way to grade exams" by ilmungo is licensed under CC BY-NC-SA 2.0

Methods
We tested both instructor and student workflows for creating and submitting assessments using Crowdmark and Gradescope integrated with a test course in D2L. Sample assignments were created for the remote assessments that included all of the available question types (i.e., upload file, enter text, multiple choice, etc.). Using separate accounts, we assigned the assessments as an instructor, submitted the assessments as a student, then returned to the instructor account to grade the assessments and sync the grades to our D2L test course. 
Findings
Key Similarities:
Both Crowdmark and Gradescope offer keyboard shortcuts for faster grading; allow late submissions, group submissions, and enforced time limits; and allow for grading by question instead of by student as well as multiple graders such as teaching assistants. Assignment submissions can include pdf or image upload, free response/short answer in a text box, or multiple choice/multi select type questions (with bubble sheets) for online assessments. For both tools, students can upload one PDF and then drag and drop each page to match each question for remote assessments, while instructors can scan and upload student submissions in batches for in person assessments. Both tools will also attempt to split a batch PDF into individual student submissions.
Key Differences:
Accessing Tools
Students have to login to Crowdmark through the Crowdmark website. This link can be added to D2L Brightspace and opened in a new, external web page. The Crowdmark sign-in prompts students to select their institution and then uses students’ Brightspace login. Gradescope can be added to D2L Brightspace as an External Tool in a D2L content module. This allows students to access Gradescope within D2L as an embedded website within the D2L page, instead of as an external page, and does not require any additional login.
Creating Assessments
When creating assessments in Crowdmark, instructors choose between administered (in person) assessments that instructors will upload or assigned (remote) assessments that students will upload (Figure 1). Administered assessments can include bubble sheets for multiple choice questions. Assigned remote assessments can include file upload, text entry responses, or multiple-choice questions (which are automatically graded).When creating an assignment in Gradescope, the assignment type must be chosen first. Then, for the first three assignment types, the submission type is designated as either the instructor or the students (Figure 2). Although Exam/Quiz and Homework/Problem Set are offered as two different choices, they actually have the same options and essential functions. There are no further options if the instructor will be uploading the assessments, but other options are available if students will be uploading. Submissions can be variable length, where students submit any number of pages and indicate the pages where their question responses are, or fixed length where students submit work where answers are in fixed locations (like worksheets). Instructors can also allow students to view and download the assessment template if desired. Multiple choice assignments can be created with printable bubble sheets that either instructors or students can upload. Programming assignments are available, which Crowdmark does not support, and they can be automatically or manually graded.
Figure 1: Assessment types available in Crowdmark.

Figure 2: Assessment types available in Gradescope.
Both tools have the ability for students to take online quizzes. Both have multiple choice and multi select that are auto-graded, and both have free response and file upload that are NOT auto-graded. Gradescope supports short answer questions which are auto-graded, but Crowdmark only has free response questions.For assignments that students will upload, instructors must input text or upload a document for each individual question in Crowdmark. It is possible for an instructor to upload one document in the instructions field which contains all of the assignment questions and then simply enter numbers in the text boxes for each question, rather than the text of each question. Gradescope only requires one document to be uploaded. Each question is then identified by dragging a box around each question area on the page and a question title must be entered.
Assigning & Distributing Assessments
For courses with several sections, Crowdmark allows assessments to be assigned to specific sections rather than the entire course. To approximate this feature in Gradescope, an instructor would have to create separate Gradescope courses or duplicate assignments and direct students to the appropriate version for their section.Both tools allow instructors to set individual accommodations for each assignment to customize due date, lateness penalty, or time to complete. However, Gradescope also allows course-wide extensions for students, where extensions can be added for all assignments to customize time limits (multiply time by x or add x minutes) and due dates. Crowdmark requires accommodations to be made in the submission area for each assignment. It does not support course-wide accommodations.When an assessment is assigned and released to students, Crowdmark sends a notification email to students, where Gradescope only sends an in-platform notification. Gradescope does send a confirmation email when students successfully submit an assignment. Both tools give instructors the option to send a notification email when returning student work.
Submitting Assessments
For in-person assessments, Crowdmark can include a QR code on assignments to ensure that every page of student work is correctly matched to the appropriate student for grading. The QR code can be manually scanned and matched to each student using an app as the assignment is turned in, or instructors can use automated matching (beta) to include a form field where students write their name and ID number for automated character recognition to identify the student and match them to that assignment’s QR code. Gradescope is developing a feature to create a unique label for each copy of an assignment and add that label to each page, but this is not currently available.Submitted file types are more flexible in Crowdmark, which can support PDF, JPEG, PNG, and iPhone photos, any of which can be rotated after submission. Gradescope accepts only PDFs or JPEGs and only PDF pages can be rotated. This means that Crowdmark offers much more flexibility in scanning software and orientation. Gradescope does have a built-in PDF scanner for iOS devices to circumvent format issues and allow seamless upload. Both tools assume that image submissions are of work associated with a single question. All work can be scanned into a single PDF for upload and each page then manually associated with each question in the assignment. In both tools, the student selects which question(s) are associated with each page(s), where multiple questions may be on a single page or multiple pages may be associated with a single question.Crowdmark allows for group submissions when either the instructor or the students scan and upload the assessments. This ability to match multiple students to one assessment allows for two-stage exams, collaborative lab reports, or other group assignments. Gradescope only allows group submissions when students scan and upload assessments, although online assignments also allow group submissions.
Grading Assessments
Assignments can be graded immediately after students have submitted them in Gradescope. Crowdmark does not allow grading to be done until the due date has passed.In Crowdmark, all feedback comments created for each question are stored in a comment library which can be reordered easily by dragging a comment to the desired location. There is no limit on the number of comments that can be dragged and dropped onto each student’s submission. Crowdmark comments can have positive or negative points attached to them, but specifying points is not required. Gradescope does not allow for dragging and dropping multiple comments; however, text annotations are saved for each question and several can be applied to each submission. The separate rubric comments must be associated with positive or negative points for each question. The rubric type can be either negative scoring, where the points are subtracted from 1.0, or positive scoring, where the points are added to 0. Score bounds can also be set, with a maximum of 1.0 and a minimum of 0. While it is possible to select more than one rubric comment, only one comment can be added as part of a “submission specific adjustment” which can include an additional point adjustment.Crowdmark sends grades to D2L and automatically creates the grade item in the gradebook. Gradescope requires that the grade item be created first, then associated with an assignment, before sending grades is possible.
Table 1: Feature Comparison between Crowdmark and Gradescope.



Topic


Crowdmark


Advantage


Gradescope




Accessing Tools


Must access through separate website; sign in to Crowdmark via Brightspace



Can add External Tool to D2L module and it can be accessed within D2L (embedded website into page)




Creating Assessments


Upload PDF and designate where questions are for administered assessments that instructors upload (drag question number to location on page)



Upload PDF and designate where questions are by dragging boxes on the page for fixed length exam/homework that students upload or an administered exam/homework that instructors upload




Must input or upload individual questions manually when creating remote assessments that students upload (but instructor can upload PDF in directions area and just enter Q1, Q2, etc. in text boxes)



Must input question titles separately for variable length submissions that students upload, but questions are designated by dragging box over location on page (no need to enter text of question in Gradescope)




Assigning & Distributing Assessments


Can assign assessments to a section rather than entire course



Cannot assign assessments to a section; must create separate course or duplicate assignments and instruct students which one to submit




Add time for accommodations for each assessment only (customize due date, lateness penalty, or time to complete)



Add extensions at course level and/or for each assessment (multiply time by x or add x minutes)




Students always receive email when new assignments are ready to be completed



Students are not notified when new assignments are ready; but students do receive email when they have submitted an assignment, and instructor has option to send email once the assignment is graded




Submitting Assessments


QR codes on printed work for in person administered assessments (can also use app to match assessments to students when scanning)



Create printouts (beta) for in person assessments; give each student a copy of the assignment with a unique label on each page (this tool is NOT yet available)




iPhone photos supported; can accept PDF, JPG, or PNG (and can rotate any file) for remote assignments submitted by students



iPhone photos not supported; accepts PDF or JPG only (can only rotate PDFs) for remote assignments submitted by students; multiple files and any file type accepted for online assignments




Allows for group submissions whether students or instructors are uploading assessments (i.e. match multiple students to one assessment)



Allows for group submissions only if students are uploading assessments, but also available for online assignments




Grading Assignments


Must wait until due date to begin grading remote assessments



Online assignments can be graded immediately




Drag and drop any number of comments from comment library for each question



Can apply one previously used comment for each submission separate from rubric; cannot select or drag and drop multiple comments, but can add multiple previously used text annotations for each question




Comments can have positive or negative points attached to them, but specifying points is not required



Comments must have associated points (positive, negative, or 0) for each question; can change rubric type from negative scoring (points subtracted from 1.0) to positive scoring (points added to 0) as well as enable/disable score bounds (max of 1.0 and min of 0)




Grades sent to D2L automatically with no need to create grade item first



Grades sent to D2L automatically but must create grade item first




 
MSU Usage Data
We explored the usage of each tool at MSU to determine if there was a perceptible trend towards one tool over the other. The total number of courses created in each tool is fairly similar (Table 2). Interestingly, the total number of students enrolled in those courses is much higher in Crowdmark, while the number of assessments administered is higher in Gradescope.
Table 2. Tool usage in courses with at least one student and at least one assessment.


 

Crowdmark


Gradescope




Courses


322


292




Students


25,322


14,398




Assessments


3,308


4,494




Crowdmark has been used by MSU instructors since 2016. Gradescope has been used since 2018. More courses were created in Crowdmark until the 2020 calendar year (Figure 3). Usage of both tools spiked in 2020, presumably due to the COVID-19 induced shift to remote teaching, and was fairly equivalent that year. For the Spring 2021 semester, more courses have been created in Gradescope. It will be interesting to observe whether this trend towards Gradescope usage continues as 2021 progresses or if Crowdmark usage picks back up.Given the disparity between number of students vs. number of classes & assessments, we explored the frequency of class sizes between the two tools (Figure 4). Both tools have been used for classes of all sizes, though the median class size is 37 for Gradescope and 63 for Crowdmark. We also explored the frequency of assessment numbers between the tools (Figure 5). We found that all but one course had 1-60 assessments created, with both tools most frequently having 2-20 assessments. Gradescope showed an interesting secondary peak of courses having 35-45 assessments. We do not have detailed information for either tool on what kinds of assessments were created or whether all of those assessments were actually used, not just created in the course for practice, or duplicates (e.g., available later, more accessible, or different versions for different class sections in Gradescope).
Figure 3. Number of courses created in each tool that had at least one student and at least one assessment for each calendar year since 2016.

Figure 4. Number of courses having a given class size and at least one assessment.

Figure 5. Number of classes having a given number of assessments and at least one student.

Discussion:
Our analysis showed significant functional overlap between Crowdmark and Gradescope, where either tool could be chosen with little to no impact on instructor capability. However, there are a few advantages to the way that Crowdmark handles assignment tracking, submission, and grade syncing to D2L. In particular, Crowdmark already offers a fast QR-code method for matching every page of in-person assessments to the appropriate student enrolled in the course when scanning the assessments in batches. We expect this feature will become a strong asset in the Fall 2021 semester as more classes will be on campus. If we were to choose between Crowdmark and Gradescope for continued support, we would recommend Crowdmark. Gradescope is a competitive technology, but it is still developing and refining capabilities that are already available through Crowdmark or D2L. If an instructor were to need to switch from Gradescope to Crowdmark, they should refer to the D2L self-enroll course “MSU Tools and Technologies” for detailed information and resources on using Crowdmark at MSU and closely review Table 1 to understand the key differences they may encounter. The Assessment Services team and/or Instructional Technology & Development team in the IT department are also available for one-on-one consultation on using either technology (request a consultation via the MSU Help Desk).
Authored by: Jennifer Wagner & Natalie Vandepol
post image
Posted on: #iteachmsu
post image
Comparative Analysis of Crowdmark and Gradescope
Executive Summary 
This analysis presents a review and compari...
Authored by:
Tuesday, Aug 24, 2021
Posted on: Center for Teaching and Learning Innovation
Wednesday, Jan 15, 2025
2025 National Day of Racial Healing (January 21)
2025's National Day of Racial Healing will occur on Tuesday, January 21. The American Association of Colleges and Universities has called on colleges and universities to "engage in activities, events, or strategies that promote healing, foster engagement around issues of racism, bias, inequity, and injustice, and build an equitable and just society where all individuals can thrive" as part of the National Racial Day of Healing. The National Day of Racial Healing is "a time to contemplate our shared values and create the blueprint together for #HowWeHeal from the effects of racism. Launched in 2017, it is an opportunity to bring [all] people together and inspire collective action to build common ground for a more just and equitable world." As part of this commitment, educators may consider engaging in events during the week, participate in the #HowWeHeal hashtag, and bringing awareness to the day with students. Some activities and events within the MSU and Lansing community includes: 

Read the #HowWeHeal Conversation Guide
Watch the "Changing the Narrative" series
Attend the MSU's 2025 MLK Student Symposium
Attend Lansing's Beloved Community Week
Attend MSU Libraries' Social Justice Film and Discussion on: Fannie Lou Hamer's America
Posted by: Bethany Meadows
post image
Posted on: Educator Stories
Monday, Feb 15, 2021
Featured Educator: Kate Sonka
This week, we are featuring, Kate Sonka, Assistant Director of Inclusion & Academic Technology in the College of Arts and Letters. Kate was recognized via iteach.msu.edu's Thank and Educator Initiative! We encourage MSU community members to nominate high impact Spartan educators (via our Thank an Educator form) regularly!
Read more about Kate's perspectives below. #iteachmsu's questions are bolded below, followed by Kate's responses!

In one word, what does being an educator mean to you? 
Connection
Share with me what this word/quality looks like in your practice? 
This looks like connections between educators and students, connections between learners and course content, connections among students, connections among faculty, and so forth. In that way, I continually keep these possibilities for connections and collaborations in my mind as I work to support teaching and learning in the College of Arts & Letters (CAL) and the broader MSU community. Some of this appears through faculty professional development opportunities I help create and facilitate and it also appears through the grad and undergrad courses I teach. And in a broader sense, it very much features in the work I do as Executive Director of Teach Access.
Have your ideas on this changed over time? if so how? 
If anything each day I find more and expansive ways to connect people to ideas and to each other. I’m definitely a life-long learner myself, so as I take in and learn new information or new pedagogies I want to share those out.
Tell me more about your educational “setting.” This can include, but not limited to departmental affiliations, community connections, co-instructors, and students.
I’m situated in the CAL Dean’s office and I report to both the Assistant Dean for Academic and Research Technology AND the Associate Dean of Academic Personnel and Administration. Beyond that, I spend a lot of my time working with colleagues in a variety of colleges and units across MSU.
What is a challenge you experience in your educator role? 
A challenge I experience is one we all face – how do we meet students where they are and ensure we’re creating inclusive learning spaces for everyone in our class.
Any particular “solutions” or “best practices” you’ve found that help you support student success at the university despite/in the face of this? 
Making sure we take the time to actually listen to students. I always include surveys to collect anonymous feedback before the semester, mid-way through, and at the end asking about how inclusive (or not) I’ve been as an educator and recommendations on how to help them meet their learning goals. And wherever I can, I try to incorporate that feedback while I still have students in the class, and/or use that feedback to improve the course the next time I teach it.
What are practices you utilize that help you feel successful as an educator? 
Certainly the student surveys I mentioned above help me understand if I’m being successful, but also any sort of additional feedback I can get from students or colleagues also helps.
What topics or ideas about teaching and learning would you like to see discussed on the iteach.msu.edu platform? Why do you think this conversation is needed at msu? 
I would love to see more conversations about how people are creating accessible learning environments and how considering students with disabilities improves their overall teaching practice. We’ve made some progress in this area since I’ve been at MSU, but the more we could share with each other, the more I think other educators would be energized to try in their own classes.
What are you looking forward to (or excited to be a part of) next semester? 
I’ve been doing more work with the CAL Inclusive Pedagogy Initiative, and we were just considering a two-part workshop series on topics of inclusion. Excited to see how this work expands in our college and beyond!
 
Don't forget to celebrate individuals you see making a difference in teaching, learning, or student success at MSU with #iteachmsu's Thank an Educator initiative. You might just see them appear in the next feature! Follow the MSU Hub Twitter account to see other great content from the #iteachmsu Commons as well as educators featured every week during #ThankfulThursdays.
 

 
 
Authored by: Kristen Surla
post image