Active Learning Blog

Learning Fearlessly Part 2

Mar 10, 2015 8:00:00 AM by Colin Montpetit

grades183Undeniably, research on the benefits of using student response systems has shown that their strategic use supports active learning approaches in a wide variety of learning environments. The use of LectureTools and its various features facilitates anonymous student answers and enhanced communication with the professor. The types of questions an educator can use to engage students in classroom activities is not only consistent with the notion of this support, it also enhances and provides many more unique opportunities for student learning. In addition to the factors that contribute to the high participation rates (~90-99% per class basis) in my classroom, low-stakes participation and the low cost associated with gaining access to LectureTools through institutional support were are also deemed highly important in terms of engagement in my classroom.

In my efforts to implement student-focused activities in my classroom to foster engagement and ultimately learning, I have progressively taken advantage of the possibilities created by SRSs to achieve this goal. Indeed, since my first use of hand-held clickers to the use of cloud-based systems; I’ve found that solutions such as LectureTools have expanded the possibilities. LectureTools enables educators to ask questions via a variety of formats, including MCQs, short answer questions, image quizzes, and ordered lists. This makes it easier for me to design class activities that are dedicated to addressing misconceptions and difficult concepts while gauging understanding, and assessing prior learning. f We all know the adage of “practice makes perfect.” Thus, engaging students in relevant ways become meaningful opportunities to prepare students for summative evaluations.

In this second part of my blog series, I attempt to answer the question - does engagement translate into success?  To answer this question, I share below my observations of trends in class performance through the lens of final exam average scores, and, learning gains and item analysis scores from a validated concept assessment test in relation to my use of SRSs to enable my pedagogical approaches. Overall, these assessments are designed to measure 8 prescribed course level learning outcomes.

Class performance on final exams:

Every year since my implementation of SRS-linked-peer instruction approaches in my class:

  • The final exam average has been steadily increasing (~68% to ~75%) compared to the years before I introduced SRSs in my classroom (~65%).
  • Concomitantly, the proportion of students in the A+, A, A-, B+ and B range of our letter grade system increased.
  • The proportion of students in the C+, C, D+, D, ranges decreased. Moreover, failing rates decreased from 5% (approximately 25 students) of the class to 1% (approximately 5 students; E and F ranges).
  • The more pronounced impact of these changes occurred last year, my first year using LectureTools in the classroom.
  • These same trends were also seen in the course final grade distribution.

Concept Inventories

Because exam questions and difficulty may differ from year to year along with group abilities, and despite all the good intentions to formulate thoughtful and useful questions to assess student learning, final exam scores may not necessarily serve as good indicators of class success. An alternative way to assess classroom performance is through the use of concept inventories. Concept inventories are tools designed to help educators evaluate students’ understanding of a specific set of concepts and identify misconceptions. Unlike typical MCQ tests, both questions and response choices are the subject of extensive research designed to determine both what a range of people think a particular question is asking and what the most common answers are. In its final form, the concept questions presented both correct answers as well as distractors that are actually incorrect answers based on commonly held misconceptions.

If valid and reliable, concept inventory data can be used to measure student learning over the duration of the course and provide educators data that can be used to evaluate the effectiveness of classroom interventions and thus, learning.  As matter of habit to assess teaching and learning, a genetic concept inventory (Smith et al., 2008), which comprises a set of 25 multiple choice questions designed to measure the aforementioned course learning outcomes is administered at the beginning (pre-assessment) to get a baseline level of student understanding and again at the end of the course (post-assessment).

Analyses of the results of the student performance on the concept inventory administered to my classes prior and after my use of SRSs revealed the following:

  • Item difficulty and discrimination indices indicate that as a class, students did better on almost all the questions in the post-assessment phase and this to greater extents compared to years where SRSs were not used;
  • Normalized learning gains (measured by calculating Post score – Pre score / 100 – pre) progressively increased (yr1= 48%, yr2=53%, and yr3=60%) compared to 30-36% prior to using SRSs linked peer instruction methods.

In this blog, I make no claim that the data provides convincing arguments for a causal relationship between student engagement and success in the classroom. Using “evidence-based student focused activities” in my classroom, the data presented above are consistent with investigations that demonstrate that educational conditions and practices that foster student engagement contribute to student success. So, does student engagement using LectureTools translate into classroom success? I dare you to try it out and judge it for yourself!

Miss Part 1 of this blog series? Read it here.

To learn more about Dr. Montpetit's success, view the Webinar archive:

 

 

VIEW THE  WEBINAR ARCHIVE  

1 = Smith, M.K., Wood, W.B., and Knight, J.K. 2008. The Genetic Concept Assessment: A New Concept Inventory for Gauging Student Understanding of Genetics. CBE- Life Sciences Education Vol 7, 422-430