For many years now, highlights from individual research studies that compare the effects of various active-learning strategies with lecture approaches have appeared in The Teaching Professor. Consistently, the results have favored active learning. But beyond a couple of small integrative analyses, what we’ve had so far is pretty much one study at a time. However, now we’ve got something significantly larger and more definitive.
A huge meta-analysis of the active learning-lecture research done in the science, technology, engineering, and math (STEM) fields has been completed. Its authors describe what they did. “We compared the results of experiments that documented student performance in courses with at least some active learning versus traditional lecturing by meta-analyzing 225 studies in the published and unpublished literature.” (p. 8410) They actually started out with 642 studies, but with the application of five predetermined criteria (explained in the article), 225 merited inclusion in their analysis. The active-learning interventions used in the individual studies varied widely, including group problem-solving, worksheets, and tutorials completed during class; clickers; and peer instruction, among others. The amount of class time devoted to the activities also ranged widely.
The analysis focused on two related questions: 1) Does active learning boost exam scores? and 2) Does active learning lower failure rates? The answer to both questions is an emphatic yes. Deborah Allen writes in another published summary of this research, “Major findings were that student performance on exams and other assessments (such as concept inventories) was nearly half a SD (standard deviation) higher in active-learning versus lecture courses, with an effect size (standardized mean weighted difference) of 0.47.” (p. 584) As for the failure rate, findings document that students in traditional lecture courses are 1.5 times more likely to fail than are students in the active-learning courses. “Average failure rates were 21.8% under active learning but 33.8% under traditional lectures—a difference that represents a 55% increase.” (p. 8410)
As if those numbers aren’t convincing enough, study authors provide additional context to make them especially significant. They report that there were 29,300 students in the 67 lecture courses analyzed in this meta-analysis. Given the failure rate data above, that means 3,516 of these students would not have failed had they been in an active-learning course.
The findings were amazingly consistent across these individual studies. There were no statistically significant differences with respect to disciplines. It didn’t matter if the courses were for majors or nonmajors, lower or upper division. Although active learning had the greatest positive effect in smaller classes, the effect was positive regardless of class size.
The research team concludes, “Although traditional lecturing has dominated undergraduate instruction for most of a millennium and continues to have strong advocates, current evidence suggests that a constructivist ‘ask don’t tell’ approach may lead to strong increases in student performance.” (p. 8413)
Carl Weiman, a Nobel Prize winner in physics and now an educational research scholar, writes in a commentary on this research, “The implications of these meta-analysis results for instruction are profound, assuming they are indicative of what could be obtained if active-learning methods replaced the lecture instruction that dominates U.S. postsecondary STEM instruction.” (pp. 8319-8320). He continues, “This meta-analysis makes a powerful case that any college or university that is teaching its STEM courses by traditional lectures is providing an inferior education to its students.” (p. 8320)
The research team echoes these strong words by noting that if the experiments analyzed here had been randomized controlled trials of a medical intervention, they very well might have been stopped. Patients in the control condition should get the experimental treatment that was clearly more beneficial. They also argue that there is no longer any need for research that compares active learning and lectures. The lecture has been discovered to be substantially less effective. Weiman elaborates, “If a new antibiotic is being tested for effectiveness, its effectiveness at curing patients is compared with the best current antibiotics and not with treatment by blood-letting.” (p. 8320)
Do note that this meta-analysis looked at research comparing active learning and lecture in the STEM fields. That’s where the bulk of the empirical work is presently occurring. However, it does leave open the question as to whether active learning has the same effects in those fields on the other side of the academic house. There are individual studies indicating that it does. However, a quantitative analysis like this one has not been undertaken in other fields.
A couple of notes on references: The meta-analysis itself is not long, but the description of the statistical methods used is detailed and difficult for those without a statistical background. However, the findings are outlined and discussed clearly. In addition, Allen has written a short, understandable summary of the research and its findings, and it appears in an open access journal. It’s a quick and easy way to learn more about this very important work. Weiman’s commentary is also worth reading. It too offers a clear summary with easier graphics, plus insightful and pointed comments. If you ever need to make the case for active learning or have colleagues who are still unconvinced, here’s the compelling evidence. Active learning wins!
References: Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okorafor, N., Jordt, H., and Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences (PNAS), 111 (23), 8410-8415.
Allen, D. (2014). Recent research in science teaching and learning. Cell Biology Education—Life Sciences Education, 13 (Winter), 584-5.
Weiman, C. E. (2014). Large-scale comparison of science teaching methods sends clear message. Proceedings of the National Academy of Sciences (PNAS), 111 (23), 8319-8320.