Book cover

Collaborative Active Learning pp 33–52 Cite as

Active Learning: An Integrative Review

  • Gillian Kidman 4 &
  • Minh Nguyet Nguyen 4  
  • First Online: 10 December 2022

950 Accesses

This chapter provides an integrated review of active learning in higher education over the past two decades. The research during this time was scrutinised, revealing four methodological approaches (students’ behaviour and how they engage in their studies, activities/tasks/strategies developed/used to generate/nurture/promote active learning, the theoretical approach to active learning, and the impact of the physical learning environment). Creating an Active Learning Framework of student engagement facilitated a deeper analysis of the 30 articles against four engagement indicators (reflective and integrative learning, learning strategies, quantitative reasoning, and collaborative learning). An adjacency analysis of the methodological approaches and engagement indicators revealed that the active learning field is a maturing research area where researched areas indicate high influence relationships. However, the research focuses on the student undertaking active learning and the materials used. Devoid of research attention is the Lecturer/tutor, their identity as a facilitator of active learning, and as a learner in this area.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

* Denotes that the paper is included in the integrative review

Google Scholar  

* Beckerson, W. C., Anderson, J. O., Perpich, J. D., & Yoder-Himes, D. (2020). An introvert’s perspective: Analyzing the impact of active learning on multiple levels of class social personalities in an upper level biology course. Journal of College Science Teaching, 49 (3), 47–57.

Article   Google Scholar  

* Brewe, E., Dou, R., & Shand, R. (2018). Costs of success: Financial implications of implementation of active learning in introductory physics courses for students and administrators. Physical Review. Physics Education Research, 14 (1), 010109. https://doi.org/10.1103/PhysRevPhysEducRes.14.010109

* Bucklin, B. A., Asdigian, N. L., Hawkins, J. L., & Klein, U. (2021). Making it stick: Use of active learning strategies in continuing medical education. BMC Medical Education, 21 (1), 44–44. https://doi.org/10.1186/s12909-020-02447-0

* Chan, K., Cheung, G., Wan, K., Brown, I., & Luk, G. (2015). Synthesizing technology adoption and learners’ approaches towards active learning in higher education. Electronic Journal of e-Learning, 13 (6), 431–440.

* Cooper, K. M., Downing, V. R., & Brownell, S. E. (2018). The influence of active learning practices on student anxiety in large-enrollment college science classrooms. Int J STEM Educ, 5 (1), 1–18. https://doi.org/10.1186/s40594-018-0123-6

* Damaskou, E., & Petratos, P. (2018). Management strategies for active learning in AACSB accredited STEM discipline of CIS: Evidence from traditional and novel didactic methods in higher edu. International Journal for Business Education, 158 , 41–56.

* Daouk, Z., Bahous, R., & Bacha, N. N. (2016). Perceptions on the effectiveness of active learning strategies. Journal of Applied Research in Higher Education, 8 (3), 360–375. https://doi.org/10.1108/JARHE-05-2015-0037

* Das Neves, R. M., Lima, R. M., & Mesquita, D. (2021). Teacher competences for active learning in engineering education. Sustainability (Basel, Switzerland), 13 (16), 9231. https://doi.org/10.3390/su13169231

* Fields, L., Trostian, B., Moroney, T., & Dean, B. A. (2021). Active learning pedagogy transformation: A whole-of-school approach to person-centred teaching and nursing graduates. Nurse Education in Practice, 53 , 103051–103051. https://doi.org/10.1016/j.nepr.2021.103051

* Gahl, M. K., Gale, A., Kaestner, A., Yoshina, A., Paglione, E., & Bergman, G. (2021). Perspectives on facilitating dynamic ecology courses online using active learning. Ecology and Evolution, 11 (8), 3473–3480. https://doi.org/10.1002/ece3.6953

* Ghilay, Y., & Ghilay, R. (2015). TBAL: Technology-based active learning in higher education. Journal of education and learning, 4 (4). https://doi.org/10.5539/jel.v4n4p10

* Grossman, G. D., & Simon, T. N. (2020). Student perceptions of open educational resources video-based active learning in university-level biology classes: A multi-class evaluation. Journal of College Science Teaching, 49 (6), 36–44.

* Hartikainen, S., Rintala, H., Pylväs, L., & Nokelainen, P. (2019). The concept of active learning and the measurement of learning outcomes: A review of research in engineering higher education. Education Sciences, 9 (4), 276. https://doi.org/10.3390/educsci9040276

* Holec, V., & Marynowski, R. (2020). Does it matter where you teach? Insights from a quasi-experimental study of student engagement in an active learning classroom. Teaching and learning inquiry, 8 (2), 140–163. LEARNINQU.8.2.10. https://doi.org/10.20343/TEACH

* Hyun, J., Ediger, R., & Lee, D. (2017). Students’ satisfaction on their Learning process in active learning and traditional classrooms. International Journal of Teaching and Learning in Higher Education, 29 (1), 108–118.

* Ito, H., & Kawazoe, N. (2015). Active learning for creating innovators: Employability skills beyond industrial needs. International Journal of Higher Education, 4 (2). https://doi.org/10.5430/ijhe.v4n2p81

* Kressler, B., & Kressler, J. (2020). Diverse student perceptions of active learning in a large enrollment STEM course. The Journal of Scholarship of Teaching and Learning, 20 (1). https://doi.org/10.14434/josotl.v20i1.24688

* Lim, J., Ko, H., Yang, J. W., Kim, S., Lee, S., Chun, M.-S., Ihm, J., & Park, J. (2019). Active learning through discussion: ICAP framework for education in health professions. BMC Medical Education, 19 (1), 477–477. https://doi.org/10.1186/s12909-019-1901-7

* Linsey, J., Talley, A., White, C., Jensen, D., & Wood, K. (2009). From tootsie rolls to broken bones: An innovative approach for active learning in mechanics of materials. Advances in Engineering Education, 1 (3), 1–22.

* Ludwig, J. (2021). An experiment in active learning: The effects of teams. International Journal of Educational Methodology, 7 (2), 353–360. https://doi.org/10.12973/ijem.7.2.353

* MacVaugh, J., & Norton, M. (2012). Introducing sustainability into business education contexts using active learning. International Journal of Sustainability in Higher Education, 13 (1), 72–87. https://doi.org/10.1108/14676371211190326

* Mangram, J. A., Haddix, M., Ochanji, M. K., & Masingila, J. (2015). Active learning strategies for complementing the lecture teaching methods in large classes in higher education. Journal of Instructional Research, 4 , 57–68.

* Pundak, D., Herscovitz, O., & Shacham, M. (2010). Attitudes of face-to-face and e-learning instructors toward “Active Learning”.  The European Journal of Open, Distance and E-Learning, 13 .

* Rose, S., Hamill, R., Caruso, A., & Appelbaum, N. P. (2021). Applying the plan-do-study-act cycle in medical education to refine an antibiotics therapy active learning session. BMC Medical Education, 21 (1), 1–459. https://doi.org/10.1186/s12909-021-02886-3

* Stewart, D. W., Brown, S. D., Clavier, C. W., & Wyatt, J. (2011). Active-learning processes used in us pharmacy education. American Journal of Pharmaceutical Education, 75 (4), 68–68. https://doi.org/10.5688/ajpe75468

* Tirado-Olivares, S., Cózar-Gutiérrez, R., García-Olivares, R., & González-Calero, J. A. (2021). Active learning in history teaching in higher education: The effect of inquiry-based learning and a student response system-based formative assessment in teacher training. Australasian Journal of Educational Technology, 37 (5), 61–76. https://doi.org/10.14742/ajet.7087

* Torres, V., Sampaio, C. A., & Caldeira, A. P. (2019). Incoming medical students and their perception on the transition towards an active learning. Interface (Botucatu, Brazil), 23 . https://doi.org/10.1590/Interface.170471

* Van Amburgh, J. A., Devlin, J. W., Kirwin, J. L., & Qualters, D. M. (2007). A tool for measuring active learning in the classroom. American Journal of Pharmaceutical Education, 71 (5), 85. https://doi.org/10.5688/aj710585

* Walters, K. (2014). Sharing classroom research and the scholarship of teaching: Student resistance to active learning may not be as pervasive as is commonly believed. Nursing Education Perspectives, 35 (5), 342–343. https://doi.org/10.5480/11-691.1

* William, C. B., Jennifer, O. A., John, D. P., & Debbie, Y.-H. (2020). An introvert’s perspective: Analyzing the impact of active learning on multiple levels of class social personalities in an upper level biology course. Journal of College Science Teaching, 49 (3), 47–57.

Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for beginners . SAGE.

Bonwell, C., & Eison, J. (1991).  Active learning: Creating excitement in the classroom AEHE-ERIC higher education ( Report No. 1 ). Jossey-Bass.

Cooper, H. M. (1984). The integrative research review: A systematic approach . SAGE Publications.

Coughlan, M., & Cronin, P. (2017). Doing a literature review in nursing, health and social care (2nd ed.) (p. 11), Beverly Hills. SAGE Publications.

Denney, A. S., & Tewksbury, R. (2013). How to write a literature review. Journal of Criminal Justice Education, 24 (2), 218–234. https://doi.org/10.1080/10511253.2012.730617

Freeman, S., Eddy, S., McDonough, M., Smith, K., Okoroafor, N., Jordt, H., & Wenderoth, M. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America., 111 (23), 8410–8415.

Hake, R. R. (1998). Interactive-engagement vs. traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66, 64. https://doi.org/10.1119/1.18809

Keathley-Herring, H., Van Aken, E., Gonzalez-Aleu, F., et al. (2016). Assessing the maturity of a research area: Bibliometric review and proposed framework. Scientometrics, 109 , 927–951. https://doi.org/10.1007/s11192-016-2096-x

Landscape design validation. (2009). Before drawing look at adjacency . July 26 post. https://ldvalidate.wordpress.com/2009/07/26/before-drawing-look-at-adjacency/

National Survey of Student Engagement (NSSE). (2020). Engagement indicators & high-impact practices . https://nsse.indiana.edu/nsse/survey-instruments/engagement-indicators.html

Russell, C. L. (2005). An overview of the integrative research review. Progress in Transplantation, 15 (1), 8–13. https://doi.org/10.7182/prtr.15.1.0n13660r26g725kj

Toronto, C. E. (2020). Overview of the integrative review. In C. E. Toronto & R. Remington (Eds.), A step-by-step guide to conducting an integrative review (1st ed.). Springer.

Download references

Author information

Authors and affiliations.

School of Curriculum, Teaching and Inclusive Education, Monash University, Melbourne, Australia

Gillian Kidman & Minh Nguyet Nguyen

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Gillian Kidman .

Editor information

Editors and affiliations.

Monash University Malaysia, Subang Jaya, Malaysia

Chan Chang-Tik

Monash University, Clayton, VIC, Australia

Gillian Kidman

Universiti Malaya, Kuala Lumpur, Malaysia

Meng Yew Tee

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Cite this chapter.

Kidman, G., Nguyen, M.N. (2022). Active Learning: An Integrative Review. In: Chang-Tik, C., Kidman, G., Tee, M.Y. (eds) Collaborative Active Learning. Palgrave Macmillan, Singapore. https://doi.org/10.1007/978-981-19-4383-6_2

Download citation

DOI : https://doi.org/10.1007/978-981-19-4383-6_2

Published : 10 December 2022

Publisher Name : Palgrave Macmillan, Singapore

Print ISBN : 978-981-19-4382-9

Online ISBN : 978-981-19-4383-6

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Open access
  • Published: 15 March 2021

Instructor strategies to aid implementation of active learning: a systematic literature review

  • Kevin A. Nguyen 1 ,
  • Maura Borrego 2 ,
  • Cynthia J. Finelli   ORCID: orcid.org/0000-0001-9148-1492 3 ,
  • Matt DeMonbrun 4 ,
  • Caroline Crockett 3 ,
  • Sneha Tharayil 2 ,
  • Prateek Shekhar 5 ,
  • Cynthia Waters 6 &
  • Robyn Rosenberg 7  

International Journal of STEM Education volume  8 , Article number:  9 ( 2021 ) Cite this article

23k Accesses

36 Citations

5 Altmetric

Metrics details

Despite the evidence supporting the effectiveness of active learning in undergraduate STEM courses, the adoption of active learning has been slow. One barrier to adoption is instructors’ concerns about students’ affective and behavioral responses to active learning, especially student resistance. Numerous education researchers have documented their use of active learning in STEM classrooms. However, there is no research yet that systematically analyzes these studies for strategies to aid implementation of active learning and address students’ affective and behavioral responses. In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and recommended at least one strategy for implementing active learning. In this paper, we ask: (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide?

In our review, we noted that most active learning activities involved in-class problem solving within a traditional lecture-based course ( N = 21). We found mostly positive affective and behavioral outcomes for students’ self-reports of learning, participation in the activities, and course satisfaction ( N = 23). From our analysis of the 29 studies, we identified eight strategies to aid implementation of active learning based on three categories. Explanation strategies included providing students with clarifications and reasons for using active learning. Facilitation strategies entailed working with students and ensuring that the activity functions as intended. Planning strategies involved working outside of the class to improve the active learning experience.

To increase the adoption of active learning and address students’ responses to active learning, this study provides strategies to support instructors. The eight strategies are listed with evidence from numerous studies within our review on affective and behavioral responses to active learning. Future work should examine instructor strategies and their connection with other affective outcomes, such as identity, interests, and emotions.

Introduction

Prior reviews have established the effectiveness of active learning in undergraduate science, technology, engineering, and math (STEM) courses (e.g., Freeman et al., 2014 ; Lund & Stains, 2015 ; Theobald et al., 2020 ). In this review, we define active learning as classroom-based activities designed to engage students in their learning through answering questions, solving problems, discussing content, or teaching others, individually or in groups (Prince & Felder, 2007 ; Smith, Sheppard, Johnson, & Johnson, 2005 ), and this definition is inclusive of research-based instructional strategies (RBIS, e.g., Dancy, Henderson, & Turpen, 2016 ) and evidence-based instructional practices (EBIPs, e.g., Stains & Vickrey, 2017 ). Past studies show that students perceive active learning as benefitting their learning (Machemer & Crawford, 2007 ; Patrick, Howell, & Wischusen, 2016 ) and increasing their self-efficacy (Stump, Husman, & Corby, 2014 ). Furthermore, the use of active learning in STEM fields has been linked to improvements in student retention and learning, particularly among students from some underrepresented groups (Chi & Wylie, 2014 ; Freeman et al., 2014 ; Prince, 2004 ).

Despite the overwhelming evidence in support of active learning (e.g., Freeman et al., 2014 ), prior research has found that traditional teaching methods such as lecturing are still the dominant mode of instruction in undergraduate STEM courses, and low adoption rates of active learning in undergraduate STEM courses remain a problem (Hora & Ferrare, 2013 ; Stains et al., 2018 ). There are several reasons for these low adoption rates. Some instructors feel unconvinced that the effort required to implement active learning is worthwhile, and as many as 75% of instructors who have attempted specific types of active learning abandon the practice altogether (Froyd, Borrego, Cutler, Henderson, & Prince, 2013 ).

When asked directly about the barriers to adopting active learning, instructors cite a common set of concerns including the lack of preparation or class time (Finelli, Daly, & Richardson, 2014 ; Froyd et al., 2013 ; Henderson & Dancy, 2007 ). Among these concerns, student resistance to active learning is a potential explanation for the low rates of instructor persistence with active learning, and this negative response to active learning has gained increased attention from the academic community (e.g., Owens et al., 2020 ). Of course, students can exhibit both positive and negative responses to active learning (Carlson & Winquist, 2011 ; Henderson, Khan, & Dancy, 2018 ; Oakley, Hanna, Kuzmyn, & Felder, 2007 ), but due to the barrier student resistance can present to instructors, we focus here on negative student responses. Student resistance to active learning may manifest, for example, as lack of student participation and engagement with in-class activities, declining attendance, or poor course evaluations and enrollments (Tolman, Kremling, & Tagg, 2016 ; Winkler & Rybnikova, 2019 ).

We define student resistance to active learning (SRAL) as a negative affective or behavioral student response to active learning (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). The affective domain, as it relates to active learning, encompasses not only student satisfaction and perceptions of learning but also motivation-related constructs such as value, self-efficacy, and belonging. The behavioral domain relates to participation, putting forth a good effort, and attending class. The affective and behavioral domains differ from much of the prior research on active learning that centers measuring cognitive gains in student learning, and systematic reviews are readily available on this topic (e.g., Freeman et al., 2014 ; Theobald et al., 2020 ). Schmidt, Rosenberg, and Beymer ( 2018 ) explain the relationship between affective, cognitive, and behavioral domains, asserting all three types of engagement are necessary for science learning, and conclude that “students are unlikely to exert a high degree of behavioral engagement during science learning tasks if they do not also engage deeply with the content affectively and cognitively” (p. 35). Thus, SRAL and negative affective and behavioral student response is a critical but underexplored component of STEM learning.

Recent research on student affective and behavioral responses to active learning has uncovered mechanisms of student resistance. Deslauriers, McCarty, Miller, Callaghan, and Kestin’s ( 2019 ) interviews of physics students revealed that the additional effort required by the novel format of an interactive lecture was the primary source of student resistance. Owens et al. ( 2020 ) identified a similar source of student resistance, which was to their carefully designed biology active learning intervention. Students were concerned about the additional effort required and the unfamiliar student-centered format. Deslauriers et al. ( 2019 ) and Owens et al. ( 2020 ) go a step further in citing self-efficacy (Bandura, 1982 ), mindset (Dweck & Leggett, 1988 ), and student engagement (Kuh, 2005 ) literature to explain student resistance. Similarly, Shekhar et al.’s ( 2020 ) review framed negative student responses to active learning in terms of expectancy-value theory (Wigfield & Eccles, 2000 ); students reacted negatively when they did not find active learning useful or worth the time and effort, or when they did not feel competent enough to complete the activities. Shekhar et al. ( 2020 ) also applied expectancy violation theory from physics education research (Gaffney, Gaffney, & Beichner, 2010 ) to explain how students’ initial expectations of a traditional course produced discomfort during active learning activities. To address both theories of student resistance, Shekhar et al. ( 2020 ) suggested that instructors provide scaffolding (Vygotsky, 1978 ) and support for self-directed learning activities. So, while framing the research as SRAL is relatively new, ideas about working with students to actively engage them in their learning are not. Prior literature on active learning in STEM undergraduate settings includes clues and evidence about strategies instructors can employ to reduce SRAL, even if they are not necessarily framed by the authors as such.

Recent interest in student affective and behavioral responses to active learning, including SRAL, is a relatively new development. But, given the discipline-based educational research (DBER) knowledge base around RBIS and EBIP adoption, we need not to reinvent the wheel. In this paper, we conduct a system review. Systematic reviews are designed to methodically gather and synthesize results from multiple studies to provide a clear overview of a topic, presenting what is known and what is not known (Borrego, Foster, & Froyd, 2014 ). Such clarity informs decisions when designing or funding future research, interventions, and programs. Relevant studies for this paper are scattered across STEM disciplines and in DBER and general education venues, which include journals and conference proceedings. Quantitative, qualitative, and mixed methods approaches have been used to understand student affective and behavioral responses to active learning. Thus, a systematic review is appropriate for this topic given the long history of research on the development of RBIS, EBIPs, and active learning in STEM education; the distribution of primary studies across fields and formats; and the different methods taken to evaluate students’ affective and behavioral responses.

Specifically, we conducted a systematic review to address two interrelated research questions. (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies ? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide ? These two questions are linked by our goal of sharing instructor strategies that can either reduce SRAL or encourage positive student affective and behavioral responses. Therefore, the instructor strategies in this review are only from studies that present empirical data of affective and behavioral student response to active learning. The strategies we identify in this review will not be surprising to highly experienced teaching and learning practitioners or researchers. However, this review does provide an important link between these strategies and student resistance, which remains one of the most feared barriers to instructor adoption of RBIS, EBIPs, and other forms of active learning.

Conceptual framework: instructor strategies to reduce resistance

Recent research has identified specific instructor strategies that correlate with reduced SRAL and positive student response in undergraduate STEM education (Finelli et al., 2018 ; Nguyen et al., 2017 ; Tharayil et al., 2018 ). For example, Deslauriers et al. ( 2019 ) suggested that physics students perceive the additional effort required by active learning to be evidence of less effective learning. To address this, the authors included a 20-min lecture about active learning in a subsequent course offering. By the end of that course, 65% of students reported increased enthusiasm for active learning, and 75% said the lecture intervention positively impacted their attitudes toward active learning. Explaining how active learning activities contribute to student learning is just one of many strategies instructors can employ to reduce SRAL (Tharayil et al., 2018 ).

DeMonbrun et al. ( 2017 ) provided a conceptual framework for differentiating instructor strategies which includes not only an explanation type of instructor strategies (e.g., Deslauriers et al., 2019 ; Tharayil et al., 2018 ) but also a facilitation type of instructor strategies. Explanation strategies involve describing the purpose (such as how the activity relates to students’ learning) and expectations of the activity to students. Typically, instructors use explanation strategies before the in-class activity has begun. Facilitation strategies include promoting engagement and keeping the activity running smoothly once the activity has already begun, and some specific strategies include walking around the classroom or directly encouraging students. We use the existing categories of explanation and facilitation as a conceptual framework to guide our analysis and systematic review.

As a conceptual framework, explanation and facilitation strategies describe ways to aid the implementation of RBIS, EBIP, and other types of active learning. In fact, the work on these types of instructor strategies is related to higher education faculty development, implementation, and institutional change research perspectives (e.g., Borrego, Cutler, Prince, Henderson, & Froyd, 2013 ; Henderson, Beach, & Finkelstein, 2011 ; Kezar, Gehrke, & Elrod, 2015 ). As such, the specific types of strategies reviewed here are geared to assist instructors in moving toward more student-centered teaching methods by addressing their concerns of student resistance.

SRAL is a particular negative form of affective or behavioral student response (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). Affective and behavioral student responses are conceptualized at the reactionary level (Kirkpatrick, 1976 ) of outcomes, which consists of how students feel (affective) and how they conduct themselves within the course (behavioral). Although affective and behavioral student responses to active learning are less frequently reported than cognitive outcomes, prior research suggests a few conceptual constructs within these outcomes.

Affective outcomes consist of any students’ feelings, preferences, and satisfaction with the course. Affective outcomes also include students’ self-reports of whether they thought they learned more (or less) during active learning instruction. Some relevant affective outcomes include students’ perceived value or utility of active learning (Shekhar et al., 2020 ; Wigfield & Eccles, 2000 ), their positivity toward or enjoyment of the activities (DeMonbrun et al., 2017 ; Finelli et al., 2018 ), and their self-efficacy or confidence with doing the in-class activity (Bandura, 1982 ).

In contrast, students’ behavioral responses to active learning consist of their actions and practices during active learning. This includes students’ attendance in the class, their participation , engagement, and effort with the activity, and students’ distraction or off-task behavior (e.g., checking their phones, leaving to use the restroom) during the activity (DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Winkler & Rybnikova, 2019 ).

We conceptualize negative or low scores in either affective or behavioral student outcomes as an indicator of SRAL (DeMonbrun et al., 2017 ; Nguyen et al., 2017 ). For example, a low score in reported course satisfaction would be an example of SRAL. This paper aims to synthesize instructor strategies to aid implementation of active learning from studies that either address SRAL and its negative or low scores or relate instructor strategies to positive or high scores. Therefore, we also conceptualize positive student affective and behavioral outcomes as the absence of SRAL. For easy categorization of this review then, we summarize studies’ affective and behavioral outcomes on active learning to either being positive , mostly positive , mixed/neutral , mostly negative , or negative .

We conducted a systematic literature review (Borrego et al., 2014 ; Gough, Oliver, & Thomas, 2017 ; Petticrew & Roberts, 2006 ) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.

A systematic review was warranted due to the popularity of active learning and the publication of numerous papers on the topic. Multiple STEM disciplines and research audiences have published journal articles and conference papers on the topic of active learning in the undergraduate STEM classroom. However, it was not immediately clear which studies addressed active learning, affective and behavioral student responses, and strategies to aid implementation of active learning. We used the systematic review process to efficiently gather results of multiple types of studies and create a clear overview of our topic.

Definitions

For clarity, we define several terms in this review. Researchers refer to us, the authors of this manuscript. Authors and instructors wrote the primary studies we reviewed, and we refer to these primary studies as “studies” consistently throughout. We use the term activity or activities to refer to the specific in-class active learning tasks assigned to students. Strategies refer to the instructor strategies used to aid implementation of active learning and address student resistance to active learning (SRAL). Student response includes affective and behavioral responses and outcomes related to active learning. SRAL is an acronym for student resistance to active learning, defined here as a negative affective or behavioral student response. Categories or category refer to a grouping of strategies to aid implementation of active learning, such as explanation or facilitation. Excerpts are quotes from studies, and these excerpts are used as codes and examples of specific strategies.

Study timeline, data collection, and sample selection

From 2015 to 2016, we worked with a research librarian to locate relevant studies and conduct a keyword search within six databases: two multidisciplinary databases (Web of Science and Academic Search Complete), two major engineering and technology indexes (Compendex and Inspec), and two popular education databases (Education Source and Education Resource Information Center). We created an inclusion criteria that listed both search strings and study requirements:

Studies must include an in-class active learning intervention. This does not include laboratory classes. The corresponding search string was:

“active learning” or “peer-to-peer” or “small group work” or “problem based learning” or “problem-based learning” or “problem-oriented learning” or “project-based learning” or “project based learning” or “peer instruction” or “inquiry learning” or “cooperative learning” or “collaborative learning” or “student response system” or “personal response system” or “just-in-time teaching” or “just in time teaching” or clickers

Studies must include empirical evidence addressing student response to the active learning intervention. The corresponding search string was:

“affective outcome” or “affective response” or “class evaluation” or “course evaluation” or “student attitudes” or “student behaviors” or “student evaluation” or “student feedback” or “student perception” or “student resistance” or “student response”

Studies must describe a STEM course, as defined by the topic of the course, rather than by the department of the course or the major of the students enrolled (e.g., a business class for mathematics majors would not be included, but a mathematics class for business majors would).

Studies must be conducted in undergraduate courses and must not include K-12, vocational, or graduate education.

Studies must be in English and published between 1990 and 2015 as journal articles or conference papers.

In addition to searching the six databases, we emailed solicitations to U.S. National Science Foundation Improving Undergraduate STEM Education (NSF IUSE) grantees. Between the database searches and email solicitation, we identified 2364 studies after removing duplicates. Most studies were from the database search, as we received just 92 studies from email solicitation (Fig. 1 ).

figure 1

PRISMA screening overview styled after Liberati et al. ( 2009 ) and Passow and Passow ( 2017 )

Next, we followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for screening studies with our inclusion criteria (Borrego et al., 2014 ; Petticrew & Roberts, 2006 ). From 2016 to 2018, a team of seven researchers conducted two rounds of review in Refworks: the first round with only titles and abstracts and the second round with the entire full-text. In both rounds, two researchers independently decided whether each study should be retained based on our inclusion criteria listed above. At the abstract review stage, if there was a disagreement between independent coders, we decided to pass the study on to the full text screening round. We screened a total of 2364 abstracts, and only 746 studies passed the first round of title and abstract verification (see PRISMA flow chart on Fig. 1 ). If there was still a disagreement between independent coders at the full text screening round, then the seven researchers met and discussed the study, clarified the inclusion criteria as needed to resolve potential future disagreements, and when necessary, took a majority vote (4 out of the 7 researchers) on the inclusion of the study. Due to the high number of coders, it was unusual to reach full consensus with all 7 coders, so a majority vote was used to finalize the inclusion of certain studies. We resolved these disagreements on a rolling basis, and depending on the round (abstract or full text), we disagreed about 10–15% of the time on the inclusion of a study. In both the first and second round of screening, studies were often excluded because they did not gather novel empirical data or evidence (inclusion criteria #2) or were not in an undergraduate STEM course (inclusion criteria #3 and #4). Only 412 studies met all our final inclusion criteria.

Coding procedure

From 2017 to 2018, a team of five researchers then coded these 412 studies for detailed information. To quickly gather information about all 412 studies and to answer the first part of our research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we developed an online coding form using Google Forms and Google Sheets. The five researchers piloted and refined the coding form over three rounds of pair coding, and 19 studies were used to test and revise early versions of the coding form. The final coding form (Borrego et al., 2018 ) used a mix of multiple choice and free response items regarding study characteristics (bibliographic information, type of publication, location of study), course characteristics (discipline, course level, number of students sampled, and type of active learning), methodology (main type of evidence collected, sample size, and analysis methods), study findings (types of student responses and outcomes), and strategy reported (if the study explicitly mentioned using strategies to implementation of active learning).

In the end, only 29 studies explicitly described strategies to aid implementation of active learning (Fig. 1 ), and we used these 29 studies as the dataset for this study. The main difference between these 29 studies and the other 383 studies was that these 29 studies explicitly described the ways authors implemented active learning in their courses to address SRAL or positive student outcomes. Although some readers who are experienced active learning instructors or educational researchers may view pedagogies and strategies as integrated, we found that most papers described active learning methods in terms of student tasks, while advice on strategies, if included, tended to appear separately. We chose to not over interpret passing mentions of how active learning was implemented as strategies recommended by the authors.

Analysis procedure for coding strategies

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we closely reviewed the 29 studies to analyze the strategies in more detail. We used Boyatzis’s ( 1998 ) thematic analysis technique to compile all mentions of instructor strategies to aid implementation of active learning and categorize these excerpts into certain strategies. This technique uses both deductive and inductive coding processes (Creswell & Creswell, 2017 ; Jesiek, Mazzurco, Buswell, & Thompson, 2018 ).

In 2018, three researchers reread the 29 studies, marking excerpts related to strategies independently. We found a total of 126 excerpts. The number of excerpts within each study ranged from 1 to 14 excerpts ( M = 4, SD = 3). We then took all the excerpts and pasted each into its own row in a Google Sheet. We examined the entire spreadsheet as a team and grouped similar excerpts together using a deductive coding process. We used the explanation and facilitation conceptual framework (DeMonbrun et al., 2017 ) and placed each excerpt into either category. We also assigned a specific strategy (i.e., describing the purpose of the activity, or encouraging students) from the framework for each excerpt.

However, there were multiple excerpts that did not easily match either category; we set these aside for the inductive coding process. We then reviewed all excerpts without a category and suggested the creation of a new third category, called planning . We based this new category on the idea that the existing explanation and facilitation conceptual framework did not capture strategies that occurred outside of the classroom. We discuss the specific strategies within the planning category in the Results. With a new category in hand, we created a preliminary codebook consisting of explanation, facilitation, and planning categories, and their respective specific strategies.

We then passed the spreadsheet and preliminary codebook to another researcher who had not previously seen the excerpts. The second researcher looked through all the excerpts and assigned categories and strategies, without being able to see the suggestions of the initial three researchers. The second researcher also created their own new strategies and codes, especially when a specific strategy was not presented in the preliminary codebook. All of their new strategies and codes were created within the planning category. The second researcher agreed on assigned categories and implementation strategies for 71% of the total excerpts. A researcher from the initial strategies coding met with the second researcher and discussed all disagreements. The high number of disagreements, 29%, arose from the specific strategies within the new third category, planning. Since the second researcher created new planning strategies, by default these assigned codes would be a disagreement. The two researchers resolved the disagreements by finalizing a codebook with the now full and combined list of planning strategies and the previous explanation and facilitation strategies. Finally, they started the last round of coding, and they coded the excerpts with the final codebook. This time, they worked together in the same coding sessions. Any disagreements were immediately resolved through discussion and updating of final strategy codes. In the end, all 126 excerpts were coded and kept.

Characteristics of the primary studies

To answer our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we report the results from our coding and systematic review process. We discuss characteristics of studies within our dataset below and in Table 1 .

Type of publication and research audience

Of the 29 studies, 11 studies were published in conference proceedings, while the remaining 18 studies were journal articles. Examples of journals included the European Journal of Engineering Education , Journal of College Science Teaching , and PRIMUS (Problems, Resources, and Issues in Mathematics Undergraduate Studies).

In terms of research audiences and perspectives, both US and international views were represented. Eighteen studies were from North America, two were from Australia, three were from Asia, and six were from Europe. For more details about the type of research publications, full bibliographic information for all 29 studies is included in the Appendix.

Types of courses sampled

Studies sampled different types of undergraduate STEM courses. In terms of course year, most studies sampled first-year courses (13 studies). All four course years were represented (4 second-year, 3 third-year, 2 fourth-year, 7 not reported). In regards to course discipline or major, all major STEM education disciplines were represented. Fourteen studies were conducted in engineering courses, and most major engineering subdisciplines were represented, such as electrical and computer engineering (4 studies), mechanical engineering (3 studies), general engineering courses (3 studies), chemical engineering (2 studies), and civil engineering (1 study). Thirteen studies were conducted in science courses (3 physics/astronomy, 7 biology, 3 chemistry), and 2 studies were conducted in mathematics or statistics courses.

For teaching methods, most studies sampled traditional courses that were primarily lecture-based but included some in-class activities. The most common activity was giving class time for students to do problem solving (PS) (21 studies). Students were instructed to either do problem solving in groups (16 studies) or individually (5 studies) and sometimes both in the same course. Project or problem-based learning (PBL) was the second most frequently reported activity with 8 studies, and the implementation of this teaching method ranged from end of term final projects to an entire project or problem-based course. The third most common activity was using clickers (4 studies) or having class discussions (4 studies).

Research design, methods, and outcomes

The 29 studies used quantitative (10 studies), qualitative (6 studies), or mixed methods (13 studies) research designs. Most studies contained self-made instructor surveys (IS) as their main source of evidence (20 studies). In contrast, only 2 studies used survey instruments with evidence of validity (IEV). Other forms of data collection included using institutions’ end of course evaluations (EOC) (10 studies), observations (5 studies), and interviews (4 studies).

Studies reported a variety of different measures for researching students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of learning (an affective outcome); twenty-one studies measured whether students thought they learned more or less due to the active learning intervention. Other common measures included whether students participated in the activities (16 studies, participation), whether they enjoyed the activities (15 studies, enjoyment), and if students were satisfied with the overall course experience (13 studies, course satisfaction). Most studies included more than one measure. Some studies also measured course attendance (4 studies) and students’ self-efficacy with the activities and relevant STEM disciplines (4 studies).

We found that the 23 of the 29 studies reported positive or mostly positive outcomes for their students’ affective and behavioral responses to active learning. Only 5 studies reported mixed/neutral study outcomes, and only one study reported negative student response to active learning. We discuss the implications of this lack of negative study outcomes and reports of SRAL in our dataset in the “Discussion” section.

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we provide descriptions, categories, and excerpts of specific strategies found within our systematic literature review.

Explanation strategies

Explanation strategies provide students with clarifications and reasons for using active learning (DeMonbrun et al., 2017 ). Within the explanation category, we identified two specific strategies: establish expectations and explain the purpose .

Establish expectations

Establishing expectations means setting the tone and routine for active learning at both the course and in-class activity level. Instructors can discuss expectations at the beginning of the semester, at the start of a class session, or right before the activity.

For establishing expectations at the beginning of the semester, studies provide specific ways to ensure students became familiar with active learning as early as possible. This included “introduc[ing] collaborative learning at the beginning of the academic term” (Herkert , 1997 , p. 450) and making sure that “project instructions and the data were posted fairly early in the semester, and the students were made aware that the project was an important part of their assessment” (Krishnan & Nalim, 2009 , p. 5).

McClanahan and McClanahan ( 2002 ) described the importance of explaining how the course will use active learning and purposely using the syllabus to do this:

Set the stage. Create the expectation that students will actively participate in this class. One way to accomplish that is to include a statement in your syllabus about your teaching strategies. For example: I will be using a variety of teaching strategies in this class. Some of these activities may require that you interact with me or other students in class. I hope you will find these methods interesting and engaging and that they enable you to be more successful in this course . In the syllabus, describe the specific learning activities you plan to conduct. These descriptions let the students know what to expect from you as well as what you expect from them (emphasis added, p. 93).

Early on, students see that the course is interactive, and they also see the activities required to be successful in the course.

These studies and excerpts demonstrate the importance of explaining to students how in-class activities relate to course expectations. Instructors using active learning should start the semester with clear expectations for how students should engage with activities.

Explain the purpose

Explaining the purpose includes offering students reasons why certain activities are being used and convincing them of the importance of participating.

One way that studies explained the purpose of the activities was by leveraging and showing assessment data on active learning. For example, Lenz ( 2015 ) dedicated class time to show current students comments from previous students:

I spend the first few weeks reminding them of the research and of the payoff that they will garner and being a very enthusiastic supporter of the [active learning teaching] method. I show them comments I have received from previous classes and I spend a lot of time selling the method (p. 294).

Providing current students comments from previous semesters may help students see the value of active learning. Lake ( 2001 ) also used data from prior course offerings to show students “the positive academic performance results seen in the previous use of active learning” on the first day of class (p. 899).

However, sharing the effectiveness of the activities does not have to be constrained to the beginning of the course. Autin et al. ( 2013 ) used mid-semester test data and comparisons to sell the continued use of active learning to their students. They said to students:

Based on your reflections, I can see that many of you are not comfortable with the format of this class. Many of you said that you would learn better from a traditional lecture. However, this class, as a whole, performed better on the test than my other [lecture] section did. Something seems to be working here (p. 946).

Showing students’ comparisons between active learning and traditional lecture classes is a powerful way to explain how active learning is a benefit to students.

Explaining the purpose of the activities by sharing course data with students appears to be a useful strategy, as it tells students why active learning is being used and convinces students that active learning is making a difference.

Facilitation strategies

Facilitation strategies ensure the continued engagement in the class activities once they have begun, and many of the specific strategies within this category involve working directly with students. We identified two strategies within the facilitation category: approach students and encourage students .

Approach students

Approaching students means engaging with students during the activity. This includes physical proximity and monitoring students, walking around the classroom, and providing students with additional feedback, clarifications, or questions about the activity.

Several studies described how instructors circulated around the classroom to check on the progress of students during an activity. Lenz ( 2015 ) stated this plainly in her study, “While the students work on these problems I walk around the room, listening to their discussions” (p. 284). Armbruster et al. ( 2009 ) described this strategy and noted positive student engagement, “During each group-work exercise the instructor would move throughout the classroom to monitor group progress, and it was rare to find a group that was not seriously engaged in the exercise” (p. 209). Haseeb ( 2011 ) combined moving around the room and approaching students with questions, and they stated, “The instructor moves around from one discussion group to another and listens to their discussions, ask[ing] provoking questions” (p. 276). Certain group-based activities worked better with this strategy, as McClanahan and McClanahan ( 2002 ) explained:

Breaking the class into smaller working groups frees the professor to walk around and interact with students more personally. He or she can respond to student questions, ask additional questions, or chat informally with students about the class (p. 94).

Approaching students not only helps facilitate the activity, but it provides a chance for the instructor to work with students more closely and receive feedback. Instructors walking around the classroom ensure that both the students and instructor continue to engage and participate with the activity.

Encourage students

Encouraging students includes creating a supportive classroom environment, motivating students to do the activity, building respect and rapport with students, demonstrating care, and having a positive demeanor toward students’ success.

Ramsier et al. ( 2003 ) provided a detailed explanation of the importance of building a supportive classroom environment:

Most of this success lies in the process of negotiation and the building of mutual respect within the class, and requires motivation, energy and enthusiasm on behalf of the instructor… Negotiation is the key to making all of this work, and building a sense of community and shared ownership. Learning students’ names is a challenge but a necessary part of our approach. Listening to student needs and wants with regard to test and homework due dates…projects and activities, etc. goes a long way to build the type of relationships within the class that we need in order to maintain and encourage performance (pp. 16–18).

Here, the authors described a few specific strategies for supporting a positive demeanor, such as learning students’ names and listening to student needs and wants, which helped maintain student performance in an active learning classroom.

Other ways to build a supportive classroom environment were for instructors to appear more approachable. For example, Bullard and Felder ( 2007 ) worked to “give the students a sense of their instructors as somewhat normal and approachable human beings and to help them start to develop a sense of community” (p. 5). As instructors and students become more comfortable working with each other, instructors can work toward easing “frustration and strong emotion among students and step by step develop the students’ acceptance [of active learning]” (Harun, Yusof, Jamaludin, & Hassan, 2012 , p. 234). In all, encouraging students and creating a supportive environment appear to be useful strategies to aid implementation of active learning.

Planning strategies

The planning category encompasses strategies that occur outside of class time, distinguishing it from the explanation and facilitation categories. Four strategies fall into this category: design appropriate activities , create group policies , align the course , and review student feedback .

Design appropriate activities

Many studies took into consideration the design of appropriate or suitable activities for their courses. This meant making sure the activity was suitable in terms of time, difficulty, and constraints of the course. Activities were designed to strike a balance between being too difficult and too simple, to be engaging, and to provide opportunities for students to participate.

Li et al. ( 2009 ) explained the importance of outside-of-class planning and considering appropriate projects: “The selection of the projects takes place in pre-course planning. The subjects for projects should be significant and manageable” (p. 491). Haseeb ( 2011 ) further emphasized a balance in design by discussing problems (within problem-based learning) between two parameters, “the problem is deliberately designed to be open-ended and vague in terms of technical details” (p. 275). Armbruster et al. ( 2009 ) expanded on the idea of balanced activities by connecting it to group-work and positive outcomes, and they stated, “The group exercises that elicited the most animated student participation were those that were sufficiently challenging that very few students could solve the problem individually, but at least 50% or more of the groups could solve the problem by working as a team” (p. 209).

Instructors should consider the design of activities outside of class time. Activities should be appropriately challenging but achievable for students, so that students remain engaged and participate with the activity during class time.

Create group policies

Creating group policies means considering rules when using group activities. This strategy is unique in that it directly addresses a specific subset of activities, group work. These policies included setting team sizes and assigning specific roles to group members.

Studies outlined a few specific approaches for assigning groups. For example, Ramsier et al. ( 2003 ) recommended frequently changing and randomizing groups: “When students enter the room on these days they sit in randomized groups of 3 to 4 students. Randomization helps to build a learning community atmosphere and eliminates cliques” (p. 4). Another strategy in combination with frequent changing of groups was to not allow students to select their own groups. Lehtovuori et al. ( 2013 ) used this to avoid problems of freeriding and group dysfunction:

For example, group division is an issue to be aware of...An easy and safe solution is to draw lots to assign the groups and to change them often. This way nobody needs to suffer from a dysfunctional group for too long. Popular practice that students self-organize into groups is not the best solution from the point of view of learning and teaching. Sometimes friendly relationships can complicate fair division of responsibility and work load in the group (p. 9).

Here, Lehtovuori et al. ( 2013 ) considered different types of group policies and concluded that frequently changing groups worked best for students. Kovac ( 1999 ) also described changing groups but assigned specific roles to individuals:

Students were divided into groups of four and assigned specific roles: manager, spokesperson, recorder, and strategy analyst. The roles were rotated from week to week. To alleviate complaints from students that they were "stuck in a bad group for the entire semester," the groups were changed after each of the two in-class exams (p. 121).

The use of four specific group roles is a potential group policy, and Kovac ( 1999 ) continued the trend of changing group members often.

Overall, these studies describe the importance of thinking about ways to implement group-based activities before enacting them during class, and they suggest that groups should be reconstituted frequently. Instructors using group activities should consider whether to use specific group member policies before implementing the activity in the classroom.

Align the course

Aligning the course emphasizes the importance of purposely connecting multiple parts of the course together. This strategy involves planning to ensure students are graded on their participation with the activities as well as considering the timing of the activities with respect to other aspects of the course.

Li et al. ( 2009 ) described aligning classroom tasks by discussing the importance of timing, and they wrote, “The coordination between the class lectures and the project phases is very important. If the project is assigned near the directly related lectures, students can instantiate class concepts almost immediately in the project and can apply the project experience in class” (p. 491).

Krishnan and Nalim ( 2009 ) aligned class activities with grades to motivate students and encourage participation: “The project was a component of the course counting for typically 10-15% of the total points for the course grade. Since the students were told about the project and that it carried a significant portion of their grade, they took the project seriously” (p. 4). McClanahan and McClanahan ( 2002 ) expanded on the idea of using grades to emphasize the importance of active learning to students:

Develop a grading policy that supports active learning. Active learning experiences that are important enough to do are important enough to be included as part of a student's grade…The class syllabus should describe your grading policy for active learning experiences and how those grades factor into the student's final grade. Clarify with the students that these points are not extra credit. These activities, just like exams, will be counted when grades are determined (p. 93).

Here, they suggest a clear grading policy that includes how activities will be assessed as part of students’ final grades.

de Justo and Delgado ( 2014 ) connected grading and assessment to learning and further suggested that reliance on exams may negatively impact student engagement:

Particular attention should be given to alignment between the course learning outcomes and assessment tasks. The tendency among faculty members to rely primarily on written examinations for assessment purposes should be overcome, because it may negatively affect students’ engagement in the course activities (p. 8).

Instructors should consider their overall assessment strategies, as overreliance on written exams could mean that students engage less with the activities.

When planning to use active learning, instructors should consider how activities are aligned with course content and students’ grades. Instructors should decide before active learning implementation whether class participation and engagement will be reflected in student grades and in the course syllabus.

Review student feedback

Reviewing student feedback includes both soliciting feedback about the activity and using that feedback to improve the course. This strategy can be an iterative process that occurs over several course offerings.

Many studies utilized student feedback to continuously revise and improve the course. For example, Metzger ( 2015 ) commented that “gathering and reviewing feedback from students can inform revisions of course design, implementation, and assessment strategies” (p. 8). Rockland et al. ( 2013 ) further described changing and improving the course in response to student feedback, “As a result of these discussions, the author made three changes to the course. This is the process of continuous improvement within a course” (p. 6).

Herkert ( 1997 ) also demonstrated the use of student feedback for improving the course over time: “Indeed, the [collaborative] learning techniques described herein have only gradually evolved over the past decade through a process of trial and error, supported by discussion with colleagues in various academic fields and helpful feedback from my students” (p. 459).

In addition to incorporating student feedback, McClanahan and McClanahan ( 2002 ) commented on how student feedback builds a stronger partnership with students, “Using student feedback to make improvements in the learning experience reinforces the notion that your class is a partnership and that you value your students’ ideas as a means to strengthen that partnership and create more successful learning” (p. 94). Making students aware that the instructor is soliciting and using feedback can help encourage and build rapport with students.

Instructors should review student feedback for continual and iterative course improvement. Much of the student feedback review occurs outside of class time, and it appears useful for instructors to solicit student feedback to guide changes to the course and build student rapport.

Summary of strategies

We list the appearance of strategies within studies in Table 1 in short-hand form. No study included all eight strategies. Studies that included the most strategies were Bullard and Felder’s ( 2007 ) (7 strategies), Armbruster et al.’s ( 2009 ) (5 strategies), and Lenz’s ( 2015 ) (5 strategies). However, these three studies were exemplars, as most studies included only one or two strategies.

Table 2 presents a summary list of specific strategies, their categories, and descriptions. We also note the number of unique studies ( N ) and excerpts ( n ) that included the specific strategies. In total, there were eight specific strategies within three categories. Most strategies fell under the planning category ( N = 26), with align the course being the most reported strategy ( N = 14). Approaching students ( N = 13) and reviewing student feedback ( N = 11) were the second and third most common strategies, respectively. Overall, we present eight strategies to aid implementation of active learning.

Characteristics of the active learning studies

To address our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we discuss the different ways studies reported research on active learning.

Limitations and gaps within the final sample

First, we must discuss the gaps within our final sample of 29 studies. We excluded numerous active learning studies ( N = 383) that did not discuss or reflect upon the efficacy of their strategies to aid implementation of active learning. We also began this systematic literature review in 2015 and did not finish our coding and analysis of 2364 abstracts and 746 full-texts until 2018. We acknowledge that there have been multiple studies published on active learning since 2015. Acknowledging these limitations, we discuss our results and analysis in the context of the 29 studies in our dataset, which were published from 1990 to 2015.

Our final sample included only 2 studies that sampled mathematics and statistics courses. In addition, there was also a lack of studies outside of first-year courses. Much of the active learning research literature introduces interventions in first-year (cornerstone) or fourth-year (capstone) courses, but we found within our dataset a tendency to oversample first-year courses. However, all four course-years were represented, as well as all major STEM disciplines, with the most common STEM disciplines being engineering (14 studies) and biology (7 studies).

Thirteen studies implemented course-based active learning interventions, such as project-based learning (8 studies), inquiry-based learning (3 studies), or a flipped classroom (2 studies). Only one study, Lenz ( 2015 ), used a previously published active learning intervention, which was Process-Oriented Guided Inquiry Learning (POGIL). Other examples of published active learning programs include the Student-Centered Active Learning Environment for Upside-down Pedagogies (SCALE-UP, Gaffney et al., 2010 ) and Chemistry, Life, the Universe, and Everything (CLUE, Cooper & Klymkowsky, 2013 ), but these were not included in our sample of 29 studies.

In contrast, most of the active learning interventions involved adding in-class problem solving (either with individual students or groups of students) to a traditional lecture course (21 studies). For some instructors attempting to adopt active learning, using this smaller active learning intervention (in-class problem solving) may be a good starting point.

Despite the variety of quantitative, qualitative, and mixed method research designs, most studies used either self-made instructor surveys (20 studies) or their institution’s course evaluations (10 studies). The variation between so many different versions of instructor surveys and course evaluations made it difficult to compare data or attempt a quantitative meta-analysis. Further, only 2 studies used instruments with evidence of validity. However, that trend may change as there are more examples of instruments with evidence of validity, such as the Student Response to Instructional Practices (StRIP, DeMonbrun et al., 2017 ), the Biology Interest Questionnaire (BIQ, Knekta, Rowland, Corwin, & Eddy, 2020 ), and the Pedagogical Expectancy Violation Assessment (PEVA, Gaffney et al., 2010 ).

We were also concerned about the use of institutional course evaluations (10 studies) as evidence of students’ satisfaction and affective responses to active learning. Course evaluations capture more than just students’ responses to active learning, as the scores are biased toward the instructors’ gender (Mitchell & Martin, 2018 ) and race (Daniel, 2019 ), and they are strongly correlated with students’ expected grade in the class (Nguyen et al., 2017 ). Despite these limitations, we kept course evaluations in our keyword search and inclusion criteria, because they relate to instructors concerns about student resistance to active learning, and these scores continue to be used for important instructor reappointment, tenure, and promotion decisions (DeMonbrun et al., 2017 ).

In addition to students’ satisfaction, there were other measures related to students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of whether they thought they learned more or less (21 studies). Other important affective outcomes included enjoyment (13 studies) and self-efficacy (4 students). The most common behavioral measure was students’ participation (16 studies). However, missing from this sample were other affective outcomes, such as students’ identities, beliefs, emotions, values, and buy-in.

Positive outcomes for using active learning

Twenty-three of the 29 studies reported positive or mostly positive outcomes for their active learning intervention. At the start of this paper, we acknowledged that much of the existing research suggested the widespread positive benefits of using active learning in undergraduate STEM courses. However, much of these positive benefits related to active learning were centered on students’ cognitive learning outcomes (e.g., Theobald et al., 2020 ) and not students’ affective and behavioral responses to active learning. Here, we show positive affective and behavioral outcomes in terms of students’ self-reports of learning, enjoyment, self-efficacy, attendance, participation, and course satisfaction.

Due to the lack of mixed/neutral or negative affective outcomes, it is important to acknowledge potential publication bias within our dataset. Authors may be hesitant to report negative outcomes to active learning interventions. It could also be the case that negative or non-significant outcomes are not easily published in undergraduate STEM education venues. These factors could help explain the lack of mixed/neutral or negative study outcomes in our dataset.

Strategies to aid implementation of active learning

We aimed to answer the question: what instructor strategies to aid implementation of active learning do the authors of these studies provide? We addressed this question by providing instructors and readers a summary of actionable strategies they can take back to their own classrooms. Here, we discuss the range of strategies found within our systematic literature review.

Supporting instructors with actionable strategies

We identified eight specific strategies across three major categories: explanation, facilitation, and planning. Each strategy appeared in at least seven studies (Table 2 ), and each strategy was written to be actionable and practical.

Strategies in the explanation category emphasized the importance of establishing expectations and explaining the purpose of active learning to students. The facilitation category focused on approaching and encouraging students once activities were underway. Strategies in the planning category highlight the importance of working outside of class time to thoughtfully design appropriate activities , create policies for group work , align various components of the course , and review student feedback to iteratively improve the course.

However, as we note in the “Introduction” section, these strategies are not entirely new, and the strategies will not be surprising to experienced researchers and educators. Even still, there has yet to be a systematic review that compiles these instructor strategies in relation to students’ affective and behavioral responses to active learning. For example, the “explain the purpose” strategy is similar to the productive framing (e.g., Hutchison & Hammer, 2010 ) of the activity for students. “Design appropriate activities” and “align various components of the course” relate to Vygotsky’s ( 1978 ) theories of scaffolding for students (Shekhar et al., 2020 ). “Review student feedback” and “approaching students” relate to ideas on formative assessment (e.g., Pellegrino, DiBello, & Brophy, 2014 ) or revising the course materials in relation to students’ ongoing needs.

We also acknowledge that we do not have an exhaustive list of specific strategies to aid implementation of active learning. More work needs to be done measuring and observing these strategies in-action and testing the use of these strategies against certain outcomes. Some of this work of measuring instructor strategies has already begun (e.g., DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Tharayil et al., 2018 ), but further testing and analysis would benefit the active learning community. We hope that our framework of explanation, facilitation, and planning strategies provide a guide for instructors adopting active learning. Since these strategies are compiled from the undergraduate STEM education literature and research on affective and behavioral responses to active learning, instructors have compelling reason to use these strategies to aid implementation of active learning.

One way to consider using these strategies is to consider the various aspects of instruction and their sequence. That is, planning strategies would be most applicable during the phase of work that occurs prior to classroom instruction, the explanation strategies would be more useful when introducing students to active learning activities, while facilitation strategies would be best enacted while students are already working and engaged in the assigned activities. Of course, these strategies may also be used in conjunction with each other and are not strictly limited to these phases. For example, one plausible approach could be using the planning strategies of design and alignment as areas of emphasis during explanation . Overall, we hope that this framework of strategies supports instructors’ adoption and sustained use of active learning.

Creation of the planning category

At the start of this paper, we presented a conceptual framework for strategies consisting of only explanation and facilitation categories (DeMonbrun et al., 2017 ). One of the major contributions of this paper is the addition of a third category, which we call the planning category, to the existing conceptual framework. The planning strategies were common throughout the systematic literature review, and many studies emphasized the need to consider how much time and effort is needed when adding active learning to the course. Although students may not see this preparation, and we did not see this type of strategy initially, explicitly adding the planning category acknowledges the work instructors do outside of the classroom.

The planning strategies also highlight the need for instructors to not only think about implementing active learning before they enter the class, but to revise their implementation after the class is over. Instructors should refine their use of active learning through feedback, reflection, and practice over multiple course offerings. We hope this persistence can lead to long-term adoption of active learning.

Despite our review ending in 2015, most of STEM instruction remains didactic (Laursen, 2019 ; Stains et al., 2018 ), and there has not been a long-term sustained adoption of active learning. In a push to increase the adoption of active learning within undergraduate STEM courses, we hope this study provided support and actionable strategies for instructors who are considering active learning but are concerned about student resistance to active learning.

We identified eight specific strategies to aid implementation of active learning based on three categories. The three categories of strategies were explanation, facilitation, and planning. In this review, we created the third category, planning, and we suggested that this category should be considered first when implementing active learning in the course. Instructors should then focus on explaining and facilitating their activity in the classroom. The eight specific strategies provided here can be incorporated into faculty professional development programs and readily adopted by instructors wanting to implement active learning in their STEM courses.

There remains important future work in active learning research, and we noted these gaps within our review. It would be useful to specifically review and measure instructor strategies in-action and compare its use against other affective outcomes, such as identity, interest, and emotions.

There has yet to be a study that compiles and synthesizes strategies reported from multiple active learning studies, and we hope that this paper filled this important gap. The strategies identified in this review can help instructors persist beyond awkward initial implementations, avoid some problems altogether, and most importantly address student resistance to active learning. Further, the planning strategies emphasize that the use of active learning can be improved over time, which may help instructors have more realistic expectations for the first or second time they implement a new activity. There are many benefits to introducing active learning in the classroom, and we hope that these benefits are shared among more STEM instructors and students.

Availability of data and materials

Journal articles and conference proceedings which make up this review can be found through reverse citation lookup. See the Appendix for the references of all primary studies within this systematic review. We used the following databases to find studies within the review: Web of Science, Academic Search Complete, Compendex, Inspec, Education Source, and Education Resource Information Center. More details and keyword search strings are provided in the “Methods” section.

Abbreviations

Science, technology, engineering, and mathematics

Student resistance to active learning

Instrument with evidence of validity

Instructor surveys

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Problem solving

Problem or project-based learning

End of course evaluations

Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sciences Education , 8 (3), 203–213. https://doi.org/10.1187/cbe.09-03-0025 .

Article   Google Scholar  

Autin, M., Bateiha, S., & Marchionda, H. (2013). Power through struggle in introductory statistics. PRIMUS , 23 (10), 935–948. https://doi.org/10.1080/10511970.2013.820810 .

Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psychologist , 37 (2), 122 https://psycnet.apa.org/doi/10.1037/0003-066X.37.2.122 .

Berkling, K., & Zundel, A. (2015). Change Management: Overcoming the Challenges of Introducing Self-Driven Learning. International Journal of Engineering Pedagogy (iJEP), 5 (4), 38–46. https://www.learntechlib.org/p/207352/ .

Bilston, L. (1999). Lessons from a problem-based learning class in first year engineering statics . Paper presented at the 2nd Asia-Pacific Forum on Engineering and Technology Education, Clayton, Victoria.

Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education , 102 (3), 394–425. https://doi.org/10.1002/jee.20020 .

Borrego, M., Foster, M. J., & Froyd, J. E. (2014). Systematic literature reviews in engineering education and other developing interdisciplinary fields. Journal of Engineering Education , 103 (1), 45–76. https://doi.org/10.1002/jee.20038 .

Borrego, M., Nguyen, K., Crockett, C., DeMonbrun, M., Shekhar, P., Tharayil, S., … Waters, C. (2018). Systematic literature review of students’ affective responses to active learning: Overview of results . San Jose: Paper presented at the 2018 IEEE Frontiers in Education Conference (FIE). https://doi.org/10.1109/FIE.2018.8659306 .

Book   Google Scholar  

Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development . Sage Publications Inc.

Breckler, J., & Yu, J. R. (2011). Student responses to a hands-on kinesthetic lecture activity for learning about the oxygen carrying capacity of blood. Advances in Physiology Education, 35 (1), 39–47. https://doi.org/10.1152/advan.00090.2010 .

Bullard, L., & Felder, R. (2007). A student-centered approach to the stoichiometry course . Honolulu: Paper presented at the 2007 ASEE Annual Conference and Exposition https://peer.asee.org/1543 .

Carlson, K. A., & Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education , 19 (1). https://doi.org/10.1080/10691898.2011.11889596 .

Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist , 49 (4), 219–243. https://doi.org/10.1080/00461520.2014.965823 .

Christensen, T. (2005). Changing the learning environment in large general education astronomy classes. Journal of College Science Teaching, 35 (3), 34.

Cooper, M., & Klymkowsky, M. (2013). Chemistry, life, the universe, and everything: A new approach to general chemistry, and a model for curriculum reform. Journal of Chemical Education , 90 (9), 1116–1122. https://doi.org/10.1021/ed300456y .

Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches . Sage Publishing Inc.

Dancy, M., Henderson, C., & Turpen, C. (2016). How faculty learn about and implement research-based instructional strategies: The case of peer instruction. Physical Review Physics Education Research , 12 (1). https://doi.org/10.1103/PhysRevPhysEducRes.12.010110 .

Daniel, B. J. (2019). Teaching while black: Racial dynamics, evaluations, and the role of white females in the Canadian academy in carrying the racism torch. Race Ethnicity and Education , 22 (1), 21–37. https://doi.org/10.1080/13613324.2018.1468745 .

de Justo, E., & Delgado, A. (2014). Change to competence-based education in structural engineering. Journal of Professional Issues in Engineering Education and Practice , 141 (3). https://doi.org/10.1061/(ASCE)EI.1943-5541.0000215 .

DeMonbrun, R. M., Finelli, C., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an instrument to measure student response to instructional practices. Journal of Engineering Education , 106 (2), 273–298. https://doi.org/10.1002/jee.20162 .

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences , 116 (39), 19251–19257. https://doi.org/10.1073/pnas.1821936116 .

Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review , 95 (2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256 .

Finelli, C., Nguyen, K., Henderson, C., Borrego, M., Shekhar, P., Prince, M., … Waters, C. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching , 47 (5), 80–91 https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-mayjune-2018/research-and-1 .

Google Scholar  

Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education , 103 (2), 331–361. https://doi.org/10.1002/jee.20042 .

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences , 111 (23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 .

Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. J. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education , 56 (4), 393–399. https://doi.org/10.1109/TE.2013.2244602 .

Gaffney, J. D., Gaffney, A. L. H., & Beichner, R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics - Physics Education Research , 6 (1), 010102. https://doi.org/10.1103/PhysRevSTPER.6.010102 .

Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews . Sage Publishing Inc.

Harun, N. F., Yusof, K. M., Jamaludin, M. Z., & Hassan, S. A. H. S. (2012). Motivation in problem-based learning implementation. Procedia-Social and Behavioral Sciences , 56 , 233–242. https://doi.org/10.1016/j.sbspro.2012.09.650 .

Haseeb, A. (2011). Implementation of micro-level problem based learning in a course on electronic materialas. Journal of Materials Education , 33 (5-6), 273–282 http://eprints.um.edu.my/id/eprint/5501 .

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching , 48 (8), 952–984. https://doi.org/10.1002/tea.20439 .

Henderson, C., & Dancy, M. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research , 3 (2). https://doi.org/10.1103/PhysRevSTPER.3.020102 .

Henderson, C., Khan, R., & Dancy, M. (2018). Will my student evaluations decrease if I adopt an active learning instructional strategy? American Journal of Physics , 86 (12), 934–942. https://doi.org/10.1119/1.5065907 .

Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics , 3 (4), 447–462. https://doi.org/10.1007/s11948-997-0047-x .

Hodgson, Y., Benson, R., & Brack, C. (2013). Using action research to improve student engagement in a peer-assisted learning programme. Educational Action Research, 21 (3), 359-375. https://doi.org/10.1080/09650792.2013.813399 .

Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multi-dimensional analysis of math and science undergraduate course planning and classroom teaching. Journal of the Learning Sciences , 22 (2), 212–257. https://doi.org/10.1080/10508406.2012.729767 .

Hutchison, P., & Hammer, D. (2010). Attending to student epistemological framing in a science classroom. Science Education , 94 (3), 506–524. https://doi.org/10.1002/sce.20373 .

Jaeger, B., & Bilen, S. (2006). The one-minute engineer: Getting design class out of the starting blocks . Paper presented at the 2006 ASEE Annual Conference and Exposition, Chicago, IL. https://peer.asee.org/524 .

Jesiek, B. K., Mazzurco, A., Buswell, N. T., & Thompson, J. D. (2018). Boundary spanning and engineering: A qualitative systematic review. Journal of Engineering Education , 107 (3), 318–413. https://doi.org/10.1002/jee.20219 .

Kezar, A., Gehrke, S., & Elrod, S. (2015). Implicit theories of change as a barrier to change on college campuses: An examination of STEM reform. The Review of Higher Education , 38 (4), 479–506. https://doi.org/10.1353/rhe.2015.0026 .

Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development . McGraw Hill.

Knekta, E., Rowland, A. A., Corwin, L. A., & Eddy, S. (2020). Measuring university students’ interest in biology: Evaluation of an instrument targeting Hidi and Renninger’s individual interest. International Journal of STEM Education , 7 , 1–16. https://doi.org/10.1186/s40594-020-00217-4 .

Kovac, J. (1999). Student active learning methods in general chemistry. Journal of Chemical Education , 76 (1), 120. https://doi.org/10.1021/ed076p120 .

Krishnan, S., & Nalim, M. R. (2009). Project based learning in introductory thermodynamics . Austin: Paper presented at the 2009 ASEE Annual Conference and Exposition https://peer.asee.org/5615 .

Kuh, G. D. (2005). Student engagement in the first year of college. In M. L. Upcraft, J. N. Gardner, J. N, & B. O. Barefoot (Eds.), Challenging and supporting the first-year student: A handbook for improving the first year of college , (pp. 86–107). Jossey-Bass.

Laatsch, L., Britton, L., Keating, S., Kirchner, P., Lehman, D., Madsen-Myers, K., Milson, L., Otto, C., & Spence, L. (2005). Cooperative learning effects on teamwork attitudes in clinical laboratory science students. American Society for Clinical Laboratory Science, 18(3). https://doi.org/10.29074/ascls.18.3.150 .

Lake, D. A. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy , 81 (3), 896–902. https://doi.org/10.1093/ptj/81.3.896 .

Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction American Association for the Advancement of Science. https://www.aaas.org/resources/levers-change-assessment-progress-changing-stem-instruction .

Lehtovuori, A., Honkala, M., Kettunen, H., & Leppävirta, J. (2013). Interactive engagement methods in teaching electrical engineering basic courses. In Paper presented at the IEEE global engineering education conference (EDUCON) . Germany: Berlin. https://doi.org/10.1109/EduCon.2013.6530089 .

Chapter   Google Scholar  

Lenz, L. (2015). Active learning in a math for liberal arts classroom. PRIMUS , 25 (3), 279–296. https://doi.org/10.1080/10511970.2014.971474 .

Li, J., Zhao, Y., & Shi, L. (2009). Interactive teaching methods in information security course . Paper presented at the International Conference on Scalable Computing and Communications; The Eighth International Conference on Embedded Computing. https://doi.org/10.1109/EmbeddedCom-ScalCom.2009.94 .

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., … Moher, D. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. Journal of Clinical Epidemology , 62 (10), e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .

Lund, T. J., & Stains, M. (2015). The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education , 2 (1). https://doi.org/10.1186/s40594-015-0026-8 .

Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education , 8 (1), 9–30. https://doi.org/10.1177/1469787407074008 .

Maib, J., Hall, R., Collier, H., & Thomas, M. (2006). A multi-method evaluation of the implementation of a student response system . Paper presented at the 12th Americas’ Conference on Information Systems (AMCIS), Acapulco, Mexico. https://aisel.aisnet.org/amcis2006/27 .

McClanahan, E. B., & McClanahan, L. L. (2002). Active learning in a non-majors biology class: Lessons learned. College Teaching , 50 (3), 92–96. https://doi.org/10.1080/87567550209595884 .

McLoone, S., & Brennan, C. (2015). On the use and evaluation of a smart device student response system in an undergraduate mathematics classroom. AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education, 7(3). http://ojs.aishe.org/index.php/aishe-j/article/view/243 .

Metzger, K. J. (2015). Collaborative teaching practices in undergraduate active learning classrooms: A report of faculty team teaching models and student reflections from two biology courses. Bioscene: Journal of College Biology Teaching , 41 (1), 3–9 http://www.acube.org/wp-content/uploads/2017/11/2015_1.pdf .

Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics , 51 (3), 648–652. https://doi.org/10.1017/S104909651800001X .

Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., DeMonbrun, R. M., … Waters, C. (2017). Students’ expectations, types of instruction, and instructor strategies predicting student response to active learning. International Journal of Engineering Education , 33 (1(A)), 2–18 http://www.ijee.ie/latestissues/Vol33-1A/02_ijee3363ns.pdf .

Oakley, B. A., Hanna, D. M., Kuzmyn, Z., & Felder, R. M. (2007). Best practices involving teamwork in the classroom: Results from a survey of 6435 engineering student respondents. IEEE Transactions on Education , 50 (3), 266–272. https://doi.org/10.1109/TE.2007.901982 .

Oliveira, P. C., & Oliveira, C. G. (2014). Integrator element as a promoter of active learning in engineering teaching. European Journal of Engineering Education, 39 (2), 201–211. https://doi.org/10.1080/03043797.2013.854318 .

Owens, D. C., Sadler, T. D., Barlow, A. T., & Smith-Walters, C. (2020). Student motivation from and resistance to active learning rooted in essential science practices. Research in Science Education , 50 (1), 253–277. https://doi.org/10.1007/s11165-017-9688-1 .

Parker Siburt, C. J., Bissell, A. N., & Macphail, R. A. (2011). Developing Metacognitive and Problem-Solving Skills through Problem Manipulation. Journal of Chemical Education, 88 (11), 1489–1495. https://doi.org/10.1021/ed100891s .

Passow, H. J., & Passow, C. H. (2017). What competencies should undergraduate engineering programs emphasize? A systematic review. Journal of Engineering Education , 106 (3), 475–526. https://doi.org/10.1002/jee.20171 .

Patrick, L. E., Howell, L. A., & Wischusen, W. (2016). Perceptions of active learning between faculty and undergraduates: Differing views among departments. Journal of STEM Education: Innovations and Research , 17 (3), 55 https://www.jstem.org/jstem/index.php/JSTEM/article/view/2121/1776 .

Pellegrino, J., DiBello, L., & Brophy, S. (2014). The science and design of assessment in engineering education. In A. Johri, & B. Olds (Eds.), Cambridge handbook of engineering education research , (pp. 571–598). Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.036 .

Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide . Blackwell Publishing. https://doi.org/10.1002/9780470754887 .

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education , 93 , 223–232. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x .

Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching , 36 (5), 14–20.

Ramsier, R. D., Broadway, F. S., Cheung, H. M., Evans, E. A., & Qammar, H. K. (2003). University physics: A hybrid approach . Nashville: Paper presented at the 2003 ASEE Annual Conference and Exposition https://peer.asee.org/11934 .

Regev, G., Gause, D. C., & Wegmann, A. (2008). Requirements engineering education in the 21st century, an experiential learning approach . 2008 16th IEEE International Requirements Engineering Conference, Catalunya. https://doi.org/10.1109/RE.2008.28 .

Rockland, R., Hirsch, L., Burr-Alexander, L., Carpinelli, J. D., & Kimmel, H. S. (2013). Learning outside the classroom—Flipping an undergraduate circuits analysis course . Atlanta: Paper presented at the 2013 ASEE Annual Conference and Exposition https://peer.asee.org/19868 .

Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching , 55 (1), 19–43. https://doi.org/10.1002/tea.21409 .

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., & Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: A systematic review of underlying reasons. Journal of College Science Teaching , 49 (6) https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-julyaugust-2020/negative-student .

Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom-based practices. Journal of Engineering Education , 94 (1), 87–101. https://doi.org/10.1002/j.2168-9830.2005.tb00831.x .

Stains, M., Harshman, J., Barker, M., Chasteen, S., Cole, R., DeChenne-Peters, S., … Young, A. M. (2018). Anatomy of STEM teaching in north American universities. Science , 359 (6383), 1468–1470. https://doi.org/10.1126/science.aap8892 .

Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE Life Sciences Education , 16 (1). https://doi.org/10.1187/cbe.16-03-0113 .

Stump, G. S., Husman, J., & Corby, M. (2014). Engineering students' intelligence beliefs and learning. Journal of Engineering Education , 103 (3), 369–387. https://doi.org/10.1002/jee.20051 .

Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education , 5 (1), 7. https://doi.org/10.1186/s40594-018-0102-y .

Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences , 117 (12), 6476–6483. https://doi.org/10.1073/pnas.1916903117 .

Tolman, A., Kremling, J., & Tagg, J. (2016). Why students resist learning: A practical model for understanding and helping students . Stylus Publishing, LLC.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Harvard University Press.

Weimer, M. (2002). Learner-centered teaching: Five key changes to practice . Wiley.

Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology , 25 (1), 68–81. https://doi.org/10.1006/ceps.1999.1015 .

Winkler, I., & Rybnikova, I. (2019). Student resistance in the classroom—Functional-instrumentalist, critical-emancipatory and critical-functional conceptualisations. Higher Education Quarterly , 73 (4), 521–538. https://doi.org/10.1111/hequ.12219 .

Download references

Acknowledgements

We thank our collaborators, Charles Henderson and Michael Prince, for their early contributions to this project, including screening hundreds of abstracts and full papers. Thank you to Adam Papendieck and Katherine Doerr for their feedback on early versions of this manuscript. Finally, thank you to the anonymous reviewers at the International Journal of STEM Education for your constructive feedback.

This work was supported by the National Science Foundation through grant #1744407. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and affiliations.

Hutchins School of Liberal Studies, Sonoma State University, Rohnert Park, CA, USA

Kevin A. Nguyen

Departments of Curriculum & Instruction and Mechanical Engineering, University of Texas, Austin, TX, USA

Maura Borrego & Sneha Tharayil

Departments of Electrical Engineering & Computer Science and Education, University of Michigan, 4413 EECS Building, 1301 Beal Avenue, Ann Arbor, MI, 48109, USA

Cynthia J. Finelli & Caroline Crockett

Enrollment Management Research Group, Southern Methodist University, Dallas, TX, USA

Matt DeMonbrun

School of Applied Engineering and Technology, New Jersey Institute of Technology, Newark, NJ, USA

Prateek Shekhar

Advanced Manufacturing and Materials, Naval Surface Warfare Center Carderock Division, Potomac, MD, USA

Cynthia Waters

Cabot Science Library, Harvard University, Cambridge, MA, USA

Robyn Rosenberg

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design and execution of this paper. KN, MB, and CW created the original vision for the paper. RR solicited, downloaded, and catalogued all studies for review. All authors contributed in reviewing and screening hundreds of studies. KN then led the initial analysis and creation of strategy codes. CF reviewed and finalized the analysis. All authors drafted, reviewed, and finalized sections of the paper. KN, MB, MD, and CC led the final review of the paper. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cynthia J. Finelli .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Nguyen, K.A., Borrego, M., Finelli, C.J. et al. Instructor strategies to aid implementation of active learning: a systematic literature review. IJ STEM Ed 8 , 9 (2021). https://doi.org/10.1186/s40594-021-00270-7

Download citation

Received : 19 June 2020

Accepted : 18 January 2021

Published : 15 March 2021

DOI : https://doi.org/10.1186/s40594-021-00270-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Active learning
  • Systematic review
  • Instructor strategies; student response

active learning literature review

Multi-Domain Active Learning: Literature Review and Comparative Study

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Help | Advanced Search

Computer Science > Machine Learning

Title: a survey of deep active learning.

Abstract: Active learning (AL) attempts to maximize the performance gain of the model by marking the fewest samples. Deep learning (DL) is greedy for data and requires a large amount of data supply to optimize massive parameters, so that the model learns how to extract high-quality features. In recent years, due to the rapid development of internet technology, we are in an era of information torrents and we have massive amounts of data. In this way, DL has aroused strong interest of researchers and has been rapidly developed. Compared with DL, researchers have relatively low interest in AL. This is mainly because before the rise of DL, traditional machine learning requires relatively few labeled samples. Therefore, early AL is difficult to reflect the value it deserves. Although DL has made breakthroughs in various fields, most of this success is due to the publicity of the large number of existing annotation datasets. However, the acquisition of a large number of high-quality annotated datasets consumes a lot of manpower, which is not allowed in some fields that require high expertise, especially in the fields of speech recognition, information extraction, medical images, etc. Therefore, AL has gradually received due attention. A natural idea is whether AL can be used to reduce the cost of sample annotations, while retaining the powerful learning capabilities of DL. Therefore, deep active learning (DAL) has emerged. Although the related research has been quite abundant, it lacks a comprehensive survey of DAL. This article is to fill this gap, we provide a formal classification method for the existing work, and a comprehensive and systematic overview. In addition, we also analyzed and summarized the development of DAL from the perspective of application. Finally, we discussed the confusion and problems in DAL, and gave some possible development directions for DAL.

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

1 blog link

Dblp - cs bibliography, bibtex formatted citation.

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

  • Department & Units
  • Majors & Minors
  • LSA Course Guide
  • LSA Gateway

Knowledge base

  • Administrative Applications
  • Classroom Support & Training
  • Computer & Desktop Support
  • Equipment Loans & Reservations
  • Lecture Recording
  • Media Center
  • Research Support & Tools
  • Emergency Remote Teaching
  • Website Services & Support
  • Support Hours
  • Walk-in Support
  • Access Controls
  • Student Resources
  • Support LSA
  • Course Guide

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

Literature and evidence to support active learning.

  • Learning & Teaching Consulting
  • Instructional Strategies
  • Active Learning Framework
  • Literature and Evidence
  • Canvas Help & Support
  • Event Support & Media Tools
  • Faculty Funding & Grants
  • Immediate In Class Assistance
  • Learning & Teaching Consulting Service
  • Planning Guide
  • Bloom’s Taxonomy for Active Learning
  • Assessment Strategies & Learning Activities
  • Activities by Classroom Type
  • Copyright and Intellectual Property
  • Group Discussions
  • Online Discussions
  • Online & Blended Teaching
  • Team-based Learning
  • Guidelines for Using Generative Artificial Intelligence
  • Instructional Tools
  • Learning & Teaching Seminar Series
  • Teaching Tip of the Week
  • Event Spotlight
  • Faculty Spotlight
  • Instructional Video
  • Audio, Visual, & Graphic Design Tools
  • Room Reservations

Evidence of Positive Impact

Bishop, C., Caston, M., & King, C. (2014). Learner-centered environments: Creating effective strategies based on student attitudes and faculty reflection. Journal of the Scholarship of Teaching and Learning, 14(3), 46-63. doi:  http://dx.doi.org/10.14434/josotl.v14i3.5065

“Abstract: Learner-centered environments effectively implement multiple teaching techniques to enhance students’ higher education experience and provide them with greater control over their academic learning. This qualitative study involves an exploration of the eight reasons for learner-centered teaching found in Terry Doyle’s 2008 book, Helping Students Learn in a Learner Centered Environment. Doyle’s principles were investigated through the use of surveys, student focus group interviews, and faculty discussions to discover a deeper understanding of the effects a “learner-centered” teaching environment has on long term learning in comparison to a “teacher-centered” learning environment. These data revealed five primary themes pertaining to student resistance to learner-centered environments. The results assisted in the development of strategies educators can adopt for creating a successful learner-centered classroom.”

Doyle, T., & Zakrajsek, T. (2013). The new science of learning: how to learn in harmony with your brain. First edition. Sterling, Virginia: Stylus.

Todd Zakrajsek is a highly published and sought after speaker in higher education and the Director of the Lilly Conferences for Evidence Based Teaching and Learning in higher education. He is also past Executive Director of the Center for Faculty Excellence at University of North Carolina at Chapel Hill. The New Science summarizes the neuroscience that supports active learning.

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223-231. Retrieved from  http://proxy.lib.umich.edu/login?url=http://search.proquest.com/docview/217960253?accountid=14667

This study examines the evidence for the effectiveness of active learning. It defines the common forms of active learning most relevant for engineering faculty and critically examines the core element of each method. It is found that there is broad but uneven support for the core elements of active, collaborative, cooperative, and problem-based learning.

Learning Theory

Anderson, L. W. (2001). A Taxonomy for learning, teaching, and assessing: a revision of Bloom's Taxonomy of educational objectives. Complete ed. New York: Longman.

A Taxonomy describes and classifies learning based on Bloom’s Taxonomy (1956) and provides insights for teaching and assessing.The text is written to guide teachers as they plan and prepare lessons and assessments to meet learning objectives.

Bransford, J. (2000). How people learn. Expanded ed. Washington,  D.C.: National Academy Press.

How People Learn offers an overview of the research in cognitive science that explains how people learn. Chapter 1 provides the foundational knowledge that supports the practice of active learning.

Design Process

Barkley, E. F. (2010). Student engagement techniques: a handbook for college faculty. San Francisco: Jossey-Bass.

Student Engagement Techniques is a heavily relied upon resources in many schools, and it is referenced often in literature and at conferences. This book provides practical activities intended to involve students in active learning. The book also offers a conceptual framework for understanding student engagement.

Weimer, M. (2013). Learner-Centered Teaching : Five key changes to practice. 2nd ed. San Francisco: Jossey-Bass.

MaryEllen Weimer provides articles and workshops for Magna Publications and Faculty Focus. MaryEllen is a leader in the field of effective teaching and learning in higher education. Her books provide concrete strategies and practices supported by learning theory. She focuses on the function of content (lower order skills outside of class; active learning in class) and the responsibility for learning (on learner). Her work also has an emphasis on assessment and classroom management.

Wiggins, G. P.., McTighe, J. (2005). Understanding by design. Expanded 2nd ed. Alexandria, VA: Association for Supervision and Curriculum Development.

Understanding by Design is a design framework developed by Grant Wiggins and Jay McTighe. The process is reflected in the Academic Technology Services planning guide. The “backward” design process is widely accepted and taught in schools of education.

Active Learning in Higher Education, Sage Publishing

Active Learning in Higher Education is an international, refereed publication for all those who teach and support learning in Higher Education and those who undertake or use research into effective learning, teaching and assessment in universities and colleges. The journal has an objective of improving the status of teaching and learning support as professional activity and embraces academic practice across all curriculum areas in higher education.

Office of the CIO

News & events, innovate newsletter, help & support, getting started with technology services.

TECHNOLOGY SERVICES

G155 Angell Hall, 435 South State St, Ann Arbor, MI 48109–1003 734.615.0100 [email protected] 

LSA - College of Literature, Science, and The Arts - University of Michigan

  • Information For
  • Prospective Students
  • Current Students
  • Faculty and Staff
  • Alumni and Friends
  • More about LSA
  • How Do I Apply?
  • LSA Magazine
  • Academic Advising
  • Majors and Minors
  • Departments and Units
  • Global Studies
  • LSA Opportunity Hub
  • Social Media
  • Update Contact Info
  • Privacy Statement
  • Report Feedback

IGI Global

  • Get IGI Global News

US Flag

  • All Products
  • Book Chapters
  • Journal Articles
  • Video Lessons
  • Teaching Cases
  • Recommend to Librarian
  • Recommend to Colleague
  • Fair Use Policy
  • Access on Platform

Export Reference

Mendeley

  • Advances in Educational Technologies and Instructional Design
  • e-Book Collection
  • Computer Science and Information Technology e-Book Collection
  • Education e-Book Collection
  • Education Knowledge Solutions e-Book Collection
  • Computer Science and IT Knowledge Solutions e-Book Collection
  • e-Book Collection Select

Active Blended Learning: Definition, Literature Review, and a Framework for Implementation

Active Blended Learning: Definition, Literature Review, and a Framework for Implementation

Introduction.

With the growing use of technologies in educational interventions, approaches to learning and teaching have evolved to take place in different environments with a variety of strategies and techniques. Blended learning programmes have thus become pervasive within academic institutions (Adams Becker et al., 2017; Sharples et al., 2014). As these courses cater to a wide range of needs and lifestyles, they represent an attractive option for both traditional and non-traditional learners (Waha & Davis, 2014). Although researchers have largely reported non-significant differences, particularly in terms of student outcomes and satisfaction, blended learning courses have been found to be as effective or better overall than similar ones in other modes of study (Liu et al., 2016; Stockwell et al., 2015). Comparative studies often attempt to replicate teaching practices in face-to-face, blended and online settings. However, the combination of curriculum materials, pedagogy and learning time seems to create the real advantages (Means et al., 2010). The most effective blended courses enable students to learn in ways not feasible in other formats (Adams Becker et al., 2017).

Active learning is particularly useful for achieving a successful and rewarding educational experience. It can result in fewer failing students, higher performance in examinations (Freeman et al., 2014), enhanced problem-solving skills (Hake, 1998), critical thinking (Shin et al., 2014), increased attendance and learner satisfaction (Lumpkin et al., 2015; Stockwell et al., 2015). It can also reduce the attainment gap between disadvantaged and non-disadvantaged students (Haak et al., 2011). The move towards active learning makes classrooms resemble real-world work and social settings that foster cross-disciplinary interactions (Adams Becker et al., 2017). Students perceive that active classrooms promote creativity and innovation (Chiu & Cheng, 2016). When learners participate in active learning environments, they tend to outperform their peers in more traditional classroom settings (Cotner et al., 2013).

This chapter focuses on the joint implementation of blended and active learning to maximise the benefits of both approaches in higher education settings. We addressed three main areas:

Institutional definitions . We analysed the information available on public-facing university websites to establish a starting point for the study of these approaches.

Academic literature . We systematically reviewed the literature on active blended learning (ABL) published in indexed, peer-reviewed journals up to June 2020 to identify trends and patterns.

Framework for active blended learning . We present and describe an evidence-based framework to guide and scale up the implementation of ABL in higher education.

Institutional Definitions

Despite its widespread usage, it is surprisingly difficult to find a universal definition of blended learning. In their review of 97 articles relating to blended learning in higher education, Smith and Hill (2019) reported a lack of consistency and clarity in the literature. Perhaps the only consensus relates to the combination of online and face-to-face elements (e.g., Garrison & Vaughan, 2008). The nature of these components remains ambiguous, and could relate to content availability, teaching strategies, learning opportunities or social interactions. Thus, descriptions of blended learning can refer to the ratio between web-based and traditional provision (Allen et al., 2007; Sener, 2015), the delivery methods (Clayton Christensen Institute, 2017; Kim, 2017) or the pedagogy (Freeman Herreid & Schiller, 2013; Mapstone et al., 2014). This variance complicates the development of research and practice, emphasising the need for shared understandings (Smith & Hill, 2019).

Complete Chapter List

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10602256

Logo of plosone

Active learning in undergraduate classroom dental education- a scoping review

Arnaldo Perez

1 School of Dentistry, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Alberta, Canada

Jacqueline Green

Mohammad moharrami.

2 Faculty of Dentistry, University of Toronto, Toronto, Ontario, Canada

Silvia Gianoni-Capenakas

Maryam kebbe.

3 Faculty of Kinesiology, University of New Brunswick, Fredericton, New Brunswick, Canada

Seema Ganatra

4 Department of Pediatrics, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Alberta, Canada

Nazlee Sharmin

Associated data.

All relevant data are within the manuscript and its Supporting Information files.

Introduction

Previous reviews on active learning in dental education have not comprehensibly summarized the research activity on this topic as they have largely focused on specific active learning strategies. This scoping review aimed to map the breadth and depth of the research activity on active learning strategies in undergraduate classroom dental education.

The review was guided by Arksey & O’Malley’s multi-step framework and followed the PRISMA Extension Scoping Reviews guidelines. MEDLINE, ERIC, EMBASE, and Scopus databases were searched from January 2005 to October 2022. Peer-reviewed, primary research articles published in English were selected. Reference lists of relevant studies were verified to improve the search. Two trained researchers independently screened titles, abstracts, and full-texts articles for eligibility and extracted the relevant data.

In total, 93 studies were included in the review. All studies performed outcome evaluations, including reaction evaluation alone (n = 32; 34.4%), learning evaluation alone (n = 19; 20.4%), and reaction and learning evaluations combined (n = 42; 45.1%). Most studies used quantitative approaches (n = 85; 91.3%), performed post-intervention evaluations (n = 70; 75.3%), and measured student satisfaction (n = 73; 78.5%) and knowledge acquisition (n = 61; 65.6%) using direct and indirect (self-report) measures. Only 4 studies (4.3%) reported faculty data in addition to student data. Flipped learning, group discussion, problem-based learning, and team-based learning were the active learning strategies most frequently evaluated (≥6 studies). Overall, most studies found that active learning improved satisfaction and knowledge acquisition and was superior to traditional lectures based on direct and indirect outcome measures.

Active learning has the potential to enhance student learning in undergraduate classroom dental education; however, robust process and outcome evaluation designs are needed to demonstrate its effectiveness in this educational context. Further research is warranted to evaluate the impact of active learning strategies on skill development and behavioral change in order to support the competency-based approach in dental education.

Active learning (AL) has been broadly defined as a type of learning that demands active gathering, processing, and application of information rather than passive assimilation of knowledge [ 1 ]. This form of learning is well aligned with principles of adult learning, including self-direction, purposefulness, experience-based, ownership, problem orientation, mentorship, and intrinsic motivation [ 2 ]. Because students regularly enroll in dental programs as young adults after completing an undergraduate degree, active learning has been encouraged in dental education to help students gain knowledge and develop basic and advanced dental, cognitive, and social skills [ 3 ]. Active learning, along with curricular integration, early exposure to clinical care, and evidence-based teaching and assessment are important reforms introduced in dental education to ensure that students develop the competencies they need to become entry-level general dentists in the 21st century [ 4 ].

Numerous teaching strategies have been developed to promote active learning across health professions education, including problem-based learning, case-based learning, flipped learning, team-based learning, and group discussion. Research suggests that students and instructors positively value active learning [ 5 , 6 ]; however, inconclusive evidence exists on the actual impact of active learning on knowledge acquisition, skill development, and attitudinal change in health sciences education [ 7 , 8 ].

Many studies have been conducted on active learning in dental education, especially in the last two decades. Some primary and review studies have found that active learning is well received by students and instructors and may be more effective than traditional lecture-based teaching in dental education [ 9 , 10 ]. However, review studies, in particular, have fallen short of providing a comprehensive overview of the existing literature on active learning in dental education [ 9 , 11 , 12 ]. For example, they have largely focused on the outcomes of a few active learning strategies (e.g., problem-based learning, flipped learning) providing limited data on their implementation and evaluation designs. These review studies have also failed to differentiate the scope, range, and nature of the research activity on active learning in different learning environments, including classroom dental education. This learning environment has unique characteristics and is of particular importance because it provides the foundational knowledge that students are expected to apply in laboratory and clinical settings.

Our scoping review aimed to map the breadth and depth of the research activity on active learning strategies in undergraduate classroom dental education from January 2005 to October 2022. Mapping this extensive body of literature is important to inform future research directions on active learning in dental education.

The scoping review framework developed by Arksey & O’Malley (2005) guided the study design, which includes the following stages: (1) formulating research questions, (2) identifying potentially relevant studies, (3) selecting relevant studies, (4) charting the data, and (5) collating, summarizing, and reporting results [ 13 ]. Unlike systematic reviews that typically synthesize the existing evidence on relationships between exposure and outcome variables, scoping reviews are well suited to map the breadth and depth of the research activity on complex topics and identify gaps in the relevant literature [ 13 ]. Our review report followed the guidelines of PRISMA Extension for Scoping Reviews [ 14 ].

Stage 1: Formulating research questions

Our scoping review sought to answer the following questions:

  • What are the characteristics of the studies conducted on active learning in classroom dental education in the study period?
  • How were active learning strategies evaluated?
  • What were the main results of the studies conducted?

Stage 2: Identifying potentially relevant studies

Four databases (MEDLINE, ERIC, EMBASE, and Scopus) were searched from January 2005 to October 2022. A preliminary search suggested that most studies on the study topic were published in the last two decades and the quality of the reports produced had substantially improved in the same study period. The search strategy for MEDLINE was developed by two authors (JG and AP) in consultation with a librarian at the University of Alberta. This strategy was then adapted for each database included in the review. Search terms used in each database are shown in Table 1 . Reference lists of included studies and articles selected in previous reviews on specific active learning strategies were verified to enhance the search and test its sensitivity.

Stage 3: Selecting relevant studies

Inclusion and exclusion criteria were based on the research questions and refined during the screening process. Primary studies published in English were included if they met the following criteria: (i) focused on undergraduate dental education in classroom settings, (ii) used at least one active learning strategy, (iii) involved dental students, and (iv) reported dental student data when students from other programs (e.g., medical students) were involved in the study. Studies were excluded if they were published in a language other than English, reported active learning in clinical or laboratory settings or at program level, and were not available as full-text articles. Review studies and perspective articles were also excluded. No restrictions were set on research methods. All references were exported to Zotero and duplicates were removed by JG. The remaining papers were then exported to Rayyan. A training session was held to ensure understanding of inclusion and exclusion criteria and consistency in their application. Two researchers (JG and SGC) independently screened for titles and abstracts and three researchers independently reviewed the full texts of articles selected in the first phase of screening (JG, SGC, MM). Consensus was obtained by discussion or consulting a fourth reviewer (AP).

Stage 4: Charting the data

A piloted, literature-informed data collection form was used to extract data on publication (year of publication, country of publication), study characteristics (research inquiry, research methodology, means of data collection), participant characteristics (type of student, sample size), intervention (content area, active learning strategy, comparator, and length of the exposure), evaluation (type of evaluation, level of evaluation, evaluation design, and outcome of interest) and main findings. Data extraction was completed independently by two trained researchers (JG and MM) and the completed data extraction forms were compared. Consensus was obtained by discussion or consulting a third reviewer (AP). Authors of studies that did not report key aspects included in the data extraction form were contacted to provide that information. Missing information was then categorized as “not reported.”

Stage 5: Collating, summarizing, and reporting results

Descriptive statistics were used to summarize quantifiable data using previously developed or data-driven classifications. Evaluation data such as level, outcomes (directly and indirectly measured), and results were summarized according to Kirkpatrick’s Model (1998) [ 15 ]. This model suggests four levels of outcome evaluation, including reaction (satisfaction and perceived outcomes), learning (direct measures of outcomes such knowledge, skills, and attitudes), behavior (behavioral changes resulting from the intervention), and results (organizational changes resulting from the intervention). 15 This model is widely used to describe evaluations of educational interventions in a variety of contexts. Papers reporting more than one outcome level and active learning strategy were classified separately to calculate the number of evaluations per level and active learning strategy, respectively.

Searches in EMBASE (n = 1200), MEDLINE (n = 422), Scopus (n = 464), and ERIC (n = 132) databases generated 2,218 records. Duplicates (n = 808) and articles not published in English (n = 47) were removed. The screening of titles and abstracts yielded 273 potentially eligible articles and the screening of full texts identified 93 eligible articles, which were included in this review ( Fig 1 ). No additional articles were identified through checking the reference lists of eligible studies and studies included in previous. A total of 10,473 students and 199 faculty were involved in the selected studies. Students involved were from dentistry (n = 10,297; 98.3%), medicine (n = 126; 1.2%), and dental hygiene (n = 50; 0.5%).

An external file that holds a picture, illustration, etc.
Object name is pone.0293206.g001.jpg

Characteristics of reviewed studies

As shown in Table 2 , selected studies originated from different geographical areas, including Asia (n = 46; 49.4%), North America (n = 29; 31.1%), Europe (n = 10; 10.7%), South America (n = 6; 6.4%), Australia (1) and Africa (1). Twenty-eight of the studies produced in North America were conducted in the United States and 1 in Canada. Thirty-one studies were published between 2005 and 2014 and 62 between 2015 and 2022. Nine studies (9.6%) did not indicate the content area. Most studies reported active learning in clinical (n = 54; 58%) and basic (n = 25; 26.8%) sciences, and only 5 (5.3%) in behavioral and social sciences.

*TBL: Team-based learning

**RCT: Randomized control trial

***PBL: Problem-based learning

Methodologically, most studies (n = 85; 91.3%) were quantitative in nature. Only a few used qualitative (n = 2; 2.1%) and mixed-method (n = 6; 6.4%) approaches. Most studies (n = 67; 72%) did not explicitly report the methodology used and some (n = 8; 8.6%) reported features of the methodology employed (e.g., prospective, comparative). Reported quantitative methods (n = 26; 27.9%) included pre- and post-tests (n = 6), randomized controlled trials (n = 4), cross-sectional studies (n = 3), cohort studies (n = 2), qualitative description (n = 1), case-control studies (n = 1), and experiments without randomization (n = 1). Two reported randomized controlled trials did not describe sequence generation, none reported allocation concealment details, and only 1 reported blinding of outcome assessors. Most common means of data collection included surveys (n = 74; 79.5%) and test scores (n = 59; 63.4%) alone or combined.

Evaluation types and designs

All studies performed outcome evaluations. No process evaluations were reported alone or combined with outcome evaluations. Outcomes evaluated included satisfaction (n = 73), knowledge acquisition (n = 61), skill development (e.g., clinical, problem-solving, communication skills) (n = 3), and engagement (n = 2). Studies performed post-intervention (n = 70; 75.2%), pre-post-intervention (n = 18; 19.3%), and during-post-intervention (n = 5; 5.3%) evaluations.

Of all the evaluations performed (n = 93), post-intervention evaluations (n = 70) included a single group exposed to one condition (n = 23; 24.7%) or two compared conditions (n = 9; 9.6%), two compared groups exposed to two conditions including (n = 10; 10.7%) and not including (n = 21; 22.5%) randomization, and two or more non-compared groups exposed to one condition, including one-time (n = 6; 6.4%) or two-time (n = 1; 1.07%) evaluation points. In the one-time evaluation point, the outcome variables of interest were evaluated after the intervention, whereas in the two-time evaluation points, the outcome variables of interest were evaluated after the intervention by asking participants to assess those variables before and after the intervention. In both cases, the evaluation data of the study groups were aggregated. Pre-and-post intervention evaluations (n = 18), included a single group exposed to one condition (n = 4; 4.3%) or two compared conditions (n = 1; 1.07%), two compared groups exposed to two conditions including (n = 8; 8.6%), and not including (n = 4; 4.3%) randomization, and two or more non-compared groups exposed to one condition with one-time evaluation point (n = 1; 1.07%). During-post-intervention evaluations (n = 5), included a single group exposed to one condition (n = 1; 1.07%) or two compared conditions (n = 1; 1.07%) and two compared groups exposed to two conditions including (n = 1; 1.07%) and not including (n = 2; 2.1%) randomization.

Evaluated active learning strategies

Studies evaluated several active learning strategies. Strategies frequently (more than 10 studies) and fairly (between 6 and 10 studies) evaluated included flipped learning, group discussion, problem-based learning (PBL), and team-based learning (TBL). Blended learning, peer teaching, debate, and role play were occasionally evaluated (between 3 to 5 studies). Strategies seldom evaluated (1 or 2 studies) included games, think-pair-share, and others such as fishbowl and Jigsaw. All outcome evaluations were performed at reaction and learning levels as the present review focused on classroom dental education. Thirty-two studies (34.4%) performed reaction evaluations alone, 19 (20.4%) learning evaluations alone, and 42 (45.1%) reaction and learning evaluations combined. Only 4 studies (4.3%) reported faculty data in addition to student data. The lengths of the exposures to active learning ranged from one hour to three years.

Reaction-level evaluations, including self-reported learning

Seventy-six student reaction evaluations alone or combined were conducted. In these evaluations, active learning was perceived to improve satisfaction in 66 studies (86.8%) and knowledge acquisition in 4 studies (5.3%). Sixty-five of these evaluations or studies compared active learning and lectures, 3 compared two active learning strategies, and 3 compared different forms of the same active learning strategy. In fifty-nine studies, active learning was perceived as superior to lectures, 5 found no differences between active learning and lectures, and only 1 reported lectures as superior to active learning. Only 4 evaluations reported instructors’ reaction data. In all these evaluations, instructors positively valued active learning.

Frequently, fairly, and occasionally evaluated (three or more studies) strategies using reaction-level data included flipped learning, PBL, group discussion, TBL, and blended learning. Peer teaching, role play debate, game, and think-pair-share were seldom evaluated (1 or 2 studies) using reaction data. Flipped learning was perceived to improve satisfaction in 16 studies and was regarded as superior to lectures in 16 studies. PBL was viewed as effective to improve knowledge acquisition in 2 studies and satisfaction in 13 studies and perceived as superior to lectures in 13 studies. Group discussion was deemed effective for knowledge acquisition in 1 study and satisfaction in 12 studies and reported to be superior to lectures in 12 studies. TBL was viewed as beneficial to improve knowledge in 1 study and satisfaction in 7 studies and considered more effective than lectures in 7 studies. Blended learning was deemed to improve satisfaction in 4 studies and regarded as superior to lectures in 4 studies.

Learning-level evaluations

All studies in which learning was directly measured (n = 57) found that active learning was effective to improve knowledge acquisition largely based on test scores. Forty-eight of these studies (84.2%) compared active learning and lectures and 4 studies (7.0%) compared two active learning strategies. Based on the learning data, 39 studies found that active learning was superior to lectures in knowledge acquisition and 9 reported no difference between active learning and traditional lectures.

Frequently and fairly evaluated strategies using direct measures of learning included flipped learning, PBL, group discussion, and TBL. Blended learning, peer teaching, debate, game, and think-pair-share were rarely evaluated using such measures. Based on direct learning data, flipped learning was found to improve knowledge acquisition in 12 studies and to be more effective than lectures in knowledge acquisition in all 12 studies. Similarly, PBL was found to enhance knowledge acquisition in 9 studies and to be superior to traditional teaching in knowledge acquisition in all 9 studies. Direct learning data also supported the effectiveness of group discussion and TBL. Specifically, group discussion and TBL were found to improve knowledge acquisition in 5 and 7 studies, respectively. Regarding this outcome, group discussion was reported to be more effective than lectures in 5 studies and TBL in 7 studies.

Most studies on active learning in classroom dental education were quantitative in nature and published in the last decade, did not report the study methodology, performed outcome evaluations, engaged in post-intervention evaluations, relied on student data, mainly measured satisfaction and knowledge acquisition, and focused on clinical and basic sciences. Our review also revealed that flipped learning, group discussion, problem-based learning, and team-based learning were the active learning strategies most frequently evaluated in classroom dental education. Based on both reaction and factual (direct measure) data, these strategies improved satisfaction and knowledge acquisition and were superior to traditional lectures in improving these outcomes. To our knowledge, this is the first attempt to map the literature on active learning strategies in classroom dental education. Our findings provide a much-needed overview of this body of literature, which previous strategy-specific reviews were not in a position to provide [ 10 , 16 , 17 ]. Such an overview is of critical importance to describe the available evidence and inform future research directions on the study topic.

Consistent with the data from previous reviews, the number of studies on active learning in dental education has increased over time, especially within the last decade [ 9 , 16 , 18 ]. This shows a positive response to repeated calls for transforming the learning environments in dental education. This surge of publications is encouraging as a proxy for innovation in dental education and as a vehicle for knowledge dissemination among dental researchers and educators. In research, though, more publication does not necessarily mean better research activity. Although scoping reviews are not intended to assess the quality of the studies conducted and the credibility of the evidence generated [ 13 ], they can shed light on these issues based on the research methods and designs employed and the nature of the evidence produced. Quality of research in educational innovations can also be inferred by examining the types of evaluations conducted.

Most studies included in our review did not explicitly indicate the methodology used, which previous review research in medical education has also reported [ 19 ]. This is of concern as methodologies are supposed to be deliberately chosen to inform study designs [ 20 ]. We did not assess whether the reported methodologies were correctly classified; however, misclassifications of study methodologies have been documented [ 21 , 22 ]. Such misclassifications may be due to lack of methodological understanding and attempts to pursue methodological credibility by claiming the use of “more robust” designs than those actually employed [ 22 ]. Several recommendations have been made to help researchers frame their projects methodologically and conceptually, including the engagement of methodologists throughout the research process [ 19 ].

Many studies included in our review employed a post-intervention evaluation design with a single cohort. This design is known to have several limitations, such as the inability to assess the magnitude of the improvement, if any, and to account for extraneous variables that may influence the learning outcomes apart from the intervention. Additionally, none of the studies included in our review reported process evaluation. This type of evaluation examines the extent to which an intervention was implemented as expected, met the parameters of effectiveness for the intervention (conditions under which it works), and was aligned with the underlying principles of the type of learning (e.g., collaborative learning) it aimed to promote [ 23 ]. Process evaluations are particularly helpful to determine whether an intervention did not work because of its effectiveness, implementation, or both. Failure to report process evaluation and properly design and implement active learning strategies have been previously documented [ 6 ]. Such shortcomings can be misleading in two fundamental ways: suggesting that a strategy was not effective when it could potentially be and suggesting it was delivered as expected when it was not.

Our findings highlight the importance of reporting not only the research inquiries (e.g., quantitative, qualitative) and methodologies (e.g., cross-sectional, RCT), but also the specific evaluation designs employed in the studies. Since methododologies may not be reported or properly classified, the specific evaluation design used becomes the best proxy for the quality of the outcome evaluation performed. This aspect should be determined by the researchers conducting the review because it may not be clearly defined in published papers. Our classification of evaluation designs can be used for this purpose, although further research may be needed to ascertain its instrumental value.

Few studies in our review used qualitative and mixed-method designs, which best practices in curriculum evaluation at the course and program levels recommend [ 24 ]. Such practices include using multiple evaluators, collecting and combining qualitative and quantitative data to provide a comprehensive evaluation, and using an evaluation framework (e.g., a logic model) to guide the evaluation process. Qualitative research is particularly suited to shed light into the circumstances under which interventions work (why and how) and the contextual factors shaping the outcomes of interventions and participants’ experiences [ 25 ].

Reviews on active learning in dental and medical education have revealed that active learning strategies are commonly evaluated using student feedback [ 6 , 9 ]. Our study confirms the use of student feedback as the main source of evaluation, which is useful to judge some aspects of teaching effectiveness, such as engagement and organization, but not others such as appropriateness of the pedagogical strategy used to achieve the learning objectives [ 26 ]. Faculty feedback is important to comprehensively evaluate active learning across health professions education and ascertain their uptake and continued use of active learning strategies in classroom and clinical learning environments.

Similar to previous review research on active learning across health professions education [ 5 ], many studies included in our review used reactionary and factual data to evaluate the impact of several active learning strategies on the outcomes of interest, especially knowledge acquisition. This is an important strength of the literature on active learning in classroom dental education. Reaction- and learning-level outcome evaluations serve slightly different purposes, but both are needed to establish whether students and faculty are satisfied with the active learning strategies used and the actual impact of those strategies on knowledge acquisition, skill development, and attitudinal change. Further research is needed to critically appraise the validity of the means used to collect direct measures of learning, especially when knowledge tests were not originally developed and validated for research purposes.

Satisfaction and knowledge acquisition were the main outcomes evaluated in the studies included, while skill development (e.g., critical thinking, problem-solving skills) was infrequently considered. The latter is an important learning outcome in the context of competency-based education, which has been highly recommended in dental education [ 27 ]. Failure to measure whether active learning promotes important skills in this type of education may be due to the length and nature of the exposures (interventions) needed to achieve these outcomes and “inherent” difficulties to measure high-level outcomes [ 28 ].

Research on active learning in classroom dental education reflects the emphasis that traditional dental programs place on basic and clinical sciences. We identified only a few papers on active learning in behavioral and social sciences, which are a key component of dental education. These sciences have expanded the understanding of diseases beyond their biological determinants and that of treatment and management beyond clinical procedures [ 29 ]. Additionally, behavioral sciences provide dental students with competencies for personalized care, inter-professional care, disease prevention and management, and personal well-being of patients and care providers [ 30 ]. While integrating the behavioral science curriculum remains an important task [ 31 ], our findings suggest that research is warranted to demonstrate the effectiveness of active learning in delivering behavioral science content in dental education.

Active learning strategies most frequently evaluated in classroom dental education (flipped learning, group discussion, PBL, TBL) are similar to those commonly evaluated in dental and medical education [ 5 , 9 , 32 ]. Properly evaluated strategies provide dental educators with a menu of teaching options from which to choose the most suitable strategy(ies) to achieve their learning objectives. However, other active learning strategies (e.g., peer teaching, role play, think-pair-share) need to be further evaluated in dental education as they have proven effective to achieve certain learning objectives alone or in conjunction with other strategies [ 33 , 34 ].

Despite the diversity of research designs, populations, settings, and evaluated strategies, active learning in classroom dental education was positively valued by students and faculty, was perceived as beneficial and ‘proven’ effective to promote satisfaction and knowledge acquisition, and was found to be superior to traditional lectures to promote these outcomes. These findings are consistent with those of previous reviews in dental education and health professions education in general [ 6 , 9 , 16 ]. Given the limitations of traditional lectures to promote deep and meaningful learning, dental researchers are encouraged to compare active learning strategies to achieve similar generic and specific learning objectives in order to demonstrate their relative effectiveness to achieve those objectives.

Our review also uncovered several reporting issues. These issues included not reporting or underreporting the research methodology, key aspects (e.g., allocation concealment) of the research design, characteristics of the instruments used for data collection, validity evidence of those instruments, active learning strategies employed, and length of the exposure to those strategies. Sufficient details of studies’ designs and conduct are important to judge the quality of the studies and that of the evidence produced. For example, without knowing the actual length of the exposure, it is not possible to appraise whether the expected learning outcomes were not achieved because the strategy used was not effective or because the exposure to the strategy was insufficient.

Limitations of our study encompasses general limitations of scoping reviews and study-specific limitations. General limitations include the potential for publication bias (published literature often prioritizes studies with significant findings over those with non-significant findings) and the absence of quality assessments for the included studies. While this assessment is not required in scoping reviews, it is important to note that the research designs of most included studies do not offer sufficient evidence to demonstrate the effectiveness of active learning in dental education classrooms. Several study-specific limitations need to be acknowledged. We relied on authors’ classifications of research methodologies and active learning strategies, which may not be the actual methods and strategies used. Misclassification of active learning strategies has been previously reported [ 17 ]. We excluded papers in languages other than English due to limited resources for translation, which may impact the generalizability of our study findings. However, based on the number of papers included, we are confident that the inclusion of this literature would not have changed the patterns observed in the extracted data. Our summary of the main results of previous studies by level of outcome evaluation (reaction and learning) may not account for noticeable differences in study design, sample, settings, and measures across studies.

Although active learning strategies were positively valued and found effective using reaction and factual data, robust evaluation designs are needed to further demonstrate their effectiveness in classroom dental education. Aside from effectiveness questions, other issues remain to be elucidated, including for whom, how, when, and in what respect active learning may work in dental education. Future research should evaluate not only the impact of active learning strategies on satisfaction and knowledge acquisition, but also on skill development to support competency-based teaching and assessment in dental education. Similarly, active learning should be used and evaluated across all the main components of dental education, including behavioral and social sciences. Dental education journals should encourage researchers to comply with evaluation and reporting standards for educational innovations to ensure that these innovations are designed, conducted, and reported as expected.

Supporting information

S1 checklist, acknowledgments.

The authors would like to thank Drs. Tania Doblanko and Tanushi Ambekar for their involvement in the study design and preliminary search for relevant articles.

Funding Statement

The authors received funding (SDERF-02) for this work from the School of Dentistry at the University of Alberta. SG was the PI. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

  • PLoS One. 2023; 18(10): e0293206.

Decision Letter 0

20 Sep 2023

PONE-D-23-12023Active learning in undergraduate classroom dental education: a scoping reviewPLOS ONE

Dear Dr. Perez,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The reviewers have expressed their satisfaction with the paper and supported its acceptance after fixing some details. More importantly, you need to highlight the limitations of the paper, the limitation of scoping reviews and make sure that every recommendation is based on actual data in your review. 

Please submit your revised manuscript by Nov 04 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at  gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols .

We look forward to receiving your revised manuscript.

Kind regards,

Mohammed Saqr, Ph.D

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf .

2. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability .

"Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories . Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions . Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

3. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information .

4. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: N/A

Reviewer #3: I Don't Know

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Excellent review.....Educational research is sorely lacking in the health professions, and this scoping review is an important contribution. My only comment would be to address any potential effect of not including

non-English papers in the discussion section.

Reviewer #2: Informative study, well written.

Author followed PRISMA guidelines for scoping reviews, and to my knowledge followed the journal guidelines.

Line 177: Methodologically

I, myself learned more on the subject.

Reviewer #3: the paper is very will written with thorough explanation of finding and interpretations. The topic is very well selected since there are limited reviews on it. The choice of a scoping review is successful since to gives a different depth of information.

6. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy .

Reviewer #1:  Yes:  Elliot Abt

Reviewer #2: No

Reviewer #3: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

25 Sep 2023

Reviewer and editor comments were included in a separate file

Submitted filename: Response to Reviewers.docx

Decision Letter 1

Active learning in undergraduate classroom dental education: a scoping review

PONE-D-23-12023R1

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

18 Oct 2023

Dear Dr. Perez:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Mohammed Saqr

IMAGES

  1. Example of a Literature Review for a Research Paper by

    active learning literature review

  2. (PDF) A Critical Literature Review of Studies in Teaching and Learning

    active learning literature review

  3. (PDF) ACTIVE LEARNING: LITERATURE REVIEW BY META-ANALYTICAL APPROACH

    active learning literature review

  4. [PDF] Project-Based Learning: A Literature Review. Working Paper

    active learning literature review

  5. Sustainability

    active learning literature review

  6. Active Learning Literature Survey

    active learning literature review

VIDEO

  1. DS 7335

  2. What is Active Learning? 💪🏻🧠

  3. SCORISH LEARNING : Literature Review

  4. Active Learning Tip 3

  5. Roger Greenaway's introduction to active learning

  6. The Science of Active Listening

COMMENTS

  1. Active Learning: An Integrative Review

    There have been several literature review projects on active learning. However, all of them are narrative reviews, and this type of review does typically not aim to examine the internal validity of the studies in focus (Toronto, 2020).We argue that research quality appraisal should form an essential part of a literature review as this helps to mitigate bias in research.

  2. PDF Active Learning Literature Survey

    This report provides a general review of the literature on active learning. There have been a host of algorithms and applications for learning with queries over the years, and this document is an attempt to distill the core ideas, methods, and applications that have been considered by the machine learning community.

  3. Instructor strategies to aid implementation of active learning: a

    We conducted a systematic literature review (Borrego et al., 2014; Gough, Oliver, & Thomas, 2017; Petticrew & Roberts, 2006) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.

  4. The Curious Construct of Active Learning

    It should be noted that most of the studies in the active-learning literature focus on a single course or a set of common courses (e.g., multiple introductory astronomy courses). ... but noted a relatively recent review of active learning within the physics community that defined 13 characteristics of active-learning instructional methods ...

  5. Review article A space for learning: An analysis of research on active

    Abstract. Active Learning Classrooms (ALCs) are learning spaces specially designed to optimize the practice of active learning and amplify its positive effects in learners from young children through university-level learners. As interest in and adoption of ALCs has increased rapidly over the last decade, the need for grounded research in their ...

  6. [PDF] Active Learning Literature Survey

    This report provides a general introduction to active learning and a survey of the literature, including a discussion of the scenarios in which queries can be formulated, and an overview of the query strategy frameworks proposed in the literature to date. The key idea behind active learning is that a machine learning algorithm can achieve greater accuracy with fewer labeled training instances ...

  7. An integrative review of in-class activities that enable active

    This study aims to synthesise the peer-reviewed literature about 'active learning' in college science classroom settings. Using the methodology of an integrative literature review, 337 articles archived in the Educational Resources Information Center (ERIC) are examined.

  8. (PDF) Trends of Active Learning in Higher Education and ...

    T rends of Active Learning in Higher. Education and Students' W ell-Being: A Literature Review. Elsa Ribeiro-Silva 1,2,3*, Catarina Amorim 1, José Luis Aparicio-Herguedas 4† and. Paula ...

  9. Active Learning Framework and Process of Classroom Engagement: A

    This literature review shows different types of Active Learning Frameworks (ALFs). It includes behaviorism, constructivism, connectivism as a learning theory, universal learning design, deductive ...

  10. The link between flipped and active learning: a scoping review

    The authors began their review of the literature on active learning by stating: 'Active learning is generally defined as any instructional method that engages students in the learning process. In short, active learning requires students to do meaningful learning activities and think about what they are doing' (Bonwell and Eison Citation ...

  11. Active learning spaces design and assessment: a qualitative systematic

    Research has provided increasing evidence on the effectiveness of active learning methodologies, such as peer or inquiry-based learning (Hsieh, 2013; Michael, 2006). ... lacking systematization. This paper presents a systematic literature review on ALS with a focus on the concepts, design principles, teaching and learning strategies ...

  12. Multi-Domain Active Learning: Literature Review and Comparative Study

    Active learning (AL) can be utilized in MDL to reduce the labeling effort by only using the most informative data. The resultant paradigm is termed multi-domain active learning (MDAL). In this work, we provide an exhaustive literature review for MDAL on the relevant fields, including AL, cross-domain information sharing schemes, and cross ...

  13. [2009.00236] A Survey of Deep Active Learning

    Active learning (AL) attempts to maximize the performance gain of the model by marking the fewest samples. Deep learning (DL) is greedy for data and requires a large amount of data supply to optimize massive parameters, so that the model learns how to extract high-quality features. In recent years, due to the rapid development of internet technology, we are in an era of information torrents ...

  14. Instructor strategies to aid implementation of active learning: a

    In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and ...

  15. (PDF) Active Blended Learning: Definition, Literature Review, and a

    Active learning is often described as a pedagogical approach that engages students in higher-order thinking tasks, usually requiring collaboration with others. The authors systematically reviewed ...

  16. Literature and Evidence to Support Active Learning

    Student Engagement Techniques is a heavily relied upon resources in many schools, and it is referenced often in literature and at conferences. This book provides practical activities intended to involve students in active learning. The book also offers a conceptual framework for understanding student engagement. Weimer, M. (2013).

  17. Active Blended Learning: Definition, Literature Review, and a Framework

    Active Blended Learning: Definition, Literature Review, and a Framework for Implementation: 10.4018/978-1-7998-7856-8.ch001: This chapter focuses on the joint implementation of blended learning and active learning. The authors analysed 152 institutional websites containing

  18. Active learning in undergraduate classroom dental education- a scoping

    However, review studies, in particular, have fallen short of providing a comprehensive overview of the existing literature on active learning in dental education [9, 11, 12]. For example, they have largely focused on the outcomes of a few active learning strategies (e.g., problem-based learning, flipped learning) providing limited data on their ...

  19. Literature review on the benefits and challenges of active learning on

    The objective of this study is to. identify the benefits and challenges of active learning on student achievement. The Systematic Literatur e Review (SLR), which comprises a. review protocol that ...

  20. Active Learning Literature Review

    Active Learning Literature Review. "Active learning involves providing opportunities for students to meaningfully talk and listen, write, read, and reflect on the content, ideas, issues, and concerns of an academic subject. (Meyers & Jones, 1993, p.6)". "Active learning techniques are not educational magic bullets.