Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Eight Instructional Strategies for Promoting Critical Thinking

fostering critical thinking skills

  • Share article

(This is the first post in a three-part series.)

The new question-of-the-week is:

What is critical thinking and how can we integrate it into the classroom?

This three-part series will explore what critical thinking is, if it can be specifically taught and, if so, how can teachers do so in their classrooms.

Today’s guests are Dara Laws Savage, Patrick Brown, Meg Riordan, Ph.D., and Dr. PJ Caposey. Dara, Patrick, and Meg were also guests on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

You might also be interested in The Best Resources On Teaching & Learning Critical Thinking In The Classroom .

Current Events

Dara Laws Savage is an English teacher at the Early College High School at Delaware State University, where she serves as a teacher and instructional coach and lead mentor. Dara has been teaching for 25 years (career preparation, English, photography, yearbook, newspaper, and graphic design) and has presented nationally on project-based learning and technology integration:

There is so much going on right now and there is an overload of information for us to process. Did you ever stop to think how our students are processing current events? They see news feeds, hear news reports, and scan photos and posts, but are they truly thinking about what they are hearing and seeing?

I tell my students that my job is not to give them answers but to teach them how to think about what they read and hear. So what is critical thinking and how can we integrate it into the classroom? There are just as many definitions of critical thinking as there are people trying to define it. However, the Critical Think Consortium focuses on the tools to create a thinking-based classroom rather than a definition: “Shape the climate to support thinking, create opportunities for thinking, build capacity to think, provide guidance to inform thinking.” Using these four criteria and pairing them with current events, teachers easily create learning spaces that thrive on thinking and keep students engaged.

One successful technique I use is the FIRE Write. Students are given a quote, a paragraph, an excerpt, or a photo from the headlines. Students are asked to F ocus and respond to the selection for three minutes. Next, students are asked to I dentify a phrase or section of the photo and write for two minutes. Third, students are asked to R eframe their response around a specific word, phrase, or section within their previous selection. Finally, students E xchange their thoughts with a classmate. Within the exchange, students also talk about how the selection connects to what we are covering in class.

There was a controversial Pepsi ad in 2017 involving Kylie Jenner and a protest with a police presence. The imagery in the photo was strikingly similar to a photo that went viral with a young lady standing opposite a police line. Using that image from a current event engaged my students and gave them the opportunity to critically think about events of the time.

Here are the two photos and a student response:

F - Focus on both photos and respond for three minutes

In the first picture, you see a strong and courageous black female, bravely standing in front of two officers in protest. She is risking her life to do so. Iesha Evans is simply proving to the world she does NOT mean less because she is black … and yet officers are there to stop her. She did not step down. In the picture below, you see Kendall Jenner handing a police officer a Pepsi. Maybe this wouldn’t be a big deal, except this was Pepsi’s weak, pathetic, and outrageous excuse of a commercial that belittles the whole movement of people fighting for their lives.

I - Identify a word or phrase, underline it, then write about it for two minutes

A white, privileged female in place of a fighting black woman was asking for trouble. A struggle we are continuously fighting every day, and they make a mockery of it. “I know what will work! Here Mr. Police Officer! Drink some Pepsi!” As if. Pepsi made a fool of themselves, and now their already dwindling fan base continues to ever shrink smaller.

R - Reframe your thoughts by choosing a different word, then write about that for one minute

You don’t know privilege until it’s gone. You don’t know privilege while it’s there—but you can and will be made accountable and aware. Don’t use it for evil. You are not stupid. Use it to do something. Kendall could’ve NOT done the commercial. Kendall could’ve released another commercial standing behind a black woman. Anything!

Exchange - Remember to discuss how this connects to our school song project and our previous discussions?

This connects two ways - 1) We want to convey a strong message. Be powerful. Show who we are. And Pepsi definitely tried. … Which leads to the second connection. 2) Not mess up and offend anyone, as had the one alma mater had been linked to black minstrels. We want to be amazing, but we have to be smart and careful and make sure we include everyone who goes to our school and everyone who may go to our school.

As a final step, students read and annotate the full article and compare it to their initial response.

Using current events and critical-thinking strategies like FIRE writing helps create a learning space where thinking is the goal rather than a score on a multiple-choice assessment. Critical-thinking skills can cross over to any of students’ other courses and into life outside the classroom. After all, we as teachers want to help the whole student be successful, and critical thinking is an important part of navigating life after they leave our classrooms.

usingdaratwo

‘Before-Explore-Explain’

Patrick Brown is the executive director of STEM and CTE for the Fort Zumwalt school district in Missouri and an experienced educator and author :

Planning for critical thinking focuses on teaching the most crucial science concepts, practices, and logical-thinking skills as well as the best use of instructional time. One way to ensure that lessons maintain a focus on critical thinking is to focus on the instructional sequence used to teach.

Explore-before-explain teaching is all about promoting critical thinking for learners to better prepare students for the reality of their world. What having an explore-before-explain mindset means is that in our planning, we prioritize giving students firsthand experiences with data, allow students to construct evidence-based claims that focus on conceptual understanding, and challenge students to discuss and think about the why behind phenomena.

Just think of the critical thinking that has to occur for students to construct a scientific claim. 1) They need the opportunity to collect data, analyze it, and determine how to make sense of what the data may mean. 2) With data in hand, students can begin thinking about the validity and reliability of their experience and information collected. 3) They can consider what differences, if any, they might have if they completed the investigation again. 4) They can scrutinize outlying data points for they may be an artifact of a true difference that merits further exploration of a misstep in the procedure, measuring device, or measurement. All of these intellectual activities help them form more robust understanding and are evidence of their critical thinking.

In explore-before-explain teaching, all of these hard critical-thinking tasks come before teacher explanations of content. Whether we use discovery experiences, problem-based learning, and or inquiry-based activities, strategies that are geared toward helping students construct understanding promote critical thinking because students learn content by doing the practices valued in the field to generate knowledge.

explorebeforeexplain

An Issue of Equity

Meg Riordan, Ph.D., is the chief learning officer at The Possible Project, an out-of-school program that collaborates with youth to build entrepreneurial skills and mindsets and provides pathways to careers and long-term economic prosperity. She has been in the field of education for over 25 years as a middle and high school teacher, school coach, college professor, regional director of N.Y.C. Outward Bound Schools, and director of external research with EL Education:

Although critical thinking often defies straightforward definition, most in the education field agree it consists of several components: reasoning, problem-solving, and decisionmaking, plus analysis and evaluation of information, such that multiple sides of an issue can be explored. It also includes dispositions and “the willingness to apply critical-thinking principles, rather than fall back on existing unexamined beliefs, or simply believe what you’re told by authority figures.”

Despite variation in definitions, critical thinking is nonetheless promoted as an essential outcome of students’ learning—we want to see students and adults demonstrate it across all fields, professions, and in their personal lives. Yet there is simultaneously a rationing of opportunities in schools for students of color, students from under-resourced communities, and other historically marginalized groups to deeply learn and practice critical thinking.

For example, many of our most underserved students often spend class time filling out worksheets, promoting high compliance but low engagement, inquiry, critical thinking, or creation of new ideas. At a time in our world when college and careers are critical for participation in society and the global, knowledge-based economy, far too many students struggle within classrooms and schools that reinforce low-expectations and inequity.

If educators aim to prepare all students for an ever-evolving marketplace and develop skills that will be valued no matter what tomorrow’s jobs are, then we must move critical thinking to the forefront of classroom experiences. And educators must design learning to cultivate it.

So, what does that really look like?

Unpack and define critical thinking

To understand critical thinking, educators need to first unpack and define its components. What exactly are we looking for when we speak about reasoning or exploring multiple perspectives on an issue? How does problem-solving show up in English, math, science, art, or other disciplines—and how is it assessed? At Two Rivers, an EL Education school, the faculty identified five constructs of critical thinking, defined each, and created rubrics to generate a shared picture of quality for teachers and students. The rubrics were then adapted across grade levels to indicate students’ learning progressions.

At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students’ early experiences through 12th grade. For instance, a kindergarten student may be expected to “identify cause and effect in familiar contexts,” while an 8th grader should demonstrate the ability to “seek out sufficient evidence before accepting a claim as true,” “identify bias in claims and evidence,” and “reconsider strongly held points of view in light of new evidence.”

When faculty and students embrace a common vision of what critical thinking looks and sounds like and how it is assessed, educators can then explicitly design learning experiences that call for students to employ critical-thinking skills. This kind of work must occur across all schools and programs, especially those serving large numbers of students of color. As Linda Darling-Hammond asserts , “Schools that serve large numbers of students of color are least likely to offer the kind of curriculum needed to ... help students attain the [critical-thinking] skills needed in a knowledge work economy. ”

So, what can it look like to create those kinds of learning experiences?

Designing experiences for critical thinking

After defining a shared understanding of “what” critical thinking is and “how” it shows up across multiple disciplines and grade levels, it is essential to create learning experiences that impel students to cultivate, practice, and apply these skills. There are several levers that offer pathways for teachers to promote critical thinking in lessons:

1.Choose Compelling Topics: Keep it relevant

A key Common Core State Standard asks for students to “write arguments to support claims in an analysis of substantive topics or texts using valid reasoning and relevant and sufficient evidence.” That might not sound exciting or culturally relevant. But a learning experience designed for a 12th grade humanities class engaged learners in a compelling topic— policing in America —to analyze and evaluate multiple texts (including primary sources) and share the reasoning for their perspectives through discussion and writing. Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care about and connect with can ignite powerful learning experiences.

2. Make Local Connections: Keep it real

At The Possible Project , an out-of-school-time program designed to promote entrepreneurial skills and mindsets, students in a recent summer online program (modified from in-person due to COVID-19) explored the impact of COVID-19 on their communities and local BIPOC-owned businesses. They learned interviewing skills through a partnership with Everyday Boston , conducted virtual interviews with entrepreneurs, evaluated information from their interviews and local data, and examined their previously held beliefs. They created blog posts and videos to reflect on their learning and consider how their mindsets had changed as a result of the experience. In this way, we can design powerful community-based learning and invite students into productive struggle with multiple perspectives.

3. Create Authentic Projects: Keep it rigorous

At Big Picture Learning schools, students engage in internship-based learning experiences as a central part of their schooling. Their school-based adviser and internship-based mentor support them in developing real-world projects that promote deeper learning and critical-thinking skills. Such authentic experiences teach “young people to be thinkers, to be curious, to get from curiosity to creation … and it helps students design a learning experience that answers their questions, [providing an] opportunity to communicate it to a larger audience—a major indicator of postsecondary success.” Even in a remote environment, we can design projects that ask more of students than rote memorization and that spark critical thinking.

Our call to action is this: As educators, we need to make opportunities for critical thinking available not only to the affluent or those fortunate enough to be placed in advanced courses. The tools are available, let’s use them. Let’s interrogate our current curriculum and design learning experiences that engage all students in real, relevant, and rigorous experiences that require critical thinking and prepare them for promising postsecondary pathways.

letsinterrogate

Critical Thinking & Student Engagement

Dr. PJ Caposey is an award-winning educator, keynote speaker, consultant, and author of seven books who currently serves as the superintendent of schools for the award-winning Meridian CUSD 223 in northwest Illinois. You can find PJ on most social-media platforms as MCUSDSupe:

When I start my keynote on student engagement, I invite two people up on stage and give them each five paper balls to shoot at a garbage can also conveniently placed on stage. Contestant One shoots their shot, and the audience gives approval. Four out of 5 is a heckuva score. Then just before Contestant Two shoots, I blindfold them and start moving the garbage can back and forth. I usually try to ensure that they can at least make one of their shots. Nobody is successful in this unfair environment.

I thank them and send them back to their seats and then explain that this little activity was akin to student engagement. While we all know we want student engagement, we are shooting at different targets. More importantly, for teachers, it is near impossible for them to hit a target that is moving and that they cannot see.

Within the world of education and particularly as educational leaders, we have failed to simplify what student engagement looks like, and it is impossible to define or articulate what student engagement looks like if we cannot clearly articulate what critical thinking is and looks like in a classroom. Because, simply, without critical thought, there is no engagement.

The good news here is that critical thought has been defined and placed into taxonomies for decades already. This is not something new and not something that needs to be redefined. I am a Bloom’s person, but there is nothing wrong with DOK or some of the other taxonomies, either. To be precise, I am a huge fan of Daggett’s Rigor and Relevance Framework. I have used that as a core element of my practice for years, and it has shaped who I am as an instructional leader.

So, in order to explain critical thought, a teacher or a leader must familiarize themselves with these tried and true taxonomies. Easy, right? Yes, sort of. The issue is not understanding what critical thought is; it is the ability to integrate it into the classrooms. In order to do so, there are a four key steps every educator must take.

  • Integrating critical thought/rigor into a lesson does not happen by chance, it happens by design. Planning for critical thought and engagement is much different from planning for a traditional lesson. In order to plan for kids to think critically, you have to provide a base of knowledge and excellent prompts to allow them to explore their own thinking in order to analyze, evaluate, or synthesize information.
  • SIDE NOTE – Bloom’s verbs are a great way to start when writing objectives, but true planning will take you deeper than this.

QUESTIONING

  • If the questions and prompts given in a classroom have correct answers or if the teacher ends up answering their own questions, the lesson will lack critical thought and rigor.
  • Script five questions forcing higher-order thought prior to every lesson. Experienced teachers may not feel they need this, but it helps to create an effective habit.
  • If lessons are rigorous and assessments are not, students will do well on their assessments, and that may not be an accurate representation of the knowledge and skills they have mastered. If lessons are easy and assessments are rigorous, the exact opposite will happen. When deciding to increase critical thought, it must happen in all three phases of the game: planning, instruction, and assessment.

TALK TIME / CONTROL

  • To increase rigor, the teacher must DO LESS. This feels counterintuitive but is accurate. Rigorous lessons involving tons of critical thought must allow for students to work on their own, collaborate with peers, and connect their ideas. This cannot happen in a silent room except for the teacher talking. In order to increase rigor, decrease talk time and become comfortable with less control. Asking questions and giving prompts that lead to no true correct answer also means less control. This is a tough ask for some teachers. Explained differently, if you assign one assignment and get 30 very similar products, you have most likely assigned a low-rigor recipe. If you assign one assignment and get multiple varied products, then the students have had a chance to think deeply, and you have successfully integrated critical thought into your classroom.

integratingcaposey

Thanks to Dara, Patrick, Meg, and PJ for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until February). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

  • This Year’s Most Popular Q&A Posts
  • Race & Racism in Schools
  • School Closures & the Coronavirus Crisis
  • Classroom-Management Advice
  • Best Ways to Begin the School Year
  • Best Ways to End the School Year
  • Student Motivation & Social-Emotional Learning
  • Implementing the Common Core
  • Facing Gender Challenges in Education
  • Teaching Social Studies
  • Cooperative & Collaborative Learning
  • Using Tech in the Classroom
  • Student Voices
  • Parent Engagement in Schools
  • Teaching English-Language Learners
  • Reading Instruction
  • Writing Instruction
  • Education Policy Issues
  • Differentiating Instruction
  • Math Instruction
  • Science Instruction
  • Advice for New Teachers
  • Author Interviews
  • Entering the Teaching Profession
  • The Inclusive Classroom
  • Learning & the Brain
  • Administrator Leadership
  • Teacher Leadership
  • Relationships in Schools
  • Professional Development
  • Instructional Strategies
  • Best of Classroom Q&A
  • Professional Collaboration
  • Classroom Organization
  • Mistakes in Education
  • Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Dearborn BS

Sign Up & Sign In

module image 9

SIPS logo

  • Previous Article
  • Next Article

1. Introduction

4. discussion, author contributions, acknowledgments, funding information, competing interests, data accessibility statement, supplemental material, repeated retrieval practice to foster students’ critical thinking skills.

[email protected]

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Guest Access
  • Get Permissions
  • Cite Icon Cite
  • Search Site

Lara M. van Peppen , Peter P. J. L. Verkoeijen , Anita Heijltjes , Eva Janssen , Tamara van Gog; Repeated Retrieval Practice to Foster Students’ Critical Thinking Skills. Collabra: Psychology 4 January 2021; 7 (1): 28881. doi: https://doi.org/10.1525/collabra.28881

Download citation file:

  • Ris (Zotero)
  • Reference Manager

There is a need for effective methods to teach critical thinking. Many studies on other skills have demonstrated beneficial effects of practice that repeatedly induces retrieval processes (repeated retrieval practice). The present experiment investigated whether repeated retrieval practice is effective for fostering critical thinking skills, focusing on avoiding biased reasoning. Seventy-five students first took a pre-test. Subsequently, they were instructed on critical thinking and avoiding belief-bias in syllogistic reasoning and engaged in retrieval practice with syllogisms. Afterwards, depending on the assigned condition, they (1) did not engage in extra retrieval practice; (2) engaged in retrieval practiced a second time (week later); or (3) engaged in retrieval practiced a second (week later) and a third time (two weeks later). Two/three days after the last practice session, all participants took a post-test consisting of practiced tasks (to measure learning relative to the pre-test) and non-practiced (transfer) tasks. Results revealed no significant difference between the pretest and the posttest learning performance as judged by the mean total performance (MC-answers + justification), although participants were, on average, faster on the post-test than on the pre-test. Exploring performance on MC-answers-only suggested that participants did benefit from instruction/practice but may have been unable to justify their answers. Unfortunately, we were unable to test effects on transfer due to a floor effect, which highlights the difficulty of establishing transfer of critical thinking skills. To the best of our knowledge, this is the first study that addresses repeated retrieval practice effects in the critical thinking domain. Further research should focus on determining the preconditions of repeated retrieval practice effects for this type of tasks.

One of the most valued and sought after skills that higher education students are expected to learn is critical thinking (CT). CT is key to effective thinking about difficult issues, weighing evidence, determining credibility, and acting rationally, which is essential for succeeding in future careers and to be efficacious citizens (Billings & Roberts, 2014; Davies, 2013; Halpern, 2014; Van Gelder, 2005) . The concept of CT can be expressed in a variety of definitions, but at its core, CT is “good thinking that is well reasoned and well supported with evidence” (H. A. Butler & Halpern, 2020, p. 152) . One key aspect of CT is the ability to avoid biases in reasoning and decision-making (e.g., West et al., 2008 ), referred to as unbiased reasoning . Bias is said to occur when people rely on heuristics (i.e., mental shortcuts) during reasoning prior to choosing actions and estimating probabilities that result in systematic deviations from ideal normative standards (i.e., derived from logic and probability theory: Stanovich et al., 2016; Tversky & Kahneman, 1974 ). As biased reasoning can have serious consequences in both daily life and complex professional environments, it is essential to teach CT in higher education (e.g., Koehler et al., 2002 ).

Not surprisingly, therefore, there is a growing body of literature on how to teach CT, including unbiased reasoning (e.g., Abrami et al., 2014; Heijltjes et al., 2015; Heijltjes, Van Gog, & Paas, 2014; Heijltjes, Van Gog, Leppink, et al., 2014; Janssen, Mainhard, et al., 2019; Janssen, Meulendijks, et al., 2019; Kuhn, 2005; Sternberg, 2001; Van Brussel et al., 2020; Van Peppen et al., 2018; Van Peppen, Verkoeijen, Heijltjes, et al., 2021; Van Peppen, Verkoeijen, Kolenbrander, et al., 2021 ). It is well established, for instance, that explicit teaching of CT combined with practice on domain-relevant problems improves learning of general CT-skills (Abrami et al., 2008, 2014) and CT-skills required for unbiased reasoning specifically (Heijltjes et al., 2015; Heijltjes, Van Gog, & Paas, 2014; Heijltjes, Van Gog, Leppink, et al., 2014) . Especially when students are exposed to authentic and sense-making problems (i.e., authentic instruction) or discuss specific problems cooperatively (i.e., dialogue) and when these instructional approaches are combined with one-on-one coaching/mentoring on students’ CT (Abrami et al., 2014) .

Nonetheless, while some effective instructional approaches for learning CT have been identified, it is still unclear which methods are most effective in supporting the ability to transfer what has been learned (Halpern & Butler, 2019; Heijltjes et al., 2015; Heijltjes, Van Gog, & Paas, 2014; Heijltjes, Van Gog, Leppink, et al., 2014; Ritchhart & Perkins, 2005; Tiruneh et al., 2014, 2016; Van Peppen et al., 2018; Van Peppen, Verkoeijen, Heijltjes, et al., 2021; Van Peppen, Verkoeijen, Kolenbrander, et al., 2021) . Transfer is the process of applying one’s prior knowledge or skills to related materials or some new context (e.g., Barnett & Ceci, 2002; Cormier & Hagman, 2014; Haskell, 2001; Perkins & Salomon, 1992; Salomon & Perkins, 1989 ). There are some insights into fostering transfer of CT-skills to isomorphic tasks (in this study referred to as learning; e.g., Heijltjes, Van Gog, Leppink, et al., 2014 ), but not into transfer to novel tasks that share underlying principles but have not been previously encountered (e.g., Heijltjes et al., 2015; Heijltjes, Van Gog, Leppink, et al., 2014; Van Peppen et al., 2018; Van Peppen, Verkoeijen, Heijltjes, et al., 2021; Van Peppen, Verkoeijen, Kolenbrander, et al., 2021 ). As it is crucial that students can successfully apply the CT-skills acquired at a later time and to novel contexts/problems and it would be unfeasible to train students on each and every type of reasoning bias they will ever encounter, more knowledge is needed into the conditions that not only yield learning of CT-skills but also transfer.

Previous research has demonstrated that to establish learning and transfer, learners have to actively construct meaningful knowledge from to-be-learned information, by mentally organizing it in coherent knowledge structures and integrating these with one’s prior knowledge ( Bassok & Holyoak, 1989; Fiorella & Mayer, 2016; Gick & Holyoak, 1983; Holland et al., 1989; Wittrock, 2010 ). This, in turn, can aid future problem solving (Kalyuga, 2011; Renkl, 2014; Van Gog et al., 2019) : if a situation presents similar requirements and the learner recognizes them, they may select and apply the same or a somewhat adapted learned procedure to solve the problem. One of the strongest learning techniques known to promote the construction of meaningful knowledge structures, is having students retrieve to-be-learned material from memory, known as practice testing or retrieval practice (e.g., Dunlosky et al., 2013; Fiorella & Mayer, 2015, 2016; Roediger & Butler, 2011 ). The effect of retrieval practice seems to be extremely robust (for reviews, see Carpenter, 2012; Delaney et al., 2010; Moreira et al., 2019; Pan & Rickard, 2018; Rickard & Pan, 2017; Roediger & Butler, 2011; Roediger & Karpicke, 2006b; Rowland, 2014 ) emerging on measures of both learning and transfer, and with different kinds of materials and test formats (e.g., A. C. Butler, 2010; Carpenter & Kelly, 2012; McDaniel et al., 2012, 2013; Rohrer et al., 2010 ).

1.1 Repeated Retrieval Practice

The effect of retrieval practice seems to be positively related to the number of successful retrieval attempts during practice (e.g., Rawson & Dunlosky, 2011; Roediger & Karpicke, 2006a ), albeit with diminishing returns. For example, in Experiment 2 from the study by Roediger and Karpicke (2006a) , participants either studied a prose passage multiple times (SSSS condition), studied a prose passage multiple times and took one free recall retrieval practice test (SSST condition) or studied a prose passage once and took a free recall retrieval practice test thrice (STTT condition). Subsequently, a delayed final free recall test on the prose passage was administered in all conditions. The results on this final free recall test showed that taking a single retrieval practice test increased the free recall performance relative to the control condition from a mean score of 40% correct to a mean score of 56% correct. Furthermore, repeated retrieval practice (i.e., the STTT condition) increased the free recall performance to a mean of 61% correct, hence showing diminishing returns for extra retrieval practice. That is, where a single retrieval practice test in the SSST condition lifted final test performance with 16% points, the two additional retrieval practice tests increased the final test performance with only 5% points. These diminishing returns of repeated retrieval practice might be due to the fact that the practice testing effect depends not only on the number of successful retrieval attempts but also on the effort that is required to successfully retrieval information from memory. According to the retrieval effort hypothesis (e.g., Pyc & Rawson, 2009 ) the effect of retrieval practice becomes larger when successful retrieval attempts require more effort. When information is repeatedly retrieved from memory, the effort associated with successful retrieval is likely to decrease, which will lead to diminishing returns of repeated retrieval practice.

Despite the potential of repeated retrieval for learning, its impact has not been investigated in research on CT. Therefore, the present study sought to determine whether repeated retrieval practice is beneficial to foster learning of CT-skills as well, and whether it can additionally facilitate transfer. For educational practice, it is relevant to identify the most efficient schedule from among those that achieve a desired level of durability. While the majority of studies were conducted in laboratory settings, the current study was conducted as part of an existing CT-course using educationally relevant practice sessions (multiple practice tasks within a session) and retention intervals (days/weeks). To the best of our knowledge, this is the first study that investigated the effects of repeated retrieval practice in the CT-domain.

1.2 The Present Study

Participants first completed a pretest including syllogistic reasoning tasks (for an overview of the study design, see figure 1 ), which examined their tendency to be influenced by the believability of a conclusion when evaluating the logical validity of arguments. Thereafter, they received instructions on CT in general and on syllogisms in particular. Subsequently, they engaged in retrieval practice with these tasks on domain-specific problems. Depending on condition, participants (1) did not engage in extra retrieval practice with these tasks (practice once); (2) engaged in retrieval practice a second time (one week later; practice twice); or (3) engaged in retrieval practice a second (one week later) and third time (two weeks after second time; practice thrice). Subsequently, all participants completed a posttest including practiced tasks (i.e., syllogistic reasoning tasks; measure of learning ) and non-practiced tasks (i.e., Wason selection tasks; measure of transfer ) two or three days after their last practice session. Participants had to indicate after each test and practice item how much effort they invested on that item and time-on-task was logged during all phases. Furthermore, they were asked after each practice session to assess how well they thought they understood the practice problems (i.e., global judgment of learning; JOL) to gain insight into the added value of extra practice according to the students themselves. Previous research has demonstrated that students’ JOLs are related to their learning strategies and study time (i.e., monitoring learning processes; e.g., Koriat, 1997; Nelson et al., 1994; Zimmerman, 2000 ) and, thus, may indirectly contribute to performance enhancement.

Note. All participants completed the posttest two or three days after their last practice session.

Note. All participants completed the posttest two or three days after their last practice session.

We hypothesized that explicit CT-instructions combined with retrieval practice would be effective for learning : thus, we expected an overall mean pretest to posttest performance gain on learning items in all conditions (Hypothesis 1). Furthermore, and more importantly, we expected that practicing retrieval twice would lead to a higher pretest to posttest performance gain on learning items (Hypothesis 2a) and a higher posttest performance on transfer items 1 (Hypothesis 3a) than practicing retrieval once. We expected that practicing retrieval thrice would lead to a higher pretest to posttest performance gain on learning items (Hypothesis 2b) and a higher posttest performance on transfer items (Hypothesis 3b) than practicing retrieval twice. However, as outlined before, prior research suggests that additional retrieval practice will have diminishing returns on the final test, so we expected these differences to be smaller than the differences between practicing retrieval once and twice.

To get more insight into the effectiveness (higher performance) and efficiency (i.e., performance/investment of mental effort or time; Van Gog & Paas, 2008 ) of repeated retrieval practice on learning and transfer, we explored the invested mental effort, time-on-test, and JOLs. Thus, we exploratively compared the practice conditions on invested mental effort on test items, time-on-test, and JOLs.

The hypotheses and complete method section were preregistered on the Open Science Framework (OSF). All data, script files, and materials (in Dutch) are available on the project page that we created for this study ( https://osf.io/pfmyg/ ).

2.1 Participants and Design

Participants were all first-year ‘Safety and Security Management’ students attending a Dutch University of Applied Sciences ( N = 103). Eleven students did not complete the posttest and two students completed the posttest a week late and therefore were excluded from the analyses (as this may have influenced the results). Seventeen participants were excluded because of non-compliance, i.e., when more than half of the practice tasks during one of the essential practice sessions were not read seriously. 2 Due to a technical problem, one class of students (i.e., 24 students) did not receive the demographic questionnaire and the pretest. Together, this resulted in a final sample of 75 students for the posttest-only analyses (i.e., completed all essential sessions, excluding the demographic questions and pretest) and a subsample of 51 students (68%) for the pretest to posttest analyses (i.e., completed all essential sessions: M age = 19.47, SD = 1.64; 25 female).

We calculated power functions of our analyses using the G*Power software (Faul et al., 2009) . The power of our one-way ANOVAs –under a fixed alpha level of .05 and with a sample size of 75– is estimated at .11, .47, and .87 for picking up a small (η p 2 = .01), medium (η p 2 = .06), and large (η p 2 = .14) effect. Regarding the crucial interaction between number of practice sessions and test moment –again calculated under a fixed alpha level of .05, but with a sample size of 51 and a correlation between measures of .64– the power is estimated at .27, .95, and >.99 for picking up a small, medium, and large interaction effect, respectively. Thus, our sample size under the above assumptions should be sufficient to pick up medium-large effects, and previous studies on repeated (retrieval) practice mainly demonstrated medium-large effects (e.g., Roediger & Karpicke, 2006b ).

The educational committee of the university approved on conducting this study within the curriculum. In week 1, all participants first completed the CT-skills pretest, followed by the CT-instructions and practice session one (see figure 1 for an overview). Participants were randomly assigned to one of three conditions. They either (1) did not practice extra with the tasks (practice once condition, posttest only: n = 26; both tests: n = 16), (2) practiced a second time in week 2 (practice twice condition, n = 25; n = 16), or (3) practiced a second time in week 2 and a third time in week 4 (practice thrice condition, n = 24; n = 19). Participants completed the CT-skills posttest two or three days after their last practice session.

2.2 Materials

2.2.1 CT-skills tests. The content of the surface features of all items was adapted to participants’ study domain. The pretest consisted of 16 syllogistic reasoning items across two categories (i.e., conditional and categorical syllogisms, see Appendix S1 for an example with explanation of each category), which were used to measure learning, as these were instructed and practiced during the training phase. All of the items included a belief bias (i.e., when the conclusion aligns with your prior beliefs or real-world knowledge but is invalid or vice versa; Evans et al., 1983; Markovits & Nantel, 1989; Newstead et al., 1992 ) and examined the tendency to be influenced by the believability of a conclusion when evaluating the logical validity of arguments (Evans, 1977, 2003) . These types of tasks are frequently used to measure people’s ability to avoid biases (e.g., Stanovich et al., 2016 ).

Our tests consisted of 3 × affirming the consequent of a conditional statement (if p then q , q therefore p ; invalid); 3 × denying the consequent of a conditional statement (if p then q , not q therefore not p ; valid); 2 × affirming the antecedent of a conditional statement (if p then q , p therefore q ; valid); 2 × denying the antecedent of a conditional statement (if p then q , not p therefore not q ; invalid); 3 × categorical syllogism ‘no A is B, some C are B, therefore some C are not A’ (valid); and 3 × categorical syllogism ‘no A is B, some C are B, therefore some A are not C’ (invalid). Participants had to indicate for each item whether the conclusion was valid or invalid and to explain their multiple-choice (MC) answer to check their understanding (on the MC-answers they might be guessing). They could earn 1 point for the correct MC-answer and 1 point for a correct and 0.5 point for a partially correct explanation (see subsection 2.4). The MC and explanation scores were sum-scored and, thus, the maximum total score on the learning items was 32 points.

The posttest was identical to the pretest but, additionally, six Wason selection items were added that measured the tendency to confirm a hypothesis rather than to falsify it (see the Appendix for two examples with explanations; e.g., Dawson et al., 2002; Evans, 2002; Stanovich, 2011 ). These items measured transfer as they were not instructed/practiced but shared similar features with the four types of conditional syllogisms. Our test consisted of 3 abstract versions and 3 versions including study-related context. A MC-format with four answer options was used in which only a specific combination of two selected answers was the correct answer. One point was assigned for each correct answer (see subsection 2.4), resulting in a maximum total score of six points on the transfer items.

2.2.2 CT-instructions. The video-based CT-instructions (15 min.) consisted of a general CT-instruction (i.e., features of CT and attitudes/skills needed to think critically) and explicit instructions on belief-bias in syllogisms that consisted of a worked example of each of the six types in the pretest. The worked examples showed the correct line of reasoning and included possible problem-solving strategies, which allowed participants to mentally correct initially erroneous responses. At the end, participants received a hint stating that the principles used in these examples can be applied with several other reasoning tasks.

2.2.3 CT-practice. Participants could practice retrieval on the six types of syllogisms on topics that they might encounter in their working-life. Participants were instructed to read the problems thoroughly and to choose the correct MC-answer option, provided directly below the problems. They had to deliberately recall the relevant information from their memory to solve the problems. After each practice-task, they received correct-answer feedback and were given a worked example in which the line of reasoning was explained in steps and clarified with a visual representation. The second and third practice sessions were parallel versions of the first one (i.e. structurally equivalent problems but with different surface features).

2.2.4 Mental effort. After each test item and after each CT-practice problem, participants were asked to indicate how much effort they invested on completing that task, on a 9-point scale ranging from (1) very, very low effort to (9) very, very high effort (Paas, 1992) .

2.2.5 Global judgments of learning (JOL). At the end of each practice session, participants made a JOL on how well they thought they understood the CT-practice problems on a 7-point scale ranging from (1) very poorly to (7) very well (Koriat et al., 2002; Thiede et al., 2003) .

2.3 Procedure

The study was run during the first four weeks of a CT-course in the Integral Safety and Security Management study program of an institute of higher professional education. The CT-skills pretest and first practice session were conducted during the first lesson in a computer classroom at the participants’ university with an entire class of students and their teacher present. The extra practice sessions and the posttest were completed entirely online (cf.  Heijltjes, Van Gog, & Paas, 2014 ). Participants came from four different classes and within each class, students were randomly assigned to one of the conditions. All materials were delivered in a computer-based environment (Qualtrics platform). Participants could work at their own pace, were allowed to use scrap paper while solving the tasks, and time-on-task was logged during all phases.

In advance of the first lesson, the students were informed by their teacher about the experiment (i.e., procedure and time window). When entering the classroom in week 1 , participants were instructed to sit down at one of the desks and read the A4-paper containing some general instructions and a link to the Qualtrics environment where they first had to sign an informed consent form. Thereafter, they had to fill in a demographic questionnaire and complete the pretest. After each test item, they had to indicate how much mental effort they invested. Subsequently, participants entered the practice phase in which they first viewed the video-based CT-instructions (15 min), followed by the practice tasks. At the end of the practice phase, participants had to indicate their JOL. Participants had to wait (in silence) until the last participant had finished before they were allowed to leave the classroom.

One day before each online session (i.e., practice session 2 and 3 and posttest), participants received an e-mail with a reminder and the request to reserve time for this mandatory part of their CT-course. One hour before participants could start, they received the link to the Qualtrics environment. They were given a specific time window (8 am to 10 pm that day) to complete these sessions. Two or three days after session 1, participants of the practice once condition had to complete the posttest. In the beginning of week 2 , all participants had to complete the second practice session. Since the content of our materials was part of the final exam of this course and the ethical guidelines of the institute of higher professional education state that all students should have been offered the same exam materials, participants of the practice once condition practiced with the extra practice materials but they were no longer included in the experiment. Two or three days after session 2, participants of the practice twice condition had to complete the posttest. Due to practical reasons (i.e., one week school holiday), the procedure of week 2 was repeated in week 4 ; all participants had to complete the third practice session but students in the practice once and twice conditions were no longer partaking in the experiment and those in the practice thrice condition had to complete the posttest after three days. Participants who did not complete either the posttest or one of the extra practice sessions received an e-mail the day after the specific time-window with the message that they could complete it that day as a last opportunity.

2.4 Data Analysis

Items were scored for accuracy; 1 point for each correct MC-alternative and a maximum of 1 point (increasing in steps of 0.5) for the correct explanation on the learning items (coding scheme can be found on our OSF-page). Unfortunately, one transfer item had to be removed from the test due to incorrectly offered MC-answer options. As a result, participants could attain a maximum total score of 32 points on the learning items and five points on the transfer items. For comparability, learning and transfer outcomes were computed as percentage correct scores instead of total scores. Two raters independently scored 25% of the explanations on the learning items of the posttest. Intraclass correlation coefficient (two-way mixed, consistency, single-measures; McGraw & Wong, 1996 ) was 0.996, indicating excellent interrater reliability (Koo & Li, 2016) . The remainder of the tests was scored by one rater. Cronbach’s alpha was .74 on the learning items on the pretest, .71 on the learning items on the posttest and .79 on the transfer items.

Boxplots were created to identify outliers (i.e., values that fall more than 1.5 times the interquartile range above the third quartile or below the first quartile) in the data. If any, we first conducted the analyses on the data of all participants and reran the analyses on the data without outliers. If outliers had influence on the results, we reported the data of both analyses. If not, we only reported the results on the full data set. In case of severe violations of the assumption of normality for our analyses, we conducted appropriate non-parametric tests.

For all analyses in this paper, a p -value of .05 was used as a threshold for statistical significance. Partial eta-squared (ηp 2 ) is reported as an effect size for all ANOVAs with ηp 2 = .01, ηp 2 = .06, and ηp 2 = .14 denoting small, medium, and large effects, respectively (Cohen, 1988) . Cramer’s V is reported as an effect size for chi-square tests with (having 2 degrees of freedom) V = .07, V = .21, and V = .35 denoting small, medium, and large effects, respectively.

3.1 Check on Condition Equivalence

Before running any of the main analyses, we checked our conditions on equivalence. Preliminary analyses confirmed that there were no a-priori differences between the conditions in age, F (2, 50) = 0.46, p = .634, ηp 2 = .02; educational background, χ²(8) = 12.69, p = .12, V = .35; performance on the pretest, F (2, 47) = 0.24, p = .790, ηp 2 = .01; time spent on the pretest, F (2, 47) = 0.74, p = .481, ηp 2 = .03; mental effort invested on the pretest, F (2, 47) = 0.82, p = .445, ηp 2 = .03; performance on practice problems session one, F (2, 74) = 0.12, p = .889, ηp 2 < .01; time spent on practice problems session one, F (2, 74) = 0.89, p = .417, ηp 2 = .02; effort invested on practice problems session one, F (2, 74) = 0.47, p = .629, ηp 2 = .01; and global JOL, F (2, 74) = 0.36, p = .701, ηp 2 = .01. We found a gender difference between the conditions, χ²(2) = 6.23, p = .043, V = .35. However, gender did not correlate significantly with any of our performance measures (minimum p = .669) and was therefore not a confounding variable.

3.2 Planned Analyses

We conducted pretest to posttest analyses on the data of participants who completed all essential experimental sessions ( n = 51) and posttest-only analyses on the data of participants who missed the demographic questions and pretest ( n = 75). Because of a floor effect on transfer performance, analysis of the transfer data would unfortunately not be very meaningful, and we therefore report only descriptive statistics on those data. Together with the descriptive statistics of the other dependent variables, these can be found in Table 1 .

a Means ( SD ) of the data excluding outliers.

3.2.1 Performance on learning items. In contrast to Hypotheses 1 and 2a, a 2×3 mixed ANOVA with Test Moment (pretest, posttest) as within-subjects factor and Condition (practice once, practice twice, practice thrice) as between-subjects factor on performance on learning items revealed no main effects of Test Moment, F (1, 48) = 3.05, p = .087, ηp 2 = .06, and Condition, F (2, 48) = 0.24, p = .788, ηp 2 = .01. Furthermore, there was no interaction between Test Moment and Condition, F (2, 48) = 0.01, p = .991, ηp 2 < .01. A one-way ANOVA with the full sample on the posttest data only, did not reveal an effect of Condition either, F (2, 72) = 0.06, p = .945, ηp 2 < .01.

3.2.2 Mental effort. A 2×3 mixed ANOVA on invested mental effort on the learning items, with Test Moment (pretest, posttest) as within-subjects factor and Condition (practice once, practice twice, practice thrice) as between-subjects factor showed a main effect of Test Moment, F (1, 48) = 8.41, p = .006, ηp 2 = .15; less effort was invested on learning items on the pretest ( M = 3.93, SD = 1.24) than the posttest ( M = 4.32, SD = 1.30). There was no main effect of Condition, F (2, 48) = 0.67, p = .515, ηp 2 = .03, nor an interaction between Test Moment and Condition, F (2, 48) = 0.85, p = .435, ηp 2 = .03. A one-way ANOVA with the full sample on the posttest data only, did not reveal an effect of Condition either, F (2, 72) = 0.28, p = .754, ηp 2 = .01.

3.2.3 Time-on-test. Because the data was not normally distributed, we conducted a Kruskal-Wallis H test with Condition (practice once, practice twice, practice thrice) as between-subjects factor on pretest-posttest differences in time spent on learning items. The results showed that there was no significant difference between conditions in pretest-posttest time spent on learning items, χ²(2) = 1.54, p = .464, ηp 2 = .01. A Kruskal-Wallis H test on the posttest-only data with Condition (practice once, practice twice, practice thrice) as between-subjects factor, showed that there was no significant difference in time spent on posttest learning items between conditions, χ²(2) = 4.54, p = .103, ηp 2 = .04. In addition to the results of the analysis on the full data, a 2×3 mixed ANOVA on the data without five outliers with Test Moment (pretest, posttest) as within-subjects factor and Condition (practice once, practice twice, practice thrice) as between-subjects factor did reveal a significant effect of Test Moment, F (1, 42) = 39.34, p < .001, ηp 2 = .48; more time was spent on the pretest ( M = 73.84, SD = 17.55) than the posttest ( M = 49.26, SD = 21.14).

3.2.4 Global judgments of learning. Finally, we examined differences in global JOLs using a one-way ANOVA. The results revealed no main effect of Condition, F (2, 74) = 1.82, p = .170, ηp 2 = .05.

3.3 Exploratory Analyses

To gain more insight into the effects of repeated retrieval practice, we explored participants’ level of performance during practice session one, two, and three. 3 Descriptive statistics showed that on average, performance increased with increasing practice opportunities: mean percentage correct during practice session one was 58.67% ( SD = 21.29; n = 75), during session two 65.31% ( SD = 19.20; n = 49), and during practice three 69.44% ( SD = 16.79; n = 24). 4 Since the transfer items of the tests shared similar features with the four types of conditional syllogisms, we additionally explored participants’ level of performance during learning on these types only. Again, descriptive statistics showed that performance increased: mean percentage correct during practice session one was 55.33% ( SD = 24.42; n = 75), during practice session two 63.78% ( SD = 25.55; n = 49), and during practice session three 69.79% ( SD = 19.48; n = 24).

Additionally, we explored whether performance on MC-questions only on the syllogism (learning) items improved after instruction and practice, using a 2×3 mixed ANOVA with Test Moment (pretest, posttest) as within-subjects factor and Condition (practice once, practice twice, practice thrice) as between-subjects factor. The results indeed revealed a main effect of Test Moment, F (1, 47) = 20.26, p < .001, ηp 2 = .30; performance was better on the posttest ( M = 68.66, SE = 2.30) than the pretest ( M = 57.42, SE = 2.60). There was, however, no significant main effect of Condition, F (2, 47) = 0.50, p = .613, ηp 2 = .02, nor an interaction between Test Moment and Condition, F (2, 47) = 0.01, p = .990, ηp 2 < .01. Finally, we explored how much time participants spent on the worked-example feedback after correct and incorrect retrievals. Both test and descriptive statistics (see Table 2 ) showed that participants spent – with almost all practice tasks – more time on the worked-example feedback after incorrect retrievals than after correct retrievals. Although participants generally spent less time on the worked-example feedback as they practiced more often (i.e., during a later practice session), this pattern is found during each of the three practice sessions.

3.4 Addressing Potential Power Issues

Due to a technical problem, our final sample was considerably smaller than predetermined and might have been insufficient to detect a small-medium interaction effect. Since adding participants to an already completed experiment will increase the Type 1 rate (alpha) and conducting a second identical experiment (i.e., in the context of an actual course) would be resource-demanding, we decided to exploratory apply whether or not that would be worthwhile, using a sequential stopping rule (SSR: see, for example Arghami & Billard, 1982, 1991; Botella et al., 2006; Doll, 1982; Fitts, 2010; Pocock, 1992; Ximénez & Revuelta, 2007 ). SSRs make it possible to stop early when statistical significance is unlikely to be achieved with the planned number of participants.

One SSR that is simple, efficient, and appropriate to this experiment is the COAST (composite open adaptive stopping rule; Frick, 1998 ). The COAST allows to stop testing participants and reject the null hypothesis if the p -value is less than a lower criterion of .01; to stop testing participants and retain the null hypothesis if the p -value is greater than an upper criterion of .36; and to test more participants if the p -value is between these two values. In the present study, the p -values of our main analyses (i.e., on performance measures) were obviously larger than the high criterion of .36. Hence, there was no hint of an existing effect of repeated retrieval practice in the present study and, thus, we decided not to add additional participants.

The current study investigated whether repeated retrieval practice is beneficial to foster learning of CT-skills and whether it can additionally facilitate transfer. Contrary to our expectations, we did not reveal pretest to posttest performance gains on learning items. Thus, we did not replicate the finding that participants’ performance improves after explicit instructions combined with retrieval practice on domain-specific problems (Hypothesis 1: e.g., Heijltjes et al., 2015; Van Peppen et al., 2018; Van Peppen, Verkoeijen, Heijltjes, et al., 2021; Van Peppen, Verkoeijen, Kolenbrander, et al., 2021 ). It should be noted, however, that this comparable level of posttest performance was attained in less time than pretest performance (i.e., prior to instruction/practice). Moreover, our exploratory findings on performance on MC-questions only, suggest that students did benefit from instructions and retrieval practice. This difference in outcomes when looking at MC-answers and total scores (i.e., MC + justification) could mean that participants did learn what the right answer was, but may have been unable to justify their answers sufficiently. In that case, however, our intervention only resulted in simple memorization (i.e., rote learning; Mayer, 2002 ) instead of a deeper understanding of the subject matter. This might perhaps also explain the occurrence of a floor effect on performance on transfer items, as transfer of knowledge or skills depends on how well-developed the knowledge structures are that are formed during initial learning (e.g., Perkins & Salomon, 1992 ).

a None of the participants completed this task incorrectly. b Only one of the participants completed this task incorrectly. *p < .05, **p < .001.

In line with previous repeated retrieval findings (e.g., Roediger & Butler, 2011 ), average performance scores during practice seemed to increase with more repetitions. However, repeated retrieval practice did not have a significant effect –compared to practice once– on performance on the final test (i.e., on learning items; Hypotheses 2a/2b). Unfortunately, we were unable to test whether repeated retrieval practice would enhance transfer (Hypotheses 3a/3b) due to a floor effect. Because the power of our study was only sufficient to pick up medium-to-large effects of repeated retrieval, it could be that additional retrieval practice had an unidentifiable small effect. In the current study, each practice session consisted of multiple practice tasks (instead of one as in most studies) and it could, therefore, be argued that practice once in this study can already be seen as repeated practice, which possibly explains the absence of substantial effects of repeated retrieval.

Another potential explanation for the lack of effect of additional retrieval practice, might lie in the feedback that was provided after each retrieval attempt. While many studies only show a retrieval practice effect when feedback is provided (for an overview, see Van Gog & Sweller, 2015 ) and others show that elaborative feedback can enhance effects of retrieval practice (e.g., Pan et al., 2016; Pan & Rickard, 2018 ), findings from recent research suggest that the feedback after each retrieval attempt may have eliminated the repeated retrieval effect (Kliegl et al., 2019; Pastötter & Bäuml, 2016; Storm et al., 2014) . According to the bifurcation model (Halamish & Bjork, 2011; Kornell et al., 2011) , feedback only strengthens knowledge that is not successfully retrieved, whereas knowledge that is successfully retrieved is hardly affected by subsequent feedback. As such, it may be that participants in the condition that merely practiced once (i.e., lowest performance during practice) processed the feedback better and, therefore, performed equally well on the final test as participants in the other conditions. Moreover, it may be that participants’ motivation to learn the correct answer was higher when they were unable to provide the correct answer during retrieval practice than when they were able to do so (e.g., Kang et al., 2009; Potts & Shanks, 2019 ). Our findings regarding time spent on worked-example feedback after correct/incorrect retrievals support this idea (i.e., more time spent after incorrect than correct retrievals). The possible elimination of a lag effect on learning problem-solving skills by providing feedback after each retrieval attempt is an interesting issue for future research.

Although participants achieved a considerably high level of performance during retrieval practice (approx. 60–70 percent correct), which was comparable to previous studies that did demonstrate beneficial effects of repeated retrieval practice (e.g., A. C. Butler, 2010; Roediger & Karpicke, 2006b ), a floor effect on performance on transfer items had arisen. Since the practice tasks consisted of MC-questions only, this finding again supports the idea that students do benefit from instructions and retrieval practice but may have been unable to justify their answers on the tests sufficiently. Another likely cause for this floor effect may be that participants lacked profound in-depth understanding of the structural overlap between syllogisms and Wason selection tasks (i.e., measure of transfer). During practice, participants could earn one point for each correctly solved syllogism. Each transfer item, however, required recall and application of all four conditional syllogism principles to solve it correctly and, thus, to earn one point. Future studies on to-be-transferred problem-solving procedures as in the current study, should guarantee sufficient understanding of structural features of tasks and complete recall of the procedure during retrieval practice. It may be helpful to provide longer or more extensive practice, including more guidance in identifying how tasks are related. Potentially, practicing retrieval until all retrievals are successful and complete might be a solution for complete recall of procedures (i.e., successive relearning: e.g., Bahrick, 1979; Rawson et al., 2013 ). Given that transfer of CT skills from trained to untrained tasks remains elusive (as our current results also underline), there is an urgent need to determine the exact obstacles to the transfer of CT-skills, which could lie in a failure to recognize that the acquired knowledge is relevant to the new task, inadequate recall of the acquired knowledge, and/or difficulties in actually applying that knowledge onto the new task (i.e., three-step process of transfer; Barnett & Ceci, 2002 ).

To the best of our knowledge, this is the first study that investigated the effects of repeated retrieval practice in the CT-domain. Moreover, while the majority of research on repeated retrieval practice has been conducted in laboratory settings, the current was conducted as part of an existing CT-course –using educationally relevant practice sessions and retention intervals. As such, it adds to the small body of literature on what instructional designs are (or are not) efficient and effective for CT-courses aiming at learning and transfer of CT-skills, which is relevant for both educational science and educational practice.

Contributed to conception and design: LVP, PV, AH, TVG

Contributed to acquisition of data: LVP

Contributed to analysis and interpretation of data: LVP, PV, TVG

Drafted and/or revised the article: LVP, PV, AH, EJ, TVG

Approved the submitted version for publication: LVP, PV, AH, EJ, TVG

The authors would like to thank Stefan V. Kolenbrander for his help with running this study and Esther Stoop and Marjolein Looijen for their assistance with coding the data.

This work was supported by The Netherlands Organisation for Scientific Research (project number 409-15-203). TVG, PV, and AH were involved in the funding acquisition. The funding source was not involved in this study/manuscript.

The authors have no other competing interests to declare. PV is an editor at Collabra: Psychology. He was not involved in the review process of this article.

All data, script files, and materials (in Dutch) are available on the project page that we created for this study (anonymized view-only link: https://osf.io/pfmyg/ ).

Appendix S1. Example Items CT-skills Tests. Docx

Peer Review History. Docx

Because transfer items were not included in the pretest, we are not able to detect transfer gains.

Fast readers (i.e., maximum reading speed of 0.17 seconds per word; e.g., Trauzettel-Klosinski & Dietz, 2012 ), taken as a limit.

This concerns all participants who engaged in the relevant practice sessions (i.e., all conditions in practice session one, practice twice and thrice in session two, and practice thrice in session three).

We additionally tested within the practice thrice condition ( n = 24) whether there was a significant difference in performance during practice session one, two, and three. Performance increased on average with increasing practice opportunities ( M 1 = 60.42%, M 2 = 65.97%, M 3 = 69.44%), but these differences (possibly due to the small sample size) were not significant, F (2, 46) = 1.94, p = .155, ηp 2 = .08.

Supplementary data

Recipient(s) will receive an email with a link to 'Repeated Retrieval Practice to Foster Students’ Critical Thinking Skills' and will not need an account to access the content.

Subject: Repeated Retrieval Practice to Foster Students’ Critical Thinking Skills

(Optional message may have a maximum of 1000 characters.)

Citing articles via

Email alerts, affiliations.

  • Recent Content
  • Special Collections
  • All Content
  • Submission Guidelines
  • Publication Fees
  • Journal Policies
  • Editorial Team
  • Online ISSN 2474-7394
  • Copyright © 2024

Stay Informed

Disciplines.

  • Ancient World
  • Anthropology
  • Communication
  • Criminology & Criminal Justice
  • Film & Media Studies
  • Food & Wine
  • Browse All Disciplines
  • Browse All Courses
  • Book Authors
  • Booksellers
  • Instructions
  • Journal Authors
  • Journal Editors
  • Media & Journalists
  • Planned Giving

About UC Press

  • Press Releases
  • Seasonal Catalog
  • Acquisitions Editors
  • Customer Service
  • Exam/Desk Requests
  • Media Inquiries
  • Print-Disability
  • Rights & Permissions
  • UC Press Foundation
  • © Copyright 2024 by the Regents of the University of California. All rights reserved. Privacy policy    Accessibility

This Feature Is Available To Subscribers Only

Sign In or Create an Account

  • U.S. Department of Health & Human Services
  • Administration for Children & Families
  • Upcoming Events
  • Open an Email-sharing interface
  • Open to Share on Facebook
  • Open to Share on Twitter
  • Open to Share on Pinterest
  • Open to Share on LinkedIn

Prefill your email content below, and then select your email client to send the message.

Recipient e-mail address:

Send your message using:

Fostering Children's Thinking Skills

Fostering Children’s Thinking Skills

Narrator: Welcome to this short presentation on Fostering Children’s Thinking Skills. Teachers plan activities and interact with children everyday to extend young children’s thinking skills. This presentation describes some ways that teachers can do this important work.

Narrator: The National Center on Quality Teaching and Learning uses the house framework to organize the teaching practices that are important for school readiness for all children. This module on Fostering Children’s Thinking Skills fits into the foundation of the house. A strong foundation of engaging interactions and environments is crucial to children’s school readiness. Teachers have many, many interactions with children every day. By taking advantage of these interactions, teachers can maximize children’s learning.

Narrator: What does it mean to foster children’s thinking skills? It means that teachers engage in interactions with children that support and guide children’s understanding and deepen children’s knowledge about their surroundings. In this module, we focus on three methods that teachers can use to foster children’s thinking and to help them learn more about their world.

Narrator: One way that teachers advance children’s thinking is to lead activities and experiences that provide opportunities to use the scientific method. Teachers can help children use the basic elements of the scientific method. They help children observe carefully. They help children predict or make a good guess about what will happen in the future. And they help children experiment

to see whether or not their prediction was accurate.

Narrator: Another way that teachers foster children’s thinking is to help children use problem solving. Teachers provide activities and experiences that set the stage for children to brainstorm or think about lots of ideas to plan and then to carry out a plan.

Teacher: Oh, maybe take some journals out from the bottom. That’s an idea. Ok, shall we try now? Children: Yeah.

Teacher: Is it ready? Oh, Max that was the trick. That worked!

Narrator: A third method that teachers can use to foster children’s thinking is to help children apply their knowledge. Teachers take note of children’s curiosity and interest and then take advantage of their interest to help children connect everyday experiences to their prior knowledge.

Teacher: Brown.

Child: Brown. Green. Brown. Green. Brown. Teacher: Ah, what comes next?

Child: Green. Teacher: Oh, Green.

Narrator: There are many opportunities for teachers to foster children’s thinking throughout the school day in a variety of activities and routines. Teachers need to be purposeful in their planning, so that these opportunities are productive for children.

Narrator: The three methods that have been described can be used to increase the quality of teacher- child interactions to support learning in the domains of the Head Start framework. This module highlighted the ways that teachers can foster children’s thinking. Teachers can plan learning activities and experiences to help children learn to use the scientific method, problem-solve and apply their knowledge. See our longer module on fostering children’s thinking for additional tip sheets, guides and resources.

Narrator: Thank you for listening and have fun helping your children learn.

Learn ways teachers can support children in using thinking skills to gain deeper understandings and acquire new knowledge.

Materials for Trainers

Presentation

Presenter Notes

Learning Activity: Video Review of Book Reading

Learning Activity: Discussion of Classroom Scenarios

Learning Activity: Video Review of Nature Walk

Learning Activity: Planning in Your Classroom

Tips for Trainers

Supplemental Videos

Supplemental Videos: Dual Language Learners

Supplemental Learning Activity: Carousel Brainstorm

Supporting Materials

Tips for Teachers

Tips for Teachers: Dual Language Learners

Tips for Families

Activities with Families

Tools for Supervisors

Helpful Resources

AIAN Materials

Learning Activity: Video Review of Fox Tail

Learning Activity: Video Review of Blubber Experiment

This zip file contains presentation materials including training videos and handouts. To view or use these materials without internet access, download Fostering Children's Thinking Skills 15-minute In-service Suite in advance. Please ensure your browser is updated to the newest version available. If you have difficulty downloading this file, try using a different browser.

For more information, please contact us at ecdtl at ecetta dot info or call (toll-free) 844-261-3752.

« Go to Engaging Interactions and Environments

Resource Type: Video

National Centers: Early Childhood Development, Teaching and Learning

Age Group: Preschoolers

Last Updated: September 26, 2023

  • Privacy Policy
  • Freedom of Information Act
  • Accessibility
  • Disclaimers
  • Vulnerability Disclosure Policy
  • Viewers & Players

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Fostering Critical Thinking, Reasoning, and Argumentation Skills through Bioethics Education

* E-mail: [email protected]

Affiliation Northwest Association for Biomedical Research, Seattle, Washington, United States of America

Affiliation Center for Research and Learning, Snohomish, Washington, United States of America

  • Jeanne Ting Chowning, 
  • Joan Carlton Griswold, 
  • Dina N. Kovarik, 
  • Laura J. Collins

PLOS

  • Published: May 11, 2012
  • https://doi.org/10.1371/journal.pone.0036791
  • Reader Comments

Table 1

Developing a position on a socio-scientific issue and defending it using a well-reasoned justification involves complex cognitive skills that are challenging to both teach and assess. Our work centers on instructional strategies for fostering critical thinking skills in high school students using bioethical case studies, decision-making frameworks, and structured analysis tools to scaffold student argumentation. In this study, we examined the effects of our teacher professional development and curricular materials on the ability of high school students to analyze a bioethical case study and develop a strong position. We focused on student ability to identify an ethical question, consider stakeholders and their values, incorporate relevant scientific facts and content, address ethical principles, and consider the strengths and weaknesses of alternate solutions. 431 students and 12 teachers participated in a research study using teacher cohorts for comparison purposes. The first cohort received professional development and used the curriculum with their students; the second did not receive professional development until after their participation in the study and did not use the curriculum. In order to assess the acquisition of higher-order justification skills, students were asked to analyze a case study and develop a well-reasoned written position. We evaluated statements using a scoring rubric and found highly significant differences (p<0.001) between students exposed to the curriculum strategies and those who were not. Students also showed highly significant gains (p<0.001) in self-reported interest in science content, ability to analyze socio-scientific issues, awareness of ethical issues, ability to listen to and discuss viewpoints different from their own, and understanding of the relationship between science and society. Our results demonstrate that incorporating ethical dilemmas into the classroom is one strategy for increasing student motivation and engagement with science content, while promoting reasoning and justification skills that help prepare an informed citizenry.

Citation: Chowning JT, Griswold JC, Kovarik DN, Collins LJ (2012) Fostering Critical Thinking, Reasoning, and Argumentation Skills through Bioethics Education. PLoS ONE 7(5): e36791. https://doi.org/10.1371/journal.pone.0036791

Editor: Julio Francisco Turrens, University of South Alabama, United States of America

Received: February 7, 2012; Accepted: April 13, 2012; Published: May 11, 2012

Copyright: © 2012 Chowning et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The “Collaborations to Understand Research and Ethics” (CURE) program was supported by a Science Education Partnership Award grant ( http://ncrrsepa.org ) from the National Center for Research Resources and the Division of Program Coordination, Planning, and Strategic Initiatives of the National Institutes of Health through Grant Number R25OD011138. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

Introduction

While the practice of argumentation is a cornerstone of the scientific process, students at the secondary level have few opportunities to engage in it [1] . Recent research suggests that collaborative discourse and critical dialogue focused on student claims and justifications can increase student reasoning abilities and conceptual understanding, and that strategies are needed to promote such practices in secondary science classrooms [2] . In particular, students need structured opportunities to develop arguments and discuss them with their peers. In scientific argument, the data, claims and warrants (that relate claims to data) are strictly concerned with scientific data; in a socio-scientific argument, students must consider stakeholder perspectives and ethical principles and ideas, in addition to relevant scientific background. Regardless of whether the arguments that students employ point towards scientific or socio-scientific issues, the overall processes students use in order to develop justifications rely on a model that conceptualizes arguments as claims to knowledge [3] .

Prior research in informal student reasoning and socio-scientific issues also indicates that most learners are not able to formulate high-quality arguments (as defined by the ability to articulate justifications for claims and to rebut contrary positions), and highlights the challenges related to promoting argumentation skills. Research suggests that students need experience and practice justifying their claims, recognizing and addressing counter-arguments, and learning about elements that contribute to a strong justification [4] , [5] .

Proponents of Socio-scientific Issues (SSI) education stress that the intellectual development of students in ethical reasoning is necessary to promote understanding of the relationship between science and society [4] , [6] . The SSI approach emphasizes three important principles: (a) because science literacy should be a goal for all students, science education should be broad-based and geared beyond imparting relevant content knowledge to future scientists; (b) science learning should involve students in thinking about the kinds of real-world experiences that they might encounter in their lives; and (c) when teaching about real-world issues, science teachers should aim to include contextual elements that are beyond traditional science content. Sadler and Zeidler, who advocate a SSI perspective, note that “people do not live their lives according to disciplinary boundaries, and students approach socio-scientific issues with diverse perspectives that integrate science and other considerations” [7] .

Standards for science literacy emphasize not only the importance of scientific content and processes, but also the need for students to learn about science that is contextualized in real-world situations that involve personal and community decision-making [7] – [10] . The National Board for Professional Teaching Standards stresses that students need “regular exposure to the human contexts of science [and] examples of ethical dilemmas, both current and past, that surround particular scientific activities, discoveries, and technologies” [11] . Teachers are mandated by national science standards and professional teaching standards to address the social dimensions of science, and are encouraged to provide students with the tools necessary to engage in analyzing bioethical issues; yet they rarely receive training in methods to foster such discussions with students.

The Northwest Association for Biomedical Research (NWABR), a non-profit organization that advances the understanding and support of biomedical research, has been engaging students and teachers in bringing the discussion of ethical issues in science into the classroom since 2000 [12] . The mission of NWABR is to promote an understanding of biomedical research and its ethical conduct through dialogue and education. The sixty research institutions that constitute our members include academia, industry, non-profit research organizations, research hospitals, professional societies, and volunteer health organizations. NWABR connects the scientific and education communities across the Northwestern United States and helps the public understand the vital role of research in promoting better health outcomes. We have focused on providing teachers with both resources to foster student reasoning skills (such as activities in which students practice evaluating arguments using criteria for strong justifications), as well as pedagogical strategies for fostering collaborative discussion [13] – [15] . Our work draws upon socio-scientific elements of functional scientific literacy identified by Zeidler et al. [6] . We include support for teachers in discourse issues, nature of science issues, case-based issues, and cultural issues – which all contribute to cognitive and moral development and promote functional scientific literacy. Our Collaborations to Understand Research and Ethics (CURE) program, funded by a Science Education Partnership Award from the National Institutes of Health (NIH), promotes understanding of translational biomedical research as well as the ethical considerations such research raises.

Many teachers find a principles-based approach most manageable for introducing ethical considerations. The principles include respect for persons (respecting the inherent worth of an individual and his or her autonomy), beneficence/nonmaleficence (maximizing benefits/minimizing harms), and justice (distributing benefits/burdens equitably across a group of individuals). These principles, which are articulated in the Belmont Report [16] in relation to research with human participants (and which are clarified and defended by Beauchamp and Childress [17] ), represent familiar concepts and are widely used. In our professional development workshops and in our support resources, we also introduce teachers to care, feminist, virtue, deontological and consequentialist ethics. Once teachers become familiar with principles, they often augment their teaching by incorporating these additional ethical approaches.

The Bioethics 101 materials that were the focus of our study were developed in conjunction with teachers, ethicists, and scientists. The curriculum contains a series of five classroom lessons and a culminating assessment [18] and is described in more detail in the Program Description below. For many years, teachers have shared with us the dramatic impacts that the teaching of bioethics can have on their students; this research study was designed to investigate the relationship between explicit instruction in bioethical reasoning and resulting student outcomes. In this study, teacher cohorts and student pre/post tests were used to investigate whether CURE professional development and the Bioethics 101 curriculum materials made a significant difference in high school students’ abilities to analyze a case study and justify their positions. Our research strongly indicates that such reasoning approaches can be taught to high school students and can significantly improve their ability to develop well-reasoned justifications to bioethical dilemmas. In addition, student self-reports provide additional evidence of the extent to which bioethics instruction impacted their attitudes and perceptions and increased student motivation and engagement with science content.

Program Description

Our professional development program, Ethics in the Science Classroom, spanned two weeks. The first week, a residential program at the University of Washington (UW) Pack Forest Conference Center, focused on our Bioethics 101 curriculum, which is summarized in Table S1 and is freely available at http://www.nwabr.org . The curriculum, a series of five classroom lessons and a culminating assessment, was implemented by all teachers who were part of our CURE treatment group. The lessons explore the following topics: (a) characteristics of an ethical question; (b) bioethical principles; (c) the relationship between science and ethics and the roles of objectivity/subjectivity and evidence in each; (d) analysis of a case study (including identifying an ethical question, determining relevant facts, identifying stakeholders and their concerns and values, and evaluating options); and (e) development of a well-reasoned justification for a position.

Additionally, the first week focused on effective teaching methods for incorporating ethical issues into science classrooms. We shared specific pedagogical strategies for helping teachers manage classroom discussion, such as asking students to consider the concerns and values of individuals involved in the case while in small single and mixed stakeholder groups. We also provided participants with background knowledge in biomedical research and ethics. Presentations from colleagues affiliated with the NIH Clinical and Translational Science Award program, from the Department of Bioethics and Humanities at the UW, and from NWABR member institutions helped participants develop a broad appreciation for the process of biomedical research and the ethical issues that arise as a consequence of that research. Topics included clinical trials, animal models of disease, regulation of research, and ethical foundations of research. Participants also developed materials directly relevant and applicable to their own classrooms, and shared them with other educators. Teachers wrote case studies and then used ethical frameworks to analyze the main arguments surrounding the case, thereby gaining experience in bioethical analysis. Teachers also developed Action Plans to outline their plans for implementation.

The second week provided teachers with first-hand experiences in NWABR research institutions. Teachers visited research centers such as the Tumor Vaccine Group and Clinical Research Center at the UW. They also had the opportunity to visit several of the following institutions: Amgen, Benaroya Research Institute, Fred Hutchinson Cancer Research Center, Infectious Disease Research Institute, Institute for Stem Cells and Regenerative Medicine at the UW, Pacific Northwest Diabetes Research Institute, Puget Sound Blood Center, HIV Vaccine Trials Network, and Washington National Primate Research Center. Teachers found these experiences in research facilities extremely valuable in helping make concrete the concepts and processes detailed in the first week of the program.

We held two follow-up sessions during the school year to deepen our relationship with the teachers, promote a vibrant ethics in science education community, provide additional resources and support, and reflect on challenges in implementation of our materials. We also provided the opportunity for teachers to share their experiences with one another and to report on the most meaningful longer-term impacts from the program. Another feature of our CURE program was the school-year Institutional Review Board (IRB) and Institutional Animal Care and Use Committee (IACUC) follow-up sessions. Teachers chose to attend one of NWABR’s IRB or IACUC conferences, attend a meeting of a review board, or complete NIH online ethics training. Some teachers also visited the UW Embryonic Stem Cell Research Oversight Committee. CURE funding provided substitutes in order for teachers to be released during the workday. These opportunities further engaged teachers in understanding and appreciating the actual process of oversight for federally funded research.

Participants

Most of the educators who have been through our intensive summer workshops teach secondary level science, but we have welcomed teachers at the college, community college, and even elementary levels. Our participants are primarily biology teachers; however, chemistry and physical science educators, health and career specialists, and social studies teachers have also used our strategies and materials with success.

The research design used teacher cohorts for comparison purposes and recruited teachers who expressed interest in participating in a CURE workshop in either the summer of 2009 or the summer of 2010. We assumed that all teachers who applied to the CURE workshop for either year would be similarly interested in ethics topics. Thus, Cohort 1 included teachers participating in CURE during the summer of 2009 (the treatment group). Their students received CURE instruction during the following 2009–2010 academic year. Cohort 2 (the comparison group) included teachers who were selected to participate in CURE during the summer of 2010. Their students received a semester of traditional classroom instruction in science during the 2009–2010 academic year. In order to track participation of different demographic groups, questions pertaining to race, ethnicity, and gender were also included in the post-tests.

Using an online sample size calculator http://www.surveysystem.com/sscalc.htm , a 95% Confidence Level, and a Confidence Interval of 5, it was calculated that a sample size of 278 students would be needed for the research study. For that reason, six Cohort 1 teachers were impartially chosen to be in the study. For the comparison group, the study design also required six teachers from Cohort 2. The external evaluator contacted all Cohort 2 teachers to explain the research study and obtain their consent, and successfully recruited six to participate.

Ethics Statement

This study was conducted according to the principles expressed in the Declaration of Helsinki. Prior to the study, research processes and materials were reviewed and approved by the Western Institutional Review Board (WIRB Study #1103180). CURE staff and evaluators received written permission from parents to have their minor children participate in the Bioethics 101 curriculum, for the collection and subsequent analysis of students’ written responses to the assessment, and for permission to collect and analyze student interview responses. Teachers also provided written informed consent prior to study participation. All study participants and/or their legal guardians provided written informed consent for the collection and subsequent analysis of verbal and written responses.

Research Study

Analyzing a case study: cure and comparison students..

Teacher cohorts and pre/post tests were used to investigate whether CURE professional development and curriculum materials made a significant difference in high school students’ abilities to analyze a case study and justify their positions. Cohort 1 teachers (N = 6) received CURE professional development and used the Bioethics 101 curriculum with their students (N = 323); Cohort 2 teachers (N = 6) did not receive professional development until after their participation in the study and did not use the curriculum with their students (N = 108). Cohort 2 students were given the test case study and questions, but with only traditional science instruction during the semester. Each Cohort was further divided into two groups (A and B). Students in Group A were asked to complete a pre-test prior to the case study, while students in Group B did not. All four student groups completed a post-test after analysis of the case study. This four-group model ( Table 1 ) allowed us to assess: 1) the effect of CURE treatment relative to conventional education practices, 2) the effect of the pre-test relative to no pre-test, and 3) the interaction between the pre-test and CURE treatment condition. Random assignment of students to treatment and comparison groups was not possible; consequently we used existing intact classes. In all, 431 students and 12 teachers participated in the research study ( Table 2 ).

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0036791.t001

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t002

In order to assess the acquisition of higher-order justification skills, students used the summative assessment provided in our curriculum as the pre- and post-test. We designed the curriculum to scaffold students’ ability to write a persuasive bioethical position; by the time they participated in the assessment, Cohort 1 students had opportunities to discuss the elements of a strong justification as well as practice in analyzing case studies. For our research, both Cohort 1 and 2 students were asked to analyze the case study of “Ashley X” ( Table S2 ), a young girl with a severe neurological impairment whose parents wished to limit her growth through a combination of interventions so that they could better care for her. Students were asked to respond to the ethical question: “Should one or more medical interventions be used to limit Ashley’s growth and physical maturation? If so, which interventions should be used and why?” In their answer, students were encouraged to develop a well-reasoned written position by responding to five questions that reflected elements of a strong justification. One difficulty in evaluating a multifaceted science-related learning task (analyzing a bioethical case study and justifying a position) is that a traditional multiple-choice assessment may not adequately reflect the subtlety and depth of student understanding. We used a rubric to assess student responses to each of the following questions (Q) on a scale of 1 to 4; these questions represent key elements of a strong justification for a bioethical argument:

  • Q1: Student Position: What is your decision?
  • Q2: Factual Support: What facts support your decision? Is there missing information that could be used to make a better decision?
  • Q3: Interests and Views of Others: Who will be impacted by the decision and how will they be impacted?
  • Q4: Ethical Considerations: What are the main ethical considerations?
  • Q5: Evaluating Alternative Options: What are some strengths and weaknesses of alternate solutions?

In keeping with our focus on the process of reasoning rather than on having students draw any particular conclusion, we did not assess students on which position they took, but on how well they stated and justified the position they chose.

We used a rubric scoring guide to assess student learning, which aligned with the complex cognitive challenges posed by the task ( Table S3 ). Assessing complex aspects of student learning is often difficult, especially evaluating how students represent their knowledge and competence in the domain of bioethical reasoning. Using a scoring rubric helped us more authentically score dimensions of students’ learning and their depth of thinking. An outside scorer who had previously participated in CURE workshops, has secondary science teaching experience, and who has a Masters degree in Bioethics blindly scored all student pre- and post-tests. Development of the rubric was an iterative process, refined after analyzing a subset of surveys. Once finalized, we confirmed the consistency and reliability of the rubric and grading process by re-testing a subset of student surveys randomly selected from all participating classes. The Cronbach alpha reliability result was 0.80 [19] .

The rubric closely followed the framework introduced through the curricular materials and reinforced through other case study analyses. For example, under Q2, Factual Support , a student rated 4 out of 4 if their response demonstrated the following:

  • The justification uses the relevant scientific reasons to support student’s answer to the ethical question.
  • The student demonstrates a solid understanding of the context in which the case occurs, including a thoughtful description of important missing information.
  • The student shows logical, organized thinking. Both facts supporting the decision and missing information are presented at levels exceeding standard (as described above).

An example of a student response that received the highest rating for Q2 asking for factual support is: “Her family has a history of breast cancer and fibrocystic breast disease. She is bed-bound and completely dependent on her parents. Since she is bed-bound, she has a higher risk of blood clots. She has the mentality of an infant. Her parents’ requests offer minimal side effects. With this disease, how long is she expected to live? If not very long then her parents don’t have to worry about growth. Are there alternative measures?”

In contrast, a student rated a 1 for responses that had the following characteristics:

  • Factual information relevant to the case is incompletely described or is missing.
  • Irrelevant information may be included and the student demonstrates some confusion.

An example of a student response that rated a 1 for Q2 is: “She is unconscious and doesn’t care what happens.”

All data were entered into SPSS (Statistical Package for the Social Sciences) and analyzed for means, standard deviations, and statistically significant differences. An Analysis of Variance (ANOVA) was used to test for significant overall differences between the two cohort groups. Pre-test and post-test composite scores were calculated for each student by adding individual scores for each item on the pre- and post-tests. The composite score on the post-test was identical in form and scoring to the composite score on the pre-test. The effect of the CURE treatment on post-test composite scores is referred to as the Main Effect, and was determined by comparing the post-test composite scores of the Cohort 1 (CURE) and Cohort 2 (Comparison) groups. In addition, Cohort 1 and Cohort 2 means scores for each test question (Questions 1–5) were compared within and between cohorts using t-tests.

CURE student perceptions of curriculum effect.

During prior program evaluations, we asked teachers to identify what they believed to be the main impacts of bioethics instruction on students. From this earlier work, we identified several themes. These themes, listed below, were further tested in our current study by asking students in the treatment group to assess themselves in these five areas after participation in the lesson, using a retrospective pre-test design to measure self-reported changes in perceptions and abilities [20] .

  • Interest in the science content of class (before/after) participating in the Ethics unit.
  • Ability to analyze issues related to science and society and make well-justified decisions (before/after) participating in the Ethics unit.
  • Awareness of ethics and ethical issues (before/after) participating in the Ethics unit.
  • Understanding of the connection between science and society (before/after) participating in the Ethics unit.
  • Ability to listen to and discuss different viewpoints (before/after) participating in the Ethics unit.

After Cohort 1 (CURE) students participated in the Bioethics 101 curriculum, we asked them to indicate the extent to which they had changed in each of the theme areas we had identified using Likert-scale items on a retrospective pre-test design [21] , with 1 =  None and 5 =  A lot!. We used paired t-tests to examine self-reported changes in their perceptions and abilities. The retrospective design avoids response-shift bias that results from overestimation or underestimation of change since both before and after information is collected at the same time [20] .

Student Demographics

Demographic information is provided in Table 3 . Of those students who reported their gender, a larger number were female (N = 258) than male (N = 169), 60% and 40%, respectively, though female students represented a larger proportion of Cohort 1 than Cohort 2. Students ranged in age from 14 to 18 years old; the average age of the students in both cohorts was 15. Students were enrolled in a variety of science classes (mostly Biology or Honors Biology). Because NIH recognizes a difference between race and ethnicity, students were asked to respond to both demographic questions. Students in both cohorts were from a variety of ethnic and racial backgrounds.

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t003

Pre- and Post-Test Results for CURE and Comparison Students

Post-test composite means for each cohort (1 and 2) and group (A and B) are shown in Table 4 . Students receiving CURE instruction earned significantly higher (p<0.001) composite mean scores than students in comparison classrooms. Cohort 1 (CURE) students (N = 323) post-test composite means were 10.73, while Cohort 2 (Comparison) students (N = 108) had post-test composite means of 9.16. The ANOVA results ( Table 5 ) showed significant differences in the ability to craft strong justifications between Cohort 1 (CURE) and Cohort 2 (Comparison) students F (1, 429) = 26.64, p<0.001.

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t004

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t005

We also examined if the pre-test had a priming effect on the students’ scores because it provides an opportunity to practice or think about the content. The pre-test would not have this effect on the comparison group because they were not exposed to CURE teaching or materials. If the pre-test provides a practice or priming effect, this would result in higher post-test performance by CURE students receiving the pre-test than by CURE students not receiving the pre-test. For this comparison, the F (1, 321) = 0.10, p = 0.92. This result suggests that the differences between the CURE and comparison groups are attributable to the treatment condition and not a priming effect of the pre-test.

After differences in main effects were investigated, we analyzed differences between and within cohorts on individual items (Questions 1–5) using t-tests. The Mean scores of individual questions for each cohort are shown in Figure 1 . There were no significant differences between Cohort 1 (CURE) and Cohort 2 (Comparison) on pre-test scores. In fact, for Q5, the mean pre-test scores for the Cohort 2 (Comparison) group were slightly higher (1.8) than the Cohort 1 (CURE) group (1.6). On the post-test, the Cohort 1 (CURE) students significantly outscored the Cohort 2 (Comparison) students on all questions; Q1, Q3, and Q4 were significant at p<0.001, Q2 was significant at p<0.01, and Q5 was significant at p<0.05. The largest post-test difference between Cohort 1 (CURE) students and Cohort 2 (Comparison) students was for Q3, with an increase of 0.6; all the other questions showed changes of 0.3 or less. Comparing Cohort 1 (CURE) post-test performance on individual questions yields the following results: scores were highest for Q1 (mean = 2.8), followed by Q3 (mean = 2.2), Q2 (mean = 2.1), and Q5 (mean = 1.9). Lowest Cohort 1 (CURE) post-test scores were associated with Q4 (mean = 1.8).

thumbnail

Mean scores for individual items of the pre-test for each cohort revealed no differences between groups for any of the items (Cohort 1, CURE, N = 323; Cohort 2, Comparison, N = 108). Post-test gains of Cohort 1 (CURE) relative to Cohort 2 (Comparison) were statistically significant for all questions. (Question (Q) 1) What is your decision? (Q2) What facts support your decision? Is there missing information that could be used to make a better decision? (Q3) Who will be impacted by the decision and how will they be impacted? (Q4) What are the main ethical considerations? and (Q5)What are some strengths and weaknesses of alternate solutions? Specifically: (Q1), (Q3), (Q4) were significant at p<0.001 (***); (Q2) was significant at p<0.01 (**); and (Q5) was significant at p<0.05 (*). Lines represent standard deviations.

https://doi.org/10.1371/journal.pone.0036791.g001

Overall, across all four groups, mean scores for Q1 were highest (2.6), while scores for Q4 were lowest (1.6). When comparing within-Cohort scores on the pre-test versus post-test, Cohort 2 (Comparison Group) showed little to no change, while CURE students improved on all test questions.

CURE Student Perceptions of Curriculum Effect

After using our resources, Cohort 1 (CURE) students showed highly significant gains (p<0.001) in all areas examined: interest in science content, ability to analyze socio-scientific issues and make well-justified decisions, awareness of ethical issues, understanding of the connection between science and society, and the ability to listen to and discuss viewpoints different from their own ( Figure 2 ). Overall, students gave the highest score to their ability to listen to and discuss viewpoints different than their own after participating in the CURE unit (mean = 4.2). Also highly rated were the changes in understanding of the connection between science and society (mean = 4.1) and the awareness of ethical issues (mean = 4.1); these two perceptions also showed the largest change pre-post (from 2.8 to 4.1 and 2.7 to 4.1, respectively).

thumbnail

Mean scores for individual items of the retrospective items on the post-test for Cohort 1 students revealed significant gains (p<0.001) in all self-reported items: Interest in science (N = 308), ability to Analyze issues related to science and society and make well-justified decisions (N = 306), Awareness of ethics and ethical issues (N = 309), Understanding of the connection between science and society (N = 308), and the ability to Listen and discuss different viewpoints (N = 308). Lines represent standard deviations.

https://doi.org/10.1371/journal.pone.0036791.g002

NWABR’s teaching materials provide support both for general ethics and bioethics education, as well as for specific topics such as embryonic stem cell research. These resources were developed to provide teachers with classroom strategies, ethics background, and decision-making frameworks. Teachers are then prepared to share their understanding with their students, and to support their students in using analysis tools and participating in effective classroom discussions. Our current research grew out of a desire to measure the effectiveness of our professional development and teaching resources in fostering student ability to analyze a complex bioethical case study and to justify their positions.

Consistent with the findings of SSI researchers and our own prior anecdotal observations of teacher classrooms and student work, we found that students improve in their analytical skill when provided with reasoning frameworks and background in concepts such as beneficence, respect, and justice. Our research demonstrates that structured reasoning approaches can be effectively taught at the secondary level and that they can improve student thinking skills. After teachers participated in a two-week professional development workshop and utilized our Bioethics 101 curriculum, within a relatively short time period (five lessons spanning approximately one to two weeks), students grew significantly in their ability to analyze a complex case and justify their position compared to students not exposed to the program. Often, biology texts present a controversial issue and ask students to “justify their position,” but teachers have shared with us that students frequently do not understand what makes a position or argument well-justified. By providing students with opportunities to evaluate sample justifications, and by explicitly introducing a set of elements that students should include in their justifications, we have facilitated the development of this important cognitive skill.

The first part of our research examined the impact of CURE instruction on students’ ability to analyze a case study. Although students grew significantly in all areas, the highest scores for the Cohort 1 (CURE) students were found in response to Q1 of the case analysis, which asked them to clearly state their own position, and represented a relatively easy cognitive task. This question also received the highest score in the comparison group. Not surprisingly, students struggled most with Q4 and Q5, which asked for the ethical considerations and the strengths and weaknesses of different solutions, respectively, and which tested specialized knowledge and sophisticated analytical skills. The area in which we saw the most growth in Cohort 1 (CURE) (both in comparison to the pre-test and in relation to the comparison group) was in students’ ability to identify stakeholders in a case and state how they might be impacted by a decision (Q3). Teachers have shared with us that secondary students are often focused on their own needs and perspectives; stepping into the perspectives of others helps enlarge their understanding of the many views that can be brought to bear upon a socio-scientific issue.

Many of our teachers go far beyond these introductory lessons, revisiting key concepts throughout the year as new topics are presented in the media or as new curricular connections arise. Although we have observed this phenomenon for many years, it has been difficult to evaluate these types of interventions, as so many teachers implement the concepts and ideas differently in response to their unique needs. Some teachers have used the Bioethics 101 curriculum as a means for setting the tone and norms for the entire year in their classes and fostering an atmosphere of respectful discussion. These teachers note that the “opportunity cost” of investing time in teaching basic bioethical concepts, decision-making strategies, and justification frameworks pays off over the long run. Students’ understanding of many different science topics is enhanced by their ability to analyze issues related to science and society and make well-justified decisions. Throughout their courses, teachers are able to refer back to the core ideas introduced in Bioethics 101, reinforcing the wide utility of the curriculum.

The second part of our research focused on changes in students’ self-reported attitudes and perceptions as a result of CURE instruction. Obtaining accurate and meaningful data to assess student self-reported perceptions can be difficult, especially when a program is distributed across multiple schools. The traditional use of the pretest-posttest design assumes that students are using the same internal standard to judge attitudes or perceptions. Considerable empirical evidence suggests that program effects based on pre-posttest self-reports are masked because people either overestimate or underestimate their pre-program perceptions [20] , [22] – [26] . Moore and Tananis [27] report that response shift can occur in educational programs, especially when they are designed to increase students’ awareness of a specific construct that is being measured. The retrospective pre-test design (RPT), which was used in this study, has gained increasing prominence as a convenient and valid method for measuring self-reported change. RPT has been shown to reduce response shift bias, providing more accurate assessment of actual effect. The retrospective design avoids response-shift bias that results from overestimation or underestimation of change since both before and after information is collected at the same time [20] . It is also convenient to implement, provides comparison data, and may be more appropriate in some situations [26] . Using student self-reported measures concerning perceptions and attitudes is also a meta-cognitive strategy that allows students to think about their learning and justify where they believe they are at the end of a project or curriculum compared to where they were at the beginning.

Our approach resulted in a significant increase in students’ own perceived growth in several areas related to awareness, understanding, and interest in science. Our finding that student interest in science can be significantly increased through a case-study based bioethics curriculum has implications for instruction. Incorporating ethical dilemmas into the classroom is one strategy for increasing student motivation and engagement with science content. Students noted the greatest changes in their own awareness of ethical issues and in understanding the connection between science and society. Students gave the highest overall rating to their ability to listen to and discuss viewpoints different from their own after participation in the bioethics unit. This finding also has implications for our future citizenry; in an increasingly diverse and globalized society, students need to be able to engage in civil and rational dialogue with others who may not share their views.

Conducting research studies about ethical learning in secondary schools is challenging; recruiting teachers for Cohort 2 and obtaining consent from students, parents, and teachers for participation was particularly difficult, and many teachers faced restraints from district regulations about curriculum content. Additional studies are needed to clarify the extent to which our curricular materials alone, without accompanying teacher professional development, can improve student reasoning skills.

Teacher pre-service training programs rarely incorporate discussion of how to address ethical issues in science with prospective educators. Likewise, with some noticeable exceptions, such as the work of the University of Pennsylvania High School Bioethics Project, the Genetic Science Learning Center at the University of Utah, and the Kennedy Institute of Ethics at Georgetown University, relatively few resources exist for high school curricular materials in this area. Teachers have shared with us that they know that such issues are important and engaging for students, but they do not have the experience in either ethical theory or in managing classroom discussion to feel comfortable teaching bioethics topics. After participating in our workshops or using our teaching materials, teachers shared that they are better prepared to address such issues with their students, and that students are more engaged in science topics and are better able to see the real-world context of what they are learning.

Preparing students for a future in which they have access to personalized genetic information, or need to vote on proposals for stem cell research funding, necessitates providing them with the tools required to reason through a complex decision containing both scientific and ethical components. Students begin to realize that, although there may not be an absolute “right” or “wrong” decision to be made on an ethical issue, neither is ethics purely relative (“my opinion versus yours”). They come to realize that all arguments are not equal; there are stronger and weaker justifications for positions. Strong justifications are built upon accurate scientific information and solid analysis of ethical and contextual considerations. An informed citizenry that can engage in reasoned dialogue about the role science should play in society is critical to ensure the continued vitality of the scientific enterprise.

“I now bring up ethical issues regularly with my students, and use them to help students see how the concepts they are learning apply to their lives…I am seeing positive results from my students, who are more clearly able to see how abstract science concepts apply to them.” – CURE Teacher “In ethics, I’ve learned to start thinking about the bigger picture. Before, I based my decisions on how they would affect me. Also, I made decisions depending on my personal opinions, sometimes ignoring the facts and just going with what I thought was best. Now, I know that to make an important choice, you have to consider the other people involved, not just yourself, and take all information and facts into account.” – CURE Student

Supporting Information

Bioethics 101 Lesson Overview.

https://doi.org/10.1371/journal.pone.0036791.s001

Case Study for Assessment.

https://doi.org/10.1371/journal.pone.0036791.s002

Grading Rubric for Pre- and Post-Test: Ashley’s Case.

https://doi.org/10.1371/journal.pone.0036791.s003

Acknowledgments

We thank Susan Adler, Jennifer M. Pang, Ph.D., Leena Pranikay, and Reitha Weeks, Ph.D., for their review of the manuscript, and Nichole Beddes for her assistance scoring student work. We also thank Carolyn Cohen of Cohen Research and Evaluation, former CURE Evaluation Consultant, who laid some of the groundwork for this study through her prior work with us. We also wish to thank the reviewers of our manuscript for their thoughtful feedback and suggestions.

Author Contributions

Conceived and designed the experiments: JTC LJC. Performed the experiments: LJC. Analyzed the data: LJC JTC DNK. Contributed reagents/materials/analysis tools: JCG. Wrote the paper: JTC LJC DNK JCG. Served as Principal Investigator on the CURE project: JTC. Provided overall program leadership: JTC. Led the curriculum and professional development efforts: JTC JCG. Raised funds for the CURE program: JTC.

  • 1. Bell P (2004) Promoting students’ argument construction and collaborative debate in the science classroom. Mahwah, NJ: Erlbaum.
  • View Article
  • Google Scholar
  • 3. Toulmin S (1958) The Uses of Argument. Cambridge: Cambridge University Press.
  • 6. Zeidler DL, Sadler TD, Simmons ML, Howes , EV (2005) Beyond STS: A research-based framework for socioscientific issues education. Wiley InterScience. pp. 357–377.
  • 8. AAAS (1990) Science for All Americans. New York: Oxford University Press.
  • 9. National Research Council (1996) National Science Education Standards. Washington, DC: National Academies Press.
  • 10. National Research Council (2011) A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: National Academies Press.
  • 11. National Board for Professional Teaching Standards (2007) Adolescence and Young Adulthood Science Standards. Arlington, VA.
  • 17. Beauchamp T, Childress JF (2001) Principles of biomedical ethics. New York: Oxford University Press.
  • 18. Chowning JT, Griswold JC (2010) Bioethics 101. Seattle, WA: NWABR.
  • 26. Klatt J, Taylor-Powell E (2005) Synthesis of literature relative to the retrospective pretest design. Presentation to the 2005 Joint CES/AEA Conference, Toronto.

Skills for Life: Fostering Critical Thinking

fostering critical thinking skills

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Chiropr Osteopat

Fostering critical thinking skills: a strategy for enhancing evidence based wellness care

Jennifer r jamison.

1 School of Chiropractic, Murdoch University, South Street, Perth, Western Australia, 6849, Australia

This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Chiropractic has traditionally regarded itself a wellness profession. As wellness care is postulated to play a central role in the future growth of chiropractic, the development of a wellness ethos acceptable within conventional health care is desirable.

This paper describes a unit which prepares chiropractic students for the role of "wellness coaches". Emphasis is placed on providing students with exercises in critical thinking in an effort to prepare them for the challenge of interfacing with an increasingly evidence based health care system.

This case study describes how health may be promoted and disease prevented through development of personalized wellness programs. As critical thinking is essential to the provision of evidence based wellness care, diverse learning opportunities for developing and refining critical thinking skills have been created. Three of the learning opportunities are an intrinsic component of the subject and, taken together, contributed over 50% of the final grade of the unit. They include a literature review, developing a client wellness contract and peer evaluation. In addition to these 3 compulsory exercises, students were also given an opportunity to develop their critical appraisal skills by undertaking voluntary self- and unit evaluation. Several opportunities for informal self-appraisal were offered in a structured self-study guide, while unit appraisal was undertaken by means of a questionnaire and group discussion at which the Head of School was present.

Formal assessment showed all students capable of preparing a wellness program consistent with current thinking in contemporary health care. The small group of students who appraised the unit seemed to value the diversity of learning experiences provided. Opportunities for voluntary unit and self-appraisal were used to varying degrees.

Unit evaluation provided useful feedback that led to substantial changes in unit structure.

Students have demonstrated themselves capable of applying critical thinking in construction of evidence based wellness programs. With respect to unit design, selective use of learning opportunities highlighted the desirability of using obligatory learning opportunities to ensure exposure to core constructs while student feedback was found to provide useful information for enriching unit review.

It is hoped inclusion of critical thinking learning opportunities in the undergraduate chiropractic curriculum will contribute to the development of an evidence based ethos in chiropractic care.

Health care has long been regarded as an art and a science. In contemporary conventional health care the 'science' dimension has increasingly come to dominate the 'art' of health care. At the undergraduate level this has been expressed as enhanced emphasis in the training of future physicians on searching and critically evaluating the available literature utilizing electronic and other databases [ 1 ]. At the level of the health care system allopathic disciplines are encouraging critical and empirical evaluation of alternative medical techniques [ 2 , 3 ]. Evidence based medicine {EBM} has become the new health care mantra and is largely pursued through critical evaluation of individual research studies, systematic reviews of studies in a particular area or practice, evidence-based practice guidelines outlining standards for the profession, and evidence-based systems of care focusing on implementation [ 4 ]. In each of these pursuits critical thinking emerges as a requisite skill.

Despite chiropractic's philosophy of vitalism contrasting sharply with the "mechanistic" foundations of orthodox medicine, there are some in the chiropractic profession who welcome this development. Not only may the development of evidence-based guidelines in chiropractic practice insulate against malpractice lawsuits, they may also improve relations between chiropractic and the health care system and better enable the chiropractic profession achieve is foremost goal of serving as a portal of entry into the health care system with chiropractors functioning as primary contact practitioners.

In addition to chiropractic functioning at the community-health care system interface [ 5 ], the chiropractic profession considers itself a provider of wellness care and this is subsumed under the mantel of maintenance care [ 6 ]. "Maintenance" or "wellness" care involves regular, ongoing visits that is not correlated directly to symptomatology. However George B. McClelland, DC, Chairman ACA Board of Governors has stated "Philosophically the idea of regular spinal manipulative therapy opposes the concept of wellness" [ 7 ]. Furthermore it has been suggested that: "...the proposition of chiropractic as a "wellness profession" is not defensible." [ 8 ]. Conventional health care would concur given that there are those in the chiropractic profession whose practice of wellness care is limited to correcting subluxations. While the notion that mechanical and functional disorders of the spine, expressed as subluxations, can degrade health and correction of spinal disorders by adjustments may restore health is fundamental to chiropractic thinking, there is no scientifically acceptable data to support this belief. Furthermore, wellness care calls for a holistic approach and the desirability for the chiropractic profession to explore a more comprehensive approach to wellness care is apparent given the Institute of Alternative Futures report Future of Chiropractic Revisited: 2005 to 2010 , which suggested possible growth scenarios for chiropractic were as "wellness coaches" or as "healthy life doctors" with a wellness mindset.

If chiropractic is to evolve as a wellness profession in an increasingly evidence based health care system, it would seem necessary that it critically appraise its current wellness practices and adopt a schema in which its practitioners serve as motivators and educators. One initiative which may contribute to this end is to include in undergraduate education units which encourage critical thinking in the context of health promotion and disease prevention. Murdoch university provides their third year chiropractic students with just such a learning opportunity.

Critical thinking skills are thoughtfully being incorporated into the curriculum of nursing [ 9 , 10 ] and medical programs [ 11 ], at both under- and post graduate levels [ 12 - 14 ].

Critical thinking is regarded as purposeful, self-regulatory judgment. In addition to evaluating whether arguments are strong, weak or relevant, critical thinking involves inferring degrees of truth from given data; recognizing unstated assumptions underlying assertions; deducing whether conclusions necessarily follow from given statements and interpreting and weighing evidence to decide if generalizations are warranted [ 15 ]. It is commonly accepted that critical thinking can be taught. Diverse learning opportunities have been shown to facilitate the development and acquisition of this skill ranging from concept mapping [ 10 ], through critical questioning workshops [ 11 ] and systematic literature reviews [ 13 ] to problem based learning [ 14 ]. Problem based learning programs create scenarios in which prior knowledge is activated in a meaningful context thereby encouraging elaboration and organization of knowledge [ 16 ]. Students in problem based curricula demonstrate an enhanced ability to apply science based concepts to their explanations [ 17 ]. While problem based learning appears to be particularly useful for refining reasoning skills, integration of critical thinking in all areas of learning has been found a useful strategy for fostering this ability [ 18 ].

This paper describes how a preclinical unit has been structured to include diverse learning opportunities for applying critical thinking skills in the context of wellness. It illustrates how students can be given opportunities to practice critical thinking as a prelude to practicing evidence based health care.

Case Presentation

Unit design.

Health Promotion and Nutritional Management is a subject taught in the third year of a 5 year chiropractic program at Murdoch University. The broad aims of this unit are to:

1. Provide the student with a strategy for implementing personal wellness programs in clinical practice.

2. Enable the student to critically explore the contribution of lifestyle interventions, including the use of nutrients in therapeutic doses, in health promotion, disease prevention and management.

3. Alert the student to the early signs and symptoms suggestive of some lifestyle modifiable diseases prevalent in primary practice.

The learning objectives are to:

• Enhance wellness through recruitment of wellness triggers; identification and reduction of lifestyle risk factors; promotion of fitness; and provide early diagnosis and management, using lifestyle interventions and nutritional therapy, of selected diseases prevalent in primary practice.

• Empower patients to take increased personal responsible for their health care through formulation of wellness contracts by performing a personal health status appraisal; screening patients to ascertain their risk of prevalent diseases; negotiating health goals through examination of patient's perceived and professionally assessed health needs; determining potential barriers, including cultural, socio-economic factors, to implementation of health promotion and disease prevention strategies; negotiating a health promotion and disease prevention plan; implementing a personalized health management program; monitoring patient progress and modify the health contract, as required.

• Analyze the patient's preferred interaction style and adapt ones mode of clinical care as required.

• Critically appraise relevant literature and apply evidence-based problem solving to promote wellness.

• Implement a self-care wellness program.

The unit provides a classroom learning experience which runs for 6 weeks, and a structured self-learning guide, complemented by WebCT, a computer based learning platform, which runs for 13 weeks of the semester. The unit has been designed to enhance active and encourage independent learning and provides 5 distinct opportunities for developing and refining critical thinking skills. The 5 critical thinking opportunities provided ranged from client health assessment, peer evaluation and literature review, which together contribute almost 60% of the final grade, to voluntary self-assessment and finally unit evaluation.

1 – Self-Assessment

The self-assessment learning experiences are embedded in the structured self-study learning guide. The learning guide has been structured to provide students with a opportunity to undertake continuous formative self-assessment. Figure ​ Figure1 1 shows the template used in the structured self-directed learning guide and depicts the guideposts to the self-assessment critical appraisal opportunities provided by the challenge and review questions and self-care tasks. The factual content of the unit is covered in 25 discrete topics each of which contains a unique learning template. For each topic the student is provided with self-assessment opportunities to:

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-1.jpg

Acquiring good habits.

• Critically review their learning by completing challenge and review questions based on the content of that topic. The student has the opportunity to monitor their grasp and recall of factual information.

• Apply the information provided in that topic to their lifestyle and formulate a personal wellness program. The student is given the opportunity to preview construction of a wellness program in a non-threatening environment and simultaneously embrace a self-care system based on a lifetime of health choices.

2 – A Client Wellness Program

Students who chose to prepare a personal wellness program are particularly well prepared when required to formulate the formal client wellness program. Formulating a wellness program for a client passes through a number of critical thinking steps. Students are required to undertake critical appraisal of a client's lifestyle with respect to their good and bad habits and, given their family history, ascertain the client's health risk. They are then required to identify health needs and, in negotiation with the client, develop a list of wellness goals. The next steps are to make the client aware of diverse strategies for achieving these goals, help them select and then implement those strategies appropriate to their lifestyle. The student is then required to monitor the client's wellness program and adapt the program as needed to meet ongoing client successes, failures and changing needs. See Figure ​ Figure2 2 .

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-2.jpg

Preparation of client wellness program.

3 – Peer Evaluation

The peer evaluation task is closely linked to the wellness program. Students are asked to appraise the wellness contract prepared by another student. They are encouraged to analyze all aspects of the program with a view to making useful suggestions on how the program may be improved. See figure ​ figure3. 3 . Marks are scored for constructive criticisms that provide feedback which enhances the learning of the program originator and potentially improves the wellness outlook of the client.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-3.jpg

Guidelines for peer assessment.

4 – Literature Appraisal

The ability to assess the scientific validity of information is increasingly recognized as an essential competence in a profession which is increasingly embracing the notion of evidence based practice. It is therefore imperative that students are given opportunities to critically evaluate the literature. For this exercise students are required to rank evidence according to the system developed by the Canadian Task Force and the US Preventive Services Task Force [ 19 , 20 ]. The guidelines for the nutritional literature review included as part of the students' formal in this unit can be found in Figure ​ Figure4 4 [ 21 ].

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-4.jpg

Critiquing the research literature.

Along with the client wellness program and its critique, the students' literature review contributes over half of the total grade for the unit.

5 – Unit Appraisal

In contrast to peer-, client- and literature assessment, students are given an optional opportunity to critically appraise the unit. Unit appraisal takes two forms. An informal questionnaire survey of student opinion initiated by the lecturer, see Figure ​ Figure5, 5 , and a formal group discussion. All students are invited to participate in the group discussion which forms part of the formal School's assessment of the unit. The Head of School is present for and leads these discussions.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-5.jpg

The questionnaire.

Summative student assessment found students could competently prepare a client wellness program. Analysis of client wellness programs submitted for formal assessment confirmed that students had mastered the skills required to achieve this objective. All students demonstrated the ability to appraise their client's lifestyle, prepare and monitor a wellness program Most students were demonstrably competent to ascertain their client's individual disease risk or health hazard as based on a family history and lifestyle. All but 2 students commented on the preferred behaviour style of their client and took this into consideration when formulating their wellness program. A few students took their own preferred behaviour style into consideration and analysed how this may be modified to best suit the client.

In contrast to their success at developing a wellness program, formal assessment of the peer appraisal assignment suggested they found critiquing a wellness program more demanding than constructing one. While all students provided satisfactory comment on the structure and content of an others wellness program, some students faltered when required to provide useful information for refining the initial program.

Formal assessment of the students' critical appraisal of the literature found all students capable of searching the literature and extracting relevant papers. Furthermore, most students were able to compare and discuss conflicting research reports and many showed themselves capable of commenting on potential biases resulting from flaws in research design. However, few categorized the level of evidence provided according to the schema proposed by the Canadian and US Preventive Taskforces.

In contrast to the above compulsory critical thinking opportunities, few students availed themselves of the opportunities offered for unit assessment. The unit survey provided insight into the students' appraisal of the unit as a whole as well as specifically provided feedback on their evaluation of various critical thinking opportunities. Of a class of some 60 students, a total of 22 completed the survey. Consistent with the ethos of independent learning, attendance is optional except when students are required to present their critique of the nutrition literature. The unit survey was completed by 17 students who voluntarily attended lectures and by a further 5 students who were required to do their class presentation on the day of the survey.

Half the students participating selected lectures as their most preferred learning style, a finding verified when ranked preferences were analyzed on a Likert type scale. Figure ​ Figure6 6 describes the overall unit rating. Eighteen students regarded the unit as highly relevant to their future practice as a chiropractor, 3 were uncertain and 1 felt it was irrelevant. The students' self-assessment of their critical reading/learning opportunity is reported in Figure ​ Figure7 7 which provides an overview of the perceived usefulness of the study guide, the essential reading and study questions. Linking study questions with the unit's content provided an opportunity for active learning and critical interpretation of new information. It also provided an opportunity for self-assessment. Two students indicated they had not attempted any of the study questions.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-6.jpg

Overall unit rating.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-7.jpg

Appraisal of the Structured Self-study guide.

A Likert type scale was used to ascertain which of the learning experiences students perceived as most valuable. Students who indicated they hadn't performed or who had attended less than half of the sessions offered for a particular activity were deemed unqualified to comment and excluded from analysis of that activity. A score of 5 per student was allocated to each activity rated as an excellent learning experience, 4 was allocated for an activity rated as good, 3 for a fair learning experience and 2 per student for activities rated as a waste of time. The score derived was then divided by the number of respondents to that item and the final score was used to rank learning experiences. On this arbitrary scale the most valued learning experiences, WebCT challenge and study questions, each achieved a total of 3.8; the least appreciated, student presentation, a value of 2.57. Figure ​ Figure8 8 shows how students appraised the popular WebCT challenge compared to the self-care and student presentation learning experiences. The WebCT challenge provided students with a formative self-assessment opportunity to evaluate the acquisition of factual knowledge which would be later tested in formal summative examination of the unit. Despite this imperative, 7 students had not used the WebCT challenge, similarly 7 had not implemented any self-care tasks.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-8.jpg

Appraisal of Diverse Learning Opportunities.

This trend extended to student presentation. Five {5} respondents indicated they had attended less than half the possible student presentations. Student presentations emerged, both in the questionnaire and in small group evaluation of the unit, to be regarded as 'a waste of time'. Clarification identified that although students found the literature search and data analysis to be useful, the classroom format was regarded as 'boring' and too time consuming. This perspective was confirmed by the group of 6 students who attended the formal unit assessment conducted by the Head of School. Despite the negative classroom learning experience, the students attending the formal unit evaluation indicated they regarded the ability to critically analyze the literature an important component of their training. Furthermore, as shown in Figure ​ Figure9, 9 , two out of 3 respondents felt they had the analytical skills to assess the scientific validity of information if they were provided with details of the research methods used, a perception was verified on formal assessment.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-9.jpg

Perceived ability after 5 weeks: Students perception of learning.

Based on the learning they had experienced during the first 5 weeks of the semester, students were asked whether they believed themselves capable of preparing a client wellness contract. Figure ​ Figure9 9 shows the majority of students judged themselves capable of evaluating a client's good habits, determining and changing a client's bad habits and assessing and performing a non-invasive health hazard appraisal. Formal assessment confirmed their optimism. In contrast the confidence of respondents with regard to their ability to undertake peer evaluation, see Figure ​ Figure10, 10 , was not confirmed on formal assessment.

An external file that holds a picture, illustration, etc.
Object name is 1746-1340-13-19-10.jpg

Perceived ability after 5 weeks: Confidence to undertake critical appraisal.

While it is unclear whether the correction of subluxations makes a unique contribution to wellness, it is apparent that care beyond an adjustment is required if chiropractors are to take the role of 'wellness coaches' or "healthy life doctors" in conventional health care. Wellness is a growth industry and the scientific basis of many wellness practices is uncertain. Critical thinking is fundamental to and regarded an important educational objective in the preparation of health professionals as evidence based carers [ 22 ]. Problem based learning scenarios have been found to be conducive to developing critical thinking skills in the classroom [ 14 - 18 ] and on the internet [ 23 ]. This paper described how by combining classroom interaction with paper based and internet self-study opportunities various learning opportunities have been created to enhance critical thinking in a wellness context.

Upon completion of the unit, formal assessment found students capable of formulating and administering a client wellness program, undertaking peer review and critically appraising the literature. These findings were largely consistent with the perceptions of the small group of students who chose to evaluate the unit. While any extrapolation of the results of the unit evaluation to the whole class is precluded due to the small size of the participating group, the results of this exercise did provide useful information for future planning. Marked discrepancies emerged with respect to the preferred learning opportunities of different students in the respondent group. Given that the majority of students completing the unit assessment were voluntarily attending a classroom learning experience, it was perhaps not surprising that overall they indicated a clear preference for lecture based learning. It seems not unreasonable to surmise that at least some of their colleagues, who chose to omit classroom learning, preferred a more independent scenario. When structuring a unit it may therefore be prudent to consider providing diverse learning scenarios for acquiring similar knowledge, skills and attitudes to cater for the learning needs of different individuals. Another red flag which emerged from this study is the necessity to incorporate compulsory learning opportunities. Although WebCT and self-study questions were the learning opportunities most favored by the majority of respondents, there were those who had not utilized these learning measures. While students with different learning styles may be expected to avail themselves of different learning opportunities, it should be noted that students were aware that these self-assessment learning experiences covered content in a format similar to the proposed end of semester examination. As some students, despite this incentive chose to omit these learning experiences the need for compulsory completion of selected learning task seems advantageous. In unit planning, it would certainly seem desirable to ensure that knowledge and skills considered fundamental to chiropractic practice are included in diverse obligatory tasks.

Consistent with the ethos of student centred learning, student unit evaluation provides useful feedback for future planning. In this instance, unit modifications in response to criticisms leveled at the format of the student presentations promises to enrich the unit for future students. While retaining the central theme of demonstrating proficiency in critically appraising the literature, the delivery mode will be modified from student presentation to student debate. For example, instead of being asked to discuss the scientific basis for the use of Echinacea, the challenge will be for 2 teams to use scientifically justifiable arguments for and against the statement "Echinacea can be used to prevent the common cold".

This paper described diverse learning experiences designed to enhance critical thinking skills in the context of wellness. By using various modalities in diverse problem solving formats the classroom, internet and a study guide have been combined to create independent, structured self-learning situations. Results of summative student assessment showed students capable of developing a personalized client wellness program consistent with current thinking in conventional health care. By providing a diversity of critical thinking learning opportunities, the more fundamental of which are compulsory, it is hoped that this unit will contribute to the graduation of chiropractors better prepared to interface as 'wellness coaches' or 'healthy life doctors' within an evidence based health care system.

  • Cohen L. McMaster's pioneer in evidence-based medicine now spreading his message in England. CMAJ. 1996; 154 :388–90. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Vickers AJ. Message to complementary and alternative medicine: evidence is a better friend than power. BMC Complement Altern Med. 2001; 1 :1. doi: 10.1186/1472-6882-1-1. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Koutouvidis N. CAM and EBM: arguments for convergence. J R Soc Med. 2004; 97 :39–40. doi: 10.1258/jrsm.97.1.39. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Villanueva-Russell Y. Evidence-based medicine and its implications for the profession of chiropractic. Soc Sci Med. 2005; 60 :545–61. doi: 10.1016/j.socscimed.2004.05.017. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jamison JR. The chiropractic doctor. J Manipulative Physiol Ther. 1985; 8 :245–9. [ PubMed ] [ Google Scholar ]
  • Jamison JR, Rupert R. Maintenance care: towards a global description. J Canadian Chiropractic Assoc. 2001; 25 :100–105. [ Google Scholar ]
  • Personal communication
  • Nelson CF, Lawrence D, Triano JJ, Bronfort G, Perle SM, Metz D, et al. Chiropractic as spine care: a model for the profession. Chiropractic & Osteopathy. 2005; 13 :9. doi: 10.1186/1746-1340-13-9. (6 July 2005) [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Baker CM. Problem-based learning for nursing: integrating lessons from other disciplines with nursing experiences. J Prof Nurs. 2000; 16 :258–66. doi: 10.1053/jpnu.2000.9461. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wheeler LA, Collins SK. The influence of concept mapping on critical thinking in baccalaureate nursing students. J Prof Nurs. 2003; 19 :339–46. doi: 10.1016/S8755-7223(03)00134-0. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Loy GL, Gelula MH, Vontver LA. Teaching students to question. Am J Obstet Gynecol. 2004; 191 :1752–6. doi: 10.1016/j.ajog.2004.07.071. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Abraham RR, Upadhya S, Torke S, Ramnarayan K. Clinically oriented physiology teaching: strategy for developing critical-thinking skills in undergraduate medical students. Adv Physiol Educ. 2004; 28 :102–4. doi: 10.1152/advan.00001.2004. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lang TA. The value of systematic reviews as research activities in medical education. Acad Med. 2004; 79 :1067–72. doi: 10.1097/00001888-200411000-00011. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Morales-Mann ET, Kaitell CA. Problem-based learning in a new Canadian curriculum. J Adv Nurs. 2001; 33 :13–9. doi: 10.1046/j.1365-2648.2001.01633.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Thorpe K, Loo RC. Critical-thinking types among nursing and management undergraduates. Nurse Education Today. 2003; 23 :566–574. doi: 10.1016/S0260-6917(03)00102-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Charlin B, Mann K, Hansen P. The many faces of problem-based learning: A framework for understanding and comparison. Medical Teacher. 1998; 20 :323–330. doi: 10.1080/01421599880300. [ CrossRef ] [ Google Scholar ]
  • Hmelo CE. Cognitive consequences of problem-based learning for the early development of medical expertise. Teaching & Learning in Medicine. 1998; 10 :92–100. doi: 10.1207/S15328015TLM1002_7. [ CrossRef ] [ Google Scholar ]
  • Chenoweth L. Facilitating the process of critical thinking for nursing. Nurse Educ Today. 1998; 18 :281–92. doi: 10.1016/S0260-6917(98)80045-2. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Canadian Task Force The periodic health examination. Can Med Assoc J. 1979; 121 :1193–254. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Report of the US Preventive Services Task Force: Guide to clinical preventive services. 2. Baltimore: Williams & Wilkins; 1996. [ Google Scholar ]
  • Jamison JR. Clinical guide to nutrition and dietary supplements in disease management. Edinburgh: Churchill Livingstone; 2003. p. 6. Box 1.1. [ Google Scholar ]
  • Kawashima A, Petrini MA. Study of critical thinking skills in nursing students and nurses in Japan. Nurse Educ Today. 2004; 24 :286–92. doi: 10.1016/j.nedt.2004.02.001. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGrath D. Teaching on the front lines: using the Internet and problem-based learning to enhance classroom teaching. Holist Nurs Pract. 2002; 16 :5–13. [ PubMed ] [ Google Scholar ]
  • Departments, units, and programs
  • College leadership
  • Diversity, equity, and inclusion
  • Faculty and staff resources
  • LAS Strategic Plan

Facebook

  • Apply to LAS
  • Liberal arts & sciences majors
  • LAS Insider blog
  • Admissions FAQs
  • Parent resources
  • Pre-college summer programs

Quick Links

Request info

  • Academic policies and standing
  • Advising and support
  • College distinctions
  • Dates and deadlines
  • Intercollegiate transfers
  • LAS Lineup student newsletter
  • Programs of study
  • Scholarships
  • Certificates
  • Student emergencies

Student resources

  • Access and Achievement Program
  • Career services
  • First-Year Experience
  • Honors program
  • International programs
  • Internship opportunities
  • Paul M. Lisnek LAS Hub
  • Student research opportunities
  • Expertise in LAS
  • Research facilities and centers
  • Dean's Distinguished Lecture series
  • Alumni advice
  • Alumni award programs
  • Get involved
  • LAS Alumni Council
  • LAS@Work: Alumni careers
  • Study Abroad Alumni Networks
  • Update your information
  • Nominate an alumnus for an LAS award
  • Faculty honors
  • The Quadrangle Online
  • LAS News email newsletter archive
  • LAS social media
  • Media contact in the College of LAS
  • LAS Landmark Day of Giving
  • About giving to LAS
  • Building projects
  • Corporate engagement
  • Faculty support
  • Lincoln Scholars Initiative
  • Impact of giving

Why is critical thinking important?

What do lawyers, accountants, teachers, and doctors all have in common?

Students in the School of Literatures, Languages, Cultures, and Linguistics give a presentation in a classroom in front of a screen

What is critical thinking?

The Oxford English Dictionary defines critical thinking as “The objective, systematic, and rational analysis and evaluation of factual evidence in order to form a judgment on a subject, issue, etc.” Critical thinking involves the use of logic and reasoning to evaluate available facts and/or evidence to come to a conclusion about a certain subject or topic. We use critical thinking every day, from decision-making to problem-solving, in addition to thinking critically in an academic context!

Why is critical thinking important for academic success?

You may be asking “why is critical thinking important for students?” Critical thinking appears in a diverse set of disciplines and impacts students’ learning every day, regardless of major.

Critical thinking skills are often associated with the value of studying the humanities. In majors such as English, students will be presented with a certain text—whether it’s a novel, short story, essay, or even film—and will have to use textual evidence to make an argument and then defend their argument about what they’ve read. However, the importance of critical thinking does not only apply to the humanities. In the social sciences, an economics major , for example, will use what they’ve learned to figure out solutions to issues as varied as land and other natural resource use, to how much people should work, to how to develop human capital through education. Problem-solving and critical thinking go hand in hand. Biology is a popular major within LAS, and graduates of the biology program often pursue careers in the medical sciences. Doctors use critical thinking every day, tapping into the knowledge they acquired from studying the biological sciences to diagnose and treat different diseases and ailments.

Students in the College of LAS take many courses that require critical thinking before they graduate. You may be asked in an Economics class to use statistical data analysis to evaluate the impact on home improvement spending when the Fed increases interest rates (read more about real-world experience with Datathon ). If you’ve ever been asked “How often do you think about the Roman Empire?”, you may find yourself thinking about the Roman Empire more than you thought—maybe in an English course, where you’ll use text from Shakespeare’s Antony and Cleopatra to make an argument about Roman imperial desire.  No matter what the context is, critical thinking will be involved in your academic life and can take form in many different ways.

The benefits of critical thinking in everyday life

Building better communication.

One of the most important life skills that students learn as early as elementary school is how to give a presentation. Many classes require students to give presentations, because being well-spoken is a key skill in effective communication. This is where critical thinking benefits come into play: using the skills you’ve learned, you’ll be able to gather the information needed for your presentation, narrow down what information is most relevant, and communicate it in an engaging way. 

Typically, the first step in creating a presentation is choosing a topic. For example, your professor might assign a presentation on the Gilded Age and provide a list of figures from the 1870s—1890s to choose from. You’ll use your critical thinking skills to narrow down your choices. You may ask yourself:

  • What figure am I most familiar with?
  • Who am I most interested in? 
  • Will I have to do additional research? 

After choosing your topic, your professor will usually ask a guiding question to help you form a thesis: an argument that is backed up with evidence. Critical thinking benefits this process by allowing you to focus on the information that is most relevant in support of your argument. By focusing on the strongest evidence, you will communicate your thesis clearly.

Finally, once you’ve finished gathering information, you will begin putting your presentation together. Creating a presentation requires a balance of text and visuals. Graphs and tables are popular visuals in STEM-based projects, but digital images and graphics are effective as well. Critical thinking benefits this process because the right images and visuals create a more dynamic experience for the audience, giving them the opportunity to engage with the material.

Presentation skills go beyond the classroom. Students at the University of Illinois will often participate in summer internships to get professional experience before graduation. Many summer interns are required to present about their experience and what they learned at the end of the internship. Jobs frequently also require employees to create presentations of some kind—whether it’s an advertising pitch to win an account from a potential client, or quarterly reporting, giving a presentation is a life skill that directly relates to critical thinking. 

Fostering independence and confidence

An important life skill many people start learning as college students and then finessing once they enter the “adult world” is how to budget. There will be many different expenses to keep track of, including rent, bills, car payments, and groceries, just to name a few! After developing your critical thinking skills, you’ll put them to use to consider your salary and budget your expenses accordingly. Here’s an example:

  • You earn a salary of $75,000 a year. Assume all amounts are before taxes.
  • 1,800 x 12 = 21,600
  • 75,000 – 21,600 = 53,400
  • This leaves you with $53,400
  • 320 x 12 = 3,840 a year
  • 53,400-3,840= 49,560
  • 726 x 12 = 8,712
  • 49,560 – 8,712= 40,848
  • You’re left with $40,848 for miscellaneous expenses. You use your critical thinking skills to decide what to do with your $40,848. You think ahead towards your retirement and decide to put $500 a month into a Roth IRA, leaving $34,848. Since you love coffee, you try to figure out if you can afford a daily coffee run. On average, a cup of coffee will cost you $7. 7 x 365 = $2,555 a year for coffee. 34,848 – 2,555 = 32,293
  • You have $32,293 left. You will use your critical thinking skills to figure out how much you would want to put into savings, how much you want to save to treat yourself from time to time, and how much you want to put aside for emergency funds. With the benefits of critical thinking, you will be well-equipped to budget your lifestyle once you enter the working world.

Enhancing decision-making skills

Choosing the right university for you.

One of the biggest decisions you’ll make in your life is what college or university to go to. There are many factors to consider when making this decision, and critical thinking importance will come into play when determining these factors.

Many high school seniors apply to colleges with the hope of being accepted into a certain program, whether it’s biology, psychology, political science, English, or something else entirely. Some students apply with certain schools in mind due to overall rankings. Students also consider the campus a school is set in. While some universities such as the University of Illinois are nestled within college towns, New York University is right in Manhattan, in a big city setting. Some students dream of going to large universities, and other students prefer smaller schools. The diversity of a university’s student body is also a key consideration. For many 17- and 18-year-olds, college is a time to meet peers from diverse racial and socio-economic backgrounds and learn about life experiences different than one’s own.

With all these factors in mind, you’ll use critical thinking to decide which are most important to you—and which school is the right fit for you.

Develop your critical thinking skills at the University of Illinois

At the University of Illinois, not only will you learn how to think critically, but you will put critical thinking into practice. In the College of LAS, you can choose from 70+ majors where you will learn the importance and benefits of critical thinking skills. The College of Liberal Arts & Sciences at U of I offers a wide range of undergraduate and graduate programs in life, physical, and mathematical sciences; humanities; and social and behavioral sciences. No matter which program you choose, you will develop critical thinking skills as you go through your courses in the major of your choice. And in those courses, the first question your professors may ask you is, “What is the goal of critical thinking?” You will be able to respond with confidence that the goal of critical thinking is to help shape people into more informed, more thoughtful members of society.

With such a vast representation of disciplines, an education in the College of LAS will prepare you for a career where you will apply critical thinking skills to real life, both in and outside of the classroom, from your undergraduate experience to your professional career. If you’re interested in becoming a part of a diverse set of students and developing skills for lifelong success, apply to LAS today!

Read more first-hand stories from our amazing students at the LAS Insider blog .

  • Privacy Notice
  • Accessibility

IMAGES

  1. The benefits of critical thinking for students and how to develop it

    fostering critical thinking skills

  2. Critical Thinking Definition, Skills, and Examples

    fostering critical thinking skills

  3. 6 Ways to Improve Critical Thinking at Work

    fostering critical thinking skills

  4. Critical_Thinking_Skills_Diagram_svg

    fostering critical thinking skills

  5. Critical Thinking Skills

    fostering critical thinking skills

  6. Foster Your Child's Critical & Creative Thinking by Taking a Brain

    fostering critical thinking skills

VIDEO

  1. Introduction to Critical Thinking

  2. Unleashing the Power of AI in Education Fostering Critical Thinking Skills

  3. Reality TV as a Teaching Tool

  4. How to Foster Critical Thinking and Problem Solving Skills in the Classroom

  5. 5 Innovative Mindsets at AISM

  6. Fostering Critical Thinking in ELT Education

COMMENTS

  1. Full article: Fostering critical thinking skills in secondary education

    Our critical thinking skills framework. The focus on critical thinking skills has its roots in two approaches: the cognitive psychological approach and the educational approach (see for reviews, e.g. Sternberg Citation 1986; Ten Dam and Volman Citation 2004).From a cognitive psychological approach, critical thinking is defined by the types of behaviours and skills that a critical thinker can show.

  2. Fostering and assessing student critical thinking: From theory to

    It is not meant to assess a critical thinking exercise, but any exercise in which students have space to develop their critical thinking skills. Product refers to a visible final student work (for example the response to a problem, an essay, an artefact of a performance). The criteria are meant to assess the student's work even if the learning ...

  3. Eight Instructional Strategies for Promoting Critical Thinking

    Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care ...

  4. Fostering Creativity and Critical Thinking in College: A Cross-Cultural

    There have also been studies in which students' scientific inquiry and critical thinking skills have improved by taking a course designed with specific science thinking and reasoning modules (Stevens and ... Tsui L. (1998). Fostering Critical Thinking in College Students: A Mixed-Methods Study of Influences Inside and Outside of the Classroom ...

  5. Fostering Students' Creativity and Critical Thinking

    A portfolio of rubrics was developed during the OECD-CERI project Fostering and Assessing Creativity and Critical Thinking Skills in Education. Conceptual rubrics were designed to clarify "what counts" or "what sub-skills should be developed" in relation to creativity and critical thinking and to guide the design of lesson plans and ...

  6. Bridging critical thinking and transformative learning: The role of

    By contrast, using critical thinking skills in a way that results in transformative learning will likely include a state of doubt as a pivotal stage in the process. ... One such text is David Foster Wallace's personal essay Consider the Lobster. The primary purpose of this essay is not to make a theoretical argument, but to present the ...

  7. Systematic review of problem based learning research in fostering

    The study's conclusions indicate that the only form of training that can aid students in fostering their critical thinking skills is problem-based learning. The purpose of this research is to review articles from indexed journals that explore how well PBL model foster critical thinking skills. This paper discusses aim to describing the research ...

  8. Fostering and assessing creativity and critical thinking in education

    Creativity and critical thinking are key skills for the complex and globalized economies and societies of the 21st century. There is a growing consensus that higher education systems and institutions should cultivate these skills with their students. However, too little is known about what this means for everyday teaching and assessment ...

  9. Repeated Retrieval Practice to Foster Students' Critical Thinking Skills

    There is a need for effective methods to teach critical thinking. Many studies on other skills have demonstrated beneficial effects of practice that repeatedly induces retrieval processes (repeated retrieval practice). The present experiment investigated whether repeated retrieval practice is effective for fostering critical thinking skills, focusing on avoiding biased reasoning. Seventy-five ...

  10. PDF Fostering Critical Thinking in the Classroom

    foster critical thinking skills in students. In this activity, students participate in a critical thinking exercise on how to evaluate a problem presented and how to respond to a particular letter. Often, the inability to resolve issues or problems is linked to their inability to weed out unimportant or irrelevant information.

  11. Fostering Children's Thinking Skills

    This module on Fostering Children's Thinking Skills fits into the foundation of the house. A strong foundation of engaging interactions and environments is crucial to children's school readiness. Teachers have many, many interactions with children every day. By taking advantage of these interactions, teachers can maximize children's learning.

  12. Fostering Critical Thinking, Reasoning, and Argumentation Skills ...

    Developing a position on a socio-scientific issue and defending it using a well-reasoned justification involves complex cognitive skills that are challenging to both teach and assess. Our work centers on instructional strategies for fostering critical thinking skills in high school students using bioethical case studies, decision-making frameworks, and structured analysis tools to scaffold ...

  13. Innovative Teaching Strategies to Foster Critical Thinking: a Review

    Abstract : "Critical thinking skills as the mental processes required in processing information, solving problems, making a. decision, and thinking critically. The teachers need to incorporate ...

  14. Skills for Life: Fostering Critical Thinking

    Critical thinking has become key to the skill set that people should develop not only to have better prospects in the labor market, but also a better personal and civic life. This brief shows how policymakers and teachers can help students develop their critical thinking skills. First, this brief defines critical thinking skills. Then, the brief shows how the concept can be translated into ...

  15. PDF FOSTERING CRITICAL THINKING SKILLS IN STUDENTS WITH LEARNING ...

    International Conference e-Learning 2014. century, there is an increased demand for individuals with higher order cognitive skills like problem-solving and critical thinking (Behar-Horenstein & Niu, 2011). Critical thinking is defined by Glaser (1942) as an outlook and analytical application toward problem solving.

  16. PDF Fostering Critical Thinking Skills Using Integrated STEM Approach among

    The lack of critical thinking skills in the nation's future workforce will negatively affect the quest to compete effectively in the global market and also impede the nation's quest for sustainable development. Therefore, this study examined fostering critical thinking skills employing an integrated STEM approach among secondary school

  17. (PDF) Methodologies for Fostering Critical Thinking Skills from

    Developing critical thinking skills appears to be a challenge for higher education institutions. However, little is known about the students' points of view regarding the methodologies they ...

  18. Fostering critical thinking skills: a strategy for enhancing evidence

    Critical thinking skills are thoughtfully being incorporated into the curriculum of nursing [9,10] and medical programs , at both under- and post graduate levels [12-14]. ... integration of critical thinking in all areas of learning has been found a useful strategy for fostering this ability .

  19. Why is critical thinking important?

    The importance of critical thinking can be found across a wide set of disciplines. They are not only used in the humanities but are also important to professionals in the social and behavioral sciences, physical sciences, and STEM—and the list does not end there. At the University of Illinois College of Liberal Arts & Sciences, you'll be ...

  20. Fostering critical thinking skills in secondary education to prepare

    ABSTRACT Although the importance of critical thinking skills for students when they enter university is widely endorsed, previous research has shown that incoming students show great variation in levels of critical thinking skills. The pre-university track of secondary education plays a major role in preparing students to think critically at university.

  21. PDF Fostering critical thinking, creativity, and language learning

    students' critical thinking was improved with the use of problem-based learning, compared with traditional lectures. Similarly, the findings of Tseng et al.'s (2011) study with 120 Taiwanese registered nursing students, showed that PBL was more effective developing students' critical thinking skills than concept mapping. The authors adopted

  22. (PDF) Fostering Students' Critical Thinking Skills in EFL Advanced

    Therefore, this paper presents a study on fostering EFL. students' critical thinking skills in the advanced classroom. Based on a qualitative method, t his study. aims at exploring EFL teachers ...

  23. 3 Critical Thinking Skills You Need In 2024

    To develop critical thinking for your career success, consider building the following skills: 1. Curiosity. Innovation comes through being curious enough to keep probing and digging for ...