Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

What influences students’ abilities to critically evaluate scientific investigations?

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

ORCID logo

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Visualization, Writing – review & editing

Affiliation Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

Roles Conceptualization, Investigation, Methodology, Writing – review & editing

Roles Conceptualization, Funding acquisition, Methodology, Project administration, Supervision, Writing – review & editing

Roles Conceptualization, Funding acquisition, Methodology, Resources, Supervision, Writing – review & editing

  • Ashley B. Heim, 
  • Cole Walsh, 
  • David Esparza, 
  • Michelle K. Smith, 
  • N. G. Holmes

PLOS

  • Published: August 30, 2022
  • https://doi.org/10.1371/journal.pone.0273337
  • Reader Comments

Table 1

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Citation: Heim AB, Walsh C, Esparza D, Smith MK, Holmes NG (2022) What influences students’ abilities to critically evaluate scientific investigations? PLoS ONE 17(8): e0273337. https://doi.org/10.1371/journal.pone.0273337

Editor: Dragan Pamucar, University of Belgrade Faculty of Organisational Sciences: Univerzitet u Beogradu Fakultet organizacionih nauka, SERBIA

Received: December 3, 2021; Accepted: August 6, 2022; Published: August 30, 2022

Copyright: © 2022 Heim et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Funding: This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0273337.t001

Instrument description

Question types..

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

thumbnail

https://doi.org/10.1371/journal.pone.0273337.t002

Instrument versions.

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews.

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses.

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

thumbnail

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

https://doi.org/10.1371/journal.pone.0273337.g001

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

thumbnail

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

https://doi.org/10.1371/journal.pone.0273337.g002

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

thumbnail

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

https://doi.org/10.1371/journal.pone.0273337.g003

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

thumbnail

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

https://doi.org/10.1371/journal.pone.0273337.g004

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix. eco-blic bass-mayfly scenario prompt..

https://doi.org/10.1371/journal.pone.0273337.s001

S2 Appendix. Eco-BLIC owl-mouse scenario prompt.

https://doi.org/10.1371/journal.pone.0273337.s002

S3 Appendix. PLIC scenario prompt.

https://doi.org/10.1371/journal.pone.0273337.s003

Acknowledgments

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

  • View Article
  • Google Scholar
  • 2. Stein B, Haynes A, Redding M, Ennis T, Cecil M. Assessing critical thinking in STEM and beyond. In: Innovations in e-learning, instruction technology, assessment, and engineering education. Dordrecht, Netherlands: Springer; 2007. pp. 79–82.
  • PubMed/NCBI
  • 19. Carmichael M, Reid A, Karpicke JD. Assessing the impact of educational video on student engagement, critical thinking and learning. Sage Publishing. 2018. Retrieved from: https://au.sagepub.com/en-gb/oce/press/what-impact-does-videohave-on-student-engagement .
  • 26. Krishna Rao MR. Infusing critical thinking skills into content of AI course. In: Proceedings of the 10th annual SIGCSE conference on Innovation and technology in computer science education; 2005 Jun 27. pp. 173–177.
  • 28. Bloom BS. Taxonomy of educational objectives. Vol. 1: Cognitive domain. New York, NY: McKay; 1956.
  • 33. Cramer H. Mathematical methods of statistics. Princeton, NJ: Princeton University Press; 1946.
  • 38. Schwartz DL, Tsang JM, Blair KP. The ABCs of how we learn: 26 scientifically proven approaches, how they work, and when to use them. New York, NY: WW Norton & Company; 2016 Jul 26.
  • 41. Szenes E, Tilakaratna N, Maton K. The knowledge practices of critical thinking. In: The Palgrave handbook of critical thinking in higher education. Palgrave Macmillan, New York; 2015. pp. 573–579.

What influences students' abilities to critically evaluate scientific investigations?

Affiliations.

  • 1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America.
  • 2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America.
  • PMID: 36040903
  • PMCID: PMC9426932
  • DOI: 10.1371/journal.pone.0273337

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students' critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

Grants and funding

  • Interesting
  • Scholarships
  • UGC-CARE Journals

The Art and Science of Critical Thinking in Research: A Guide to Academic Excellence

Dr. Sowndarya Somasundaram

Critical thinking is a fundamental skill in research and academia that involves analyzing, evaluating, and interpreting information in a systematic and logical manner. It is the process of objectively evaluating evidence, arguments, and ideas to arrive at well-reasoned conclusions or make informed decisions.

The art and science of critical thinking in research is a multifaceted and dynamic process that requires intellectual rigor, creativity, and an open mind.

In research, critical thinking is essential for developing research questions, designing research studies, collecting and analyzing data, and interpreting research findings. It allows researchers to evaluate the quality and validity of research studies, identify gaps in the literature, and make evidence-based decisions.

Critical thinking in research also involves being open to alternative viewpoints and being willing to revise one’s own conclusions based on new evidence. It requires intellectual humility and a willingness to challenge one’s own assumptions and biases.

Why Critical Thinking is Important in Research?

Critical thinking is important in research for the following reasons:

Rigor and accuracy

It helps researchers to approach their work with rigor and accuracy, ensuring that the research methods and findings are reliable and valid.

Evaluation of evidence

Critical thinking helps researchers to evaluate the evidence they encounter and determine its relevance and reliability to the research question or hypothesis.

Identification of biases and assumptions

Critical thinking helps research ers to identify their own biases and assumptions and those of others, which can influence the research process and findings.

Problem-solving

It helps researchers to identify and solve problems that may arise during the research process, such as inconsistencies in data or unexpected results.

Development of new ideas

Critical thinking can help researchers develop new ideas and theories based on their analysis of the evidence.

Communication

Critical thinking helps researchers to communicate their findings and ideas in a clear and logical manner, making it easier for others to understand and build on their work.

Therefore, critical thinking is essential for conducting rigorous and impactful research that can advance our understanding of the world around us.

It helps researchers to approach their work with a critical and objective perspective, evaluating evidence and developing insights that can contribute to the advancement of knowledge in their field.

How to develop critical thinking skills in research?

Developing critical thinking skills in research requires a specific set of strategies. Here are some ways to develop critical thinking skills in research:

Evaluate the credibility of sources

In research, it is important to evaluate the credibility of sources to determine if the information is reliable and valid. To develop your critical thinking skills, practice evaluating the sources you encounter and assessing their credibility.

Assess the quality of evidence

Critical thinking in research involves assessing the quality of evidence and determining if it supports the research question or hypothesis. Practice evaluating the quality of evidence and understanding how it impacts the research findings.

Consider alternative explanations

To develop critical thinking skills in research, practice considering alternative explanations for the findings. Evaluate the evidence and consider if there are other explanations that could account for the results.

Challenge assumptions

Critical thinking in research involves challenging assumptions and exploring alternative perspectives. Practice questioning assumptions and considering different viewpoints to develop your critical thinking skills.

Seek out feedback

Seek out feedback from colleagues, advisors, or peers on your research methods and findings. This can help you identify areas where you need to improve your critical thinking skills and provide valuable insights for your research.

Practice analyzing data

Critical thinking in research involves analyzing and interpreting data. Practice analyzing different types of data to develop your critical thinking skills.

Attend conferences and seminars

Attend conferences and seminars in your field to learn about the latest research and to engage in critical discussions with other researchers. This can help you develop your critical thinking skills and keep up-to-date with the latest research in your field.

By consistently practicing these strategies, you can develop your critical thinking skills in research and become a more effective and insightful researcher.

The Art and Science of Critical Thinking in Research

The art and science of critical thinking in research is a vital skill for academic excellence. Here’s a guide to academic excellence through the art and science of critical thinking in research:

Define the research problem

The first step in critical thinking is to define the research problem or question. This involves identifying the key concepts, understanding the context, and formulating a clear and concise research question or hypothesis. Clearly define the research question or problem you are trying to address. This will help you focus your thinking and avoid unnecessary distractions.

Conduct a comprehensive literature review

A thorough review of relevant literature is essential in critical thinking. It helps you understand the existing knowledge and research in the field, identify research gaps, and evaluate the quality and reliability of the evidence. It also allows you to identify different perspectives and theories related to the research problem.

Evaluate evidence and sources

Critical thinking requires careful evaluation of evidence and sources. This includes assessing the credibility, reliability, and validity of research studies, data sources, and information. It also involves identifying potential biases, limitations, and assumptions in the evidence and sources. Use reputable, peer-reviewed sources and critically analyze the evidence and arguments presented in those sources.

Analyze and synthesize information

Critical thinking involves analyzing and synthesizing information from various sources. This includes identifying patterns, trends, and relationships among different pieces of information. It also requires organizing and integrating information to develop a coherent and logical argument.

Question assumptions

Challenge your assumptions and biases. Be aware of your own biases and preconceived notions, and critically examine them to avoid potential bias in your research.

Evaluate arguments and reasoning

Critical thinking involves evaluating the strength and validity of arguments and reasoning. This includes identifying logical fallacies, evaluating the coherence and consistency of arguments, and assessing the evidence and support for arguments. It also involves considering alternative viewpoints and perspectives.

Apply critical thinking tools

Use critical thinking tools such as SWOT analysis (Strengths, Weaknesses, Opportunities, Threats), mind maps, concept maps, and flowcharts to organize and analyze information in a structured and systematic manner.

Apply critical thinking skills in research design and methodology: Critical thinking is essential in research design and methodology. This includes making informed decisions about research approaches, sampling methods, data collection, and data analysis techniques. It also involves anticipating potential limitations and biases in the research design and methodology.

Consider multiple perspectives

Avoid tunnel vision by considering multiple perspectives and viewpoints on the issue at hand. This will help you gain a more comprehensive understanding of the topic and make informed decisions based on a broader range of information.

Ask critical questions

Critical questions in research.

Some of the sample critical questions in the research are listed below.

1. What is the research question, and is it clearly defined?

2. What are the assumptions underlying the research question?

3. What is the methodology being used, and is it appropriate for the research organized

4. What are the limitations of the study, and how might they affect the results?

5. How representative is the sample being studied, and are there any biases in the selection process?

6. What are the potential sources of error or bias in the data collection process?

7. Are the statistical analyses used appropriate, and do they support the conclusions drawn from the data?

8. What are the implications of the research findings, and do they have practical significance?

9. Are there any ethical considerations that arise from the research, and have they been adequately addressed?

10. Are there any alternative explanations for the results, and have they been considered and ruled out?

Communicate effectively

Critical thinking requires effective communication skills to articulate and present research findings and arguments clearly and convincingly.

This includes writing clearly and concisely, using appropriate evidence and examples, and presenting information in a logical and organized manner. It also involves listening and responding critically to feedback and engaging in constructive discussions and debates.

Practice self-reflection

Critical thinking involves self-reflection and self-awareness.  Reflect on your own thinking and decision-making process throughout the research. It requires regularly evaluating your own biases, assumptions, and limitations in your thinking process. It also involves being mindful of your emotions and personal beliefs that may influence your critical thinking and decision-making.

Embrace creativity and open-mindedness

Critical thinking involves being open to new ideas, perspectives, and approaches. It requires creativity in generating and evaluating alternative solutions or interpretations.

It also involves being willing to revise your conclusions or change your research direction based on new information. Avoid confirmation bias and strive for objectivity in your research.

Seek feedback and engage in peer review

Critical thinking benefits from feedback and peer review. Seeking feedback from mentors, colleagues, or peer reviewers can help identify potential flaws or weaknesses in your research or arguments. Engaging in peer review also provides an opportunity to critically evaluate the work of others and learn from their perspectives.

By following these best practices and techniques, you can cultivate critical thinking skills that will enhance the quality and rigor of your research, leading to more successful outcomes.

Critical thinking is an essential component of research that enables researchers to evaluate information, identify biases, and draw valid conclusions.

It involves defining research problems, conducting literature reviews, evaluating evidence and sources, analyzing and synthesizing information, evaluating arguments and reasoning, applying critical thinking in research design and methodology, communicating effectively, embracing creativity and open-mindedness, practicing self-reflection, seeking feedback, and engaging in peer review.

By cultivating and applying critical thinking skills in research, you can enhance the quality and rigor of your work and contribute to the advancement of knowledge in your field.

Remember to continuously practice and refine your critical thinking skills as they are valuable not only in research but also in various aspects of life. Happy researching!

  • academic excellence
  • Academic Research
  • academic success
  • critical thinking
  • practical tips
  • research skills

Dr. Sowndarya Somasundaram

100 Connective Words for Research Paper Writing

Phd supervisors – unsung heroes of doctoral students, effective tips on how to read research paper, most popular, india-canada collaborative industrial r&d grant, call for mobility plus project proposal – india and the czech republic, iitm & birmingham – joint master program, anna’s archive – download research papers for free, fulbright-kalam climate fellowship: fostering us-india collaboration, fulbright specialist program 2024-25, six effective tips to identify research gap, best for you, 24 best free plagiarism checkers in 2024, what is phd, popular posts, how to check scopus indexed journals 2024, how to write a research paper a complete guide, 480 ugc-care list of journals – science – 2024, popular category.

  • POSTDOC 317
  • Interesting 258
  • Journals 234
  • Fellowship 128
  • Research Methodology 102
  • All Scopus Indexed Journals 92

ilovephd_logo

iLovePhD is a research education website to know updated research-related information. It helps researchers to find top journals for publishing research articles and get an easy manual for research tools. The main aim of this website is to help Ph.D. scholars who are working in various domains to get more valuable ideas to carry out their research. Learn the current groundbreaking research activities around the world, love the process of getting a Ph.D.

Contact us: [email protected]

Google News

Copyright © 2024 iLovePhD. All rights reserved

  • Artificial intelligence

critical thinking is needed to evaluate scientific findings pertaining to

Thinking critically on critical thinking: why scientists’ skills need to spread

critical thinking is needed to evaluate scientific findings pertaining to

Lecturer in Psychology, University of Tasmania

Disclosure statement

Rachel Grieve does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Tasmania provides funding as a member of The Conversation AU.

View all partners

critical thinking is needed to evaluate scientific findings pertaining to

MATHS AND SCIENCE EDUCATION: We’ve asked our authors about the state of maths and science education in Australia and its future direction. Today, Rachel Grieve discusses why we need to spread science-specific skills into the wider curriculum.

When we think of science and maths, stereotypical visions of lab coats, test-tubes, and formulae often spring to mind.

But more important than these stereotypes are the methods that underpin the work scientists do – namely generating and systematically testing hypotheses. A key part of this is critical thinking.

It’s a skill that often feels in short supply these days, but you don’t necessarily need to study science or maths in order gain it. It’s time to take critical thinking out of the realm of maths and science and broaden it into students’ general education.

What is critical thinking?

Critical thinking is a reflective and analytical style of thinking, with its basis in logic, rationality, and synthesis. It means delving deeper and asking questions like: why is that so? Where is the evidence? How good is that evidence? Is this a good argument? Is it biased? Is it verifiable? What are the alternative explanations?

Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

For many scientists, critical thinking becomes (seemingly) intuitive, but like any skill set, critical thinking needs to be taught and cultivated. Unfortunately, educators are unable to deposit this information directly into their students’ heads. While the theory of critical thinking can be taught, critical thinking itself needs to be experienced first-hand.

So what does this mean for educators trying to incorporate critical thinking within their curricula? We can teach students the theoretical elements of critical thinking. Take for example working through [statistical problems](http://wdeneys.org/data/COGNIT_1695.pdf](http://wdeneys.org/data/COGNIT_1695.pdf) like this one:

In a 1,000-person study, four people said their favourite series was Star Trek and 996 said Days of Our Lives. Jeremy is a randomly chosen participant in this study, is 26, and is doing graduate studies in physics. He stays at home most of the time and likes to play videogames. What is most likely? a. Jeremy’s favourite series is Star Trek b. Jeremy’s favourite series is Days of Our Lives

Some critical thought applied to this problem allows us to know that Jeremy is most likely to prefer Days of Our Lives.

Can you teach it?

It’s well established that statistical training is associated with improved decision-making. But the idea of “teaching” critical thinking is itself an oxymoron: critical thinking can really only be learned through practice. Thus, it is not surprising that student engagement with the critical thinking process itself is what pays the dividends for students.

As such, educators try to connect students with the subject matter outside the lecture theatre or classroom. For example, problem based learning is now widely used in the health sciences, whereby students must figure out the key issues related to a case and direct their own learning to solve that problem. Problem based learning has clear parallels with real life practice for health professionals.

Critical thinking goes beyond what might be on the final exam and life-long learning becomes the key. This is a good thing, as practice helps to improve our ability to think critically over time .

Just for scientists?

For those engaging with science, learning the skills needed to be a critical consumer of information is invaluable. But should these skills remain in the domain of scientists? Clearly not: for those engaging with life, being a critical consumer of information is also invaluable, allowing informed judgement.

Being able to actively consider and evaluate information, identify biases, examine the logic of arguments, and tolerate ambiguity until the evidence is in would allow many people from all backgrounds to make better decisions. While these decisions can be trivial (does that miracle anti-wrinkle cream really do what it claims?), in many cases, reasoning and decision-making can have a substantial impact, with some decisions have life-altering effects. A timely case-in-point is immunisation.

Pushing critical thinking from the realms of science and maths into the broader curriculum may lead to far-reaching outcomes. With increasing access to information on the internet, giving individuals the skills to critically think about that information may have widespread benefit, both personally and socially.

The value of science education might not always be in the facts, but in the thinking.

This is the sixth part of our series Maths and Science Education .

  • Maths and science education

critical thinking is needed to evaluate scientific findings pertaining to

Case Management Specialist

critical thinking is needed to evaluate scientific findings pertaining to

Lecturer / Senior Lecturer - Marketing

critical thinking is needed to evaluate scientific findings pertaining to

Assistant Editor - 1 year cadetship

critical thinking is needed to evaluate scientific findings pertaining to

Executive Dean, Faculty of Health

critical thinking is needed to evaluate scientific findings pertaining to

Lecturer/Senior Lecturer, Earth System Science (School of Science)

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

1.5: The Scientific Method

  • Last updated
  • Save as PDF
  • Page ID 29584

  • Golden West College via NGE Far Press

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

The procedure that scientists use is also a standard form of argument. Its conclusions only give you the likelihood or the probability that something is true (if your theory or hypothesis is confirmed), and not the certainty that it’s true. But when it is done correctly, the conclusions it reaches are very well-grounded in experimental evidence. Here’s a rough outline of how the procedure works.

  • Observation: Something is observed in the world which invokes your curiosity.
  • Theory: An idea is proposed which could explain why the thing which you observed happened, or why it is what it is. This is the part of the procedure where scientists can get quite creative and imaginative.
  • Prediction: A test is planned which could prove or disprove the theory. As part of the plan, the scientist will offer a proposition in this form: “If my theory is true, then the experiment will have [whatever] result.”
  • Experiment: The test is performed, and the results are recorded.
  • Successful Result: If the prediction you made came true, then the theory devised is strengthened, not proved or made certain. The theory is “verified.” And then we go back and make more predictions and do more and more tests, to see if the theory can get stronger yet.
  • Failed Result: If the prediction did not come true, then the theory is falsified, and there are strong reasons to believe the theory is false. Nothing is ever certain (the sun may not actually rise tomorrow, for example, even though we all know it will), but we will assume that we were wrong if observations do not match our theories. When our predictions fail, we go back and devise a new theory to put to the test, and a new prediction to go with it.

Actually, a failed experimental result is really a kind of success, because falsification tells us what doesn’t work. And that frees up the scientist to pursue other, more promising theories. Scientists often test more than one theory at the same time, so that they can eventually arrive at the “last theory standing.” In this way, scientists can use a form of disjunctive syllogism (a deductive argument form) to arrive at definitive conclusions about what theory is the best explanation for the observation. Here’s how that part of the procedure works.

(P1) Either Theory 1 is true, or Theory 2 is true, or Theory 3 is true, or Theory 4 is true. (And so on, for however many theories are being tested.)

(P2) By experimental observation, Theories 1 and 2 and 3 were falsified.

(C) Therefore, Theory 4 is true.

Or, at least, Theory 4 is strengthened to the point where it would be quite absurd to believe anything else. After all, there might be other theories that we haven’t thought of, or tested yet. But until we think of them, and test them, we’re going to go with the best theory we’ve got.

There’s a bit more to scientific method than this. There are paradigms and paradigm shifts, epistemic values, experimental controls and variables, and the various ways that scientists negotiate with each other as they interpret experimental results. There are also a few differences between the experimental methods used by physical scientists (such as chemists), and social scientists (such as anthropologists). Scientific method is the most powerful and successful form of knowing that has been employed. Every advance in engineering, medicine, and technology has been made possible by people applying science to their problems. It is adventurous, curious, rigorously logical, and inspirational – it is even possible to be artistic about scientific discoveries. And the best part about science is that anyone can do it. Science can look difficult because there’s a lot of jargon involved, and a lot of math. But even the most complicated quantum physics and the most far-reaching astronomy follows the same method, in principle, as that primary school project in which you played with magnets or built a model volcano.

Doing the Scientific Method Yourself

We do the scientific method every day all the time when we learn or predict things. What will happen if you don’t text/call/message your significant other for longer than you normally do? Test it and find out! What will happen if you put 5 packets of ketchup on a hot dog? Find out! Pick some variables, make a prediction of what will happen when you change the variables, and then observe the results when you make those changes. Were you correct? What have you learned? What experiment would you like to do to test your new understandings? Follow the method mentioned in this chapter and see what you can learn.

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

6.1: Finding and Evaluating Research Sources

  • Last updated
  • Save as PDF
  • Page ID 155818

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

Chapter 26: Finding and Evaluating Research Sources

Rebekah bennetch, corey owen, and zachary keesey.

Learning Objectives

By the end of this chapter, you should be able to:

  • Distinguish between popular sources and scholarly sources
  • Identify different types of sources that can be used for technical projects
  • Evaluate research sources with a critical lens by considering their authority, content, and purpose

Key Terms and Concepts

  • scholarly sources
  • popular sources
  • authority, content, and purpose

Finding and Evaluating Research Source s

While going through Bitzer and Aristotle’s rhetorical theory, you will need to start looking up some sources you can use for your report. Perhaps you already know of some sources that you can pull from. If you don’t have any sources in mind, or don’t know how to even start looking for them, that’s okay too. We’ll talk more about strategies for conducting research in the next chapter . Right now, we want to make sure you know how to properly evaluate the sources you do find when you start your research.

Popular vs Scholarly Sources

In this “information age,” when so much information is available at our fingertips on the Internet, it is crucial to be able to critically search through the reams of information in order to select credible sources that can provide reliable data to support your ideas and convince your audience. In the era of “fake news,” deliberate misinformation, and “alternative facts,” developing the skill to evaluate the credibility of sources is critical.

Sources can be broken up into two categories: popular and scholarly. How would you define the difference between the two? Can you come up with some examples of both? Ideally, when you think of both categories, you should think of items such as those in the figure below.

Popular-vs-Scholarly.jpg

Figure #1 Examples of Popular vs Scholarly Sources. [1]

Why are scholarly sources more desirable than popular sources ? Is it ever a good idea to use one source type over the other in the report writing process? Let’s look at this a little more deeply. Watch the video below to help distinguish between the two.

Thumbnail for the embedded element "Scholarly and Popular Sources- UVic Libraries Research Help video"

A YouTube element has been excluded from this version of the text. You can view it online here: https://pb.libretexts.org/effective/?p=124

Link to Original Video: tinyurl.com/scholarsource

Scholarly articles published in academic journals are usually required sources in professional communication; they are also an especially integral part of engineering projects and technical reports. Since you are researching in a professional field and preparing for the workplace, there are many credible kinds of sources you will draw on in a professional context. How many can you list? Table #1 below lists several types of sources you may find useful in researching your projects.

Table #1: Typical Research Sources for Technical Projects

The original version of this chapter contained H5P content. You may want to remove or replace this element.

Download: Typical Research Sources for Technical Projects

Critically Evaluating Sources

Clearly, there are a lot of places where you can pull sources from. However—and this is a big however—it is essential that you critically evaluate your sources for authority , content, and purpose .

Anyone can put anything on the internet, and people with strong web and document design skills can make this information look very professional and credible, even if it isn’t . Since a majority of research is currently done online, and many sources are available electronically, developing your critical evaluation skills is crucial to finding valid, credible evidence to support and develop your ideas. In fact, this has become such a challenging issue that there are sites like List of Predatory Journals that regularly update its online list of journals that subvert the peer review process and simply publish for profit.

When evaluating research sources and presenting your own research, be careful to critically evaluate the authority , content , and purpose of the material, using the questions in Table #2.

Table #2: Evaluating the Authority, Content and Purpose of the Information

Going through all those questions may seem like a tedious, unnecessary process, but it is essential that you consider these questions as you acquire sources for your reports. Not doing so can negatively impact your credibility as a professional.

Consider it this way: let’s say you are presenting a report to a potential client. If they find out you used sources that lack authority, are not relevant or recent, or do not really serve their purpose, how will they view you? How will that affect their view of the company you work for?

Ultimately, critical thinking lies at the heart of evaluating sources. You want to be rigorous in your selection of evidence, because once you use it in your paper, it will either bolster your own credibility or undermine it.

Key Takeaways

  • Not all sources are created equal when it comes to research. There are two main categories: popular sources and scholarly sources . Of the two, scholarly sources are more credible because they have to go through a peer-review process in order to be published.
  • However, you still want to make sure you evaluate the sources your find for authority , content , and pu r pose regardless of where you found it.

Government of Canada, Statistics Canada [online]. Available: http://www.statcan.gc.ca/eng/start ↵

Kurland, D. (2000). What is critical thinking? How the language really works: The fundamentals of critical reading and effective writing. https://www.criticalreading.com/critical_thinking.htm

List of predatory journals . (n.d.). Stop Predatory Journals. https://predatoryjournals.com/journals/

Attributions

This chapter is adapted from Technical Writing Essentials (on BCcampus ) by Suzan Last, and is used under a Creative Commons Attribution 4.0 International License .

Cover images from journals are used to illustrate difference between popular and scholarly journals, and are for noncommercial, educational use only. ↵

Back Home

  • Search Search Search …
  • Search Search …

Scientific Thinking and Research: Essential Guide to Methodological Approaches

Scientific Thinking and Research

Scientific thinking and research are integral to the advancement of human knowledge and understanding. Scientific thinking is a type of knowledge-seeking process that encompasses various cognitive aspects such as asking questions, testing hypotheses, making observations, recognizing patterns, and making inferences 1 . This process goes beyond mere facts and figures; it involves critical thinking, problem-solving, and creativity, which contribute to scientific discovery and innovation.

Moreover, research in the sciences serves as a practical application of scientific thinking, focusing on systematically investigating the natural world to extend, correct, or confirm existing scientific knowledge. The scientific method, which forms the core of scientific research, entails designing experiments, collecting and analyzing data, and drawing conclusions based on empirical evidence. This rigorous approach increases the likelihood of generating accurate, unbiased, and reliable findings across various scientific disciplines.

Key Takeaways

  • Scientific thinking involves critical thinking and problem-solving skills to advance human knowledge.
  • Research in the sciences applies scientific thinking through the systematic investigation of the natural world.
  • Combining scientific thinking and research leads to accurate, unbiased, and reliable findings in different disciplines.

Principles of Scientific Thinking

Observation and experiment.

Scientific thinking begins with the observation of the natural world and the phenomena occurring around us. Through observation, scientists gather data and gain insights into how things work. To better understand these phenomena, scientists often design and conduct experiments. Controlled experiments allow them to isolate variables, manipulate conditions, and analyze the effects of these manipulations on the outcomes.

Hypothesis and Testing

A central aspect of scientific thinking involves formulating hypotheses. A hypothesis is a testable statement or question based on prior knowledge, observations, and evidence. Once a hypothesis is established, scientists use various methods to test its validity. Testing hypotheses is crucial in refining scientific knowledge and can involve experimentation, conducting case studies, or analyzing existing data. This step incorporates critical thinking and evaluation of the evidence in order to ascertain if the hypothesis holds true or needs further investigation.

Induction and Deduction

Inductive and deductive reasoning are key components of scientific thinking. Induction involves drawing general conclusions from specific observations. For example, a researcher might observe patterns in the data and use these patterns to make broader predictions about a phenomenon. On the other hand, deduction involves applying general principles to specific cases. In deduction, a scientist might start with a broader theory and then test it by making predictions about what should happen in particular situations. Both forms of reasoning help scientists develop theories and models that explain the world around them and allow them to assess the validity of hypotheses.

Scientific Theories

Scientific theories are well-substantiated explanations of some aspect of the natural world. They are the result of rigorous testing, refinement, and a convergence of evidence from multiple sources. A robust scientific theory should be falsifiable, meaning it can be potentially disproven through further experimentation or analysis. The process of developing and refining theories is ongoing, and as new evidence emerges, theories may be adjusted or replaced. Thomas Kuhn’s concept of scientific revolutions highlights how scientific theories can undergo significant shifts based on new insights and perspectives.

Internalizing the principles of scientific thinking enables individuals to evaluate the evidence and uncertainty in various situations, apply critical thinking skills, and develop a deeper understanding of the world and its intricate workings.

Scientific Research

Authorship and articles.

Scientific research involves the process of generating new knowledge and understanding by following a systematic approach. When researchers work on a problem, they publish their findings in the form of articles. These articles are authored by researchers and include the details of the study, the results obtained, and the analysis of the data. Publishing research articles allows other scientists to build upon the findings and advance the field of knowledge.

Experimental Methods

Effective experimental methods are crucial in conducting scientific research. Researchers use various experimental designs and techniques to investigate hypotheses, test theories, and obtain accurate and replicable results. Some common experimental methods include controlled experiments, field experiments, and natural experiments. These experiments help to establish causal relationships between variables and provide valuable information for problem-solving and advancing scientific knowledge.

Case Studies

In some situations, researchers may choose to conduct case studies as part of their research process. Case studies involve an in-depth examination of a particular event, phenomenon, or individual. They provide detailed information on specific instances, allowing researchers to gain a deeper understanding of complex phenomena and identify patterns that may be overlooked in broader studies. While case studies cannot establish causal relationships, they offer valuable insights into real-world situations and can be a valuable complement to other research approaches.

Scientific Argumentation

Scientific argumentation is a critical aspect of the research process, as it enables researchers to communicate their findings, defend their conclusions, and engage in constructive debates. It is essential for researchers to present their arguments clearly and systematically, avoiding confusion and ambiguity. A strong scientific argument consists of a well-structured, evidence-based reasoning that logically connects the premises and conclusions. Through scientific argumentation, researchers contribute to the ongoing dialogue in their field, fostering the development and refinement of theories and ideas.

Importance of Scientific Thinking in Education

Children’s cognitive development.

Scientific thinking plays a crucial role in children’s cognitive development. By engaging in learning processes such as observation, reflection, and motivation, children enhance their critical thinking and problem-solving abilities. Through these processes, they learn to make sense of the world and understand complex concepts. It has been shown that the use of scientific thinking contributes significantly to young children’s development.

Incorporating scientific thinking into early education helps children progress into adulthood with a solid foundation in evidence-based decision-making. A strong emphasis on the scientific method in science education equips children with tools to combat the spread of misinformation and pseudoscience. As a result, an educated population can better understand the significance of scientific advancements and their applications to everyday life.

Teaching Science

The methods used in teaching science have evolved to ensure students develop a deep understanding of the subject. An approach based on constructivism fosters active learning and allows students to construct their knowledge through experiences and reflection. This method is particularly effective for middle school science education, where students can leverage their prior knowledge and engage in hands-on experiments.

Teachers play a vital role in nurturing scientific thinking and encouraging students to question, analyze, and interpret data. Providing activities and environments that promote collaboration among peers can also enhance the development of 21st-century skills like teamwork, communication, and adaptability. As science education continues to evolve and improve, the significance of critical thinking in the field becomes increasingly apparent.

Addressing the importance of scientific thinking in education , educators, and researchers must work together to implement effective strategies and curricula. By fostering students’ scientific thinking abilities, educators pave the way for a generation prepared to tackle complex problems and contribute positively to society’s progress.

Scientific Thinking in Different Disciplines

Scientific thinking is a crucial aspect of many disciplines, as it allows for systematic inquiry and the acquisition of new knowledge. This section explores how scientific thinking manifests in three specific disciplines: psychology and creativity, engineering and problem solving, and philosophy and critical thinking.

Psychology and Creativity

In the field of psychology, scientific thinking focuses on understanding the human mind and its processes. Psychologists employ a variety of methodologies to develop and test hypotheses, often through experiments or observations of human behavior. A key aspect of scientific thinking in psychology involves creativity and curiosity, as researchers must be open to new ideas and novel ways of interpreting data. Embracing creativity enables psychologists to develop innovative theories, experimental designs, and therapeutic interventions, ultimately contributing to the advancement of the field [^1^].

Engineering and Problem Solving

Engineering is another discipline where scientific thinking is essential, as it relies on systematic, evidence-based approaches to solve complex problems. Engineers use their expertise in various fields like mechanical, electrical, and civil engineering, combined with critical thinking skills, to design, build, and maintain infrastructure and machinery. Problem-solving abilities are vital in engineering, as they enable engineers to identify challenges, evaluate potential solutions, and optimize designs for performance, sustainability, and cost-effectiveness. In this discipline, scientific thinking is directly applied to real-world challenges, enhancing the functionality of our society and the built environment [^2^].

Philosophy and Critical Thinking

Scientific thinking is also influential in the discipline of philosophy, as it encourages rigorous analysis and critical evaluation of ideas. Philosophers examine concepts, principles, and theories related to a wide range of areas, such as ethics, metaphysics, and epistemology. Through their work, they aim to improve our understanding of the world and inform the development of ethical and practical guidelines for personal conduct and decision-making. Scientific thinking in philosophy often involves critical thinking and logical reasoning, allowing philosophers to assess the validity of arguments, identify underlying assumptions, and uncover inconsistencies or fallacies in philosophical debates [^3^].

In each of these disciplines, scientific thinking plays a significant role in shaping the investigative methods, analytical approaches, and intellectual development of their respective fields. By fostering curiosity, creativity, critical thinking, and problem-solving, scientific thinking drives progress and innovation across a diverse spectrum of academic and professional pursuits.

Assessing Scientific Thinking

Assessing scientific thinking involves evaluating a person’s ability to reason, question beliefs, and maintain an open-minded attitude when exploring complex concepts. It is crucial to develop methods to evaluate one’s scientific thinking skills, as they play a significant role in understanding scientific principles and contribute to an individual’s overall scientific literacy.

Several assessment methods are applied to gauge scientific thinking. One common method includes analyzing a person’s ability to make inductive and deductive reasoning when presented with scientific information. Inductive reasoning involves making general conclusions based on specific observations, while deductive reasoning entails using established principles to evaluate and predict outcomes. By analyzing an individual’s ability to make logical inferences and predictions, educators can evaluate their scientific thinking skills.

Another important aspect to assess is an individual’s beliefs. When faced with contrasting ideas and theories, it is crucial for a person to evaluate their beliefs, weigh the evidence, and adopt a theory-evidence coordination approach. This method encourages the balance between existing beliefs and newly acquired information, allowing for a comprehensive understanding of scientific concepts.

To foster open-mindedness, assessments can evaluate a person’s capacity to consider multiple perspectives and alternate explanations. Asking individuals to compare and contrast different scientific theories, recognize patterns, and make connections between unrelated concepts can help in measuring their flexibility in thinking.

Furthermore, the ability to design and conduct experiments reflects strong scientific thinking. Assessing an individual’s experimental skills, such as formulating hypotheses, selecting appropriate variables, and interpreting results, can provide valuable insight into their overall scientific thinking abilities.

In summary, assessing scientific thinking requires a multifaceted approach, considering an individual’s reasoning, beliefs, open-mindedness, and experimental capabilities. By developing effective methods to evaluate scientific thinking skills, educators can help individuals to become confident, knowledgeable, and adept at understanding complex scientific concepts.

Fundamental Concepts in Science

Force and energy.

Force and energy are fundamental concepts in the development of science. Force is a push or a pull that causes an object with mass to accelerate or change its motion, while energy is the ability to do work, such as moving an object or causing it to change its state. Exploration of these concepts has led to various predictions and conjectures, often guided by the scientific method.

Scientific curiosity has driven researchers to investigate the interaction between force and energy, leading to the discovery of numerous natural laws and phenomena. For example, Johannes Kepler’s work on planetary motion was instrumental in understanding the relationship between gravitational force and energy.

Equilibrium and Magnetism

Equilibrium is a state in which opposing forces or actions are balanced, resulting in a stable system. Magnetism, on the other hand, is a force that acts upon charged particles, causing a mutual attraction or repulsion between them. These concepts have a pervasive presence in the scientific world, as they govern a wide range of phenomena.

The study of equilibrium and magnetism has prompted scientists to make numerous observations and develop theories to help explain various anomalies. For instance, magnetism plays a crucial role in understanding the behavior of electrical currents and the way they interact with magnetic fields.

Atoms and the Universe

Atoms are the basic building blocks of matter, and their study is essential for comprehending the composition and structure of the universe. Scientific thinking has enabled researchers to delve deeper into the nature of atoms, leading to several important discoveries.

Atoms consist of a nucleus, which contains protons and neutrons, and electrons that orbit the nucleus. Understanding the internal structure and behavior of atoms has been crucial in advancing our knowledge of the universe and its origins.

The scientific method, characterized by observation, experimentation, and hypothesis testing, has been essential in the study of atoms and the universe. It has allowed researchers to make accurate predictions and refine our understanding of fundamental forces and particles.

Frequently Asked Questions

What are the main characteristics of scientific research.

Scientific research is characterized by systematic investigation, empirical evidence, objectivity, and replicability. It aims to understand and predict natural phenomena through the application of the scientific method , which involves observing, hypothesizing, experimenting, and analyzing data. The research should be transparent, and the results must be open to peer review.

Can you provide examples of scientific thinking in everyday life?

Scientific thinking can be applied to everyday situations, such as problem-solving, decision-making, and evaluating information. For example, when you notice your plants are not thriving, you may hypothesize that they need more sunlight or water. You can then design an experiment by altering the amount of light or water, and monitor the results. This process mirrors the scientific method in everyday life.

How can one develop scientific thinking skills?

Developing scientific thinking skills requires practice, curiosity, and openness to new ideas. Engaging in activities such as critical reading, observation, and data analysis is essential to cultivate these skills. You can try to:

  • Ask meaningful questions that can be investigated empirically
  • Analyze information critically – evaluate the credibility of sources and the validity of the arguments
  • Familiarize yourself with scientific concepts to understand the underlying principles
  • Design and conduct experiments, analyze the results, and draw conclusions based on the evidence
  • Be open to changing your mind when presented with new evidence

What are the different types of scientific thinking?

Different types of scientific thinking include inductive reasoning, deductive reasoning, and abductive reasoning.

  • Inductive reasoning involves drawing general conclusions from specific observations. For example, you might notice that birds fly and conclude that all birds can fly.
  • Deductive reasoning is the process of drawing logical conclusions from previously established facts. For example, knowing that mammals have hair and that cats are mammals, you deduce that cats have hair.
  • Abductive reasoning involves forming a hypothesis to explain observed data or phenomena that are not entirely understood in the existing theories.

What are the three principles of scientific thinking?

The three principles of scientific thinking are empiricism , parsimony, and comprehensiveness.

  • Empiricism emphasizes the importance of observational data and the critical examination of evidence when forming knowledge and understanding.
  • Parsimony is the principle of selecting the simplest explanation that accurately accounts for the observed data or phenomena.
  • Comprehensiveness refers to the quality of a theory or explanation that can account for all, or most, of the available evidence and data.

What are the five steps of the scientific thinking process?

The steps of the scientific thinking process are:

  • Asking a question: Identify a problem or phenomenon that can be investigated.
  • Conducting background research: Study existing literature and research on the topic to better understand it and identify possible explanations.
  • Developing a hypothesis: Formulate a testable statement or prediction based on your research.
  • Designing and carrying out an experiment: Develop a procedure to test the hypothesis, collect data, and analyze the results.
  • Drawing conclusions: Determine whether the data supports the hypothesis, and consider the implications of the findings for further research.

You may also like

scientific thinking skills

Scientific Thinking Skills: Enhancing Cognitive Abilities for Problem Solving and Decision Making

Scientific thinking skills play a crucial role in the field of science, enabling individuals to approach problems methodically and make informed decisions. […]

Scientific Thinking Examples

Scientific Thinking Examples: A Comprehensive Guide for Practical Application

Scientific thinking is a vital skill in today’s world, helping us unravel the mysteries of the natural and physical world around us. […]

Elements of Scientific Thinking

Elements of Scientific Thinking: A Guide to Effective Inquiry

Scientific thinking is a crucial aspect of modern-day society, as it enables individuals to approach complex situations and problems systematically and rationally. […]

Best Books on the Scientific Thinking Method

Best Books on the Scientific Thinking Method: Your Ultimate Guide

Scientific thinking is an approach that allows individuals to critically analyze information and develop rational conclusions based on evidence. Numerous books on […]

Advertisement

Advertisement

Scientific Thinking and Critical Thinking in Science Education 

Two Distinct but Symbiotically Related Intellectual Processes

  • Open access
  • Published: 05 September 2023

Cite this article

You have full access to this open access article

critical thinking is needed to evaluate scientific findings pertaining to

  • Antonio García-Carmona   ORCID: orcid.org/0000-0001-5952-0340 1  

5313 Accesses

2 Citations

Explore all metrics

Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one or the other indistinctly to refer to the same cognitive and metacognitive skills, usually leaving unclear what are their differences and what are their common aspects. The present work therefore was aimed at elucidating what the differences and relationships between these two types of thinking are. The conclusion reached was that, while they differ in regard to the purposes of their application and some skills or processes, they also share others and are related symbiotically in a metaphorical sense; i.e., each one makes sense or develops appropriately when it is nourished or enriched by the other. Finally, an orientative proposal is presented for an integrated development of the two types of thinking in science classes.

Similar content being viewed by others

critical thinking is needed to evaluate scientific findings pertaining to

Social Learning Theory—Albert Bandura

Discovery learning—jerome bruner.

critical thinking is needed to evaluate scientific findings pertaining to

Opportunities and Challenges of STEM Education

Avoid common mistakes on your manuscript.

Education is not the learning of facts, but the training of the mind to think. Albert Einstein

1 Introduction

In consulting technical reports, theoretical frameworks, research, and curricular reforms related to science education, one commonly finds appeals to scientific thinking and critical thinking as essential educational processes or objectives. This is confirmed in some studies that include exhaustive reviews of the literature in this regard such as those of Bailin ( 2002 ), Costa et al. ( 2020 ), and Santos ( 2017 ) on critical thinking, and of Klarh et al. ( 2019 ) and Lehrer and Schauble ( 2006 ) on scientific thinking. However, conceptualizing and differentiating between both types of thinking based on the above-mentioned documents of science education are generally difficult. In many cases, they are referred to without defining them, or they are used interchangeably to represent virtually the same thing. Thus, for example, the document A Framework for K-12 Science Education points out that “Critical thinking is required, whether in developing and refining an idea (an explanation or design) or in conducting an investigation” (National Research Council (NRC), 2012 , p. 46). The same document also refers to scientific thinking when it suggests that basic scientific education should “provide students with opportunities for a range of scientific activities and scientific thinking , including, but not limited to inquiry and investigation, collection and analysis of evidence, logical reasoning, and communication and application of information” (NRC, 2012 , p. 251).

A few years earlier, the report Science Teaching in Schools in Europe: Policies and Research (European Commission/Eurydice, 2006 ) included the dimension “scientific thinking” as part of standardized national science tests in European countries. This dimension consisted of three basic abilities: (i) to solve problems formulated in theoretical terms , (ii) to frame a problem in scientific terms , and (iii) to formulate scientific hypotheses . In contrast, critical thinking was not even mentioned in such a report. However, in subsequent similar reports by the European Commission/Eurydice ( 2011 , 2022 ), there are some references to the fact that the development of critical thinking should be a basic objective of science teaching, although these reports do not define it at any point.

The ENCIENDE report on early-year science education in Spain also includes an explicit allusion to critical thinking among its recommendations: “Providing students with learning tools means helping them to develop critical thinking , to form their own opinions, to distinguish between knowledge founded on the evidence available at a certain moment (evidence which can change) and unfounded beliefs” (Confederation of Scientific Societies in Spain (COSCE), 2011 , p. 62). However, the report makes no explicit mention to scientific thinking. More recently, the document “ Enseñando ciencia con ciencia ” (Teaching science with science) (Couso et al., 2020 ), sponsored by Spain’s Ministry of Education, also addresses critical thinking:

(…) with the teaching approach through guided inquiry students learn scientific content, learn to do science (procedures), learn what science is and how it is built, and this (...) helps to develop critical thinking , that is, to question any statement that is not supported by evidence. (Couso et al., 2020 , p. 54)

On the other hand, in referring to what is practically the same thing, the European report Science Education for Responsible Citizenship speaks of scientific thinking when it establishes that one of the challenges of scientific education should be: “To promote a culture of scientific thinking and inspire citizens to use evidence-based reasoning for decision making” (European Commission, 2015 , p. 14). However, the Pisa 2024 Strategic Vision and Direction for Science report does not mention scientific thinking but does mention critical thinking in noting that “More generally, (students) should be able to recognize the limitations of scientific inquiry and apply critical thinking when engaging with its results” (Organization for Economic Co-operation and Development (OECD), 2020 , p. 9).

The new Spanish science curriculum for basic education (Royal Decree 217/ 2022 ) does make explicit reference to scientific thinking. For example, one of the STEM (Science, Technology, Engineering, and Mathematics) competency descriptors for compulsory secondary education reads:

Use scientific thinking to understand and explain the phenomena that occur around them, trusting in knowledge as a motor for development, asking questions and checking hypotheses through experimentation and inquiry (...) showing a critical attitude about the scope and limitations of science. (p. 41,599)

Furthermore, when developing the curriculum for the subjects of physics and chemistry, the same provision clarifies that “The essence of scientific thinking is to understand what are the reasons for the phenomena that occur in the natural environment to then try to explain them through the appropriate laws of physics and chemistry” (Royal Decree 217/ 2022 , p. 41,659). However, within the science subjects (i.e., Biology and Geology, and Physics and Chemistry), critical thinking is not mentioned as such. Footnote 1 It is only more or less directly alluded to with such expressions as “critical analysis”, “critical assessment”, “critical reflection”, “critical attitude”, and “critical spirit”, with no attempt to conceptualize it as is done with regard to scientific thinking.

The above is just a small sample of the concepts of scientific thinking and critical thinking only being differentiated in some cases, while in others they are presented as interchangeable, using one or the other indistinctly to talk about the same cognitive/metacognitive processes or practices. In fairness, however, it has to be acknowledged—as said at the beginning—that it is far from easy to conceptualize these two types of thinking (Bailin, 2002 ; Dwyer et al., 2014 ; Ennis, 2018 ; Lehrer & Schauble, 2006 ; Kuhn, 1993 , 1999 ) since they feed back on each other, partially overlap, and share certain features (Cáceres et al., 2020 ; Vázquez-Alonso & Manassero-Mas, 2018 ). Neither is there unanimity in the literature on how to characterize each of them, and rarely have they been analyzed comparatively (e.g., Hyytinen et al., 2019 ). For these reasons, I believed it necessary to address this issue with the present work in order to offer some guidelines for science teachers interested in deepening into these two intellectual processes to promote them in their classes.

2 An Attempt to Delimit Scientific Thinking in Science Education

For many years, cognitive science has been interested in studying what scientific thinking is and how it can be taught in order to improve students’ science learning (Klarh et al., 2019 ; Zimmerman & Klarh, 2018 ). To this end, Kuhn et al. propose taking a characterization of science as argument (Kuhn, 1993 ; Kuhn et al., 2008 ). They argue that this is a suitable way of linking the activity of how scientists think with that of the students and of the public in general, since science is a social activity which is subject to ongoing debate, in which the construction of arguments plays a key role. Lehrer and Schauble ( 2006 ) link scientific thinking with scientific literacy, paying especial attention to the different images of science. According to those authors, these images would guide the development of the said literacy in class. The images of science that Leherer and Schauble highlight as characterizing scientific thinking are: (i) science-as-logical reasoning (role of domain-general forms of scientific reasoning, including formal logic, heuristic, and strategies applied in different fields of science), (ii) science-as-theory change (science is subject to permanent revision and change), and (iii) science-as-practice (scientific knowledge and reasoning are components of a larger set of activities that include rules of participation, procedural skills, epistemological knowledge, etc.).

Based on a literature review, Jirout ( 2020 ) defines scientific thinking as an intellectual process whose purpose is the intentional search for information about a phenomenon or facts by formulating questions, checking hypotheses, carrying out observations, recognizing patterns, and making inferences (a detailed description of all these scientific practices or competencies can be found, for example, in NRC, 2012 ; OECD, 2019 ). Therefore, for Jirout, the development of scientific thinking would involve bringing into play the basic science skills/practices common to the inquiry-based approach to learning science (García-Carmona, 2020 ; Harlen, 2014 ). For other authors, scientific thinking would include a whole spectrum of scientific reasoning competencies (Krell et al., 2022 ; Moore, 2019 ; Tytler & Peterson, 2004 ). However, these competences usually cover the same science skills/practices mentioned above. Indeed, a conceptual overlap between scientific thinking, scientific reasoning, and scientific inquiry is often found in science education goals (Krell et al., 2022 ). Although, according to Leherer and Schauble ( 2006 ), scientific thinking is a broader construct that encompasses the other two.

It could be said that scientific thinking is a particular way of searching for information using science practices Footnote 2 (Klarh et al., 2019 ; Zimmerman & Klarh, 2018 ; Vázquez-Alonso & Manassero-Mas, 2018 ). This intellectual process provides the individual with the ability to evaluate the robustness of evidence for or against a certain idea, in order to explain a phenomenon (Clouse, 2017 ). But the development of scientific thinking also requires metacognition processes. According to what Kuhn ( 2022 ) argues, metacognition is fundamental to the permanent control or revision of what an individual thinks and knows, as well as that of the other individuals with whom it interacts, when engaging in scientific practices. In short, scientific thinking demands a good connection between reasoning and metacognition (Kuhn, 2022 ). Footnote 3

From that perspective, Zimmerman and Klarh ( 2018 ) have synthesized a taxonomy categorizing scientific thinking, relating cognitive processes with the corresponding science practices (Table 1 ). It has to be noted that this taxonomy was prepared in line with the categorization of scientific practices proposed in the document A Framework for K-12 Science Education (NRC, 2012 ). This is why one needs to understand that, for example, the cognitive process of elaboration and refinement of hypotheses is not explicitly associated with the scientific practice of hypothesizing but only with the formulation of questions. Indeed, the K-12 Framework document does not establish hypothesis formulation as a basic scientific practice. Lederman et al. ( 2014 ) justify it by arguing that not all scientific research necessarily allows or requires the verification of hypotheses, for example, in cases of exploratory or descriptive research. However, the aforementioned document (NRC, 2012 , p. 50) does refer to hypotheses when describing the practice of developing and using models , appealing to the fact that they facilitate the testing of hypothetical explanations .

In the literature, there are also other interesting taxonomies characterizing scientific thinking for educational purposes. One of them is that of Vázquez-Alonso and Manassero-Mas ( 2018 ) who, instead of science practices, refer to skills associated with scientific thinking . Their characterization basically consists of breaking down into greater detail the content of those science practices that would be related to the different cognitive and metacognitive processes of scientific thinking. Also, unlike Zimmerman and Klarh’s ( 2018 ) proposal, Vázquez-Alonso and Manassero-Mas’s ( 2018 ) proposal explicitly mentions metacognition as one of the aspects of scientific thinking, which they call meta-process . In my opinion, the proposal of the latter authors, which shells out scientific thinking into a broader range of skills/practices, can be more conducive in order to favor its approach in science classes, as teachers would have more options to choose from to address components of this intellectual process depending on their teaching interests, the educational needs of their students and/or the learning objectives pursued. Table 2 presents an adapted characterization of the Vázquez-Alonso and Manassero-Mas’s ( 2018 ) proposal to address scientific thinking in science education.

3 Contextualization of Critical Thinking in Science Education

Theorization and research about critical thinking also has a long tradition in the field of the psychology of learning (Ennis, 2018 ; Kuhn, 1999 ), and its application extends far beyond science education (Dwyer et al., 2014 ). Indeed, the development of critical thinking is commonly accepted as being an essential goal of people’s overall education (Ennis, 2018 ; Hitchcock, 2017 ; Kuhn, 1999 ; Willingham, 2008 ). However, its conceptualization is not simple and there is no unanimous position taken on it in the literature (Costa et al., 2020 ; Dwyer et al., 2014 ); especially when trying to relate it to scientific thinking. Thus, while Tena-Sánchez and León-Medina ( 2022 ) Footnote 4 and McBain et al. ( 2020 ) consider critical thinking to be the basis of or forms part of scientific thinking, Dowd et al. ( 2018 ) understand scientific thinking to be just a subset of critical thinking. However, Vázquez-Alonso and Manassero-Mas ( 2018 ) do not seek to determine whether critical thinking encompasses scientific thinking or vice versa. They consider that both types of knowledge share numerous skills/practices and the progressive development of one fosters the development of the other as a virtuous circle of improvement. Other authors, such as Schafersman ( 1991 ), even go so far as to say that critical thinking and scientific thinking are the same thing. In addition, some views on the relationship between critical thinking and scientific thinking seem to be context-dependent. For example, Hyytine et al. ( 2019 ) point out that in the perspective of scientific thinking as a component of critical thinking, the former is often used to designate evidence-based thinking in the sciences, although this view tends to dominate in Europe but not in the USA context. Perhaps because of this lack of consensus, the two types of thinking are often confused, overlapping, or conceived as interchangeable in education.

Even with such a lack of unanimous or consensus vision, there are some interesting theoretical frameworks and definitions for the development of critical thinking in education. One of the most popular definitions of critical thinking is that proposed by The National Council for Excellence in Critical Thinking (1987, cited in Inter-American Teacher Education Network, 2015 , p. 6). This conceives of it as “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action”. In other words, critical thinking can be regarded as a reflective and reasonable class of thinking that provides people with the ability to evaluate multiple statements or positions that are defensible to then decide which is the most defensible (Clouse, 2017 ; Ennis, 2018 ). It thus requires, in addition to a basic scientific competency, notions about epistemology (Kuhn, 1999 ) to understand how knowledge is constructed. Similarly, it requires skills for metacognition (Hyytine et al., 2019 ; Kuhn, 1999 ; Magno, 2010 ) since critical thinking “entails awareness of one’s own thinking and reflection on the thinking of self and others as objects of cognition” (Dean & Kuhn, 2003 , p. 3).

In science education, one of the most suitable scenarios or resources, but not the only one, Footnote 5 to address all these aspects of critical thinking is through the analysis of socioscientific issues (SSI) (Taylor et al., 2006 ; Zeidler & Nichols, 2009 ). Without wishing to expand on this here, I will only say that interesting works can be found in the literature that have analyzed how the discussion of SSIs can favor the development of critical thinking skills (see, e.g., López-Fernández et al., 2022 ; Solbes et al., 2018 ). For example, López-Fernández et al. ( 2022 ) focused their teaching-learning sequence on the following critical thinking skills: information analysis, argumentation, decision making, and communication of decisions. Even some authors add the nature of science (NOS) to this framework (i.e., SSI-NOS-critical thinking), as, for example, Yacoubian and Khishfe ( 2018 ) in order to develop critical thinking and how this can also favor the understanding of NOS (Yacoubian, 2020 ). In effect, as I argued in another work on the COVID-19 pandemic as an SSI, in which special emphasis was placed on critical thinking, an informed understanding of how science works would have helped the public understand why scientists were changing their criteria to face the pandemic in the light of new data and its reinterpretations, or that it was not possible to go faster to get an effective and secure medical treatment for the disease (García-Carmona, 2021b ).

In the recent literature, there have also been some proposals intended to characterize critical thinking in the context of science education. Table 3 presents two of these by way of example. As can be seen, both proposals share various components for the development of critical thinking (respect for evidence, critically analyzing/assessing the validity/reliability of information, adoption of independent opinions/decisions, participation, etc.), but that of Blanco et al. ( 2017 ) is more clearly contextualized in science education. Likewise, that of these authors includes some more aspects (or at least does so more explicitly), such as developing epistemological Footnote 6 knowledge of science (vision of science…) and on its interactions with technology, society, and environment (STSA relationships), and communication skills. Therefore, it offers a wider range of options for choosing critical thinking skills/processes to promote it in science classes. However, neither proposal refers to metacognitive skills, which are also essential for developing critical thinking (Kuhn, 1999 ).

3.1 Critical thinking vs. scientific thinking in science education: differences and similarities

In accordance with the above, it could be said that scientific thinking is nourished by critical thinking, especially when deciding between several possible interpretations and explanations of the same phenomenon since this generally takes place in a context of debate in the scientific community (Acevedo-Díaz & García-Carmona, 2017 ). Thus, the scientific attitude that is perhaps most clearly linked to critical thinking is the skepticism with which scientists tend to welcome new ideas (Normand, 2008 ; Sagan, 1987 ; Tena-Sánchez and León-Medina, 2022 ), especially if they are contrary to well-established scientific knowledge (Bell, 2009 ). A good example of this was the OPERA experiment (García-Carmona & Acevedo-Díaz, 2016a ), which initially seemed to find that neutrinos could move faster than the speed of light. This finding was supposed to invalidate Albert Einstein’s theory of relativity (the finding was later proved wrong). In response, Nobel laureate in physics Sheldon L. Glashow went so far as to state that:

the result obtained by the OPERA collaboration cannot be correct. If it were, we would have to give up so many things, it would be such a huge sacrifice... But if it is, I am officially announcing it: I will shout to Mother Nature: I’m giving up! And I will give up Physics. (BBVA Foundation, 2011 )

Indeed, scientific thinking is ultimately focused on getting evidence that may support an idea or explanation about a phenomenon, and consequently allow others that are less convincing or precise to be discarded. Therefore when, with the evidence available, science has more than one equally defensible position with respect to a problem, the investigation is considered inconclusive (Clouse, 2017 ). In certain cases, this gives rise to scientific controversies (Acevedo-Díaz & García-Carmona, 2017 ) which are not always resolved based exclusively on epistemic or rational factors (Elliott & McKaughan, 2014 ; Vallverdú, 2005 ). Hence, it is also necessary to integrate non-epistemic practices into the framework of scientific thinking (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ), practices that transcend the purely rational or cognitive processes, including, for example, those related to emotional or affective issues (Sinatra & Hofer, 2021 ). From an educational point of view, this suggests that for students to become more authentically immersed in the way of working or thinking scientifically, they should also learn to feel as scientists do when they carry out their work (Davidson et al., 2020 ). Davidson et al. ( 2020 ) call it epistemic affect , and they suggest that it could be approach in science classes by teaching students to manage their frustrations when they fail to achieve the expected results; Footnote 7 or, for example, to moderate their enthusiasm with favorable results in a scientific inquiry by activating a certain skepticism that encourages them to do more testing. And, as mentioned above, for some authors, having a skeptical attitude is one of the actions that best visualize the application of critical thinking in the framework of scientific thinking (Normand, 2008 ; Sagan, 1987 ; Tena-Sánchez and León-Medina, 2022 ).

On the other hand, critical thinking also draws on many of the skills or practices of scientific thinking, as discussed above. However, in contrast to scientific thinking, the coexistence of two or more defensible ideas is not, in principle, a problem for critical thinking since its purpose is not so much to invalidate some ideas or explanations with respect to others, but rather to provide the individual with the foundations on which to position themself with the idea/argument they find most defensible among several that are possible (Ennis, 2018 ). For example, science with its methods has managed to explain the greenhouse effect, the phenomenon of the tides, or the transmission mechanism of the coronavirus. For this, it had to discard other possible explanations as they were less valid in the investigations carried out. These are therefore issues resolved by the scientific community which create hardly any discussion at the present time. However, taking a position for or against the production of energy in nuclear power plants transcends the scope of scientific thinking since both positions are, in principle, equally defensible. Indeed, within the scientific community itself there are supporters and detractors of the two positions, based on the same scientific knowledge. Consequently, it is critical thinking, which requires the management of knowledge and scientific skills, a basic understanding of epistemic (rational or cognitive) and non-epistemic (social, ethical/moral, economic, psychological, cultural, ...) aspects of the nature of science, as well as metacognitive skills, which helps the individual forge a personal foundation on which to position themself in one place or another, or maintain an uncertain, undecided opinion.

In view of the above, one can summarize that scientific thinking and critical thinking are two different intellectual processes in terms of purpose, but are related symbiotically (i.e., one would make no sense without the other or both feed on each other) and that, in their performance, they share a fair number of features, actions, or mental skills. According to Cáceres et al. ( 2020 ) and Hyytine et al. ( 2019 ), the intellectual skills that are most clearly common to both types of thinking would be searching for relationships between evidence and explanations , as well as investigating and logical thinking to make inferences . To this common space, I would also add skills for metacognition in accordance with what has been discussed about both types of knowledge (Khun, 1999 , 2022 ).

In order to compile in a compact way all that has been argued so far, in Table 4 , I present my overview of the relationship between scientific thinking and critical thinking. I would like to point out that I do not intend to be extremely extensive in the compilation, in the sense that possibly more elements could be added in the different sections, but rather to represent above all the aspects that distinguish and share them, as well as the mutual enrichment (or symbiosis) between them.

4 A Proposal for the Integrated Development of Critical Thinking and Scientific Thinking in Science Classes

Once the differences, common aspects, and relationships between critical thinking and scientific thinking have been discussed, it would be relevant to establish some type of specific proposal to foster them in science classes. Table 5 includes a possible script to address various skills or processes of both types of thinking in an integrated manner. However, before giving guidance on how such skills/processes could be approached, I would like to clarify that while all of them could be dealt within the context of a single school activity, I will not do so in this way. First, because I think that it can give the impression that the proposal is only valid if it is applied all at once in a specific learning situation, which can also discourage science teachers from implementing it in class due to lack of time or training to do so. Second, I think it can be more interesting to conceive the proposal as a set of thinking skills or actions that can be dealt with throughout the different science contents, selecting only (if so decided) some of them, according to educational needs or characteristics of the learning situation posed in each case. Therefore, in the orientations for each point of the script or grouping of these, I will use different examples and/or contexts. Likewise, these orientations in the form of comments, although founded in the literature, should be considered only as possibilities to do so, among many others possible.

Motivation and predisposition to reflect and discuss (point i ) demands, on the one hand, that issues are chosen which are attractive for the students. This can be achieved, for example, by asking the students directly what current issues, related to science and its impact or repercussions, they would like to learn about, and then decide on which issue to focus on (García-Carmona, 2008 ). Or the teacher puts forward the issue directly in class, trying for it be current, to be present in the media, social networks, etc., or what they think may be of interest to their students based on their teaching experience. In this way, each student is encouraged to feel questioned or concerned as a citizen because of the issue that is going to be addressed (García-Carmona, 2008 ). Also of possible interest is the analysis of contemporary, as yet unresolved socioscientific affairs (Solbes et al., 2018 ), such as climate change, science and social justice, transgenic foods, homeopathy, and alcohol and drug use in society. But also, everyday questions can be investigated which demand a decision to be made, such as “What car to buy?” (Moreno-Fontiveros et al., 2022 ), or “How can we prevent the arrival of another pandemic?” (Ushola & Puig, 2023 ).

On the other hand, it is essential that the discussion about the chosen issue is planned through an instructional process that generates an environment conducive to reflection and debate, with a view to engaging the students’ participation in it. This can be achieved, for example, by setting up a role-play game (Blanco-López et al., 2017 ), especially if the issue is socioscientific, or by critical and reflective reading of advertisements with scientific content (Campanario et al., 2001 ) or of science-related news in the daily media (García-Carmona, 2014 , 2021a ; Guerrero-Márquez & García-Carmona, 2020 ; Oliveras et al., 2013 ), etc., for subsequent discussion—all this, in a collaborative learning setting and with a clear democratic spirit.

Respect for scientific evidence (point ii ) should be the indispensable condition in any analysis and discussion from the prisms of scientific and of critical thinking (Erduran, 2021 ). Although scientific knowledge may be impregnated with subjectivity during its construction and is revisable in the light of new evidence ( tentativeness of scientific knowledge), when it is accepted by the scientific community it is as objective as possible (García-Carmona & Acevedo-Díaz, 2016b ). Therefore, promoting trust and respect for scientific evidence should be one of the primary educational challenges to combating pseudoscientists and science deniers (Díaz & Cabrera, 2022 ), whose arguments are based on false beliefs and assumptions, anecdotes, and conspiracy theories (Normand, 2008 ). Nevertheless, it is no simple task to achieve the promotion or respect for scientific evidence (Fackler, 2021 ) since science deniers, for example, consider that science is unreliable because it is imperfect (McIntyre, 2021 ). Hence the need to promote a basic understanding of NOS (point iii ) as a fundamental pillar for the development of both scientific thinking and critical thinking. A good way to do this would be through explicit and reflective discussion about controversies from the history of science (Acevedo-Díaz & García-Carmona, 2017 ) or contemporary controversies (García-Carmona, 2021b ; García-Carmona & Acevedo-Díaz, 2016a ).

Also, with respect to point iii of the proposal, it is necessary to manage basic scientific knowledge in the development of scientific and critical thinking skills (Willingham, 2008 ). Without this, it will be impossible to develop a minimally serious and convincing argument on the issue being analyzed. For example, if one does not know the transmission mechanism of a certain disease, it is likely to be very difficult to understand or justify certain patterns of social behavior when faced with it. In general, possessing appropriate scientific knowledge on the issue in question helps to make the best interpretation of the data and evidence available on this issue (OECD, 2019 ).

The search for information from reliable sources, together with its analysis and interpretation (points iv to vi ), are essential practices both in purely scientific contexts (e.g., learning about the behavior of a given physical phenomenon from literature or through enquiry) and in the application of critical thinking (e.g., when one wishes to take a personal, but informed, position on a particular socio-scientific issue). With regard to determining the credibility of information with scientific content on the Internet, Osborne et al. ( 2022 ) propose, among other strategies, to check whether the source is free of conflicts of interest, i.e., whether or not it is biased by ideological, political or economic motives. Also, it should be checked whether the source and the author(s) of the information are sufficiently reputable.

Regarding the interpretation of data and evidence, several studies have shown the difficulties that students often have with this practice in the context of enquiry activities (e.g., Gobert et al., 2018 ; Kanari & Millar, 2004 ; Pols et al., 2021 ), or when analyzing science news in the press (Norris et al., 2003 ). It is also found that they have significant difficulties in choosing the most appropriate data to support their arguments in causal analyses (Kuhn & Modrek, 2022 ). However, it must be recognized that making interpretations or inferences from data is not a simple task; among other reasons, because their construction is influenced by multiple factors, both epistemic (prior knowledge, experimental designs, etc.) and non-epistemic (personal expectations, ideology, sociopolitical context, etc.), which means that such interpretations are not always the same for all scientists (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ). For this reason, the performance of this scientific practice constitutes one of the phases or processes that generate the most debate or discussion in a scientific community, as long as no consensus is reached. In order to improve the practice of making inferences among students, Kuhn and Lerman ( 2021 ) propose activities that help them develop their own epistemological norms to connect causally their statements with the available evidence.

Point vii refers, on the one hand, to an essential scientific practice: the elaboration of evidence-based scientific explanations which generally, in a reasoned way, account for the causality, properties, and/or behavior of the phenomena (Brigandt, 2016 ). In addition, point vii concerns the practice of argumentation . Unlike scientific explanations, argumentation tries to justify an idea, explanation, or position with the clear purpose of persuading those who defend other different ones (Osborne & Patterson, 2011 ). As noted above, the complexity of most socioscientific issues implies that they have no unique valid solution or response. Therefore, the content of the arguments used to defend one position or another are not always based solely on purely rational factors such as data and scientific evidence. Some authors defend the need to also deal with non-epistemic aspects of the nature of science when teaching it (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ) since many scientific and socioscientific controversies are resolved by different factors or go beyond just the epistemic (Vallverdú, 2005 ).

To defend an idea or position taken on an issue, it is not enough to have scientific evidence that supports it. It is also essential to have skills for the communication and discussion of ideas (point viii ). The history of science shows how the difficulties some scientists had in communicating their ideas scientifically led to those ideas not being accepted at the time. A good example for students to become aware of this is the historical case of Semmelweis and puerperal fever (Aragón-Méndez et al., 2019 ). Its reflective reading makes it possible to conclude that the proposal of this doctor that gynecologists disinfect their hands, when passing from one parturient to another to avoid contagions that provoked the fever, was rejected by the medical community not only for epistemic reasons, but also for the difficulties that he had to communicate his idea. The history of science also reveals that some scientific interpretations were imposed on others at certain historical moments due to the rhetorical skills of their proponents although none of the explanations would convincingly explain the phenomenon studied. An example is the case of the controversy between Pasteur and Liebig about the phenomenon of fermentation (García-Carmona & Acevedo-Díaz, 2017 ), whose reading and discussion in science class would also be recommended in this context of this critical and scientific thinking skill. With the COVID-19 pandemic, for example, the arguments of some charlatans in the media and on social networks managed to gain a certain influence in the population, even though scientifically they were muddled nonsense (García-Carmona, 2021b ). Therefore, the reflective reading of news on current SSIs such as this also constitutes a good resource for the same educational purpose. In general, according to Spektor-Levy et al. ( 2009 ), scientific communication skills should be addressed explicitly in class, in a progressive and continuous manner, including tasks of information seeking, reading, scientific writing, representation of information, and representation of the knowledge acquired.

Finally (point ix ), a good scientific/critical thinker must be aware of what they know, of what they have doubts about or do not know, to this end continuously practicing metacognitive exercises (Dean & Kuhn, 2003 ; Hyytine et al., 2019 ; Magno, 2010 ; Willingham, 2008 ). At the same time, they must recognize the weaknesses and strengths of the arguments of their peers in the debate in order to be self-critical if necessary, as well as to revising their own ideas and arguments to improve and reorient them, etc. ( self-regulation ). I see one of the keys of both scientific and critical thinking being the capacity or willingness to change one’s mind, without it being frowned upon. Indeed, quite the opposite since one assumes it to occur thanks to the arguments being enriched and more solidly founded. In other words, scientific and critical thinking and arrogance or haughtiness towards the rectification of ideas or opinions do not stick well together.

5 Final Remarks

For decades, scientific thinking and critical thinking have received particular attention from different disciplines such as psychology, philosophy, pedagogy, and specific areas of this last such as science education. The two types of knowledge represent intellectual processes whose development in students, and in society in general, is considered indispensable for the exercise of responsible citizenship in accord with the demands of today’s society (European Commission, 2006 , 2015 ; NRC, 2012 ; OECD, 2020 ). As has been shown however, the task of their conceptualization is complex, and teaching students to think scientifically and critically is a difficult educational challenge (Willingham, 2008 ).

Aware of this, and after many years dedicated to science education, I felt the need to organize my ideas regarding the aforementioned two types of thinking. In consulting the literature about these, I found that, in many publications, scientific thinking and critical thinking are presented or perceived as being interchangeable or indistinguishable; a conclusion also shared by Hyytine et al. ( 2019 ). Rarely have their differences, relationships, or common features been explicitly studied. So, I considered that it was a matter needing to be addressed because, in science education, the development of scientific thinking is an inherent objective, but, when critical thinking is added to the learning objectives, there arise more than reasonable doubts about when one or the other would be used, or both at the same time. The present work came about motivated by this, with the intention of making a particular contribution, but based on the relevant literature, to advance in the question raised. This converges in conceiving scientific thinking and critical thinking as two intellectual processes that overlap and feed into each other in many aspects but are different with respect to certain cognitive skills and in terms of their purpose. Thus, in the case of scientific thinking, the aim is to choose the best possible explanation of a phenomenon based on the available evidence, and it therefore involves the rejection of alternative explanatory proposals that are shown to be less coherent or convincing. Whereas, from the perspective of critical thinking, the purpose is to choose the most defensible idea/option among others that are also defensible, using both scientific and extra-scientific (i.e., moral, ethical, political, etc.) arguments. With this in mind, I have described a proposal to guide their development in the classroom, integrating them under a conception that I have called, metaphorically, a symbiotic relationship between two modes of thinking.

Critical thinking is mentioned literally in other of the curricular provisions’ subjects such as in Education in Civics and Ethical Values or in Geography and History (Royal Decree 217/2022).

García-Carmona ( 2021a ) conceives of them as activities that require the comprehensive application of procedural skills, cognitive and metacognitive processes, and both scientific knowledge and knowledge of the nature of scientific practice .

Kuhn ( 2021 ) argues that the relationship between scientific reasoning and metacognition is especially fostered by what she calls inhibitory control , which basically consists of breaking down the whole of a thought into parts in such a way that attention is inhibited on some of those parts to allow a focused examination of the intended mental content.

Specifically, Tena-Sánchez and León-Medina (2020) assume that critical thinking is at the basis of rational or scientific skepticism that leads to questioning any claim that does not have empirical support.

As discussed in the introduction, the inquiry-based approach is also considered conducive to addressing critical thinking in science education (Couso et al., 2020 ; NRC, 2012 ).

Epistemic skills should not be confused with epistemological knowledge (García-Carmona, 2021a ). The former refers to skills to construct, evaluate, and use knowledge, and the latter to understanding about the origin, nature, scope, and limits of scientific knowledge.

For this purpose, it can be very useful to address in class, with the help of the history and philosophy of science, that scientists get more wrong than right in their research, and that error is always an opportunity to learn (García-Carmona & Acevedo-Díaz, 2018 ).

Acevedo-Díaz, J. A., & García-Carmona, A. (2017). Controversias en la historia de la ciencia y cultura científica [Controversies in the history of science and scientific culture]. Los Libros de la Catarata.

Aragón-Méndez, M. D. M., Acevedo-Díaz, J. A., & García-Carmona, A. (2019). Prospective biology teachers’ understanding of the nature of science through an analysis of the historical case of Semmelweis and childbed fever. Cultural Studies of Science Education , 14 (3), 525–555. https://doi.org/10.1007/s11422-018-9868-y

Bailin, S. (2002). Critical thinking and science education. Science & Education, 11 (4), 361–375. https://doi.org/10.1023/A:1016042608621

Article   Google Scholar  

BBVA Foundation (2011). El Nobel de Física Sheldon L. Glashow no cree que los neutrinos viajen más rápido que la luz [Physics Nobel laureate Sheldon L. Glashow does not believe neutrinos travel faster than light.]. https://www.fbbva.es/noticias/nobel-fisica-sheldon-l-glashow-no-cree-los-neutrinos-viajen-mas-rapido-la-luz/ . Accessed 5 Februray 2023.

Bell, R. L. (2009). Teaching the nature of science: Three critical questions. In Best Practices in Science Education . National Geographic School Publishing.

Google Scholar  

Blanco-López, A., España-Ramos, E., & Franco-Mariscal, A. J. (2017). Estrategias didácticas para el desarrollo del pensamiento crítico en el aula de ciencias [Teaching strategies for the development of critical thinking in the teaching of science]. Ápice. Revista de Educación Científica, 1 (1), 107–115. https://doi.org/10.17979/arec.2017.1.1.2004

Brigandt, I. (2016). Why the difference between explanation and argument matters to science education. Science & Education, 25 (3-4), 251–275. https://doi.org/10.1007/s11191-016-9826-6

Cáceres, M., Nussbaum, M., & Ortiz, J. (2020). Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity, 37 , 100674. https://doi.org/10.1016/j.tsc.2020.100674

Campanario, J. M., Moya, A., & Otero, J. (2001). Invocaciones y usos inadecuados de la ciencia en la publicidad [Invocations and misuses of science in advertising]. Enseñanza de las Ciencias, 19 (1), 45–56. https://doi.org/10.5565/rev/ensciencias.4013

Clouse, S. (2017). Scientific thinking is not critical thinking. https://medium.com/extra-extra/scientific-thinking-is-not-critical-thinking-b1ea9ebd8b31

Confederacion de Sociedades Cientificas de Espana [COSCE]. (2011). Informe ENCIENDE: Enseñanza de las ciencias en la didáctica escolar para edades tempranas en España [ENCIENDE report: Science education for early-year in Spain] . COSCE.

Costa, S. L. R., Obara, C. E., & Broietti, F. C. D. (2020). Critical thinking in science education publications: the research contexts. International Journal of Development Research, 10 (8), 39438. https://doi.org/10.37118/ijdr.19437.08.2020

Couso, D., Jiménez-Liso, M.R., Refojo, C. & Sacristán, J.A. (coords.) (2020). Enseñando ciencia con ciencia [Teaching science with science]. FECYT & Fundacion Lilly / Penguin Random House

Davidson, S. G., Jaber, L. Z., & Southerland, S. A. (2020). Emotions in the doing of science: Exploring epistemic affect in elementary teachers' science research experiences. Science Education, 104 (6), 1008–1040. https://doi.org/10.1002/sce.21596

Dean, D., & Kuhn, D. (2003). Metacognition and critical thinking. ERIC document. Reproduction No. ED477930 . https://files.eric.ed.gov/fulltext/ED477930.pdf

Díaz, C., & Cabrera, C. (2022). Desinformación científica en España . FECYT/IBERIFIER https://www.fecyt.es/es/publicacion/desinformacion-cientifica-en-espana

Dowd, J. E., Thompson, R. J., Jr., Schiff, L. A., & Reynolds, J. A. (2018). Understanding the complex relationship between critical thinking and science reasoning among undergraduate thesis writers. CBE—Life Sciences . Education, 17 (1), ar4. https://doi.org/10.1187/cbe.17-03-0052

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity, 12 , 43–52. https://doi.org/10.1016/j.tsc.2013.12.004

Elliott, K. C., & McKaughan, D. J. (2014). Non-epistemic values and the multiple goals of science. Philosophy of Science, 81 (1), 1–21. https://doi.org/10.1086/674345

Ennis, R. H. (2018). Critical thinking across the curriculum: A vision. Topoi, 37 (1), 165–184. https://doi.org/10.1007/s11245-016-9401-4

Erduran, S. (2021). Respect for evidence: Can science education deliver it? Science & Education, 30 (3), 441–444. https://doi.org/10.1007/s11191-021-00245-8

European Commission. (2015). Science education for responsible citizenship . Publications Office https://op.europa.eu/en/publication-detail/-/publication/a1d14fa0-8dbe-11e5-b8b7-01aa75ed71a1

European Commission / Eurydice. (2011). Science education in Europe: National policies, practices and research . Publications Office. https://op.europa.eu/en/publication-detail/-/publication/bae53054-c26c-4c9f-8366-5f95e2187634

European Commission / Eurydice. (2022). Increasing achievement and motivation in mathematics and science learning in schools . Publications Office. https://eurydice.eacea.ec.europa.eu/publications/mathematics-and-science-learning-schools-2022

European Commission/Eurydice. (2006). Science teaching in schools in Europe. Policies and research . Publications Office. https://op.europa.eu/en/publication-detail/-/publication/1dc3df34-acdf-479e-bbbf-c404fa3bee8b

Fackler, A. (2021). When science denial meets epistemic understanding. Science & Education, 30 (3), 445–461. https://doi.org/10.1007/s11191-021-00198-y

García-Carmona, A. (2008). Relaciones CTS en la educación científica básica. II. Investigando los problemas del mundo [STS relationships in basic science education II. Researching the world problems]. Enseñanza de las Ciencias, 26 (3), 389–402. https://doi.org/10.5565/rev/ensciencias.3750

García-Carmona, A. (2014). Naturaleza de la ciencia en noticias científicas de la prensa: Análisis del contenido y potencialidades didácticas [Nature of science in press articles about science: Content analysis and pedagogical potential]. Enseñanza de las Ciencias, 32 (3), 493–509. https://doi.org/10.5565/rev/ensciencias.1307

García-Carmona, A., & Acevedo-Díaz, J. A. (2016). Learning about the nature of science using newspaper articles with scientific content. Science & Education, 25 (5–6), 523–546. https://doi.org/10.1007/s11191-016-9831-9

García-Carmona, A., & Acevedo-Díaz, J. A. (2016b). Concepciones de estudiantes de profesorado de Educación Primaria sobre la naturaleza de la ciencia: Una evaluación diagnóstica a partir de reflexiones en equipo [Preservice elementary teachers' conceptions of the nature of science: a diagnostic evaluation based on team reflections]. Revista Mexicana de Investigación Educativa, 21 (69), 583–610. https://www.redalyc.org/articulo.oa?id=14045395010

García-Carmona, A., & Acevedo-Díaz, J. A. (2017). Understanding the nature of science through a critical and reflective analysis of the controversy between Pasteur and Liebig on fermentation. Science & Education, 26 (1–2), 65–91. https://doi.org/10.1007/s11191-017-9876-4

García-Carmona, A., & Acevedo-Díaz, J. A. (2018). The nature of scientific practice and science education. Science & Education, 27 (5–6), 435–455. https://doi.org/10.1007/s11191-018-9984-9

García-Carmona, A. (2020). From inquiry-based science education to the approach based on scientific practices. Science & Education, 29 (2), 443–463. https://doi.org/10.1007/s11191-020-00108-8

García-Carmona, A. (2021a). Prácticas no-epistémicas: ampliando la mirada en el enfoque didáctico basado en prácticas científicas [Non-epistemic practices: extending the view in the didactic approach based on scientific practices]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 18 (1), 1108. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2021.v18.i1.1108

García-Carmona, A. (2021b). Learning about the nature of science through the critical and reflective reading of news on the COVID-19 pandemic. Cultural Studies of Science Education, 16 (4), 1015–1028. https://doi.org/10.1007/s11422-021-10092-2

Guerrero-Márquez, I., & García-Carmona, A. (2020). La energía y su impacto socioambiental en la prensa digital: temáticas y potencialidades didácticas para una educación CTS [Energy and its socio-environmental impact in the digital press: issues and didactic potentialities for STS education]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 17(3), 3301. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2020.v17.i3.3301

Gobert, J. D., Moussavi, R., Li, H., Sao Pedro, M., & Dickler, R. (2018). Real-time scaffolding of students’ online data interpretation during inquiry with Inq-ITS using educational data mining. In M. E. Auer, A. K. M. Azad, A. Edwards, & T. de Jong (Eds.), Cyber-physical laboratories in engineering and science education (pp. 191–217). Springer.

Chapter   Google Scholar  

Harlen, W. (2014). Helping children’s development of inquiry skills. Inquiry in Primary Science Education, 1 (1), 5–19. https://ipsejournal.files.wordpress.com/2015/03/3-ipse-volume-1-no-1-wynne-harlen-p-5-19.pdf

Hitchcock, D. (2017). Critical thinking as an educational ideal. In On reasoning and argument (pp. 477–497). Springer.

Hyytinen, H., Toom, A., & Shavelson, R. J. (2019). Enhancing scientific thinking through the development of critical thinking in higher education. In M. Murtonen & K. Balloo (Eds.), Redefining scientific thinking for higher education . Palgrave Macmillan.

Jiménez-Aleixandre, M. P., & Puig, B. (2022). Educating critical citizens to face post-truth: the time is now. In B. Puig & M. P. Jiménez-Aleixandre (Eds.), Critical thinking in biology and environmental education, Contributions from biology education research (pp. 3–19). Springer.

Jirout, J. J. (2020). Supporting early scientific thinking through curiosity. Frontiers in Psychology, 11 , 1717. https://doi.org/10.3389/fpsyg.2020.01717

Kanari, Z., & Millar, R. (2004). Reasoning from data: How students collect and interpret data in science investigations. Journal of Research in Science Teaching, 41 (7), 748–769. https://doi.org/10.1002/tea.20020

Klahr, D., Zimmerman, C., & Matlen, B. J. (2019). Improving students’ scientific thinking. In J. Dunlosky & K. A. Rawson (Eds.), The Cambridge handbook of cognition and education (pp. 67–99). Cambridge University Press.

Krell, M., Vorholzer, A., & Nehring, A. (2022). Scientific reasoning in science education: from global measures to fine-grained descriptions of students’ competencies. Education Sciences, 12 , 97. https://doi.org/10.3390/educsci12020097

Kuhn, D. (1993). Science as argument: Implications for teaching and learning scientific thinking. Science education, 77 (3), 319–337. https://doi.org/10.1002/sce.3730770306

Kuhn, D. (1999). A developmental model of critical thinking. Educational Researcher, 28 (2), 16–46. https://doi.org/10.3102/0013189X028002016

Kuhn, D. (2022). Metacognition matters in many ways. Educational Psychologist, 57 (2), 73–86. https://doi.org/10.1080/00461520.2021.1988603

Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006

Kuhn, D., & Lerman, D. (2021). Yes but: Developing a critical stance toward evidence. International Journal of Science Education, 43 (7), 1036–1053. https://doi.org/10.1080/09500693.2021.1897897

Kuhn, D., & Modrek, A. S. (2022). Choose your evidence: Scientific thinking where it may most count. Science & Education, 31 (1), 21–31. https://doi.org/10.1007/s11191-021-00209-y

Lederman, J. S., Lederman, N. G., Bartos, S. A., Bartels, S. L., Meyer, A. A., & Schwartz, R. S. (2014). Meaningful assessment of learners' understandings about scientific inquiry—The views about scientific inquiry (VASI) questionnaire. Journal of Research in Science Teaching, 51 (1), 65–83. https://doi.org/10.1002/tea.21125

Lehrer, R., & Schauble, L. (2006). Scientific thinking and science literacy. In K. A. Renninger, I. E. Sigel, W. Damon, & R. M. Lerner (Eds.), Handbook of child psychology: Child psychology in practice (pp. 153–196). John Wiley & Sons, Inc.

López-Fernández, M. D. M., González-García, F., & Franco-Mariscal, A. J. (2022). How can socio-scientific issues help develop critical thinking in chemistry education? A reflection on the problem of plastics. Journal of Chemical Education, 99 (10), 3435–3442. https://doi.org/10.1021/acs.jchemed.2c00223

Magno, C. (2010). The role of metacognitive skills in developing critical thinking. Metacognition and Learning, 5 , 137–156. https://doi.org/10.1007/s11409-010-9054-4

McBain, B., Yardy, A., Martin, F., Phelan, L., van Altena, I., McKeowen, J., Pembertond, C., Tosec, H., Fratuse, L., & Bowyer, M. (2020). Teaching science students how to think. International Journal of Innovation in Science and Mathematics Education, 28 (2), 28–35. https://openjournals.library.sydney.edu.au/CAL/article/view/14809/13480

McIntyre, L. (2021). Talking to science deniers and sceptics is not hopeless. Nature, 596 (7871), 165–165. https://doi.org/10.1038/d41586-021-02152-y

Moore, C. (2019). Teaching science thinking. Using scientific reasoning in the classroom . Routledge.

Moreno-Fontiveros, G., Cebrián-Robles, D., Blanco-López, A., & y España-Ramos, E. (2022). Decisiones de estudiantes de 14/15 años en una propuesta didáctica sobre la compra de un coche [Fourteen/fifteen-year-old students’ decisions in a teaching proposal on the buying of a car]. Enseñanza de las Ciencias, 40 (1), 199–219. https://doi.org/10.5565/rev/ensciencias.3292

National Research Council [NRC]. (2012). A framework for K-12 science education . National Academies Press.

Network, I.-A. T. E. (2015). Critical thinking toolkit . OAS/ITEN.

Normand, M. P. (2008). Science, skepticism, and applied behavior analysis. Behavior Analysis in Practice, 1 (2), 42–49. https://doi.org/10.1007/BF03391727

Norris, S. P., Phillips, L. M., & Korpan, C. A. (2003). University students’ interpretation of media reports of science and its relationship to background knowledge, interest, and reading difficulty. Public Understanding of Science, 12 (2), 123–145. https://doi.org/10.1177/09636625030122001

Oliveras, B., Márquez, C., & Sanmartí, N. (2013). The use of newspaper articles as a tool to develop critical thinking in science classes. International Journal of Science Education, 35 (6), 885–905. https://doi.org/10.1080/09500693.2011.586736

Organisation for Economic Co-operation and Development [OECD]. (2019). PISA 2018. Assessment and Analytical Framework . OECD Publishing. https://doi.org/10.1787/b25efab8-en

Book   Google Scholar  

Organisation for Economic Co-operation and Development [OECD]. (2020). PISA 2024: Strategic Vision and Direction for Science. https://www.oecd.org/pisa/publications/PISA-2024-Science-Strategic-Vision-Proposal.pdf

Osborne, J., Pimentel, D., Alberts, B., Allchin, D., Barzilai, S., Bergstrom, C., Coffey, J., Donovan, B., Kivinen, K., Kozyreva, A., & Wineburg, S. (2022). Science Education in an Age of Misinformation . Stanford University.

Osborne, J. F., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95 (4), 627–638. https://doi.org/10.1002/sce.20438

Pols, C. F. J., Dekkers, P. J. J. M., & De Vries, M. J. (2021). What do they know? Investigating students’ ability to analyse experimental data in secondary physics education. International Journal of Science Education, 43 (2), 274–297. https://doi.org/10.1080/09500693.2020.1865588

Royal Decree 217/2022. (2022). of 29 March, which establishes the organisation and minimum teaching of Compulsory Secondary Education (Vol. 76 , pp. 41571–41789). Spanish Official State Gazette. https://www.boe.es/eli/es/rd/2022/03/29/217

Sagan, C. (1987). The burden of skepticism. Skeptical Inquirer, 12 (1), 38–46. https://skepticalinquirer.org/1987/10/the-burden-of-skepticism/

Santos, L. F. (2017). The role of critical thinking in science education. Journal of Education and Practice, 8 (20), 160–173. https://eric.ed.gov/?id=ED575667

Schafersman, S. D. (1991). An introduction to critical thinking. https://facultycenter.ischool.syr.edu/wp-content/uploads/2012/02/Critical-Thinking.pdf . Accessed 10 May 2023.

Sinatra, G. M., & Hofer, B. K. (2021). How do emotions and attitudes influence science understanding? In Science denial: why it happens and what to do about it (pp. 142–180). Oxford Academic.

Solbes, J., Torres, N., & Traver, M. (2018). Use of socio-scientific issues in order to improve critical thinking competences. Asia-Pacific Forum on Science Learning & Teaching, 19 (1), 1–22. https://www.eduhk.hk/apfslt/

Spektor-Levy, O., Eylon, B. S., & Scherz, Z. (2009). Teaching scientific communication skills in science studies: Does it make a difference? International Journal of Science and Mathematics Education, 7 (5), 875–903. https://doi.org/10.1007/s10763-009-9150-6

Taylor, P., Lee, S. H., & Tal, T. (2006). Toward socio-scientific participation: changing culture in the science classroom and much more: Setting the stage. Cultural Studies of Science Education, 1 (4), 645–656. https://doi.org/10.1007/s11422-006-9028-7

Tena-Sánchez, J., & León-Medina, F. J. (2022). Y aún más al fondo del “bullshit”: El papel de la falsificación de preferencias en la difusión del oscurantismo en la teoría social y en la sociedad [And even deeper into “bullshit”: The role of preference falsification in the difussion of obscurantism in social theory and in society]. Scio, 22 , 209–233. https://doi.org/10.46583/scio_2022.22.949

Tytler, R., & Peterson, S. (2004). From “try it and see” to strategic exploration: Characterizing young children's scientific reasoning. Journal of Research in Science Teaching, 41 (1), 94–118. https://doi.org/10.1002/tea.10126

Uskola, A., & Puig, B. (2023). Development of systems and futures thinking skills by primary pre-service teachers for addressing epidemics. Research in Science Education , 1–17. https://doi.org/10.1007/s11165-023-10097-7

Vallverdú, J. (2005). ¿Cómo finalizan las controversias? Un nuevo modelo de análisis: la controvertida historia de la sacarina [How does controversies finish? A new model of analysis: the controversial history of saccharin]. Revista Iberoamericana de Ciencia, Tecnología y Sociedad, 2 (5), 19–50. http://www.revistacts.net/wp-content/uploads/2020/01/vol2-nro5-art01.pdf

Vázquez-Alonso, A., & Manassero-Mas, M. A. (2018). Más allá de la comprensión científica: educación científica para desarrollar el pensamiento [Beyond understanding of science: science education for teaching fair thinking]. Revista Electrónica de Enseñanza de las Ciencias, 17 (2), 309–336. http://reec.uvigo.es/volumenes/volumen17/REEC_17_2_02_ex1065.pdf

Willingham, D. T. (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109 (4), 21–32. https://doi.org/10.3200/AEPR.109.4.21-32

Yacoubian, H. A. (2020). Teaching nature of science through a critical thinking approach. In W. F. McComas (Ed.), Nature of Science in Science Instruction (pp. 199–212). Springer.

Yacoubian, H. A., & Khishfe, R. (2018). Argumentation, critical thinking, nature of science and socioscientific issues: a dialogue between two researchers. International Journal of Science Education, 40 (7), 796–807. https://doi.org/10.1080/09500693.2018.1449986

Zeidler, D. L., & Nichols, B. H. (2009). Socioscientific issues: Theory and practice. Journal of elementary science education, 21 (2), 49–58. https://doi.org/10.1007/BF03173684

Zimmerman, C., & Klahr, D. (2018). Development of scientific thinking. In J. T. Wixted (Ed.), Stevens’ handbook of experimental psychology and cognitive neuroscience (Vol. 4 , pp. 1–25). John Wiley & Sons, Inc..

Download references

Conflict of Interest

The author declares no conflict of interest.

Funding for open access publishing: Universidad de Sevilla/CBUA

Author information

Authors and affiliations.

Departamento de Didáctica de las Ciencias Experimentales y Sociales, Universidad de Sevilla, Seville, Spain

Antonio García-Carmona

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Antonio García-Carmona .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

García-Carmona, A. Scientific Thinking and Critical Thinking in Science Education . Sci & Educ (2023). https://doi.org/10.1007/s11191-023-00460-5

Download citation

Accepted : 30 July 2023

Published : 05 September 2023

DOI : https://doi.org/10.1007/s11191-023-00460-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Cognitive skills
  • Critical thinking
  • Metacognitive skills
  • Science education
  • Scientific thinking
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10672018

Logo of jintell

Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An Integrative Review

Associated data.

This research did not involve collection of original data, and hence there are no new data to make available.

A review of the research shows that critical thinking is a more inclusive construct than intelligence, going beyond what general cognitive ability can account for. For instance, critical thinking can more completely account for many everyday outcomes, such as how thinkers reject false conspiracy theories, paranormal and pseudoscientific claims, psychological misconceptions, and other unsubstantiated claims. Deficiencies in the components of critical thinking (in specific reasoning skills, dispositions, and relevant knowledge) contribute to unsubstantiated belief endorsement in ways that go beyond what standardized intelligence tests test. Specifically, people who endorse unsubstantiated claims less tend to show better critical thinking skills, possess more relevant knowledge, and are more disposed to think critically. They tend to be more scientifically skeptical and possess a more rational–analytic cognitive style, while those who accept unsubstantiated claims more tend to be more cynical and adopt a more intuitive–experiential cognitive style. These findings suggest that for a fuller understanding of unsubstantiated beliefs, researchers and instructors should also assess specific reasoning skills, relevant knowledge, and dispositions which go beyond what intelligence tests test.

1. Introduction

Why do some people believe implausible claims, such as the QAnon conspiracy theory, that a cabal of liberals is kidnapping and trafficking many thousands of children each year, despite the lack of any credible supporting evidence? Are believers less intelligent than non-believers? Do they lack knowledge of such matters? Are they more gullible or less skeptical than non-believers? Or, more generally, are they failing to think critically?

Understanding the factors contributing to acceptance of unsubstantiated claims is important, not only to the development of theories of intelligence and critical thinking but also because many unsubstantiated beliefs are false, and some are even dangerous. Endorsing them can have a negative impact on an individual and society at large. For example, false beliefs about the COVID-19 pandemic, such as believing that 5G cell towers induced the spread of the COVID-19 virus, led some British citizens to set fire to 5G towers ( Jolley and Paterson 2020 ). Other believers in COVID-19 conspiracy theories endangered their own and their children’s lives when they refused to socially distance and be vaccinated with highly effective vaccines, despite the admonitions of scientific experts ( Bierwiaczonek et al. 2020 ). Further endangering the population at large, those who believe the false conspiracy theory that human-caused global warming is a hoax likely fail to respond adaptively to this serious global threat ( van der Linden 2015 ). Parents, who uncritically accept pseudoscientific claims, such as the false belief that facilitated communication is an effective treatment for childhood autism, may forego more effective treatments ( Lilienfeld 2007 ). Moreover, people in various parts of the world still persecute other people whom they believe are witches possessing supernatural powers. Likewise, many people still believe in demonic possession, which has been associated with mental disorders ( Nie and Olson 2016 ). Compounding the problems created by these various unsubstantiated beliefs, numerous studies now show that when someone accepts one of these types of unfounded claims, they tend to accept others as well; see Bensley et al. ( 2022 ) for a review.

Studying the factors that contribute to unfounded beliefs is important not only because of their real-world consequences but also because this can facilitate a better understanding of unfounded beliefs and how they are related to critical thinking and intelligence. This article focuses on important ways in which critical thinking and intelligence differ, especially in terms of how a comprehensive model of CT differs from the view of intelligence as general cognitive ability. I argue that this model of CT more fully accounts for how people can accurately decide if a claim is unsubstantiated than can views of intelligence, emphasizing general cognitive ability. In addition to general cognitive ability, thinking critically about unsubstantiated claims involves deployment of specific reasoning skills, dispositions related to CT, and specific knowledge, which go beyond the contribution of general cognitive ability.

Accordingly, this article begins with an examination of the constructs of critical thinking and intelligence. Then, it discusses theories proposing that to understand thinking in the real world requires going beyond general cognitive ability. Specifically, the focus is on factors related to critical thinking, such as specific reasoning skills, dispositions, metacognition, and relevant knowledge. I review research showing that that this alternative multidimensional view of CT can better account for individual differences in the tendency to endorse multiple types of unsubstantiated claims than can general cognitive ability alone.

2. Defining Critical Thinking and Intelligence

Critical thinking is an almost universally valued educational objective in the US and in many other countries which seek to improve it. In contrast, intelligence, although much valued, has often been viewed as a more stable characteristic and less amenable to improvement through specific short-term interventions, such as traditional instruction or more recently through practice on computer-implemented training programs. According to Wechsler’s influential definition, intelligence is a person’s “aggregate or global capacity to act purposefully, to think rationally, and to deal effectively with his environment” ( Wechsler 1944, p. 3 ).

Consistent with this definition, intelligence has long been associated with general cognitive or intellectual ability and the potential to learn and reason well. Intelligence (IQ) tests measure general cognitive abilities, such as knowledge of words, memory skills, analogical reasoning, speed of processing, and the ability to solve verbal and spatial problems. General intelligence or “g” is a composite of these abilities statistically derived from various cognitive subtests on IQ tests which are positively intercorrelated. There is considerable overlap between g and the concept of fluid intelligence (Gf) in the prominent Cattell–Horn–Carroll model ( McGrew 2009 ), which refers to “the ability to solve novel problems, the solution of which does not depend on previously acquired skills and knowledge,” and crystalized intelligence (Gc), which refers to experience, existing skills, and general knowledge ( Conway and Kovacs 2018, pp. 50–51 ). Although g or general intelligence is based on a higher order factor, inclusive of fluid and crystallized intelligence, it is technically not the same as general cognitive ability, a commonly used, related term. However, in this article, I use “general cognitive ability” and “cognitive ability” because they are the imprecise terms frequently used in the research reviewed.

Although IQ scores have been found to predict performance in basic real-world domains, such as academic performance and job success ( Gottfredson 2004 ), an enduring question for intelligence researchers has been whether g and intelligence tests predict the ability to adapt well in other real-world situations, which concerns the second part of Wechsler’s definition. So, in addition to the search for the underlying structure of intelligence, researchers have been perennially concerned with how general abilities associated with intelligence can be applied to help a person adapt to real-world situations. The issue is largely a question of how cognitive ability and intelligence can help people solve real-world problems and cope adaptively and succeed in dealing with various environmental demands ( Sternberg 2019 ).

Based on broad conceptual definitions of intelligence and critical thinking, both intelligence and CT should aid adaptive functioning in the real world, presumably because they both involve rational approaches. Their common association with rationality gives each term a positive connotation. However, complicating the definition of each of these is the fact that rationality also continues to have a variety of meanings. In this article, in agreement with Stanovich et al. ( 2018 ), rationality is defined in the normative sense, used in cognitive science, as the distance between a person’s response and some normative standard of optimal behavior. As such, degree of rationality falls on a continuous scale, not a categorical one.

Despite disagreements surrounding the conceptual definitions of intelligence, critical thinking, and rationality, a commonality in these terms is they are value-laden and normative. In the case of intelligence, people are judged based on norms from standardized intelligence tests, especially in academic settings. Although scores on CT tests seldom are, nor could be, used to judge individuals in this way, the normative and value-laden basis of CT is apparent in people’s informal judgements. They often judge others who have made poor decisions to be irrational or to have failed to think critically.

This value-laden aspect of CT is also apparent in formal definitions of CT. Halpern and Dunn ( 2021 ) defined critical thinking as “the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal-directed.” The positive conception of CT as helping a person adapt well to one’s environment is clearly implied in “desirable outcome”.

Robert Ennis ( 1987 ) has offered a simpler, yet useful definition of critical thinking that also has normative implications. According to Ennis, “critical thinking is reasonable, reflective thinking focused on deciding what to believe or do” ( Ennis 1987, p. 102 ). This definition implies that CT helps people know what to believe (a goal of epistemic rationality) and how to act (a goal of instrumental rationality). This is conveyed by associating “critical thinking” with the positive terms, “reasonable” and “reflective”. Dictionaries commonly define “reasonable” as “rational”, “logical”, “intelligent”, and “good”, all terms with positive connotations.

For critical thinkers, being reasonable involves using logical rules, standards of evidence, and other criteria that must be met for a product of thinking to be considered good. Critical thinkers use these to evaluate how strongly reasons or evidence supports one claim versus another, drawing conclusions which are supported by the highest quality evidence ( Bensley 2018 ). If no high-quality evidence is available for consideration, it would be unreasonable to draw a strong conclusion. Unfortunately, people’s beliefs are too often based on acceptance of unsubstantiated claims. This is a failure of CT, but is it also a failure of intelligence?

3. Does Critical Thinking “Go Beyond” What Is Meant by Intelligence?

Despite the conceptual overlap in intelligence and CT at a general level, one way that CT can be distinguished from the common view of intelligence as general cognitive ability is in terms of what each can account for. Although intelligence tests, especially measures of general cognitive ability, have reliably predicted academic and job performance, they may not be sufficient to predict other everyday outcomes for which CT measures have made successful predictions and have added to the variance accounted for in performance. For instance, replicating a study by Butler ( 2012 ), Butler et al. ( 2017 ) obtained a negative correlation ( r = −0.33) between scores on the Halpern Critical Thinking Appraisal (HCTA) and a measure of 134 negative, real-world outcomes, not expected to befall critical thinkers, such as engaging in unprotected sex or posting a message on social media which the person regretted. They found that higher HCTA scores not only predicted better life decisions, but also predicted better performance beyond a measure of general cognitive ability. These results suggest that CT can account for real-world outcomes and goes beyond general cognitive ability to account for additional variance.

Some theorists maintain that standardized intelligence tests do not capture the variety of abilities that people need to adapt well in the real world. For example, Gardner ( 1999 ), has proposed that additional forms of intelligence are needed, such as spatial, musical, and interpersonal intelligences in addition to linguistic and logical–mathematical intelligences, more typically associated with general cognitive ability and academic success. In other theorizing, Sternberg ( 1988 ) has proposed three additional types of intelligence: analytical, practical, and creative intelligence, to more fully capture the variety of intelligent abilities on which people differ. Critical thinking is considered part of analytical skills which involve evaluating the quality and applicability of ideas, products, and options ( Sternberg 2022 ). Regarding adaptive intelligence, Sternberg ( 2019 ) has emphasized how adaptive aspects of intelligence are needed to solve real-world problems both at the individual and species levels. According to Sternberg, core components of intelligence have evolved in humans, but intelligence takes different forms in different cultures, with each culture valuing its own skills for adaptation. Thus, the construct of intelligence must go beyond core cognitive ability to encompass the specific abilities needed for adaptive behavior in specific cultures and settings.

Two other theories propose that other components be added to intelligent and rational thinking. Ackerman ( 2022 ) has emphasized the importance of acquiring domain-specific knowledge for engaging in intelligent functioning in the wide variety of tasks found in everyday life. Ackerman has argued that declarative, procedural, and tacit knowledge, as well as non-ability variables, are needed to better predict job performance and performance of other everyday activities. Taking another approach, Halpern and Dunn ( 2021 ) have proposed that critical thinking is essentially the adaptive application of intelligence for solving real-world problems. Elsewhere, Butler and Halpern ( 2019 ) have argued that dispositions such as open-mindedness are another aspect of CT and that domain-specific knowledge and specific CT skills are needed to solve real-world problems.

Examples are readily available for how CT goes beyond what IQ tests test to include specific rules for reasoning and relevant knowledge needed to execute real-world tasks. Take the example of scientific reasoning, which can be viewed as a specialized form of CT. Drawing a well-reasoned inductive conclusion about a theory or analyzing the quality of a research study both require that a thinker possess relevant specialized knowledge related to the question and specific reasoning skills for reasoning about scientific methodology. In contrast, IQ tests are deliberately designed to be nonspecialized in assessing Gc, broadly sampling vocabulary and general knowledge in order to be fair and unbiased ( Stanovich 2009 ). Specialized knowledge and reasoning skills are also needed in non-academic domains. Jurors must possess specialized knowledge to understand expert, forensic testimony and specific reasoning skills to interpret the law and make well-reasoned judgments about a defendant’s guilt or innocence.

Besides lacking specific reasoning skills and domain-relevant knowledge, people may fail to think critically because they are not disposed to use their reasoning skills to examine such claims and want to preserve their favored beliefs. Critical thinking dispositions are attitudes or traits that make it more likely that a person will think critically. Theorists have proposed numerous CT dispositions (e.g., Bensley 2018 ; Butler and Halpern 2019 ; Dwyer 2017 ; Ennis 1987 ). Some commonly identified CT dispositions especially relevant to this discussion are open-mindedness, skepticism, intellectual engagement, and the tendency to take a reflective, rational–analytic approach. Critical thinking dispositions are clearly value-laden and prescriptive. A good thinker should be open-minded, skeptical, reflective, intellectually engaged, and value a rational–analytic approach to inquiry. Conversely, corresponding negative dispositions, such as “close-mindedness” and “gullibility”, could obstruct CT.

Without the appropriate disposition, individuals will not use their reasoning skills to think critically about questions. For example, the brilliant mystery writer, Sir Arthur Conan Doyle, who was trained as a physician and created the hyper-reasonable detective Sherlock Holmes, was not disposed to think critically about some unsubstantiated claims. Conan Doyle was no doubt highly intelligent in cognitive ability terms, but he was not sufficiently skeptical (disposed to think critically) about spiritualism. He believed that he was talking to his dearly departed son though a medium, despite the warnings of his magician friend, Harry Houdini, who told him that mediums used trickery in their seances. Perhaps influenced by his Irish father’s belief in the “wee folk”, Conan Doyle also believed that fairies inhabited the English countryside, based on children’s photos, despite the advice of experts who said the photos could be faked. Nevertheless, he was skeptical of a new theory of tuberculosis proposed by Koch when he reported on it, despite his wife suffering from the disease. So, in professional capacities, Conan Doyle used his CT skills, but in certain other domains for which he was motivated to accept unsubstantiated claims, he failed to think critically, insufficiently disposed to skeptically challenge certain implausible claims.

This example makes two important points. Conan Doyle’s superior intelligence was not enough for him to reject implausible claims about the world. In general, motivated reasoning can lead people, even those considered highly intelligent, to accept claims with no good evidentiary support. The second important point is that we would not be able to adequately explain cases like this one, considering only the person’s intelligence or even their reasoning skills, without also considering the person’s disposition. General cognitive ability alone is not sufficient, and CT dispositions should also be considered.

Supporting this conclusion, Stanovich and West ( 1997 ) examined the influence of dispositions beyond the contribution of cognitive ability on a CT task. They gave college students an argument evaluation test in which participants first rated their agreement with several claims about real social and political issues made by a fictitious person. Then, they gave them evidence against each claim and finally asked them to rate the quality of a counterargument made by the same fictitious person. Participants’ ratings of the counterarguments were compared to the median ratings of expert judges on the quality of the rebuttals. Stanovich and West also administered a new measure of rational disposition called the Actively Open-minded Thinking (AOT) scale and the SAT as a proxy for cognitive ability. The AOT was a composite of items from several other scales that would be expected to measure CT disposition. They found that both SAT and AOT scores were significant predictors of higher argument analysis scores. Even after partialing out cognitive ability, actively open-minded thinking was significant. These results suggest that general cognitive ability alone was not sufficient to account for thinking critically about real-world issues and that CT disposition was needed to go beyond it.

Further examining the roles of CT dispositions and cognitive ability on reasoning, Stanovich and West ( 2008 ) studied myside bias, a bias in reasoning closely related to one-sided thinking and confirmation bias. A critical thinker would be expected to not show myside bias and instead fairly evaluate evidence on all sides of a question. Stanovich and West ( 2007 ) found that college students often showed myside bias when asked their opinions about real-world policy issues, such as those concerning the health risks of smoking and drinking alcohol. For example, compared to non-smokers, smokers judged the health risks of smoking to be lower. When they divided participants into higher versus lower cognitive ability groups based on SAT scores, the two groups showed little difference on myside bias. Moreover, on the hazards of drinking issue, participants who drank less had higher scores on the CT disposition measure.

Other research supports the need for both reasoning ability and CT disposition in predicting outcomes in the real world. Ren et al. ( 2020 ) found that CT disposition, as measured by a Chinese critical thinking disposition inventory, and a CT skill measure together contributed a significant amount of the variance in predicting academic performance beyond the contribution of cognitive ability alone, as measured by a test of fluid intelligence. Further supporting the claim that CT requires both cognitive ability and CT disposition, Ku and Ho ( 2010 ) found that a CT disposition measure significantly predicted scores on a CT test beyond the significant contribution of verbal intelligence in high school and college students from Hong Kong.

The contribution of dispositions to thinking is related to another way that CT goes beyond the application of general cognitive ability, i.e., by way of the motivation for reasoning. Assuming that all reasoning is motivated ( Kunda 1990 ), then CT is motivated, too, which is implicit within the Halpern and Dunn ( 2021 ) and Ennis ( 1987 ) definitions. Critical thinking is motivated in the sense of being purposeful and directed towards the goal of arriving at an accurate conclusion. For instance, corresponding to pursuit of the goal of accurate reasoning, the CT disposition of “truth-seeking” guides a person towards reaching the CT goal of arriving at an accurate conclusion.

Also, according to Kunda ( 1990 ), a second type of motivated reasoning can lead to faulty conclusions, often by directing a person towards the goal of maintaining favored beliefs and preconceptions, as in illusory correlation, belief perseverance, and confirmation bias. Corresponding to this second type, negative dispositions, such as close-mindedness and self-serving motives, can incline thinkers towards faulty conclusions. This is especially relevant in the present discussion because poorer reasoning, thinking errors, and the inappropriate use of heuristics are related to the endorsement of unsubstantiated claims, all of which are CT failures. The term “thinking errors” is a generic term referring to logical fallacies, informal reasoning fallacies, argumentation errors, and inappropriate uses of cognitive heuristics ( Bensley 2018 ). Heuristics are cognitive shortcuts, commonly used to simplify judgment tasks and reduce mental effort. Yet, when used inappropriately, heuristics often result in biased judgments.

Stanovich ( 2009 ) has argued that IQ tests do not test people’s use of heuristics, but heuristics have been found to be negatively correlated with CT performance ( West et al. 2008 ). In this same study, they found that college students’ cognitive ability, as measured by performance on the SAT, was not correlated with thinking biases associated with use of heuristics. Although Stanovich and West ( 2008 ) found that susceptibility to biases, such as the conjunction fallacy, framing effect, base-rate neglect, affect bias, and myside bias were all uncorrelated with cognitive ability (using SAT as a proxy), other types of thinking errors were correlated with SAT.

Likewise, two types of knowledge are related to the two forms of motivated reasoning. For instance, inaccurate knowledge, such as misconceptions, can derail reasoning from moving towards a correct conclusion, as in when a person reasons from false premises. In contrast, reasoning from accurate knowledge is more likely to produce an accurate conclusion. Taking into account inaccurate knowledge and thinking errors is important to understanding the endorsement of unsubstantiated claims because these are also related to negative dispositions, such as close-mindedness and cynicism, none of which are measured by intelligence tests.

Critical thinking questions are often situated in real-world examples or in simulations of them which are designed to detect thinking errors and bias. As described in Halpern and Butler ( 2018 ), an item like one on the “Halpern Critical Thinking Assessment” (HCTA) provides respondents with a mock newspaper story about research showing that first-graders who attended preschool were better able to learn how to read. Then the question asks if preschool should be made mandatory. A correct response to this item requires recognizing that correlation does not imply causation, that is, avoiding a common reasoning error people make in thinking about research implications in everyday life. Another CT skills test, “Analyzing Psychological Statements” (APS) assesses the ability to recognize thinking errors and apply argumentation skills and psychology to evaluate psychology-related examples and simulations of real-life situations ( Bensley 2021 ). For instance, besides identifying thinking errors in brief samples of thinking, questions ask respondents to distinguish arguments from non-arguments, find assumptions in arguments, evaluate kinds of evidence, and draw a conclusion from a brief psychological argument. An important implication of the studies just reviewed is that efforts to understand CT can be further informed by assessing thinking errors and biases, which, as the next discussion shows, are related to individual differences in thinking dispositions and cognitive style.

4. Dual-Process Theory Measures and Unsubstantiated Beliefs

Dual-process theory (DPT) and measures associated with it have been widely used in the study of the endorsement of unsubstantiated beliefs, especially as they relate to cognitive style. According to a cognitive style version of DPT, people have two modes of processing, a fast intuitive–experiential (I-E) style of processing and a slower, reflective, rational–analytic (R-A) style of processing. The intuitive cognitive style is associated with reliance on hunches, feelings, personal experience, and cognitive heuristics which simplify processing, while the R-A cognitive style is a reflective, rational–analytic style associated with more elaborate and effortful processing ( Bensley et al. 2022 ; Epstein 2008 ). As such, the rational–analytic cognitive style is consistent with CT dispositions, such as those promoting the effortful analysis of evidence, objective truth, and logical consistency. In fact, CT is sometimes referred to as “critical-analytic” thinking ( Byrnes and Dunbar 2014 ) and has been associated with analytical intelligence Sternberg ( 1988 ) and with rational thinking, as discussed before.

People use both modes of processing, but they show individual differences in which mode they tend to rely upon, although the intuitive–experiential mode is the default ( Bensley et al. 2022 ; Morgan 2016 ; Pacini and Epstein 1999 ), and they accept unsubstantiated claims differentially based on their predominate cognitive style ( Bensley et al. 2022 ; Epstein 2008 ). Specifically, individuals who rely more on an I-E cognitive style tend to endorse unsubstantiated claims more strongly, while individuals who rely more on a R-A cognitive style tend to endorse those claims less. Note, however, that other theorists view the two processes and cognitive styles somewhat differently, (e.g., Kahneman 2011 ; Stanovich et al. 2018 ).

Researchers have often assessed the contribution of these two cognitive styles to endorsement of unsubstantiated claims, using variants of three measures: the Cognitive Reflection Test (CRT) of Frederick ( 2005 ), the Rational–Experiential Inventory of Epstein and his colleagues ( Pacini and Epstein 1999 ), and the related Need for Cognition scale of Cacioppo and Petty ( 1982 ). The CRT is a performance-based test which asks participants to solve problems that appear to require simple mathematical calculations, but which actually require more reflection. People typically do poorly on the CRT, which is thought to indicate reliance on an intuitive cognitive style, while better performance is thought to indicate reliance on the slower, more deliberate, and reflective cognitive style. The positive correlation of the CRT with numeracy scores suggests it also has a cognitive skill component ( Patel et al. 2019 ). The Rational–Experiential Inventory (REI) of Pacini and Epstein ( 1999 ) contains one scale designed to measure an intuitive–experiential cognitive style and a second scale intended to measure a rational–analytic (R-A) style. The R-A scale was adapted from the Need for Cognition (NFC) scale of Cacioppo and Petty ( 1982 ), another scale associated with rational–analytic thinking and expected to be negatively correlated with unsubstantiated beliefs. The NFC was found to be related to open-mindedness and intellectual engagement, two CT dispositions ( Cacioppo et al. 1996 ).

The cognitive styles associated with DPT also relate to CT dispositions. Thinking critically requires that individuals be disposed to use their reasoning skills to reject unsubstantiated claims ( Bensley 2018 ) and that they be inclined to take a rational–analytic approach rather than relying on their intuitions and feelings. For instance, Bensley et al. ( 2014 ) found that students who endorsed more psychological misconceptions adopted a more intuitive cognitive style, were less disposed to take a rational–scientific approach to psychology, and scored lower on a psychological critical thinking skills test. Further supporting this connection, West et al. ( 2008 ) found that participants who tended to use cognitive heuristics more, thought to be related to intuitive processing and bias, scored lower on a critical thinking measure. As the Bensley et al. ( 2014 ) results suggest, in addition to assessing reasoning skills and dispositions, comprehensive CT assessment research should assess knowledge and unsubstantiated beliefs because these are related to failures of critical thinking.

5. Assessing Critical Thinking and Unsubstantiated Beliefs

Assessing endorsement of unsubstantiated claims provides another way to assess CT outcomes related to everyday thinking, which goes beyond what intelligence tests test ( Bensley and Lilienfeld 2020 ). From the perspective of the multi-dimensional model of CT, endorsement of unsubstantiated claims could result from deficiencies in a person’s CT reasoning skills, a lack of relevant knowledge, and in the engagement of inappropriate dispositions. Suppose an individual endorses an unsubstantiated claim, such as believing the conspiracy theory that human-caused global warming is a hoax. The person may lack the specific reasoning skills needed to critically evaluate the conspiracy. Lantian et al. ( 2020 ) found that scores on a CT skills test were negatively correlated with conspiracy theory beliefs. The person also must possess relevant scientific knowledge, such as knowing the facts that each year humans pump about 40 billion metric tons of carbon dioxide into the atmosphere and that carbon dioxide is a greenhouse gas which traps heat in the atmosphere. Or, the person may not be scientifically skeptical or too cynical or mistrustful of scientists or governmental officials.

Although endorsing unsubstantiated beliefs is clearly a failure of CT, problems arise in deciding which ones are unsubstantiated, especially when considering conspiracy theories. Typically, the claims which critical thinkers should reject as unsubstantiated are those which are not supported by objective evidence. But of the many conspiracies proposed, few are vigorously examined. Moreover, some conspiracy theories which authorities might initially deny turn out to be real, such as the MK-Ultra theory that the CIA was secretly conducting mind-control research on American citizens.

A way out of this quagmire is to define unsubstantiated beliefs on a continuum which depends on the quality of evidence. This has led to the definition of unsubstantiated claims as assertions which have not been supported by high-quality evidence ( Bensley 2023 ). Those which are supported have the kind of evidentiary support that critical thinkers are expected to value in drawing reasonable conclusions. Instead of insisting that a claim must be demonstrably false to be rejected, we adopt a more tentative acceptance or rejection of claims, based on how much good evidence supports them. Many claims are unsubstantiated because they have not yet been carefully examined and so totally lack support or they may be supported only by low quality evidence such as personal experience, anecdotes, or non-scientific authority. Other claims are more clearly unsubstantiated because they contradict the findings of high-quality research. A critical thinker should be highly skeptical of these.

Psychological misconceptions are one type of claim that can be more clearly unsubstantiated. Psychological misconceptions are commonsense psychological claims (folk theories) about the mind, brain, and behavior that are contradicted by the bulk of high-quality scientific research. Author developed the Test of Psychological Knowledge and Misconceptions (TOPKAM), a 40-item, forced-choice measure with each item posing a statement of a psychological misconception and the other response option stating the evidence-based alternative ( Bensley et al. 2014 ). They found that higher scores on the APS, the argument analysis test applying psychological concepts to analyze real-world examples, were associated with more correct answers on the TOPKAM. Other studies have found positive correlations between CT skills tests and other measures of psychological misconceptions ( McCutcheon et al. 1992 ; Kowalski and Taylor 2004 ). Bensley et al. ( 2014 ) also found that higher correct TOPKAM scores were positively correlated with scores on the Inventory of Thinking Dispositions in Psychology (ITDP) of Bensley ( 2021 ), a measure of the disposition to take a rational and scientific approach to psychology but were negatively correlated with an intuitive cognitive style.

Bensley et al. ( 2021 ) conducted a multidimensional study, assessing beginner psychology students starting a CT course on their endorsement of psychological misconceptions, recognition of thinking errors, CT dispositions, and metacognition, before and after CT instruction. Two classes received explicit instruction involving considerable practice in argument analysis and scientific reasoning skills, with one class receiving CT instruction focused more on recognizing psychological misconceptions and a second class focused more on recognizing various thinking errors. Bensley et al. assessed both classes before and after instruction on the TOPKAM and on the Test of Thinking Errors, a test of the ability to recognize in real-world examples 17 different types of thinking errors, such as confirmation bias, inappropriate use of the availability and representativeness heuristics, reasoning from ignorance/possibility, gambler’s fallacy, and hasty generalization ( Bensley et al. 2021 ). Correct TOPKAM and TOTE scores were positively correlated, and after CT instruction both were positively correlated with the APS, the CT test of argument analysis skills.

Bensley et al. found that after explicit instruction of CT skills, students improved significantly on both the TOPKAM and TOTE, but those focusing on recognizing misconceptions improved the most. Also, those students who improved the most on the TOTE scored higher on the REI rational–analytic scale and on the ITDP, while those improving the most on the TOTE scored higher on the ITDP. The students receiving explicit CT skill instruction in recognizing misconceptions also significantly improved the accuracy of their metacognitive monitoring in estimating their TOPKAM scores after instruction.

Given that before instruction neither class differed in GPA nor on the SAT, a proxy for general cognitive ability, CT instruction provided a good accounting for the improvement in recognition of thinking errors and misconceptions without recourse to intelligence. However, SAT scores were positively correlated with both TOTE scores and APS scores, suggesting that cognitive ability contributed to CT skill performance. These results replicated the earlier findings of Bensley and Spero ( 2014 ) showing that explicit CT instruction improved performance on both CT skills tests and metacognitive monitoring accuracy while controlling for SAT, which was positively correlated with the CT skills test performance.

Taken together, these findings suggest that cognitive ability contributes to performance on CT tasks but that CT instruction goes beyond it to further improve performance. As the results of Bensley et al. ( 2021 ) show, and as discussed next, thinking errors and bias from heuristics are CT failures that should also be assessed because they are related to endorsement of unsubstantiated beliefs and cognitive style.

6. Dual-Processing Theory and Research on Unsubstantiated Beliefs

Consistent with DPT, numerous other studies have obtained significant positive correlations between intuitive cognitive style and paranormal belief, often using the REI intuitive–experiential scale and the Revised Paranormal Belief Scale (RPBS) of Tobacyk ( 2004 ) (e.g., Genovese 2005 ; Irwin and Young 2002 ; Lindeman and Aarnio 2006 ; Pennycook et al. 2015 ; Rogers et al. 2018 ; Saher and Lindeman 2005 ). Studies have also found positive correlations between superstitious belief and intuitive cognitive style (e.g., Lindeman and Aarnio 2006 ; Maqsood et al. 2018 ). REI intuitive–experiential thinking style was also positively correlated with belief in complementary and alternative medicine ( Lindeman 2011 ), conspiracy theory belief ( Alper et al. 2020 ), and with endorsement of psychological misconceptions ( Bensley et al. 2014 ; Bensley et al. 2022 ).

Additional evidence for DPT has been found when REI R-A and NFC scores were negatively correlated with scores on measures of unsubstantiated beliefs, but studies correlating them with measures of paranormal belief and conspiracy theory belief have shown mixed results. Supporting a relationship, REI rational–analytic and NFC scores significantly and negatively predicted paranormal belief ( Lobato et al. 2014 ; Pennycook et al. 2012 ). Other studies have also obtained a negative correlation between NFC and paranormal belief ( Lindeman and Aarnio 2006 ; Rogers et al. 2018 ; Stahl and van Prooijen 2018 ), but both Genovese ( 2005 ) and Pennycook et al. ( 2015 ) found that NFC was not significantly correlated with paranormal belief. Swami et al. ( 2014 ) found that although REI R-A scores were negatively correlated with conspiracy theory belief, NFC scores were not.

Researchers often refer to people who are doubtful of paranormal and other unfounded claims as “skeptics” and so have tested whether measures related to skepticism are associated with less endorsement of unsubstantiated claims. They typically view skepticism as a stance towards unsubstantiated claims taken by rational people who reject them, (e.g., Lindeman and Aarnio 2006 ; Stahl and van Prooijen 2018 ), rather than as a disposition inclining a person to think critically about unsubstantiated beliefs ( Bensley 2018 ).

Fasce and Pico ( 2019 ) conducted one of the few studies using a measure related to skeptical disposition, the Critical Thinking Disposition Scale (CTDS) of Sosu ( 2013 ), in relation to endorsement of unsubstantiated claims. They found that scores on the CTDS were negatively correlated with scores on the RPBS but not significantly correlated with either a measure of pseudoscience or of conspiracy theory belief. However, the CRT was negatively correlated with both RPBS and the pseudoscience measure. Because Fasce and Pico ( 2019 ) did not examine correlations with the Reflective Skepticism subscale of the CTDS, its contribution apart from full-scale CTDS was not found.

To more directly test skepticism as a disposition, we recently assessed college students on how well three new measures predicted endorsement of psychological misconceptions, paranormal claims, and conspiracy theories ( Bensley et al. 2022 ). The dispositional measures included a measure of general skeptical attitude; a second measure, the Scientific Skepticism Scale (SSS), which focused more on waiting to accept claims until high-quality scientific evidence supported them; and a third measure, the Cynicism Scale (CS), which focused on doubting the sincerity of the motives of scientists and people in general. We found that although the general skepticism scale did not predict any of the unsubstantiated belief measures, SSS scores were a significant negative predictor of both paranormal belief and conspiracy theory belief. REI R-A scores were a less consistent negative predictor, while REI I-E scores were more consistent positive predictors, and surprisingly CS scores were the most consistent positive predictors of the unsubstantiated beliefs.

Researchers commonly assume that people who accept implausible, unsubstantiated claims are gullible or not sufficiently skeptical. For instance, van Prooijen ( 2019 ) has argued that conspiracy theory believers are more gullible (less skeptical) than non-believers and tend to accept unsubstantiated claims more than less gullible people. van Prooijen ( 2019 ) reviewed several studies supporting the claim that people who are more gullible tend to endorse conspiracy theories more. However, he did not report any studies in which a gullible disposition was directly measured.

Recently, we directly tested the gullibility hypothesis in relation to scientific skepticism ( Bensley et al. 2023 ) using the Gullibility Scale of Teunisse et al. ( 2019 ) on which people skeptical of the paranormal had been shown to have lower scores. We found that Gullibility Scale and the Cynicism Scale scores were positively correlated, and both were significant positive predictors of unsubstantiated beliefs, in general, consistent with an intuitive–experiential cognitive style. In contrast, we found that scores on the Cognitive Reflection Test, the Scientific Skepticism Scale, and the REI rational–analytic scale were all positively intercorrelated and significant negative predictors of unsubstantiated beliefs, in general, consistent with a rational–analytic/reflective cognitive style. Scientific skepticism scores negatively predicted general endorsement of unsubstantiated claims beyond the REI R-A scale, but neither the CTDS nor the CTDS Reflective Skepticism subscale were significant. These results replicated findings from the Bensley et al. ( 2023 ) study and supported an elaborated dual-process model of unsubstantiated belief. The SSS was not only a substantial negative predictor, it was also negatively correlated with the Gullibility Scale, as expected.

These results suggest that both CT-related dispositions and CT skills are related to endorsement of unsubstantiated beliefs. However, a measure of general cognitive ability or intelligence must be examined along with measures of CT and unsubstantiated beliefs to determine if CT goes beyond intelligence to predict unsubstantiated beliefs. In one of the few studies that also included a measure of cognitive ability, Stahl and van Prooijen ( 2018 ) found that dispositional characteristics helped account for acceptance of conspiracies and paranormal belief beyond cognitive ability. Using the Importance of Rationality Scale (IRS), a rational–analytic scale designed to measure skepticism towards unsubstantiated beliefs, Stahl and van Prooijen ( 2018 ) found that the IRS was negatively correlated with paranormal belief and belief in conspiracy theories. In separate hierarchical regressions, cognitive ability was the strongest negative predictor of both paranormal belief and of conspiracy belief, but IRS scores in combination with cognitive ability negatively predicted endorsement of paranormal belief but did not significantly predict conspiracy theory belief. These results provided partial support that that a measure of rational–analytic cognitive style related to skeptical disposition added to the variance accounted for beyond cognitive ability in negatively predicting unsubstantiated belief.

In another study that included a measure of cognitive ability, Cavojova et al. ( 2019 ) examined how CT-related dispositions and the Scientific Reasoning Scale (SRS) were related to a measure of paranormal, pseudoscientific, and conspiracy theory beliefs. The SRS of Drummond and Fischhoff ( 2017 ) likely measures CT skill in that it measures the ability to evaluate scientific research and evidence. As expected, the unsubstantiated belief measure was negatively correlated with the SRS and a cognitive ability measure, similar to Raven’s Progressive Matrices. Unsubstantiated beliefs were positively correlated with dogmatism (the opposite of open-mindedness) but not with REI rational–analytic cognitive style. The SRS was a significant negative predictor of both unsubstantiated belief and susceptibility to bias beyond the contribution of cognitive ability, but neither dogmatism nor analytic thinking were significant predictors. Nevertheless, this study provides some support that a measure related to CT reasoning skill accounts for variance in unsubstantiated belief beyond cognitive ability.

The failure of this study to show a correlation between rational–analytic cognitive style and unsubstantiated beliefs, when some other studies have found significant correlations with it and related measures, has implications for the multidimensional assessment of unsubstantiated beliefs. One implication is that the REI rational–analytic scale may not be a strong predictor of unsubstantiated beliefs. In fact, we have recently found that the Scientific Skepticism Scale was a stronger negative predictor ( Bensley et al. 2022 ; Bensley et al. 2023 ), which also suggests that other measures related to rational–analytic thinking styles should be examined. This could help triangulate the contribution of self-report cognitive style measures to endorsement of unsubstantiated claims, recognizing that the use of self-report measures has a checkered history in psychological research. A second implication is that once again, measures of critical thinking skill and cognitive ability were negative predictors of unsubstantiated belief and so they, too, should be included in future assessments of unsubstantiated beliefs.

7. Discussion

This review provided different lines of evidence supporting the claim that CT goes beyond cognitive ability in accounting for certain real-world outcomes. Participants who think critically reported fewer problems in everyday functioning, not expected to befall critical thinkers. People who endorsed unsubstantiated claims less showed better CT skills, more accurate domain-specific knowledge, less susceptibility to thinking errors and bias, and were more disposed to think critically. More specifically, they tended to be more scientifically skeptical and adopt a more rational–analytic cognitive style. In contrast, those who endorsed them more tended to be more cynical and adopt an intuitive–experiential cognitive style. These characteristics go beyond what standardized intelligence tests test. In some studies, the CT measures accounted for additional variance beyond the variance contributed by general cognitive ability.

That is not to say that measures of general cognitive ability are not useful. As noted by Gottfredson ( 2004 ), “g” is a highly successful predictor of academic and job performance. More is known about g and Gf than about many other psychological constructs. On average, g is closely related to Gf, which is highly correlated with working memory ( r = 0.70) and can be as high as r = 0.77 ( r 2 = 0.60) based on a correlated two-factor model ( Gignac 2014 ). Because modern working memory theory is, itself, a powerful theory ( Chai et al. 2018 ), this lends construct validity to the fluid intelligence construct. Although cognitive scientists have clearly made progress in understanding the executive processes underlying intelligence, they have not yet identified the specific cognitive components of intelligence ( Sternberg 2022 ). Moreover, theorists have acknowledged that intelligence must also include components beyond g, including domain-specific knowledge ( Ackerman 2022 ; Conway and Kovacs 2018 ) which are not yet clearly understood,

This review also pointed to limitations in the research that should be addressed. So far, not only have few studies of unsubstantiated beliefs included measures of intelligence, but they have also often used proxies for intelligence test scores, such as SAT scores. Future studies, besides using more and better measures of intelligence, could benefit from inclusion of more specifically focused measures, such as measures of Gf and Gc. Also, more research should be carried out to develop additional high-quality measures of CT, including ones that assess specific reasoning skills and knowledge relevant to thinking about a subject, which could help resolve perennial questions about the domain-general versus domain-specific nature of intelligence and CT. Overall, the results of this review encourage taking a multidimensional approach to investigating the complex constructs of intelligence, CT, and unsubstantiated belief. Supporting these recommendations were results of studies in which the improvement accrued from explicit CT skill instruction could be more fully understood when CT skills, relevant knowledge, CT dispositions, metacognitive monitoring accuracy, and a proxy for intelligence were used.

8. Conclusions

Critical thinking, broadly conceived, offers ways to understand real-world outcomes of thinking beyond what general cognitive ability can provide and intelligence tests test. A multi-dimensional view of CT which includes specific reasoning and metacognitive skills, CT dispositions, and relevant knowledge can add to our understanding of why some people endorse unsubstantiated claims more than others do, going beyond what intelligence tests test. Although general cognitive ability and domain-general knowledge often contribute to performance on CT tasks, thinking critically about real-world questions also involves applying rules, criteria, and knowledge which are specific to the question under consideration, as well as the appropriate dispositions and cognitive styles for deploying these.

Despite the advantages of taking this multidimensional approach to CT in helping us to more fully understand everyday thinking and irrationality, it presents challenges for researchers and instructors. It implies the need to assess and instruct multidimensionally, including not only measures of reasoning skills but also addressing thinking errors and biases, dispositions, the knowledge relevant to a task, and the accuracy of metacognitive judgments. As noted by Dwyer ( 2023 ), adopting a more complex conceptualization of CT beyond just skills is needed, but it presents challenges for those seeking to improve students’ CT. Nevertheless, the research reviewed suggests that taking this multidimensional approach to CT can enhance our understanding of the endorsement of unsubstantiated claims beyond what standardized intelligence tests contribute. More research is needed to resolve remaining controversies and to develop evidence-based applications of the findings.

Funding Statement

This research received no external funding.

Institutional Review Board Statement

This research involved no new testing of participants and hence did not require Institutional Review Board approval.

Informed Consent Statement

This research involved no new testing of participants and hence did not require an Informed Consent Statement.

Data Availability Statement

Conflicts of interest.

The author declares no conflict of interest.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

  • Ackerman Phillip L. Intelligence … Moving beyond the lowest common denominator. American Psychologist. 2022; 78 :283–97. doi: 10.1037/amp0001057. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Alper Sinan, Bayrak Faith, Yilmaz Onurcan. Psychological correlates of COVID-19 conspiracy beliefs and preventive measures: Evidence from Turkey. Current Psychology. 2020; 40 :5708–17. doi: 10.1007/s12144-020-00903-0. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan. Critical Thinking in Psychology and Everyday Life: A Guide to Effective Thinking. Worth Publishers; New York: 2018. [ Google Scholar ]
  • Bensley D. Alan. The Critical Thinking in Psychology Assessment Battery (CTPAB) and Test Guide. 2021. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan. “I can’t believe you believe that”: Identifying unsubstantiated claims. Skeptical Inquirer. 2023; 47 :53–56. [ Google Scholar ]
  • Bensley D. Alan, Spero Rachel A. Improving critical thinking skills and metacognitive monitoring through direct infusion. Thinking Skills and Creativity. 2014; 12 :55–68. doi: 10.1016/j.tsc.2014.02.001. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Lilienfeld Scott O. Assessment of Unsubstantiated Beliefs. Scholarship of Teaching and Learning in Psychology. 2020; 6 :198–211. doi: 10.1037/stl0000218. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Masciocchi Christopher M., Rowan Krystal A. A comprehensive assessment of explicit critical thinking instruction on recognition of thinking errors and psychological misconceptions. Scholarship of Teaching and Learning in Psychology. 2021; 7 :107. doi: 10.1037/stl0000188. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Watkins Cody, Lilienfeld Scott O., Masciocchi Christopher, Murtagh Michael, Rowan Krystal. Skepticism, cynicism, and cognitive style predictors of the generality of unsubstantiated belief. Applied Cognitive Psychology. 2022; 36 :83–99. doi: 10.1002/acp.3900. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Rodrigo Maria, Bravo Maria, Jocoy Kathleen. Dual-Process Theory and Cognitive Style Predictors of the General Endorsement of Unsubstantiated Claims. 2023. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan, Lilienfeld Scott O., Powell Lauren. A new measure of psychological. misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences. 2014; 36 :9–18. doi: 10.1016/j.lindif.2014.07.009. [ CrossRef ] [ Google Scholar ]
  • Bierwiaczonek Kinga, Kunst Jonas R., Pich Olivia. Belief in COVID-19 conspiracy theories reduces social distancing over time. Applied Psychology Health and Well-Being. 2020; 12 :1270–85. doi: 10.1111/aphw.12223. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Butler Heather A. Halpern critical thinking assessment predicts real-world outcomes of critical thinking. Applied Cognitive Psychology. 2012; 26 :721–29. doi: 10.1002/acp.2851. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Halpern Diane F. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Intelligence. Cambridge University Press; Cambridge: 2019. pp. 183–96. [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Byrnes James P., Dunbar Kevin N. The nature and development of critical-analytic thinking. Educational Research Review. 2014; 26 :477–93. doi: 10.1007/s10648-014-9284-0. [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E. The need for cognition. Journal of Personality and Social Psychology. 1982; 42 :116–31. doi: 10.1037/0022-3514.42.1.116. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E., Feinstein Jeffrey A., Jarvis W. Blair G. Dispositional differences in cognitive motivation: The life and times of individuals varying in need for cognition. Psychological Bulletin. 1996; 119 :197. doi: 10.1037/0033-2909.119.2.197. [ CrossRef ] [ Google Scholar ]
  • Cavojova Vladimira, Srol Jakub, Jurkovic Marek. Why we should think like scientists? Scientific reasoning and susceptibility to epistemically suspect beliefs and cognitive biases. Applied Cognitive Psychology. 2019; 34 :85–95. doi: 10.1002/acp.3595. [ CrossRef ] [ Google Scholar ]
  • Chai Wen Jia, Hamid Abd, Ismafairus Aini, Abdullah Jafri Malin. Working memory from the psychological and neuroscience perspective. Frontiers in Psychology. 2018; 9 :401. doi: 10.3389/fpsyg.2018.00401. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Conway Andrew R., Kovacs Kristof. The nature of the general factor of intelligence. In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 49–63. [ Google Scholar ]
  • Drummond Caitlin, Fischhoff Baruch. Development and validation of the Scientific Reasoning Scale. Journal of Behavioral Decision Making. 2017; 30 :26–38. doi: 10.1002/bdm.1906. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P. Conceptual Perspectives and Practical Guidelines. Cambridge University Press; Cambridge: 2017. [ Google Scholar ]
  • Dwyer Christopher P. An evaluative review of barriers to critical thinking in educational and real-world settings. Journal of Intelligence. 2023; 11 :105. doi: 10.3390/jintelligence11060105. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ennis Robert H. A taxonomy of critical thinking dispositions and abilities. In: Baron Joan, Sternberg Robert., editors. Teaching Thinking Skills: Theory and Practice. W. H. Freeman; New York: 1987. [ Google Scholar ]
  • Epstein Seymour. Intuition from the perspective of cognitive-experiential self-theory. In: Plessner Henning, Betsch Tilmann., editors. Intuition in Judgment and Decision Making. Erlbaum; Washington, DC: 2008. pp. 23–37. [ Google Scholar ]
  • Fasce Angelo, Pico Alfonso. Science as a vaccine: The relation between scientific literacy and unwarranted beliefs. Science & Education. 2019; 28 :109–25. doi: 10.1007/s11191-018-00022-0. [ CrossRef ] [ Google Scholar ]
  • Frederick Shane. Cognitive reflection and decision making. Journal of Economic Perspectives. 2005; 19 :25–42. doi: 10.1257/089533005775196732. [ CrossRef ] [ Google Scholar ]
  • Gardner Howard. Intelligence Reframed: Multiple Intelligence for the 21st Century. Basic Books; New York: 1999. [ Google Scholar ]
  • Genovese Jeremy E. C. Paranormal beliefs, schizotypy, and thinking styles among teachers and future teachers. Personality and Individual Differences. 2005; 39 :93–102. doi: 10.1016/j.paid.2004.12.008. [ CrossRef ] [ Google Scholar ]
  • Gignac Gilles E. Fluid intelligence shares closer to 60% of its variance with working memory capacity and is a better indicator of general intelligence. Intelligence. 2014; 47 :122–33. doi: 10.1016/j.intell.2014.09.004. [ CrossRef ] [ Google Scholar ]
  • Gottfredson Linda S. Life, death, and intelligence. Journal of Cognitive Education and Psychology. 2004; 4 :23–46. doi: 10.1891/194589504787382839. [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Dunn Dana. Critical thinking: A model of intelligence for solving real-world problems. Journal of Intelligence. 2021; 9 :22. doi: 10.3390/jintelligence9020022. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 183–196. [ Google Scholar ]
  • Irwin Harvey J., Young J. M. Intuitive versus reflective processes in the formation of paranormal beliefs. European Journal of Parapsychology. 2002; 17 :45–55. [ Google Scholar ]
  • Jolley Daniel, Paterson Jenny L. Pylons ablaze: Examining the role of 5G COVID-19 conspiracy beliefs and support for violence. British Journal of Social Psychology. 2020; 59 :628–40. doi: 10.1111/bjso.12394. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahneman Daniel. Thinking Fast and Slow. Farrar, Strauss and Giroux; New York: 2011. [ Google Scholar ]
  • Kowalski Patricia, Taylor Annette J. Ability and critical thinking as predictors of change in students’ psychological misconceptions. Journal of Instructional Psychology. 2004; 31 :297–303. [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Dispositional Factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences. 2010; 48 :54–58. doi: 10.1016/j.paid.2009.08.015. [ CrossRef ] [ Google Scholar ]
  • Kunda Ziva. The case for motivated reasoning. Psychological Bulletin. 1990; 98 :480–98. doi: 10.1037/0033-2909.108.3.480. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lantian Anthony, Bagneux Virginie, Delouvee Sylvain, Gauvrit Nicolas. Maybe a free thinker but not a critical one: High conspiracy belief is associated with low critical thinking ability. Applied Cognitive Psychology. 2020; 35 :674–84. doi: 10.1002/acp.3790. [ CrossRef ] [ Google Scholar ]
  • Lilienfeld Scott O. Psychological treatments that cause harm. Perspectives on Psychological Science. 2007; 2 :53–70. doi: 10.1111/j.1745-6916.2007.00029.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana. Biases in intuitive reasoning and belief in complementary and alternative medicine. Psychology and Health. 2011; 26 :371–82. doi: 10.1080/08870440903440707. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana, Aarnio Kia. Paranormal beliefs: Their dimensionality and correlates. European Journal of Personality. 2006; 20 :585–602. [ Google Scholar ]
  • Lobato Emilio J., Mendoza Jorge, Sims Valerie, Chin Matthew. Explaining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Applied Cognitive Psychology. 2014; 28 :617–25. doi: 10.1002/acp.3042. [ CrossRef ] [ Google Scholar ]
  • Maqsood Alisha, Jamil Farhat, Khalid Ruhi. Thinking styles and belief in superstitions: Moderating role of gender in young adults. Pakistan Journal of Psychological Research. 2018; 33 :335–348. [ Google Scholar ]
  • McCutcheon Lynn E., Apperson Jenneifer M., Hanson Esher, Wynn Vincent. Relationships among critical thinking skills, academic achievement, and misconceptions about psychology. Psychological Reports. 1992; 71 :635–39. doi: 10.2466/pr0.1992.71.2.635. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGrew Kevin S. CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence. 2009; 37 :1–10. doi: 10.1016/j.intell.2008.08.004. [ CrossRef ] [ Google Scholar ]
  • Morgan Jonathan. Religion and dual-process cognition: A continuum of styles or distinct types. Religion, Brain, & Behavior. 2016; 6 :112–29. doi: 10.1080/2153599X.2014.966315. [ CrossRef ] [ Google Scholar ]
  • Nie Fanhao, Olson Daniel V. A. Demonic influence: The negative mental health effects of belief in demons. Journal for the Scientific Study of Religion. 2016; 55 :498–515. doi: 10.1111/jssr.12287. [ CrossRef ] [ Google Scholar ]
  • Pacini Rosemary, Epstein Seymour. The relation of rational and experiential information processing styles to personality, basic beliefs, and the ratio-bias phenomenon. Journal of Personality and Social Psychology. 1999; 76 :972–87. doi: 10.1037/0022-3514.76.6.972. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Patel Niraj, Baker S. Glenn, Scherer Laura D. Evaluating the cognitive reflection test as a measure of intuition/reflection, numeracy, and insight problem solving, and the implications for understanding real-world judgments and beliefs. Journal of Experimental Psychology: General. 2019; 148 :2129–53. doi: 10.1037/xge0000592. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Barr Nathaniel, Koehler Derek J., Fugelsang Jonathan A. On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making. 2015; 10 :549–63. doi: 10.1017/S1930297500006999. [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Seti Paul, Koehler Derek J., Fugelsang Jonathan A. Analytic cognitive style predicts religious and paranormal belief. Cognition. 2012; 123 :335–46. doi: 10.1016/j.cognition.2012.03.003. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ren Xuezhu, Tong Yan, Peng Peng, Wang Tengfei. Critical thinking predicts academic performance beyond cognitive ability: Evidence from adults and children. Intelligence. 2020; 82 :10187. doi: 10.1016/j.intell.2020.101487. [ CrossRef ] [ Google Scholar ]
  • Rogers Paul, Fisk John E., Lowrie Emma. Paranormal belief, thinking style preference and susceptibility to confirmatory conjunction errors. Consciousness and Cognition. 2018; 65 :182–95. doi: 10.1016/j.concog.2018.07.013. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Saher Marieke, Lindeman Marjaana. Alternative medicine: A psychological perspective. Personality and Individual Differences. 2005; 39 :1169–78. doi: 10.1016/j.paid.2005.04.008. [ CrossRef ] [ Google Scholar ]
  • Sosu Edward M. The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity. 2013; 9 :107–19. doi: 10.1016/j.tsc.2012.09.002. [ CrossRef ] [ Google Scholar ]
  • Stahl Tomas, van Prooijen Jan-Wilem. Epistemic irrationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational. Personality and Individual Differences. 2018; 122 :155–63. doi: 10.1016/j.paid.2017.10.026. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Reasoning independently of prior belief and individual differences in actively open-minded thinking. Journal of Educational Psychology. 1997; 89 :345–57. doi: 10.1037/0022-0663.89.2.342. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Natural myside bias is independent of cognitive ability. Thinking & Reasoning. 2007; 13 :225–47. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict myside and one-sided thinking bias. Thinking and Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F., Toplak Maggie E. The Rationality Quotient: Toward a Test of Rational Thinking. The MIT Press; Cambridge, MA: 2018. [ Google Scholar ]
  • Sternberg Robert J. The Triarchic Mind: A New Theory of Intelligence. Penguin Press; London: 1988. [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. The search for the elusive basic processes underlying human intelligence: Historical and contemporary perspectives. Journal of Intelligence. 2022; 10 :28. doi: 10.3390/jintelligence10020028. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Swami Viren, Voracek Martin, Stieger Stefan, Tran Ulrich S., Furnham Adrian. Analytic thinking reduces belief in conspiracy theories. Cognition. 2014; 133 :572–85. doi: 10.1016/j.cognition.2014.08.006. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Teunisse Alessandra K., Case Trevor I., Fitness Julie, Sweller Naomi. I should have known better: Development of a self-report measure of gullibility. Personality and Social Psychology Bulletin. 2019; 46 :408–23. doi: 10.1177/0146167219858641. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tobacyk Jerome J. A revised paranormal belief scale. The International Journal of Transpersonal Studies. 2004; 23 :94–98. doi: 10.24972/ijts.2004.23.1.94. [ CrossRef ] [ Google Scholar ]
  • van der Linden Sander. The conspiracy-effect: Exposure to conspiracy theories (about global warming) leads to decreases pro-social behavior and science acceptance. Personality and Individual Differences. 2015; 87 :173–75. doi: 10.1016/j.paid.2015.07.045. [ CrossRef ] [ Google Scholar ]
  • van Prooijen Jan-Willem. Belief in conspiracy theories: Gullibility or rational skepticism? In: Forgas Joseph P., Baumeister Roy F., editors. The Social Psychology of Gullibility: Fake News, Conspiracy Theories, and Irrational Beliefs. Routledge; London: 2019. pp. 319–32. [ Google Scholar ]
  • Wechsler David. The Measurement of Intelligence. 3rd ed. Williams & Witkins; Baltimore: 1944. [ Google Scholar ]
  • West Richard F., Toplak Maggie E., Stanovich Keith E. Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology. 2008; 100 :930–41. doi: 10.1037/a0012842. [ CrossRef ] [ Google Scholar ]

IMAGES

  1. 💋 What is critical thinking examples. What Is Critical Thinking?. 2022

    critical thinking is needed to evaluate scientific findings pertaining to

  2. 6 Main Types of Critical Thinking Skills (With Examples)

    critical thinking is needed to evaluate scientific findings pertaining to

  3. Guide to improve critical thinking skills

    critical thinking is needed to evaluate scientific findings pertaining to

  4. Critical Thinking Definition, Skills, and Examples

    critical thinking is needed to evaluate scientific findings pertaining to

  5. What is critical thinking?

    critical thinking is needed to evaluate scientific findings pertaining to

  6. Critical_Thinking_Skills_Diagram_svg

    critical thinking is needed to evaluate scientific findings pertaining to

VIDEO

  1. 1 5 Critical Thinking to Evaluate Scientific Claims

  2. CORE-MD webinar Cardiovascular and diabetic

  3. The Power of Thought

  4. Critical thinking and deferring to experts

  5. 8. Improving Critical Thinking

  6. Episode 6b: Reading Scientific Articles

COMMENTS

  1. Nutrition 1020- Exam 6 Flashcards

    Study with Quizlet and memorize flashcards containing terms like A scientific hypothesis is:, *Critical thinking is needed to evaluate scientific findings pertaining to:, Which of the following is true about a scientific experiment: and more.

  2. What influences students' abilities to critically evaluate scientific

    Here we evaluate the efficacy of assessment questions to probe students' critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC).

  3. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  4. Understanding the Complex Relationship between Critical Thinking and

    The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking.

  5. PDF Critical Thinking About Research: Psychology and Related Fields, Second

    to research in the sciences, especially psychology and related fields (education, law, economics, health, nursing). This two-part book is about how to evaluate scientific evidence. Part I provides an overview of the methods and principles that you need to know about when you critique or evaluate the trustworthiness of research.

  6. What influences students' abilities to critically evaluate scientific

    Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of ...

  7. The Role of Evidence Evaluation in Critical Thinking: Fostering

    By epistemically vigilant we mean evaluating and monitoring the credibility and trustworthiness of information while being aware of the potential of being misinformed (Sperber et al., 2010 ). Epistemic vigilance is vital to critical thinking. We draw on Kuhn's ( 2018) definition of critical thinking as argumentation.

  8. Evidenced-Based Thinking for Scientific Thinking

    Critical thinking is a foundational cornerstone of scientific thinking that provides students with abilities to weigh up and evaluate information in order to make sound judgements. Today, more than ever, this is important as students are inundated with information and need critical faculties to evaluate the vast information that they consume.

  9. Enhancing Scientific Thinking Through the Development of Critical

    Research publications, policy papers and reports have argued that higher education cannot only facilitate learning of domain-specific knowledge and skills, but it also has to promote learning of thinking skills for using that knowledge in action (e.g. Greiff et al., 2014; Shavelson, 2010a; Strijbos, Engels, & Struyven, 2015).The focus on critical thinking arises, in part, because of higher ...

  10. What influences students' abilities to critically evaluate scientific

    Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and ...

  11. Guidelines for a Scientific Approach to Critical Thinking Assessment

    This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning, conducting, and using assessment research. Specifically, we discuss study design options and strategies for improving the quality of assessment data and the use of such data to improve critical thinking ...

  12. Measuring Critical Thinking in Science: Systematic Review

    Meanwhile, critical thinking within subject-domain (science) enables a person to think, evaluate, and solve problems in a scientific ways (Santos, 2017), thus suitable. assessment that assess ...

  13. The Art and Science of Critical Thinking in Research: A Guide to

    The art and science of critical thinking in research is a multifaceted and dynamic process that requires intellectual rigor, creativity, and an open mind. In research, critical thinking is essential for developing research questions, designing research studies, collecting and analyzing data, and interpreting research findings.

  14. Critical Analysis: The Often-Missing Step in Conducting Literature

    Literature reviews are essential in moving our evidence-base forward. "A literature review makes a significant contribution when the authors add to the body of knowledge through providing new insights" (Bearman, 2016, p. 383).Although there are many methods for conducting a literature review (e.g., systematic review, scoping review, qualitative synthesis), some commonalities in ...

  15. Thinking critically on critical thinking: why scientists' skills need

    Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

  16. 1.5: The Scientific Method

    This page titled 1.5: The Scientific Method is shared under a CC BY-NC-SA license and was authored, remixed, and/or curated by Noah Levin ( NGE Far Press) . The procedure that scientists use is also a standard form of argument. Its conclusions only give you the likelihood or the probability that something is true (if your theory or hypothesis ...

  17. 6.1: Finding and Evaluating Research Sources

    Ultimately, critical thinking lies at the heart of evaluating sources. You want to be rigorous in your selection of evidence, because once you use it in your paper, it will either bolster your own credibility or undermine it. Key Takeaways. Not all sources are created equal when it comes to research. There are two main categories: popular ...

  18. Scientific Thinking and Research: Essential Guide to Methodological

    Scientific thinking and research are integral to the advancement of human knowledge and understanding. Scientific thinking is a type of knowledge-seeking process that encompasses various cognitive aspects such as asking questions, testing hypotheses, making observations, recognizing patterns, and making inferences 1.This process goes beyond mere facts and figures; it involves critical thinking ...

  19. Scientific Thinking and Critical Thinking in Science Education

    Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one ...

  20. Critical thinking during science investigations: what do practicing

    However, teachers also expressed avoidance intentions related to student confusion and uncertainty before and after inquiry-based investigations designed for critical thinking. These findings highlight a potential disconnect between the benefits of productive student struggle for critical thinking as endorsed in the research on learning and ...

  21. Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and

    Learning to provide safe and quality health care requires technical expertise, the ability to think critically, experience, and clinical judgment. The high-performance expectation of nurses is dependent upon the nurses' continual learning, professional accountability, independent and interdependent decisionmaking, and creative problem-solving abilities.

  22. Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An

    Critical thinking is considered part of analytical skills which involve evaluating the quality and applicability of ideas, products, and options (Sternberg 2022). Regarding adaptive intelligence, Sternberg ( 2019 ) has emphasized how adaptive aspects of intelligence are needed to solve real-world problems both at the individual and species levels.