Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Prevent plagiarism. Run a free check.

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

critical thinking about evidence

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved April 12, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

critical thinking about evidence

  • Choosing Effective Vocabulary
  • How to Fill a Page (When You Have Nothing to Say)
  • Resources – Books
  • Critical Thinking and Reading Skills
  • Key Terms and the Inference Continuum
  • Bad Inferences – Fallacies and Biases
  • Application: Inferences and History
  • An Aside: Strong Inferences vs. Ghosts
  • Eight Types of Evidence – Strengths and Weaknesses
  • Bad Evidence – Fallacies and Poor Appeals
  • Value Conflicts and Key Terms
  • Tragic Application of Values
  • Common Value Systems
  • Fallacies and a Few Fun Techniques
  • Donna Hicks’s Essential Elements of Dignity
  • Fundamental Needs
  • Mapping Classroom Culture – Support and Humiliation
  • The Dignity Pledge
  • Separation and Segregation
  • Stripping Away Resources and Protections
  • Violence and Intimidation
  • Murder and Elimination
  • Toxic Mythologies and Deep Narratives
  • Scapegoating and Conspiracy Theories
  • Caricature and Stereotypes
  • Denial and Willful Ignorance
  • Conclusion and FAQs

Eight Types of Evidence – Strengths and Weaknesses

Overview: The ability to distinguish sources of evidence allows students to better evaluate and generate information in support of arguments.

Evidence is a huge component of reasoning and argument. Understanding how evidence works and how it might be questioned, probed, or attacked, significantly boosts students’ reasoning ability. The following material offers a vocabulary that can operate as a toolkit for use on any task that requires analysis or generation of evidence.

1. Personal Experience – It happened to you. You know what bronchitis feels like because you had it last year, and it was terrible.

Strengths – Emotionally intense and relevant, collected by your very own senses.

Weaknesses  – The way you interpret your own experiences is very personal and based on your own expectations and biases. Also, your senses have all sorts of flaws, as does your memory. You remember events and moments that are bizarre, intense, or otherwise of interest to you, which is a small sliver of the world around you.

2. Personal Observation – You saw or measured the event. You haven’t had a migraine, but your mom gets them and you have witnessed how painful and awful they can be.

Strengths – Collected by the senses, scientific measurement techniques can carefully and cleverly isolate the information you are seeking.

Weaknesses – The same as Personal Experience, scientific measurements can be corrupted by factors you didn’t anticipate.

3. Testimonial – The experience or observation of someone else; a witness. My friend saw a guy with pink eye yesterday. He said it was pretty gross.

Strengths – They were there, emotional weight of hearing someone’s story or claim. We want to believe one another because lying is so dangerous to our social fabric.

Weaknesses – The person might be mistaken (see weaknesses of Personal Experience), lying, or leaving out important details.

4. Appeal to Authority – The experience or observations of a learned and/or respected person; an expert. My brother is a doctor and treated a guy with a broken arm. He told me that broken bones don’t always hurt as much as you would expect.

Strengths – This person presumably has a lot of access to information, a depth of experience, and a professional reputation on the line.

Weaknesses – Same as Testimonial, the person’s expertise could be based on a depth of experience in field separate from the one we’re dealing with. (See Appeal to Questionable Authority Fallacy).

5. Case Examples – Historical, literary, or other recorded examples. They could be the statements of witnesses or experts, or they could be more general events that we cite to support our claim. War is terrible for soldiers on the ground. You can read all about it in many Civil War diaries.

Strengths – Same data available to everyone, you can carefully seek out and find examples that support your claim (see Confirmation Bias), emotional weight of vivid examples.

Weaknesses – Examples might be isolated and/or unrepresentative of “normal” experience (see Hasty Generalization ).

6. Research Studies   – Large sample of carefully gathered  information scrutinized with statistical tools and peer-reviewed by other experts.

Strengths – Large samples protect against Hasty Generalizations, the same data is available to everyone.

Weaknesses  – There is a long list of potential pitfalls to good research. They include poor design, poor data gathering, and poor data analysis. There are conflicting studies which cite different parts of the same data, and there are weak studies published to push a political agenda.

7. Analogy – Citing a similar circumstance; if it worked in that ugly situation, it will work in this ugly situation. If cigarettes give mice cancer, they probably give humans cancer.

Strengths – Much of life follows general rules; if something works in one place, there’s a pretty good chance it will work in another place.

Weaknesses – Places can be different! You have to look at salient details (a.k.a the details that actually contribute to whatever it is you are looking at). If a flying squirrel can fall from a tall building and survive, I should be able to do the same thing. We are both mammals! (See Bad Analogy Fallacy.)

8. Intuition – Your gut feeling, presumably based on years of experience. It feels true. The inferences that pop into your head first are likely to be based on intuition rather than research studies or other types of evidence. If you hear a bump in the night, the weight of your experience will offer a causal inference, and if that inference isn’t dangerous (“it was just the wind!”), you will likely just go back to sleep.

Strengths  – For most issues, our experience is a good guide to life. We have built a pretty good picture of the world, and we can generally rely on it to stay consistent. Malcolm Gladwell explains the power of quick inferences in his book Blink , and Daniel Kahneman describes it as “fast thinking” in his book Thinking, Fast and Slow .

Weaknesses – Your experience is personal and unique. Other people have had different experiences and will therefore have different gut feelings. There is no way to prove that your intuition is correct. If people trust it, it’s because you have been right many times in the past and will therefore trust you to be right again (see Appeal to Questionable Authority). All of the fallacies and biases that lead us to make weak inferences are relevant here.

Again, this list is adapted from Asking the Right Questions  by Neil Browne and Stuart Keeley, which offers a more in-depth look at each type of evidence. I’ve simplified and adapted their work to serve as an introduction to students new to this approach.

Share this:

Leave a comment cancel reply.

  • Search for:

' src=

  • Copy shortlink
  • Report this content
  • Manage subscriptions

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

6.2: Defining Evidence

  • Last updated
  • Save as PDF
  • Page ID 67178

  • Jim Marteney
  • Los Angeles Valley College via ASCCC Open Educational Resources Initiative (OERI)

What is evidence? According to Reike and Sillars, ”Evidence refers to specific instances, statistics, and testimony, when they support a claim in such a way as to cause the decision maker(s) to grant adherence to that claim.” 1

Screen Shot 2020-09-06 at 4.35.47 PM.png

Evidence is information that answers the question “ How do you know ? ” of a contention you have made. Please take that question very literally. It is often hard to tell the difference at first between telling someone what you know and telling them how you know it. To become an effective arguer in almost any context, you need to be able to ask this question repeatedly and test the answers you hear to determine the strength of the evidence.

Only experts can use phrases like "I think" or "I feel" or "I believe" as they have the qualifications needed that allow you to accept their observations. As for everyone else, we need to use evidence to support our arguments. As a critical thinker, you should rely much more on what a person can prove instead of what a person "feels."

Evidence is a term commonly used to describe the supporting material used when persuading others. Evidence gives an objective support to your arguments, and makes your arguments more than a mere collection of personal opinions or prejudices. No longer are you saying, “ I believe ” or “ I think ” or “ In my opinion .” Now you can support your assertions with evidence. Because you are asking your audience to take a risk when you attempt to persuade them, audiences will demand support for your assertions. Evidence needs to be carefully chosen to serve the needs of the claim and to reach the target audience.

An argument is designed to persuade a resistant audience to accept a claim via the presentation of evidence for the contentions being argued. Evidence establishes the amount of accuracy your arguments have. Evidence is one element of proof (the second is reasoning), that is used as a means of moving your audience toward the threshold necessary for them to grant adherence to your arguments.

Quality argumentation depends in part on the quantity and diversity of evidence. The arguer should expect audiences to not be persuaded by limited evidence or by a lack of variety/scope, evidence drawn from only one source as opposed to diverse sources. On the other hand, too much evidence, particularly when not carefully crafted, may leave the audience overwhelmed and without focus. Evidence in support of the different contentions in the argument needs to make the argument reasonable enough to be accepted by the target audience.

Challenge of Too Much Evidence

I attended a lecture years ago where the guest speaker told us that we have access to more information in one edition of the New York Times than a man in the middle ages had in his entire life time. The challenge is not finding information, the challenge is sorting through information to find quality evidence to use in our arguments and decision-making. In his book, “Data Smog, Surviving the Information Glut”, David Shenk expresses his concern in the first chapter:

“Information has also become a lot cheaper--to produce, to manipulate, to disseminate. All of this has made us information-rich, empowering Americans with the blessings of applied knowledge. It has also, though, unleashed the potential of information-gluttony...How much of the information in our midst is useful, and how much of it gets in the way? ...

As we have accrued more and more of it, information has emerged not only as a currency, but also as a pollutant."

  • In 1971 the average American was targeted by at least 560 daily advertising messages. Twenty years later, that number had risen six- fold, to 3,000 messages per day.
  • In the office, an average of 60 percent of each person's time is now spent processing documents.
  • Paper consumption per capita in the United States tripled from 1940 to 1980 (from 200 to 600 pounds), and tripled again from 1980 to 1990 (to 1,800 pounds).
  • In the 1980s, third-class mail (used to send publications) grew thirteen times faster than population growth.
  • Two-thirds of business managers surveyed report tension with colleagues, loss of job satisfaction and strained personal relationships as a result of information overload.
  • More than 1,000 telemarketing companies employ four million Americans, and generate $650 billion in annual sales.

Let us call this unexpected, unwelcome part of our atmosphere "data smog," an expression for the noxious muck and druck of the information age. Data smog gets in the way; it crowds out quiet moments, and obstructs much-needed contemplation. It spoils conversation, literature, and even entertainment. It thwarts skepticism, rendering us less sophisticated as consumers and citizens. It stresses us out.” 2

We need ways of sorting through this information and the first method is understanding the different types of evidence that we encounter.

Sources of Evidence

The first aspect of evidence we need to explore is the actual source of evidence or where we find evidence. There are two primary sources of evidence; primary and secondary.

Primary Sources

A primary source provides direct or firsthand evidence about an event, object, person, or work of art. Primary sources include historical and legal documents, eyewitness accounts, results of experiments, statistical data, pieces of creative writing, audio and video recordings, speeches, and art objects. Interviews, surveys, fieldwork, and Internet communications via email, blogs, tweets, and newsgroups are also primary sources. In the natural and social sciences, primary sources are often empirical studies—research where an experiment was performed or a direct observation was made. The results of empirical studies are typically found in scholarly articles or papers delivered at conferences. 3

Included in primary sources:

  • Original, first-hand accounts of events, activity or time period
  • Factual accounts instead of interpretations of accounts or experiments
  • Results of an experiment
  • Reports of scientific discoveries
  • Results of scientifically based polls

Secondary Sources

Secondary sources describe, discuss, interpret, comment upon, analyze, evaluate, summarize, and process primary sources. Secondary source materials can be articles in newspapers or popular magazines, book or movie reviews, or articles found in scholarly journals that discuss or evaluate someone else's original research. 4

Included in secondary sources:

  • Analyzation and interpretation of the accounts of primary sources
  • Secondhand account of an activity or historical event
  • Analyzation and interpretation of scientific or social research results

The key difference between the two sources is how far the author of the evidence is removed from the original event. You want to ask, " Is the author giving you a firsthand account, or a secondhand account? "

Types of Evidence

There are five types of evidence critical thinkers can use to support their arguments: precedent evidence, statistical evidence, testimonial evidence, hearsay evidence, and common knowledge evidence .

Precedent evidence is an act or event which establishes expectations for future conduct. There are two forms of precedent evidence: legal and personal.

Legal precedent is one of the most powerful and most difficult types of evidence to challenge. Courts establish legal precedent. Once a court makes a ruling, that ruling becomes the legal principle upon which other courts base their actions. Legislatures can also establish precedent through the laws they pass and the laws they choose not to pass. Once a principle of law has been established by a legislative body, it is very difficult to reverse.

Personal precedents are the habits and traditions you maintain. They occur as a result of watching the personal actions of others in order to understand the expectations for future behaviors. Younger children in a family watch how the older children are treated in order to see what precedents are being established. Newly employed on a job watch to see what older workers do in terms of breaks and lunchtime in order that their actions may be consistent. The first months of a marriage is essentially a time to establish precedent. Who does the cooking, who takes out the garbage, who cleans, which side of the bed does each person get, are precedents established early in a marriage. Once these precedents are displayed, an expectation of the other’s behavior is established. Such precedent is very difficult to alter.

To use either type of precedent as evidence, the arguer refers to how the past event relates to the current situation. In a legal situation, the argument is that the ruling in the current case should be the same as it was in the past, because they represent similar situations. In a personal situation, if you were allowed to stay out all night by your parents "just once," you can use that "just once" as precedent evidence when asking that your curfew be abolished.

Statistical evidence consists primarily of polls, surveys, and experimental results from the laboratory. This type of evidence is the numerical reporting of specific instances. Statistical evidence provides a means for communicating a large number of specific instances without citing each one. Statistics can be manipulated and misused to make the point of the particular advocate.

Don’t accept statistics just because they are numbers. People often fall into the trap of believing whatever a number says, because numbers seem accurate. Statistics are the product of a process subject to human prejudice, bias, and error. Questions on a survey can be biased, the people surveyed can be selectively chosen, comparisons may be made of non-comparable items, and reports of findings can be slanted. Take a look at all the polls that predict an election outcome. You will find variances and differences in the results.

Statistics have to be interpreted. In a debate over the use of lie detector tests to determine guilt or innocence in court, the pro-side cited a study which found that 98% of lie detector tests were accurate. The pro-side interpreted this to mean that lie detector tests were an effective means for determining guilt or innocence. However, the con-side interpreted the statistic to mean that two out of every 100 defendants in this country would be found guilty and punished for a crime they did not commit.

Screen Shot 2020-09-06 at 4.44.01 PM.png

The great baseball announcer Vin Scully once described the misuse of statistics by a journalist by saying that “ He uses statistics like a drunk uses a lamppost, not for illumination but for support

Statistics are often no more reliable than other forms of evidence, although people often think they are. Advocates need to carefully analyze how they use statistics when attempting to persuade others. Likewise, the audience needs to question statistics that don't make sense to them.

Testimonial evidence is used for the purpose of assigning motives, assessing responsibilities, and verifying actions for past, present and future events. Testimony is an opinion of reality as stated by another person. There are three forms of testimonial evidence: eyewitness, expert-witness, and historiography.

Eyewitness testimony is a personal declaration as to the accuracy of an event. That is, the person actually saw an event take place and is willing to bear witness to that event. Studies have confirmed that eyewitness testimony, even with all of its problems, is a powerful form of evidence. There seems to be almost something "magical" about a person swearing to "tell the whole truth and nothing but the truth."

Expert-witness evidence calls upon someone qualified to make a personal declaration about the nature of the fact in question. Courts of law make use of experts in such fields as forensics, ballistics, and psychology. The critical thinker uses the credibility of another person to support an argument through statements about the facts or opinions of the situation.

What or who qualifies as an expert witness? Does being a former military officer make them an expert in military tactics? Often an advocate will merely pick someone who they know the audience will accept. But as an audience we should demand that advocates justify the expertise of their witness. As we acquire more knowledge, our standards of what constitutes an expert should rise. We need to make a distinction between sources that are simply credible like well-known athletes and entertainers that urge you to buy a particular product, and those who really have the qualities that allow them to make a judgment about a subject in the argumentative environment.

Although expert witness testimony is an important source of evidence, such experts can disagree. In a recent House Energy and Commerce subcommittee, two experts gave opposite testimony, on the same day, on a bill calling for a label on all aspirin containers warning of the drug's often fatal link to Reye's Syndrome. The head of the American Academy of Pediatrics gave testimony supporting the link, but Dr. Joseph White, President of The Aspirin Foundation of America, said there was insufficient evidence linking aspirin to Reye’s syndrome.

Historiography is the third form of testimonial evidence. In their book, ARGUMENTATION AND ADVOCACY, Windes and Hastings write, "Historiographers are concerned in large part with the discovery, use, and verification of evidence. The historian traces influences, assigns motives, evaluates roles, allocates responsibilities, and juxtaposes events in an attempt to reconstruct the past. That reconstruction is no wiser, no more accurate or dependable than the dependability of the evidence the historian uses for his reconstruction." 5

Keep in mind that there are many different ways of determining how history happens. Remember, historians may disagree over why almost any event happened. In the search for how things happen, we get ideas about how to understand our present world's events and what to do about them, if anything.

Primary sources are essential to the study of history. They are the basis for what we know about the distant past and the recent past. Historians must depend on other evidence from the era to determine who said what, who did what, and why.

How successful is the historian in recreating “objective reality?" As noted historian Arthur Schlesinger, Jr. says,

“The sad fact is that, in many cases, the basic evidence for the historian’s reconstruction of the really hard cases does not exist, and the evidence that does exist is often incomplete, misleading, or erroneous. Yet, it is the character of the evidence which establishes the framework within which he writes. He cannot imagine scenes for which he has no citation, invent dialogue for which he has no text, assume relationships for which he has no warrant.”

Historical reconstruction must be done by a qualified individual to be classified as historical evidence. Critical thinkers will find it useful to consider the following three criteria for evaluating historical evidence.

Around 1,000 books are published internationally every day and the total of all printed knowledge doubles every 5 years.

More information is estimated to have been produced in the last 30 years than in the previous 5,000.

----The Reuters Guide to Good Information Strategy 2000

Was the author an eyewitness to what is being described, or is the author considered an authority on the subject? Eyewitness accounts can be the most objective and valuable but they may also be tainted with bias. If the author professes to be an authority, he/she should present his/her qualifications.

Does the author have a hidden agenda? The author may purposely or unwittingly tell only part of the story. The excerpt may seem to be a straight-forward account of the situation, yet the author has selected certain facts, details, and language, which advance professional, personal or political goals or beliefs. They may be factual, but the hidden agenda of these books was to make money for the author, or get even with those in the administration they didn't like.

Does the author have a bias? The author's views may be based on personal prejudice rather than a reasoned conclusion based on facts. Critical thinkers need to notice when the author uses exaggerated language, fails to acknowledge, or dismisses his or her opponents' arguments. Historians may have biases based on their political allegiance. Conservative historians would view events differently than a liberal historian. It is important to know the political persuasion of the historian in order to determine the extent of bias he or she might have on the specific topic they are writing about.

Screen Shot 2020-09-06 at 4.49.05 PM.png

Sometimes we think we might know our history, but Historian Daniel Boorstin puts a perspective on the ultimate validity and accuracy of historical testimony when he writes, "Education is learning what you didn't even know you didn't know." Modern techniques of preserving data should make the task of recreating the past easier and adding to our education.

Hearsay evidence (also called rumor or gossip evidence) can be defined as an assertion or set of assertions widely repeated from person to person, though its accuracy is unconfirmed by firsthand observation. "Rumor is not always wrong , " wrote Tacitus, the Roman historian. A given rumor may be spontaneous or premeditated in origin. It may consist of opinion represented as fact, a nugget of accuracy garbled or misrepresented to the point of falsehood, exaggerations, or outright, intentional lies. Yet, hearsay may well be the "best available evidence" in certain situations where the original source of the information cannot be produced.

Rumor, gossip or hearsay evidence carries proportionately higher risks of distortion and error than other types of evidence. However, outside the courtroom, it can be as effective as any other form of evidence in proving your point. Large companies often rely on this type of evidence, because they lack the capability to deliver other types of evidence.

A recent rumor was started that actor Morgan Freeman had died. A page on “Facebook” was created and soon gained more that 60,000 followers, after it was announced that the actor had passed away. Many left their condolences and messages of tribute. Only one problem, Morgan Freeman was very much alive, actually that is not so much a problem, especially to Morgan Freeman. The Internet is a very effective tool when it comes to spreading rumors.

Common knowledge evidence is also a way to support one’s arguments. This type of evidence is most useful in providing support for arguments which lack any real controversy. Many claims are supported by evidence that comes as no particular surprise to anyone.

Basing an argument on common knowledge is the easiest method of securing belief in an idea, because an audience will accept it without further challenge. As Communication Professors Patterson and Zarefsky explain:

“Many argumentative claims we make are based on knowledge generally accepted by most people as true. For example, if you claimed that millions of Americans watch television each day, the claim would probably be accepted without evidence. Nor would you need to cite opinions or survey results to get most people to accept the statement that millions of people smoke cigarettes." 6 (Patterson, 1983)

Credibility of Evidence or How Good Is It?

In order to tell us how you know something, you need to tell us where the information came from. If you personally observed the case you are telling us about, you need to tell us that you observed it, and when and where. If you read about it, you need to tell us where you read about it. If you are accepting the testimony of an expert, you need to tell us who the expert is and why she is an expert in this field. The specific identity, name or position and qualifications of your sources are part of the answer to the question “How do you know?” You need to give your audience that information.

Keep in mind that it is the person, the individual human being, who wrote an article or expressed an idea who brings authority to the claim. Sometimes that authority may be reinforced by the publication in which the claim appeared, sometimes not. But when you quote or paraphrase a source you are quoting or paraphrasing the author, not the magazine or journal. The credibility of the evidence you use can be enhanced by:

Specific Reference to Source : Does the advocate indicate the particular individual or group making the statements used for evidence? Does the advocate tell you enough about the source that you could easily find it yourself?

Qualifications of the Source: Does the advocate give you reason to believe that the source is competent and well-informed in the area in question?

Bias of the Source : Even if an expert, is the source likely to be biased on the topic? Could we easily predict the source’s position merely from knowledge of his job, her political party, or organizations he or she works for?

Factual Support: Does the source offer factual support for the position taken or simply state personal opinions as fact?

Evaluating Internet Sources of Evidence

We currently obtain a significant amount of the evidence we use in an argument from the Internet. Some people are still under the influence that if they read it on the Internet, it must be accurate. But we all know that some Internet sources are better than others. We need to be able to evaluate websites to obtain the best information possible. Here are two approaches to evaluating websites

Who, What, When, Where, and Why

This first test is based on the traditional 5 “W’s.” These questions, like critical thinking, go back to Greek and Roman times. The notable Roman, Cicero, who was in office in 63 BC, is credited with asking these questions

Journalists are taught to answer these five questions when writing an article for publication. To provide an accurate interpretation of events to their viewers or readers, they ask these five questions and we can ask the same questions to begin discovering the level of quality of an online source.

Who wrote the post? What are their qualifications?

What is actually being said in the website. How accurate is the content?

When was the website’s latest post?

Where is the source of the post? Does the URL suggest it is from an academic source or an individual?

Why is the website published? Is the website there to inform or entertain?

There is a second method of evaluating websites that is more popular and includes a more in depth analysis. This method is known as the CRAAP test.

The C.R.A.A.P. Test

C.R.A.A.P. is an acronym standing for Currency, Relevance, Authority, Accuracy, and Purpose. Developed by the Meriam Library at the California State University at Chico, each of these five areas is used to evaluate websites.

Currency How recent is this website. If you are conducting research on some historical subject a web site that has no recent additions could be useful. If, however you are researching some current news story, or technology, or scientific topic, you will want a site that has been recently updated.

Questions to Ask:

  • When was the content of the website published or posted?
  • Has the information been revised or updated recently?
  • Have more recent articles on your subject been published?
  • Does your topic require the most current information possible, or will older posts and sources be acceptable?
  • Are the web links included in the website functional?
  • Relevance This test of a website asks you how important is the information to the specific topic you are researching. You will want to determine if you are the intended audience and if the information provided fits your research needs.
  • Does the content relate to your research topic or the question you are answering?
  • Who is the intended audience?
  • Is the information at an appropriate level for the purpose of your work? In other words, is it college level or targeted to a younger or less educated audience?
  • Have you compared this site to a variety of other resources?
  • Would you be comfortable citing this source in your research project?

Authority Here we determine if the source of the website has the credentials to write on the subject which makes you feel comfortable in using the content. If you are looking for an accurate interpretation of news events, you will want to know if the author of the website is a qualified journalist or a random individual reposting content.

  • Who is the author/ publisher/ source/ sponsor of the website?
  • What are the author’s credentials or organizational affiliations?
  • Does the author have the qualifications to write on this particular topic?
  • Can you find information about the author from reference sources or the Internet?
  • Is the author quoted or referred to on other respected sources or websites?
  • Is there contact information, such as a publisher or email address?
  • Does the URL reveal anything about the author or source?

Accuracy In this test we attempt to determine the reliability and accuracy of the content of the website. You need to determine if you can trust the information presented in the website or is it just slanted, personal beliefs.

  • Where does the information in the website come from?
  • Is the information supported by Evidence, or is it just opinion?
  • Has the information presented been reviewed by qualified sources?
  • Can you verify any of the content in another source or personal knowledge?
  • Are there statements in the website you know to be false?
  • Does the language or tone used in the website appear unbiased or free of emotion or loaded language?
  • Are there spelling, grammar or typographical errors in the content of the website?

Purpose Finally we examine the purpose of the website. We need to determine if the website was created to inform, entertain or even sell a product or service. If we want accurate, high quality evidence, we would want to avoid a site that is trying to sell us something. Although a company selling solar power may have some factual information about solar energy on their site, the site is geared to sell you their product. The information they provide is not there to educate you with all aspects of solar power.

  • What is the purpose of the content of this website? Is the purpose to inform, teach, sell, entertain or persuade?
  • Do the authors/sponsors of the website make their intentions or purpose clear?
  • Is the content in the website considered facts, opinion, or even propaganda?
  • Does the point of view appear objective and impartial?
  • Does the author omit important facts or data that might disprove the claim being made in the post?
  • Are alternative points of view presented?
  • Does the content of the website contain political, ideological, cultural, religious, institutional or personal biases?

Questions used here are inspired from questions from the Meriam Library at California State University Chico, the University of Maryland University College Library and Creighton University Library

Screen Shot 2020-09-06 at 4.59.33 PM.png

  • Rieke, Richard D. and Malcolm Sillars. Argumentation and Critical Decision Making. (New York: HaperCollins Rhetoric and Society Series, 1993)
  • Shenk, David. Data Smog, Surviving the Information Glut. 1. San Fransisco: HarperEdge, 1997
  • Ithica College, "Primary and Secondary Sources," libguides.ithaca.edu/research101/primary (accessed October 31, 2019)
  • ARGUMENTATION AND ADVOCACY. By Russel R. Windes and Arthur Hastings. New York: Random House, 1965
  • Patterson, J. W. and David Zarefsky. Contemporary Debate. Boston: Houghton Mifflin, 1983

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

critical thinking about evidence

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience
  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • Applying Critical Thinking
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically implies moving beyond simply understanding information, but questioning its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., being influenced by personal opinions and feelings rather than by external determinants ] . Applying critical thinking to investigating a research problem involves actively challenging assumptions and questioning the choices and potential motives underpinning how the author designed the study, conducted the research, and arrived at particular conclusions or recommended courses of action.

Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design . New York: Routledge, 2017.

Thinking Critically

Applying Critical Thinking to Research and Writing

Professors like to use the term critical thinking; in fact, the idea of being critical permeates much of higher education writ large. In the classroom, the idea of thinking critically is often mentioned by professors when students ask how they should approach a research and writing assignment [other approaches your professor might mention include interdisciplinarity, comparative, gendered, global, etc.]. However, critical thinking is more than just an approach to research and writing. It is an acquired skill used in becoming a complex learner capable of discerning important relationships among the elements of, as well as integrating multiple ways of understanding applied to, the research problem. Critical thinking is a lens through which you holistically interrogate a topic.

Given this, thinking critically encompasses a variety of inter-related connotations applied to college-level research and writing * :

  • Integrated and Multi-Dimensional . Critical thinking is not focused on any one element of research, but rather, is applied holistically throughout the process of identifying the research problem, reviewing of literature, applying methods of analysis, describing the results, discussing their implications, and, if appropriate, offering recommendations for further research. The act of thinking critically is also non-linear [i.e., applies to going back and changing prior thoughts when new evidence emerges]; it permeates the entire research endeavor from contemplating what to write to proofreading the final product.
  • Humanize Research . Thinking critically can help humanize the research problem by extending the scope of your analysis beyond the boundaries of traditional approaches to studying the topic. Traditional approaches can include, for example, sampling homogeneous populations, considering only certain factors related to investigating a phenomenon, or limiting the way you frame or represent the context of your study. Critical thinking can help reveal opportunities to incorporate the experiences of others into the research, creating a more representative examination of the research problem.
  • Normative . This refers to the idea that critical thinking can be used to challenge prior assumptions in ways that advocate for social justice, equity, and inclusion and which can lead to research having a more transformative and expansive impact. In this respect, critical thinking can be a method for breaking away from dominant culture norms so as to produce research outcomes that illuminate previously hidden aspects of exploitation and injustice.
  • Power Dynamics . Research in the social and behavioral sciences often includes examining aspects of power and influence that shape social relations, organizations, institutions, and the production and maintenance of knowledge. This approach encompasses studying how power operates, how it can be acquired, and how power and influence can be maintained. Critical thinking can reveal how societal structures perpetuate power and influence in ways that marginalizes and oppresses certain groups or communities within the contexts of history , politics, economics, culture, and other factors.
  • Reflection . A key aspect of critical thinking is practicing reflexivity; the act of turning ideas and concepts back onto yourself in order to reveal and clarify your own beliefs, assumptions, and perspectives. Being critically reflexive is important because it can reveal hidden biases you may have that could unintentionally influence how you interpret and validate information. The more reflexive you are, the better able and more comfortable you are about opening yourself up to new modes of understanding.
  • Rigorous Questioning . Thinking critically is guided by asking questions that lead to addressing complex concepts, principles, theories, or problems more effectively and to help distinguish what is known from from what is not known [or that may be hidden]. In this way, critical thinking involves deliberately framing inquiries not just as research questions, but as a way to focus on systematic, disciplined,  in-depth questioning concerning the research problem and your positionality as a researcher.
  • Social Change . An overarching goal of critical thinking applied to research and writing is to seek to identify and challenge sources of inequality, exploitation, oppression, and marinalization that contributes to maintaining the status quo within institutions of society. This can include entities, such as, schools, courts, businesses, government agencies, religious centers, that have been created and maintained through certain ways of thinking within the dominant culture.

In writing a research paper, the act of critical thinking applies most directly to the literature review and discussion sections of your paper . In reviewing the literature, it is important to reflect upon specific aspects of a study, such as, determining if the research design effectively establishes cause and effect relationships or provides insight into explaining why certain phenomena do or do not occur, assessing whether the method of gathering data or information supports the objectives of the study, and evaluating if the assumptions used t o arrive at a specific conclusion are evidence-based and relevant to addressing the research problem. An assessment of whether a source is helpful to investigating the research problem also involves critically analyzing how the research challenges conventional approaches to investigations that perpetuate inequalities or hides the voices of others.

Critical thinking also applies to the discussion section of your paper because this is where you interpret the findings of your study and explain its significance. This involves more than summarizing findings and describing outcomes. It includes reflecting on their importance and providing reasoned explanations why the research study is important in filling a gap in the literature or expanding knowledge and understanding about the topic in ways that inform practice. Critical reflection helps you think introspectively about your own beliefs concerning the significance of the findings but in ways that avoid biased judgment and decision making.

* Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Suter, W. Newton. Introduction to Educational Research: A Critical Thinking Approach. 2nd edition. Thousand Oaks, CA: SAGE Publications, 2012

Behar-Horenstein, Linda S., and Lian Niu. “Teaching Critical Thinking Skills in Higher Education: A Review of the Literature.” Journal of College Teaching and Learning 8 (February 2011): 25-41; Bayou, Yemeserach and Tamene Kitila. "Exploring Instructors’ Beliefs about and Practices in Promoting Students’ Critical Thinking Skills in Writing Classes." GIST–Education and Learning Research Journal 26 (2023): 123-154; Butcher, Charity. "Using In-class Writing to Promote Critical Thinking and Application of Course Concepts." Journal of Political Science Education 18 (2022): 3-21; Loseke, Donileen R. Methodological Thinking: Basic Principles of Social Research Design. Thousand Oaks, CA: Sage, 2012; Hart, Claire et al. “Exploring Higher Education Students’ Critical Thinking Skills through Content Analysis.” Thinking Skills and Creativity 41 (September 2021): 100877; Sabrina, R., Emilda Sulasmi, and Mandra Saragih. "Student Critical Thinking Skills and Student Writing Ability: The Role of Teachers' Intellectual Skills and Student Learning." Cypriot Journal of Educational Sciences 17 (2022): 2493-2510.Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design. New York: Routledge, 2017; Yeh, Hui-Chin, Shih-hsien Yang, Jo Shan Fu, and Yen-Chen Shih. "Developing College Students’ Critical Thinking through Reflective Writing." Higher Education Research & Development 42 (2023): 244-259.

  • << Previous: Academic Writing Style
  • Next: Choosing a Title >>
  • Last Updated: Apr 11, 2024 1:27 PM
  • URL: https://libguides.usc.edu/writingguide

SEP home page

  • Table of Contents
  • New in this Archive
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Back to Entry
  • Entry Contents
  • Entry Bibliography
  • Academic Tools
  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Supplement to Critical Thinking

How can one assess, for purposes of instruction or research, the degree to which a person possesses the dispositions, skills and knowledge of a critical thinker?

In psychometrics, assessment instruments are judged according to their validity and reliability.

Roughly speaking, an instrument is valid if it measures accurately what it purports to measure, given standard conditions. More precisely, the degree of validity is “the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests” (American Educational Research Association 2014: 11). In other words, a test is not valid or invalid in itself. Rather, validity is a property of an interpretation of a given score on a given test for a specified use. Determining the degree of validity of such an interpretation requires collection and integration of the relevant evidence, which may be based on test content, test takers’ response processes, a test’s internal structure, relationship of test scores to other variables, and consequences of the interpretation (American Educational Research Association 2014: 13–21). Criterion-related evidence consists of correlations between scores on the test and performance on another test of the same construct; its weight depends on how well supported is the assumption that the other test can be used as a criterion. Content-related evidence is evidence that the test covers the full range of abilities that it claims to test. Construct-related evidence is evidence that a correct answer reflects good performance of the kind being measured and an incorrect answer reflects poor performance.

An instrument is reliable if it consistently produces the same result, whether across different forms of the same test (parallel-forms reliability), across different items (internal consistency), across different administrations to the same person (test-retest reliability), or across ratings of the same answer by different people (inter-rater reliability). Internal consistency should be expected only if the instrument purports to measure a single undifferentiated construct, and thus should not be expected of a test that measures a suite of critical thinking dispositions or critical thinking abilities, assuming that some people are better in some of the respects measured than in others (for example, very willing to inquire but rather closed-minded). Otherwise, reliability is a necessary but not a sufficient condition of validity; a standard example of a reliable instrument that is not valid is a bathroom scale that consistently under-reports a person’s weight.

Assessing dispositions is difficult if one uses a multiple-choice format with known adverse consequences of a low score. It is pretty easy to tell what answer to the question “How open-minded are you?” will get the highest score and to give that answer, even if one knows that the answer is incorrect. If an item probes less directly for a critical thinking disposition, for example by asking how often the test taker pays close attention to views with which the test taker disagrees, the answer may differ from reality because of self-deception or simple lack of awareness of one’s personal thinking style, and its interpretation is problematic, even if factor analysis enables one to identify a distinct factor measured by a group of questions that includes this one (Ennis 1996). Nevertheless, Facione, Sánchez, and Facione (1994) used this approach to develop the California Critical Thinking Dispositions Inventory (CCTDI). They began with 225 statements expressive of a disposition towards or away from critical thinking (using the long list of dispositions in Facione 1990a), validated the statements with talk-aloud and conversational strategies in focus groups to determine whether people in the target population understood the items in the way intended, administered a pilot version of the test with 150 items, and eliminated items that failed to discriminate among test takers or were inversely correlated with overall results or added little refinement to overall scores (Facione 2000). They used item analysis and factor analysis to group the measured dispositions into seven broad constructs: open-mindedness, analyticity, cognitive maturity, truth-seeking, systematicity, inquisitiveness, and self-confidence (Facione, Sánchez, and Facione 1994). The resulting test consists of 75 agree-disagree statements and takes 20 minutes to administer. A repeated disturbing finding is that North American students taking the test tend to score low on the truth-seeking sub-scale (on which a low score results from agreeing to such statements as the following: “To get people to agree with me I would give any reason that worked”. “Everyone always argues from their own self-interest, including me”. “If there are four reasons in favor and one against, I’ll go with the four”.) Development of the CCTDI made it possible to test whether good critical thinking abilities and good critical thinking dispositions go together, in which case it might be enough to teach one without the other. Facione (2000) reports that administration of the CCTDI and the California Critical Thinking Skills Test (CCTST) to almost 8,000 post-secondary students in the United States revealed a statistically significant but weak correlation between total scores on the two tests, and also between paired sub-scores from the two tests. The implication is that both abilities and dispositions need to be taught, that one cannot expect improvement in one to bring with it improvement in the other.

A more direct way of assessing critical thinking dispositions would be to see what people do when put in a situation where the dispositions would reveal themselves. Ennis (1996) reports promising initial work with guided open-ended opportunities to give evidence of dispositions, but no standardized test seems to have emerged from this work. There are however standardized aspect-specific tests of critical thinking dispositions. The Critical Problem Solving Scale (Berman et al. 2001: 518) takes as a measure of the disposition to suspend judgment the number of distinct good aspects attributed to an option judged to be the worst among those generated by the test taker. Stanovich, West and Toplak (2011: 800–810) list tests developed by cognitive psychologists of the following dispositions: resistance to miserly information processing, resistance to myside thinking, absence of irrelevant context effects in decision-making, actively open-minded thinking, valuing reason and truth, tendency to seek information, objective reasoning style, tendency to seek consistency, sense of self-efficacy, prudent discounting of the future, self-control skills, and emotional regulation.

It is easier to measure critical thinking skills or abilities than to measure dispositions. The following eight currently available standardized tests purport to measure them: the Watson-Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), the Cornell Critical Thinking Tests Level X and Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), the Ennis-Weir Critical Thinking Essay Test (Ennis & Weir 1985), the California Critical Thinking Skills Test (Facione 1990b, 1992), the Halpern Critical Thinking Assessment (Halpern 2016), the Critical Thinking Assessment Test (Center for Assessment & Improvement of Learning 2017), the Collegiate Learning Assessment (Council for Aid to Education 2017), the HEIghten Critical Thinking Assessment (https://territorium.com/heighten/), and a suite of critical thinking assessments for different groups and purposes offered by Insight Assessment (https://www.insightassessment.com/products). The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students’ critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level certificates in critical thinking on the basis of an examination (OCR 2011). Many of these standardized tests have received scholarly evaluations at the hands of, among others, Ennis (1958), McPeck (1981), Norris and Ennis (1989), Fisher and Scriven (1997), Possin (2008, 2013a, 2013b, 2013c, 2014, 2020) and Hatcher and Possin (2021). Their evaluations provide a useful set of criteria that such tests ideally should meet, as does the description by Ennis (1984) of problems in testing for competence in critical thinking: the soundness of multiple-choice items, the clarity and soundness of instructions to test takers, the information and mental processing used in selecting an answer to a multiple-choice item, the role of background beliefs and ideological commitments in selecting an answer to a multiple-choice item, the tenability of a test’s underlying conception of critical thinking and its component abilities, the set of abilities that the test manual claims are covered by the test, the extent to which the test actually covers these abilities, the appropriateness of the weighting given to various abilities in the scoring system, the accuracy and intellectual honesty of the test manual, the interest of the test to the target population of test takers, the scope for guessing, the scope for choosing a keyed answer by being test-wise, precautions against cheating in the administration of the test, clarity and soundness of materials for training essay graders, inter-rater reliability in grading essays, and clarity and soundness of advance guidance to test takers on what is required in an essay. Rear (2019) has challenged the use of standardized tests of critical thinking as a way to measure educational outcomes, on the grounds that  they (1) fail to take into account disputes about conceptions of critical thinking, (2) are not completely valid or reliable, and (3) fail to evaluate skills used in real academic tasks. He proposes instead assessments based on discipline-specific content.

There are also aspect-specific standardized tests of critical thinking abilities. Stanovich, West and Toplak (2011: 800–810) list tests of probabilistic reasoning, insights into qualitative decision theory, knowledge of scientific reasoning, knowledge of rules of logical consistency and validity, and economic thinking. They also list instruments that probe for irrational thinking, such as superstitious thinking, belief in the superiority of intuition, over-reliance on folk wisdom and folk psychology, belief in “special” expertise, financial misconceptions, overestimation of one’s introspective powers, dysfunctional beliefs, and a notion of self that encourages egocentric processing. They regard these tests along with the previously mentioned tests of critical thinking dispositions as the building blocks for a comprehensive test of rationality, whose development (they write) may be logistically difficult and would require millions of dollars.

A superb example of assessment of an aspect of critical thinking ability is the Test on Appraising Observations (Norris & King 1983, 1985, 1990a, 1990b), which was designed for classroom administration to senior high school students. The test focuses entirely on the ability to appraise observation statements and in particular on the ability to determine in a specified context which of two statements there is more reason to believe. According to the test manual (Norris & King 1985, 1990b), a person’s score on the multiple-choice version of the test, which is the number of items that are answered correctly, can justifiably be given either a criterion-referenced or a norm-referenced interpretation.

On a criterion-referenced interpretation, those who do well on the test have a firm grasp of the principles for appraising observation statements, and those who do poorly have a weak grasp of them. This interpretation can be justified by the content of the test and the way it was developed, which incorporated a method of controlling for background beliefs articulated and defended by Norris (1985). Norris and King synthesized from judicial practice, psychological research and common-sense psychology 31 principles for appraising observation statements, in the form of empirical generalizations about tendencies, such as the principle that observation statements tend to be more believable than inferences based on them (Norris & King 1984). They constructed items in which exactly one of the 31 principles determined which of two statements was more believable. Using a carefully constructed protocol, they interviewed about 100 students who responded to these items in order to determine the thinking that led them to choose the answers they did (Norris & King 1984). In several iterations of the test, they adjusted items so that selection of the correct answer generally reflected good thinking and selection of an incorrect answer reflected poor thinking. Thus they have good evidence that good performance on the test is due to good thinking about observation statements and that poor performance is due to poor thinking about observation statements. Collectively, the 50 items on the final version of the test require application of 29 of the 31 principles for appraising observation statements, with 13 principles tested by one item, 12 by two items, three by three items, and one by four items. Thus there is comprehensive coverage of the principles for appraising observation statements. Fisher and Scriven (1997: 135–136) judge the items to be well worked and sound, with one exception. The test is clearly written at a grade 6 reading level, meaning that poor performance cannot be attributed to difficulties in reading comprehension by the intended adolescent test takers. The stories that frame the items are realistic, and are engaging enough to stimulate test takers’ interest. Thus the most plausible explanation of a given score on the test is that it reflects roughly the degree to which the test taker can apply principles for appraising observations in real situations. In other words, there is good justification of the proposed interpretation that those who do well on the test have a firm grasp of the principles for appraising observation statements and those who do poorly have a weak grasp of them.

To get norms for performance on the test, Norris and King arranged for seven groups of high school students in different types of communities and with different levels of academic ability to take the test. The test manual includes percentiles, means, and standard deviations for each of these seven groups. These norms allow teachers to compare the performance of their class on the test to that of a similar group of students.

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Critical thinking and evidence-based practice

Affiliation.

  • 1 Faculty of Nursing, University of Alberta, Edmonton, Alberta, Canada. [email protected]
  • PMID: 16311232
  • DOI: 10.1016/j.profnurs.2005.10.002

Critical thinking (CT) is vital to evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care and can contribute positively to patient outcomes across a variety of settings and geographic locations. The nature of EBP, its relevance to nursing, and the skills needed to support it should be required components of baccalaureate education and must be introduced early in students' development as independent, self-directed learners and as professional nurses. Among the knowledge, skills, and processes needed to support EBP, CT is paramount. The development of CT can prepare nurses with the necessary skills and dispositions (habits of mind, attitudes, and traits) to support EBP. The intents of this study were to explore the importance of CT as an essential skill to support EBP and to describe some of the strategies and processes considered key to the ongoing development of CT.

Publication types

  • Attitude of Health Personnel
  • Education, Nursing, Baccalaureate / organization & administration*
  • Evidence-Based Medicine* / education
  • Evidence-Based Medicine* / organization & administration
  • Health Knowledge, Attitudes, Practice
  • Health Services Needs and Demand
  • Mentors / psychology
  • Nurses / psychology
  • Nursing Process / organization & administration*
  • Nursing Research* / education
  • Nursing Research* / organization & administration
  • Organizational Innovation
  • Philosophy, Nursing
  • Problem-Based Learning
  • Professional Competence / standards
  • Social Support
  • Teaching / organization & administration

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of bmcnurs

Evidence-based practice beliefs and implementations: a cross-sectional study among undergraduate nursing students

Nesrin n. abu-baker.

1 Faculty of Nursing, Community and Mental Health Nursing Department, Jordan University of Science & Technology, P.O Box 3030, Irbid, 22110 Jordan

Salwa AbuAlrub

2 Faculty of Irbid College, Department of Applied Sciences, Al-Balqa Applied University, P.O. Box 1293, Irbid, Jordan

Rana F. Obeidat

3 Faculty of Nursing, Zarqa University, 247D Khawarezmi Building, Zarqa, Jordan

Kholoud Assmairan

4 Faculty of Nursing, Al-Albayt University, P.O Box 130040, Mafraq, 25113 Jordan

Associated Data

Data are available from the corresponding author upon reasonable request and with permission of Jordan University of Science and Technology.

Integrating evidence-based practice (EBP) into the daily practice of healthcare professionals has the potential to improve the practice environment as well as patient outcomes. It is essential for nurses to build their body of knowledge, standardize practice, and improve patient outcomes. This study aims to explore nursing students’ beliefs and implementations of EBP, to examine the differences in students’ beliefs and implementations by prior training of EBP, and to examine the relationship between the same.

A cross-sectional survey design was used with a convenience sample of 241 nursing students from two public universities. Students were asked to answer the questions in the Evidence-Based Practice Belief and Implementation scales.

This study revealed that the students reported a mean total belief score of 54.32 out of 80 ( SD  = 13.63). However, they reported a much lower implementation score of 25.34 out of 72 ( SD  = 12.37). Students who received EBP training reported significantly higher total belief and implementation scores than those who did not. Finally, there was no significant relationship between belief and implementation scores ( p  > .05).

To advance nursing science, enhance practice for future nurses, and improve patient outcomes, it is critical to teach nursing students not only the value of evidence-based knowledge, but also how to access this knowledge, appraise it, and apply it correctly as needed.

Evidence-based practice (EBP) integrates the clinical expertise, the latest and best available research evidence, as well as the patient’s unique values and circumstances [ 1 ]. This form of practice is essential for nurses as well as the nursing profession as it offers a wide variety of benefits: It helps nurses to build their own body of knowledge, minimize the gap between nursing education, research, and practice, standardize nursing practices [ 2 ], improve clinical patient outcomes, improve the quality of healthcare, and decrease healthcare costs [ 3 ]. Thus, clinical decision-making by nurses should be based on the best and most up-to-date, available research evidence [ 4 ].

Earlier studies of EBP implementation by nurses in their everyday clinical practice have shown that it is suboptimal [ 5 – 7 ]. Implementation of EBP is defined as its application in clinical practice [ 8 ]. Findings from previous studies indicate that nurses’ implementation of EBP can be promoted by improving their belief about EBP. Belief is the perception of the value and benefits of EBP and the perceived self-confidence in one’s knowledge and skills of EBP [ 8 ]. Nurses with a strong belief in EBP implement it more than nurses with a weak belief in the same [ 7 , 9 ].

Preparing nurses for practice and ensuring that they have met a set of minimum core competencies at the point of graduation is achieved through their undergraduate education [ 10 ]. Several formal entities such as the Institute of Medicine (IOM) [ 4 ] and the Accreditation Commission for Education in Nursing (ACEN) [ 11 ] consider EBP as one of the core competencies that should be included in health care clinicians’ education. However, this does not necessarily guarantee the actual implementation of EBP in everyday clinical practice [ 12 ]. It is essential to educate undergraduate nursing students on EBP to improve their knowledge about it, to strengthen their belief regarding its benefits to patients and nurses, and to enhance their self-efficacy in implementing EBP. In order to effect this change, it is crucial to improve the education process and to focus more on the knowledge and implementation of EBP.

There is consistent evidence showing that while undergraduate nursing students hold positive beliefs about EBP and its value in patient care, they also report many challenges regarding its actual implementation in clinical practice. For instance, a mixed-methods study indicated that 118 American undergraduate nursing students found it difficult to distinguish between EBP and research. Students were able to search for evidence, but were less able to integrate evidence to plan EBP changes or disseminate best practices [ 13 ]. Additionally, a correlational study was conducted in Jordan using a sample of 612 senior nursing students. The study reported that students held positive attitudes towards research and 75% of them agreed on using nursing research in clinical practice. Students strongly believed in the usefulness of research. However, they did not believe strongly in their ability to conduct research [ 14 ]. A cross-sectional study was conducted among 188 Saudi undergraduate nursing students. Students reported positive beliefs about EBP; however, they reported a low mean score in EBP implementation (22.57 out of 72). Several significant factors have been reported as influencing EBP implementation, such as age, gender, awareness, and training on EBP [ 15 ]. A comparative survey comprised of 1383 nursing students from India, Saudi Arabia, Nigeria, and Oman. The study reported that having no authority in changing patient care policies, the slow publication of evidence, and the lack of time in the clinical area to implement the evidence were major barriers in implementing EBP according to the participating students [ 16 ].

In Jordan, evidence-based knowledge with critical thinking is one of the seven standards for the professional practice of registered nurses that were released by the Jordan Nursing Council [ 17 ]. Despite the plethora of studies on undergraduate nursing students’ beliefs about EBP and its implementation in everyday clinical practice, this topic has not been fully addressed among Jordanian undergraduate nursing students. Thus, the purpose of this study is to explore the self-reported beliefs and implementations of EBP among undergraduate nursing students in Jordan. The specific aims of this study were to (1) explore nursing students’ beliefs and implementations of EBP, (2) examine the differences in students’ beliefs and implementations by prior training of EBP, and (3) examine the relationship between nursing students’ beliefs and implementations of EBP.

Design and setting

A cross-sectional, correlational research survey design was used to meet the study aims. Recruitment of study participants was undertaken at two governmental universities in the northern part of Jordan. The two universities offer a four-year undergraduate nursing program aimed at graduating competent general nurses with baccalaureate degrees. The nursing research course is included as a compulsory course in the undergraduate nursing curricula in both universities.

Population and sample

The target population of this study was the undergraduate nursing students in Jordan. The accessible population was undergraduate nursing students who are currently enrolled in the four-year BSN program in two governmental universities in the northern region of Jordan. We calculated the sample size using the G*Power software (2014). Using a conventional power estimate of 0.8, with alpha set at 0.05, and medium effect size, it was estimated that for a Pearson Correlation test, a total of 100 participants would need to be recruited to examine the relationship between the beliefs and implementations of EBP. To counteract anticipated non-response and to enhance the power of the study, 300 students were approached. The inclusion criteria of the study participants were as follows: a) senior nursing students who are in the 3 rd or 4th-year level, b) students who are currently taking a clinical course with training in a clinical setting/hospital, c) and students who have successfully passed the nursing research course.

Measurement

A structured questionnaire composed of two parts was used for data collection. The first part aimed to gather the demographic data of the participants: gender, age, study year level, university, and any previous EBP training received in the nursing research course. The second part contained the EBP Belief Scale and EBP Implementation scale developed by Melnyk et al. (2008) [ 18 ]. Both scales had previous satisfactory psychometric properties with a Cronbach’s alpha of more than 0.9 and good construct validity. The Evidence-Based Practice Belief Scale (EBPB) consists of 16 statements that describe the respondent’s beliefs of EBP. Students were asked to report on a five-point Likert scale their agreement or disagreement with each of the 16 statements in the scale. Response options on this scale ranged from strongly disagree (1 point) to strongly agree (5 points). All statements were positive except for two statements (statements 11 and 13), which were reversed before calculating the total and mean scores. Total scores on the EBPB ranged from 16 to 80, with a higher total score indicating a more positive belief toward EBP. In the current study, the scale showed satisfactory internal consistency reliability with a Cronbach’s Alpha of .92 for the total scale.

The Evidence-Based Practice Implementation Scale (EBPI) consists of 18 statements related to the respondent’s actual implementation of EBP in the clinical setting. Students were asked to report the frequency of the application of these statements over the past 8 weeks. The answers were ranked on a Likert scale that ranged from 0 to 4 points (0 = 0 times, 1 = 1–3 times, 2 = 4–5 times, 3 = 6–8, and 4 ≥ 8 times). The total score ranged from 0 to 72, with the higher total score indicating a more frequent utilization of EBP.

Both scales were introduced to the participating students in their original language of English because English is the official language of teaching and instruction in all schools of nursing in Jordan.

Ethical considerations

The Institutional Review Board (IRB) at the first author’s university granted ethical approval for this study (Reference #19/122/2019). The code of ethics was addressed in the cover letter of the questionnaire. The principal investigator met the potential eligible students, provided them with an explanation about the study purpose and procedures, and gave them 5 min to read the questionnaires and to decide whether to participate in the study or not. Students who agreed to participate in the study were assured of voluntary participation and the right to withdraw from the study at any time. Questionnaires were collected anonymously without any identifying information from the participating students. The principal investigator explained to participating students that the return of completed questionnaires is an implicit consent to participate in the study. Permission to use the EBP belief scale and the EBP implementation scale for the purpose of this study was obtained from the authors of the instrument.

Data collection procedure

After ethical approval was granted to conduct the study, data was collected during the second semester of the academic year 2018/2019 (i.e., January through June 2019). The questionnaires were distributed to the nursing students during the classroom lectures after taking permission from the lecturer. The researchers explained the purpose, the significance of the study, the inclusion criteria, and the right of the students to refuse participation in the study. Students were screened for eligibility to participate. Students who met the eligibility criteria and agreed to participate were provided with the study package that included a cover letter and the study questionnaire. Students were given 20 min to complete the questionnaire and return it to the principal investigator who was available to answer students’ questions during the data collection process.

Data analysis

Descriptive statistics (e.g., means, standard deviations, frequencies, and percentages) were performed to describe the demographic characteristics of the participating students and the main study variables. For the belief scale, the two agreement categories (4 = agree, 5 = strongly agree) were collapsed to one category to indicate a positive belief. For the implementation scale, the three categories (2 = 4–5 times, 3 = 6–8, and 4 ≥ 8 times in the past 8 weeks) were collapsed to one category as (≥ 4 times) to indicate frequent implementation. Pearson’s correlation test was used to determine the relationship between the total scores of the EBP belief and implementation scales. A chi-square test was used to examine the difference between trained and untrained students in terms of agreement toward each EBP belief (disagreement vs. agreement) and in terms of frequency of each EBP implementation (less than 4 times vs. 4 times or more in the past 8 weeks). Finally, an independent samples t -test was used to examine the difference between trained and untrained students in terms of the total mean scores of EBP beliefs. The Statistical Package for Social Sciences (SPSS) software (version 22) was used for data analysis.

Among the 300 approached students, 35 students did not meet the inclusion criteria and 24 students refused to participate. Thus, a total of 241 undergraduate nursing students from both universities completed the study questionnaire for a response rate of 91%. The mean age of the participants was 22.09 years ( SD  = 1.55). The majority of the participants were females (73.4%) and in the fourth year of the undergraduate nursing program (85.1%). Further, more than half of the participants (67.6%) stated that they received EBP training before (Table  1 ).

Distribution of the sample by demographic variables ( n  = 241)

The total mean score of the EBP belief scale was 54.32 out of 80 ( SD  = 13.63). Overall, between 50.5 and 73.4% of students agreed or strongly agreed on the 16 statements on the EBP belief scale, which indicates positive beliefs. However, students held a more positive belief regarding the importance and the usefulness of EBP in quality patient care than in their ability to implement EBP. For example, while the majority of students believed that “EBP results in the best clinical care for patients” and that “evidence-based guidelines can improve clinical care” (73.4 and 72.2%, respectively), only about 54% of them cited that they “knew how to implement EBP sufficiently enough to make practice changes” or were “confident about their ability to implement EBP where they worked”. Students who received previous training on EBP reported more agreements (i.e., more positive beliefs) toward all items of EBP compared to those who did not receive training; however, the difference between the two groups was not always significant. For example, 60.7% of trained students believed that “they are sure that they can implement EBP” compared to 41% of untrained students χ 2 (1, n  = 241) = 8.26, p  = .004. Furthermore, 58.3% of trained students were “clear about the steps of EBP” compared to 41% of untrained students χ 2 (1, n  = 241) = 6.30, p  = .021 (Table  2 ).

Responses to evidence-based practice belief scale by trained and untrained students ( n  = 241)

The two agreement categories (agree and strongly agree) were collapsed together to indicate positive belief

In contrast, students reported a much lower total score on the EBP implementation scale: 25.34 out of 72 ( SD  = 12.37). Less than half the students reported implementing all the listed EBPs four times or more in the last 8 weeks. For example, only about one-third of all students reported that they “used evidence to change their clinical practice”, “generated a PICO question about clinical practice”, “read and critically appraised a clinical research study”, and “accessed the database for EBP four times or more in the past eight weeks” (32.4, 33.6, 31.9, and 31.6%, respectively). The only EBP that was implemented by more than half of the students (54.8%) four times or more in the past 8 weeks was “collecting data on a patient problem”. Students who had previous training on EBP reported more frequent implementations of all listed EBPs compared to those who did not receive training; however, the difference between the two groups was not always significant. For example, 50.9% of trained students reported that they “shared an EBP guideline with a colleague” four times or more in the past 8 weeks compared to 30.8% of untrained students χ 2 (1, n  = 241) = 8.68, p  = .003. Almost 50 % of the trained students “shared evidence from a research study with a patient/family member” four times or more in the past 8 weeks, compared to 28.2% of the untrained students χ 2 (1, n  = 241) = 9.95, p  = .002 (Table  3 ).

Responses to evidence-based practice implementation scale by trained and untrained students ( n  = 241)

The three categories (4–5 times, 6–8 times, and 4 ≥ 8 times) were collapsed together as (≥ 4 times) to indicate frequent implementation

There was a significant difference between students’ total scores on the EBP belief scale with respect to previous training on EBP. Students who received previous training on EBP had a significantly higher mean score on the EBP belief scale compared to students who did not receive previous training on EBP ( t (239) = 2.04, p  = .042). In addition, there was a significant difference in the total score of EBP implementation by previous training on EBP. Students who received previous training on EBP had a significantly higher mean score on the EBP implementation scale compared to students who did not receive previous training on EBP ( t (239) = 3.08, p  = .002) (Table  4 ).

Independent samples t-test between students who received EBP training and students who did not eeceive EBP training in terms of beliefs and implementations of EBP ( n  = 241)

Finally, results of the Pearson correlation test revealed that there was no significant association between the total score of the EBP belief scale and the total score of the EBP implementation scale ( r  = 0.106, p  = 0.101).

This study aimed to explore the self-reported beliefs regarding and implementation of EBP among undergraduate nursing students in Jordan. It is observed that Jordanian undergraduate nursing students valued EBP and its importance in delivering quality patient care as over 70% of them believed that EBP results in the best clinical care for patients and that evidence-based guidelines can improve clinical care. However, a lower percentage of students believed in their ability to implement EBP where they worked and an even lower percentage of them actually implemented EBP frequently in their everyday clinical practice. For illustration, only one-third of the students accessed a database for EBP, have read and critically appraised a clinical research study, or used evidence to change their clinical practice four times or more in the last 8 weeks. Our results are consistent with previous studies among Jordanian nursing students which also showed students had positive attitudes towards research and its usefulness to providing quality patient care but had insufficient ability to utilize research evidence in clinical practice [ 14 ]. Further, a recent study has shown that nursing students in Jordan had low knowledge about EBP regardless of their admitting university [ 19 ]. These results indicate that there could be a gap in the education process of undergraduate nursing students in Jordan about EBP. Thus, schools of nursing in Jordan have to critically review their current educational strategies on EBP and improve it to enhance students’ knowledge of EBP as well as their abilities to implement evidence in clinical practice.

The results of the current study revealed that despite the positive beliefs of the nursing students, their implementation of EBP was very low. There was no significant relationship between the total score of EBP belief and the total score of EBP implementation. Our results are consistent with those reported among Saudi as well as American nursing students who also had positive beliefs about EBP but implemented it less frequently in their everyday clinical practice [ 13 , 15 ]. Moreover, in line with previous studies which showed that training on EBP was one of the significant predictors of beliefs and implementation [ 15 ], students who previously received EBP training had significantly higher total belief and implementation scores than those who did not, in this study. This finding is expected as EBP training has been shown to improve knowledge, self-efficacy in implementation, and by extension, implementation practices among nurses and nursing students [ 20 – 22 ]. On the other hand, in this study, we asked students whether they have received training on EBP during the nursing research course taught at their universities. More than one-third of participating students in our study cited that they had not received previous training on EBP even though all of them have successfully passed the nursing research course offered at their universities. One possible explanation for this finding could be that there is an inconsistency in the way the nursing research course is taught. It seems that EBP practice is not always included in the content taught in this course. Thus, nursing schools in Jordan have to revise their curricula to ensure that EBP is included and is taught to all students before graduation.

The results of the current study have several international implications that involve academic education and nursing curricula. There is a pressing need to enhance the education process and to focus more on the knowledge and skills of EBP. Incorporating EBP into the nursing curricula, especially the undergraduate program is critical as it is the first step to prepare the students for their professional roles as registered nurses. Sin and Bliquez (2017) stated that creative and enjoyable strategies are fundamental in order to encourage students’ commitment to and learning about EBP [ 23 ]. One of these effective strategies is teaching the EBP process by asking a clinical question, acquiring and searching for evidence, appraising then applying this evidence, and finally evaluating the effectiveness of its application in clinical practice [ 8 ]. A thematic review study demonstrated that various interactive teaching strategies and clinically integrated teaching strategies have been emphasized to enhance EBP knowledge and skills [ 24 ].

Gaining knowledge about undergraduate nursing students’ beliefs and their ability to implement EBP in a clinical setting is essential for nursing educators at the national and the international level. This knowledge might help them to evaluate and improve the current strategies utilized to educate undergraduate students about EBP. Furthermore, academic administrators and teachers should design their courses to apply EBP concepts. They should promote EBP training courses, workshops, and seminars. For example, the research course should focus more on this topic and should include clinical scenarios that involve the application of EBP. In addition, clinical courses should include assignments for the purpose of integrating EBP within their clinical cases. The scale used in this study could be implemented in clinical courses to evaluate students’ practical skills concerning EBP. Finally, nursing instructors, leaders, and practitioners should always update their EBP knowledge and skills through continuous education and workshops. Since they are the role models and instructors, they should be competent enough to teach and evaluate their students. They should also cooperate to facilitate the implementation of EBP in clinical settings to overcome any barrier.

Study limitations and recommendations

This study sheds light on the existing gap between the belief in and the implementation of EBP among nursing students. However, convenience sampling, using two universities only, and self-report bias are all limitations of this study. In addition, the researchers did not investigate the type of EBP training that was received by the students in this study. More studies are needed in Jordan and the Middle Eastern region about EBP using larger random samples in different settings. It is also recommended to investigate the barriers that prevent nursing students from implementing EBP other than not receiving training on it. Furthermore, conducting qualitative studies might help examine and understand students’ perceptions as well as provide suggestions to bridge the gap between education and practice. Finally, future experimental studies are needed to test the effect of certain interventions on enhancing the implementation of EBP among nursing students.

Evidence-based practice is essential for nursing students worldwide. However, having strong beliefs about EBP and its benefits does not necessarily mean that it is frequently implemented. On the other hand, providing training courses on EBP is an essential step in the enhancement of EBP implementation. This means that in order to advance nursing science and enhance nursing care for future nurses, it is vital to incorporate EBP within the nursing curricula. It is also critical to teach nursing students the value of evidence-based knowledge as well as how to access this knowledge, appraise it, and apply it correctly as needed. This can be achieved through rigorous cooperation between nursing administrators, clinicians, teachers, and students to enhance the implementation process.

Acknowledgments

Abbreviations, authors’ contributions.

All authors; NA, SA, RO, and KA had active contributions to the conception and design and/or collecting and analysis of the data and/or the drafting of the paper. NA and RO also made critical revisions for important content and finalized the final version of the manuscript. All authors approved the final version of the manuscript.

This study was funded by Jordan University of Science and Technology Grant # (20190141). The funding source had no role in the design of the study and collection, analysis, and interpretation of data or in writing the manuscript.

Availability of data and materials

Ethics approval and consent to participate.

Obtained from the Institutional Review Board (IRB) at Jordan University of Science and Technology (Reference # 19/122/2019). All participants were asked to sign a consent form before data collection.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing or conflict of interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Nesrin N. Abu-Baker, Email: oj.ude.tsuj@nirsen .

Salwa AbuAlrub, Email: [email protected] .

Rana F. Obeidat, Email: ude.olaffub@tadiebor .

Kholoud Assmairan, Email: moc.oohay@naremsladuolohk .

Promoting critical thinking through an evidence-based skills fair intervention

Journal of Research in Innovative Teaching & Learning

ISSN : 2397-7604

Article publication date: 23 November 2020

Issue publication date: 1 April 2022

The lack of critical thinking in new graduates has been a concern to the nursing profession. The purpose of this study was to investigate the effects of an innovative, evidence-based skills fair intervention on nursing students' achievements and perceptions of critical thinking skills development.

Design/methodology/approach

The explanatory sequential mixed-methods design was employed for this study.

The findings indicated participants perceived the intervention as a strategy for developing critical thinking.

Originality/value

The study provides educators helpful information in planning their own teaching practice in educating students.

Critical thinking

Evidence-based practice, skills fair intervention.

Gonzalez, H.C. , Hsiao, E.-L. , Dees, D.C. , Noviello, S.R. and Gerber, B.L. (2022), "Promoting critical thinking through an evidence-based skills fair intervention", Journal of Research in Innovative Teaching & Learning , Vol. 15 No. 1, pp. 41-54. https://doi.org/10.1108/JRIT-08-2020-0041

Emerald Publishing Limited

Copyright © 2020, Heidi C. Gonzalez, E-Ling Hsiao, Dianne C. Dees, Sherri R. Noviello and Brian L. Gerber

Published in Journal of Research in Innovative Teaching & Learning . Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode

Introduction

Critical thinking (CT) was defined as “cognitive skills of analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge” ( Scheffer and Rubenfeld, 2000 , p. 357). Critical thinking is the basis for all professional decision-making ( Moore, 2007 ). The lack of critical thinking in student nurses and new graduates has been a concern to the nursing profession. It would negatively affect the quality of service and directly relate to the high error rates in novice nurses that influence patient safety ( Arli et al. , 2017 ; Saintsing et al. , 2011 ). It was reported that as many as 88% of novice nurses commit medication errors with 30% of these errors due to a lack of critical thinking ( Ebright et al. , 2004 ). Failure to rescue is another type of error common for novice nurses, reported as high as 37% ( Saintsing et al. , 2011 ). The failure to recognize trends or complications promptly or take action to stabilize the patient occurs when health-care providers do not recognize signs and symptoms of the early warnings of distress ( Garvey and CNE series, 2015 ). Internationally, this lack of preparedness and critical thinking attributes to the reported 35–60% attrition rate of new graduate nurses in their first two years of practice ( Goodare, 2015 ). The high attrition rate of new nurses has expensive professional and economic costs of $82,000 or more per nurse and negatively affects patient care ( Twibell et al. , 2012 ). Facione and Facione (2013) reported the failure to utilize critical thinking skills not only interferes with learning but also results in poor decision-making and unclear communication between health-care professionals, which ultimately leads to patient deaths.

Due to the importance of critical thinking, many nursing programs strive to infuse critical thinking into their curriculum to better prepare graduates for the realities of clinical practice that involves ever-changing, complex clinical situations and bridge the gap between education and practice in nursing ( Benner et al. , 2010 ; Kim et al. , 2019 ; Park et al. , 2016 ; Newton and Moore, 2013 ; Nibert, 2011 ). To help develop students' critical thinking skills, nurse educators must change the way they teach nursing, so they can prepare future nurses to be effective communicators, critical thinkers and creative problem solvers ( Rieger et al. , 2015 ). Nursing leaders also need to redefine teaching practice and educational guidelines that drive innovation in undergraduate nursing programs.

Evidence-based practice has been advocated to promote critical thinking and help reduce the research-practice gap ( Profetto-McGrath, 2005 ; Stanley and Dougherty, 2010 ). Evidence-based practice was defined as “the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of the individual patient” ( Sackett et al. , 1996 , p. 71). Skills fair intervention, one type of evidence-based practice, can be used to engage students, promote active learning and develop critical thinking ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). Skills fair intervention helps promote a consistent teaching practice of the psychomotor skills to the novice nurse that decreased anxiety, gave clarity of expectations to the students in the clinical setting and increased students' critical thinking skills ( Roberts et al. , 2009 ). The researchers of this study had an opportunity to create an active, innovative skills fair intervention for a baccalaureate nursing program in one southeastern state. This intervention incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in nursing students ( Hsu and Hsieh, 2013 ; Oermann et al. , 2011 ; Roberts et al. , 2009 ). The effects of an innovative skills fair intervention on senior baccalaureate nursing students' achievements and their perceptions of critical thinking development were examined in the study.

Literature review

The ability to use reasoned opinion focusing equally on processes and outcomes over emotions is called critical thinking ( Paul and Elder, 2008 ). Critical thinking skills are desired in almost every discipline and play a major role in decision-making and daily judgments. The roots of critical thinking date back to Socrates 2,500 years ago and can be traced to the ancient philosopher Aristotle ( Paul and Elder, 2012 ). Socrates challenged others by asking inquisitive questions in an attempt to challenge their knowledge. In the 1980s, critical thinking gained nationwide recognition as a behavioral science concept in the educational system ( Robert and Petersen, 2013 ). Many researchers in both education and nursing have attempted to define, measure and teach critical thinking for decades. However, a theoretical definition has yet to be accepted and established by the nursing profession ( Romeo, 2010 ). The terms critical literacy, CT, reflective thinking, systems thinking, clinical judgment and clinical reasoning are used synonymously in the reviewed literature ( Clarke and Whitney, 2009 ; Dykstra, 2008 ; Jones, 2010 ; Swing, 2014 ; Turner, 2005 ).

Watson and Glaser (1980) viewed critical thinking not only as cognitive skills but also as a combination of skills, knowledge and attitudes. Paul (1993) , the founder of the Foundation for Critical Thinking, offered several definitions of critical thinking and identified three essential components of critical thinking: elements of thought, intellectual standards and affective traits. Brunt (2005) stated critical thinking is a process of being practical and considered it to be “the process of purposeful thinking and reflective reasoning where practitioners examine ideas, assumptions, principles, conclusions, beliefs, and actions in the contexts of nursing practice” (p. 61). In an updated definition, Ennis (2011) described critical thinking as, “reasonable reflective thinking focused on deciding what to believe or do” (para. 1).

The most comprehensive attempt to define critical thinking was under the direction of Facione and sponsored by the American Philosophical Association ( Scheffer and Rubenfeld, 2000 ). Facione (1990) surveyed 53 experts from the arts and sciences using the Delphi method to define critical thinking as a “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as an explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which judgment, is based” (p. 2).

To come to a consensus definition for critical thinking, Scheffer and Rubenfeld (2000) also conducted a Delphi study. Their study consisted of an international panel of nurses who completed five rounds of sequenced questions to arrive at a consensus definition. Critical thinking was defined as “habits of mind” and “cognitive skills.” The elements of habits of mind included “confidence, contextual perspective, creativity, flexibility, inquisitiveness, intellectual integrity, intuition, open-mindedness, perseverance, and reflection” ( Scheffer and Rubenfeld, 2000 , p. 352). The elements of cognitive skills were recognized as “analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge” ( Scheffer and Rubenfeld, 2000 , p. 352). In addition, Ignatavicius (2001) defined the development of critical thinking as a long-term process that must be practiced, nurtured and reinforced over time. Ignatavicius believed that a critical thinker required six cognitive skills: interpretation, analysis, evaluation, inference, explanation and self-regulation ( Chun-Chih et al. , 2015 ). According to Ignatavicius (2001) , the development of critical thinking is difficult to measure or describe because it is a formative rather than summative process.

Fero et al. (2009) noted that patient safety might be compromised if a nurse cannot provide clinically competent care due to a lack of critical thinking. The Institute of Medicine (2001) recommended five health care competencies: patient-centered care, interdisciplinary team care, evidence-based practice, informatics and quality improvement. Understanding the development and attainment of critical thinking is the key for gaining these future competencies ( Scheffer and Rubenfeld, 2000 ). The development of a strong scientific foundation for nursing practice depends on habits such as contextual perspective, inquisitiveness, creativity, analysis and reasoning skills. Therefore, the need to better understand how these critical thinking habits are developed in nursing students needs to be explored through additional research ( Fero et al. , 2009 ). Despite critical thinking being listed since the 1980s as an accreditation outcome criteria for baccalaureate programs by the National League for Nursing, very little improvement has been observed in practice ( McMullen and McMullen, 2009 ). James (2013) reported the number of patient harm incidents associated with hospital care is much higher than previously thought. James' study indicated that between 210,000 and 440,000 patients each year go to the hospital for care and end up suffering some preventable harm that contributes to their death. James' study of preventable errors is attributed to other sources besides nursing care, but having a nurse in place who can advocate and critically think for patients will make a positive impact on improving patient safety ( James, 2013 ; Robert and Peterson, 2013 ).

Adopting teaching practice to promote CT is a crucial component of nursing education. Research by Nadelson and Nadelson (2014) suggested evidence-based practice is best learned when integrated into multiple areas of the curriculum. Evidence-based practice developed its roots through evidence-based medicine, and the philosophical origins extend back to the mid-19th century ( Longton, 2014 ). Florence Nightingale, the pioneer of modern nursing, used evidence-based practice during the Crimean War when she recognized a connection between poor sanitary conditions and rising mortality rates of wounded soldiers ( Rahman and Applebaum, 2011 ). In professional nursing practice today, a commonly used definition of evidence-based practice is derived from Dr. David Sackett: the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient ( Sackett et al. , 1996 , p. 71). As professional nurses, it is imperative for patient safety to remain inquisitive and ask if the care provided is based on available evidence. One of the core beliefs of the American Nephrology Nurses' Association's (2019) 2019–2020 Strategic Plan is “Anna must support research to develop evidence-based practice, as well as to advance nursing science, and that as individual members, we must support, participate in, and apply evidence-based research that advances our own skills, as well as nursing science” (p. 1). Longton (2014) reported the lack of evidence-based practice in nursing resulted in negative outcomes for patients. In fact, when evidence-based practice was implemented, changes in policies and procedures occurred that resulted in decreased reports of patient harm and associated health-care costs. The Institute of Medicine (2011) recommendations included nurses being leaders in the transformation of the health-care system and achieving higher levels of education that will provide the ability to critically analyze data to improve the quality of care for patients. Student nurses must be taught to connect and integrate CT and evidence-based practice throughout their program of study and continue that practice throughout their careers.

One type of evidence-based practice that can be used to engage students, promote active learning and develop critical thinking is skills fair intervention ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). Skills fair intervention promoted a consistent teaching approach of the psychomotor skills to the novice nurse that decreased anxiety, gave clarity of expectations to the students in the clinical setting and increased students' critical thinking skills ( Roberts et al. , 2009 ). The skills fair intervention used in this study is a teaching strategy that incorporated CT prompts, Socratic questioning, group work, guided discussions, return demonstrations and blended learning in an attempt to develop CT in nursing students ( Hsu and Hsieh, 2013 ; Roberts et al. , 2009 ). It melded evidence-based practice with simulated CT opportunities while students practiced essential psychomotor skills.

Research methodology

Context – skills fair intervention.

According to Roberts et al. (2009) , psychomotor skills decline over time even among licensed experienced professionals within as little as two weeks and may need to be relearned within two months without performing a skill. When applying this concept to student nurses for whom each skill is new, it is no wonder their competency result is diminished after having a summer break from nursing school. This skills fair intervention is a one-day event to assist baccalaureate students who had taken the summer off from their studies in nursing and all faculty participated in operating the stations. It incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in baccalaureate students.

Students were scheduled and placed randomly into eight teams based on attributes of critical thinking as described by Wittmann-Price (2013) : Team A – Perseverance, Team B – Flexibility, Team C – Confidence, Team D – Creativity, Team E – Inquisitiveness, Team F – Reflection, Team G – Analyzing and Team H – Intuition. The students rotated every 20 minutes through eight stations: Medication Administration: Intramuscular and Subcutaneous Injections, Initiating Intravenous Therapy, ten-minute Focused Physical Assessment, Foley Catheter Insertion, Nasogastric Intubation, Skin Assessment/Braden Score and Restraints, Vital Signs and a Safety Station. When the students completed all eight stations, they went to the “Check-Out” booth to complete a simple evaluation to determine their perceptions of the effectiveness of the innovative intervention. When the evaluations were complete, each of the eight critical thinking attribute teams placed their index cards into a hat, and a student won a small prize. All Junior 2, Senior 1 and Senior 2 students were required to attend the Skills Fair. The Skills Fair Team strove to make the event as festive as possible, engaging nursing students with balloons, candy, tri-boards, signs and fun pre and postactivities. The Skills Fair rubrics, scheduling and instructions were shared electronically with students and faculty before the skills fair intervention to ensure adequate preparation and continuous resource availability as students move forward into their future clinical settings.

Research design

Institutional review board (IRB) approval was obtained from XXX University to conduct this study and protect human subject rights. The explanatory sequential mixed-methods design was employed for this study. The design was chosen to identify what effects a skills fair intervention that had on senior baccalaureate nursing students' achievements on the Kaplan Critical Thinking Integrated Test (KCTIT) and then follow up with individual interviews to explore those test results in more depth. In total, 52 senior nursing students completed the KCTIT; 30 of them participated in the skills fair intervention and 22 of them did not participate. The KCTIT is a computerized 85-item exam in which 85 equates to 100%, making each question worth one point. It has high reliability and validity ( Kaplan Nursing, 2012 ; Swing, 2014 ). The reliability value of the KCTIT ranged from 0.72 to 0.89. A t -test was used to analyze the test results.

A total of 11 participants were purposefully selected based on a range of six high achievers and five low achievers on the KCTIT for open-ended one-on-one interviews. Each interview was conducted individually and lasted for about 60 minutes. An open-ended interview protocol was used to guide the flow of data collection. The interviewees' ages ranged from 21 to 30 years, with an average of 24 years. One of 11 interviewees was male. Among them, seven were White, three were Black and one was Indian American. The data collected were used to answer the following research questions: (1) What was the difference in achievements on the KCTIT among senior baccalaureate nursing students who participated in the skills fair intervention and students who did not participate? (2) What were the senior baccalaureate nursing students' perceptions of internal and external factors impacting the development of critical thinking skills during the skills fair intervention? and (3) What were the senior baccalaureate nursing students' perceptions of the skills fair intervention as a critical thinking developmental strategy?

Inductive content analysis was used to analyze interview data by starting with the close reading of the transcripts and writing memos for initial coding, followed by an analysis of patterns and relationships among the data for focused coding. The intercoder reliability was established for qualitative data analysis with a nursing expert. The lead researcher and the expert read the transcript several times and assigned a code to significant units of text that corresponded with answering the research questions. The codes were compared based on differences and similarities and sorted into subcategories and categories. Then, headings and subheadings were used based on similar comments to develop central themes and patterns. The process of establishing intercoder reliability helped to increase dependability, conformability and credibility of the findings ( Graneheim and Lundman, 2004 ). In addition, methods of credibility, confirmability, dependability and transferability were applied to increase the trustworthiness of this study ( Graneheim and Lundman, 2004 ). First, reflexivity was observed by keeping journals and memos. This practice allowed the lead researcher to reflect on personal views to minimize bias. Data saturation was reached through following the recommended number of participants as well as repeated immersion in the data during analysis until no new data surfaced. Member checking was accomplished through returning the transcript and the interpretation to the participants to check the accuracy and truthfulness of the findings. Finally, proper documentation was conducted to allow accurate crossreferencing throughout the study.

Quantitative results

Results for the quantitative portion showed there was no difference in scores on the KCTIT between senior nursing students who participated in the skills fair intervention and senior nursing students who did not participate, t (50) = −0.174, p  = 0.86 > 0.05. The test scores between the nonparticipant group ( M  = 67.59, SD = 5.81) and the participant group ( M  = 67.88, SD = 5.99) were almost equal.

Qualitative results

Initial coding.

The results from the initial coding and generated themes are listed in Table 1 . First, the participants perceived the skills fair intervention as “promoting experience” and “confidence” by practicing previously learned knowledge and reinforcing it with active learning strategies. Second, the participants perceived the skills fair intervention as a relaxed, nonthreatening learning environment due to the festive atmosphere, especially in comparison to other learning experiences in the nursing program. The nonthreatening environment of the skills fair intervention allowed students to learn without fear. Third, the majority of participants believed their critical thinking was strengthened after participating. Several participants believed their perception of critical thinking was “enhanced” or “reinforced” rather than significantly changed.

Focused coding results

The final themes were derived from the analysis of patterns and relationships among the content of the data using inductive content analysis ( Saldana, 2009 ). The following was examined across the focused coding process: (1) factors impacting critical thinking skills development during skills fair intervention and (2) skills fair intervention a critical thinking skills developmental strategy.

Factors impacting critical thinking skills development . The factors impacting the development of critical thinking during the skills fair intervention were divided into two themes: internal factors and external factors. The internal factors were characteristics innate to the students. The identified internal factors were (1) confidence and anxiety levels, (2) attitude and (3) age. The external factors were the outside influences that affected the students. The external factors were (1) experience and practice, (2) faculty involvement, (3) positive learning environment and (4) faculty prompts.

I think that confidence and anxiety definitely both have a huge impact on your ability to be able to really critically think. If you start getting anxious and panicking you cannot think through the process like you need too. I do not really think gender or age necessarily would have anything to do with critical thinking.
Definitely the confidence level, I think, the more advanced you get in the program, your confidence just keeps on growing. Level of anxiety, definitely… I think the people who were in the Skills Fair for the first time, had more anxiety because they did not really know to think, they did not know how strict it was going to be, or if they really had to know everything by the book. I think the Skills Fair helped everyone's confidence levels, but especially the Jr. 2's.

Attitude was an important factor in the development of critical thinking skills during the skills fair intervention as participants believed possessing a pleasant and positive attitude meant a student was eager to learn, participate, accept responsibility for completing duties and think seriously. Participant 6 believed attitude contributed to performance in the Skills Fair.

I feel like, certain things bring critical thinking out in you. And since I'm a little bit older than some of the other students, I have had more life experiences and am able to figure stuff out better. Older students have had more time to learn by trial and error, and this and that.
Like when I had clinical with you, you'd always tell us to know our patients' medications. To always know and be prepared to answer questions – because at first as a Junior 1 we did not do that in the clinical setting… and as a Junior 2, I did not really have to know my medications, but with you as a Senior 1, I started to realize that the patients do ask about their meds, so I was making sure that I knew everything before they asked it. And just having more practice with IVs – at first, I was really nervous, but when I got to my preceptorship – I had done so many IVs and with all of the practice, it just built up my confidence with that skill so when I performed that skill during the Fair, I was confident due to my clinical experiences and able to think and perform better.
I think teachers will always affect the ability to critically think just because you want [to] get the right answer because they are there and you want to seem smart to them [Laugh]. Also, if you are leading in the wrong direction of your thinking – they help steer you back to [in] the right direction so I think that was very helpful.
You could tell the faculty really tried to make it more laid back and fun, so everybody would have a good experience. The faculty had a good attitude. I think making it fun and active helped keep people positive. You know if people are negative and not motivated, nothing gets accomplished. The faculty did an amazing job at making the Skills Fair a positive atmosphere.

However, for some of the participants, a positive learning environment depended on their fellow students. The students were randomly assigned alphabetically to groups, and the groups were assigned to starting stations at the Skills Fair. The participants claimed some students did not want to participate and displayed cynicism toward the intervention. The participants believed their cynicism affected the positive learning environment making critical thinking more difficult during the Skills Fair.

Okay, when [instructor name] was demonstrating the Chevron technique right after we inserted the IV catheter and we were trying to secure the catheter, put on the extension set, and flush the line at what seemed to be all at the same time. I forgot about how you do not want to put the tape right over the hub of the catheter because when you go back in and try to assess the IV site – you're trying to assess whether or not it is patent or infiltrated – you have to visualize the insertion site. That was one of the things that I had been doing wrong because I was just so excited that I got the IV in the vein in the first place – that I did not think much about the tape or the tegaderm for sterility. So I think an important part of critical thinking is to be able to recognize when you've made a mistake and stop, stop yourself from doing it in the future (see Table 2 ).

Skills fair intervention as a developmental strategy for critical thinking . The participants identified the skills fair intervention was effective as a developmental strategy for critical thinking, as revealed in two themes: (1) develops alternative thinking and (2) thinking before doing (See Table 3 ).

Develops alternative thinking . The participants perceived the skills fair intervention helped enhance critical thinking and confidence by developing alternative thinking. Alternative thinking was described as quickly thinking of alternative solutions to problems based on the latest evidence and using that information to determine what actions were warranted to prevent complications and prevent injury. It helped make better connections through the learning of rationale between knowledge and skills and then applying that knowledge to prevent complications and errors to ensure the safety of patients. The participants stated the learning of rationale for certain procedures provided during the skills fair intervention such as the evidence and critical thinking prompts included in the rubrics helped reinforce this connection. The participants also shared they developed alternative thinking after participating in the skills fair intervention by noticing trends in data to prevent potential complications from the faculty prompts. Participant 1 stated her instructor prompted her alternative thinking through questioning about noticing trends to prevent potential complications. She said the following:

Another way critical thinking occurred during the skills fair was when [instructor name] was teaching and prompted us about what it would be like to care for a patient with a fractured hip – I think this was at the 10-minute focused assessment station, but I could be wrong. I remember her asking, “What do you need to be on the look-out for? What can go wrong?” I automatically did not think critically very well and was only thinking circulation in the leg, dah, dah, dah. But she was prompting us to think about mobility alterations and its effect on perfusion and oxygenation. She was trying to help us build those connections. And I think that's a lot of the aspects of critical thinking that gets overlooked with the nursing student – trouble making connections between our knowledge and applying it in practice.

Thinking before doing . The participants perceived thinking before doing, included thinking of how and why certain procedures, was necessary through self-examination prior to taking action. The hands-on situational learning allowed the participants in the skills fair intervention to better notice assessment data and think at a higher level as their previous learning of the skills was perceived as memorization of steps. This higher level of learning allowed participants to consider different future outcomes and analyze pertinent data before taking action.

I think what helped me the most is considering outcomes of my actions before I do anything. For instance, if you're thinking, “Okay. Well, I need to check their blood pressure before I administer this blood pressure medication – or the blood pressure could potentially bottom out.” I really do not want my patient to bottom out and get hypotensive because I administered a medication that was ordered, but not safe to give. I could prevent problems from happening if I know what to be on alert for and act accordingly. So ultimately knowing that in the clinical setting, I can prevent complications from happening and I save myself, my license, and promote patient safety. I think knowing that I've seen the importance of critical thinking already in practice has helped me value and understand why I should be critically thinking. Yes, we use the 5-rights of medication safety – but we also have to think. For instance, if I am going to administer insulin – what do I need to know or do to give this safely? What is the current blood sugar? Has the patient been eating? When is the next meal scheduled? Is the patient NPO for a procedure? Those are examples of questions to consider and the level of thinking that needs to take place prior to taking actions in the clinical setting.

Although the results of quantitative data showed no significant difference in scores on the KCTIT between the participant and nonparticipant groups, during the interviews some participants attributed this result to the test not being part of a course grade and believed students “did not try very hard to score well.” However, the participants who attended interviews did identify the skills fair intervention as a developmental strategy for critical thinking by helping them develop alternative thinking and thinking before doing. The findings are supported in the literature as (1) nurses must recognize signs of clinical deterioration and take action promptly to prevent potential complications ( Garvey and CNE series 2015 ) and (2) nurses must analyze pertinent data and consider all possible solutions before deciding on the most appropriate action for each patient ( Papathanasiou et al. , 2014 ).

The skills fair intervention also enhanced the development of self-confidence by participants practicing previously learned skills in a controlled, safe environment. The nonthreatening environment of the skills fair intervention allowed students to learn without fear and the majority of participants believed their critical thinking was strengthened after participating. The interview data also revealed a combination of internal and external factors that influenced the development of critical thinking during the skills fair intervention including confidence and anxiety levels, attitude, age, experience and practice, faculty involvement, positive learning environment and faculty prompts. These factors should be considered when addressing the promotion and development of critical thinking.

Conclusions, limitations and recommendations

A major concern in the nursing profession is the lack of critical thinking in student nurses and new graduates, which influences the decision-making of novice nurses and directly affects patient care and safety ( Saintsing et al. , 2011 ). Nurse educators must use evidence-based practice to prepare students to critically think with the complicated and constantly evolving environment of health care today ( Goodare, 2015 ; Newton and Moore, 2013 ). Evidence-based practice has been advocated to promote critical thinking ( Profetto-McGrath, 2005 ; Stanley and Dougherty, 2010 ). The skills fair intervention can be one type of evidence-based practice used to promote critical thinking ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). The Intervention used in this study incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in nursing students.

The explanatory sequential mixed-methods design was employed to investigate the effects of the innovative skills fair intervention on senior baccalaureate nursing students' achievements and their perceptions of critical thinking skills development. Although the quantitative results showed no significant difference in scores on the KCTIT between students who participated in the skills fair intervention and those who did not, those who attended the interviews perceived their critical thinking was reinforced after the skills fair intervention and believed it was an effective developmental strategy for critical thinking, as it developed alternative thinking and thinking before doing. This information is useful for nurse educators who plan their own teaching practice to promote critical thinking and improve patient outcomes. The findings also provide schools and educators information that helps review their current approach in educating nursing students. As evidenced in the findings, the importance of developing critical thinking skills is crucial for becoming a safe, professional nurse. Internal and external factors impacting the development of critical thinking during the skills fair intervention were identified including confidence and anxiety levels, attitude, age, experience and practice, faculty involvement, positive learning environment and faculty prompts. These factors should be considered when addressing the promotion and development of critical thinking.

There were several limitations to this study. One of the major limitations of the study was the limited exposure of students' time of access to the skills fair intervention, as it was a one-day learning intervention. Another limitation was the sample selection and size. The skills fair intervention was limited to only one baccalaureate nursing program in one southeastern state. As such, the findings of the study cannot be generalized as it may not be representative of baccalaureate nursing programs in general. In addition, this study did not consider students' critical thinking achievements prior to the skills fair intervention. Therefore, no baseline measurement of critical thinking was available for a before and after comparison. Other factors in the nursing program could have affected the students' scores on the KCTIT, such as anxiety or motivation that was not taken into account in this study.

The recommendations for future research are to expand the topic by including other regions, larger samples and other baccalaureate nursing programs. In addition, future research should consider other participant perceptions, such as nurse educators, to better understand the development and growth of critical thinking skills among nursing students. Finally, based on participant perceptions, future research should include a more rigorous skills fair intervention to develop critical thinking and explore the link between confidence and critical thinking in nursing students.

Initial coding results

Factors impacting critical thinking skill development during skills fair intervention

Skills fair intervention as a developmental strategy for critical thinking

American Nephrology Nurses Association (ANNA) ( 2019 ), “ Learning, leading, connecting, and playing at the intersection of nephrology and nursing-2019–2020 strategic plan ”, viewed 3 Aug 2019, available at: https://www.annanurse.org/download/reference/association/strategicPlan.pdf .

Arli , S.D. , Bakan , A.B. , Ozturk , S. , Erisik , E. and Yildirim , Z. ( 2017 ), “ Critical thinking and caring in nursing students ”, International Journal of Caring Sciences , Vol. 10 No. 1 , pp. 471 - 478 .

Benner , P. , Sutphen , M. , Leonard , V. and Day , L. ( 2010 ), Educating Nurses: A Call for Radical Transformation , Jossey-Bass , San Francisco .

Brunt , B. ( 2005 ), “ Critical thinking in nursing: an integrated review ”, The Journal of Continuing Education in Nursing , Vol. 36 No. 2 , pp. 60 - 67 .

Chun-Chih , L. , Chin-Yen , H. , I-Ju , P. and Li-Chin , C. ( 2015 ), “ The teaching-learning approach and critical thinking development: a qualitative exploration of Taiwanese nursing students ”, Journal of Professional Nursing , Vol. 31 No. 2 , pp. 149 - 157 , doi: 10.1016/j.profnurs.2014.07.001 .

Clarke , L.W. and Whitney , E. ( 2009 ), “ Walking in their shoes: using multiple-perspectives texts as a bridge to critical literacy ”, The Reading Teacher , Vol. 62 No. 6 , pp. 530 - 534 , doi: 10.1598/RT.62.6.7 .

Dykstra , D. ( 2008 ), “ Integrating critical thinking and memorandum writing into course curriculum using the internet as a research tool ”, College Student Journal , Vol. 42 No. 3 , pp. 920 - 929 , doi: 10.1007/s10551-010-0477-2 .

Ebright , P. , Urden , L. , Patterson , E. and Chalko , B. ( 2004 ), “ Themes surrounding novice nurse near-miss and adverse-event situations ”, The Journal of Nursing Administration: The Journal of Nursing Administration , Vol. 34 , pp. 531 - 538 , doi: 10.1097/00005110-200411000-00010 .

Ennis , R. ( 2011 ), “ The nature of critical thinking: an outline of critical thinking dispositions and abilities ”, viewed 3 May 2017, available at: https://education.illinois.edu/docs/default-source/faculty-documents/robert-ennis/thenatureofcriticalthinking_51711_000.pdf .

Facione , P.A. ( 1990 ), Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , The California Academic Press , Millbrae .

Facione , N.C. and Facione , P.A. ( 2013 ), The Health Sciences Reasoning Test: Test Manual , The California Academic Press , Millbrae .

Fero , L.J. , Witsberger , C.M. , Wesmiller , S.W. , Zullo , T.G. and Hoffman , L.A. ( 2009 ), “ Critical thinking ability of new graduate and experienced nurses ”, Journal of Advanced Nursing , Vol. 65 No. 1 , pp. 139 - 148 , doi: 10.1111/j.1365-2648.2008.04834.x .

Garvey , P.K. and CNE series ( 2015 ), “ Failure to rescue: the nurse's impact ”, Medsurg Nursing , Vol. 24 No. 3 , pp. 145 - 149 .

Goodare , P. ( 2015 ), “ Literature review: ‘are you ok there?’ The socialization of student and graduate nurses: do we have it right? ”, Australian Journal of Advanced Nursing , Vol. 33 No. 1 , pp. 38 - 43 .

Graneheim , U.H. and Lundman , B. ( 2014 ), “ Qualitative content analysis in nursing research: concepts, procedures, and measures to achieve trustworthiness ”, Nurse Education Today , Vol. 24 No. 2 , pp. 105 - 12 , doi: 10.1016/j.nedt.2003.10.001 .

Hsu , L. and Hsieh , S. ( 2013 ), “ Factors affecting metacognition of undergraduate nursing students in a blended learning environment ”, International Journal of Nursing Practice , Vol. 20 No. 3 , pp. 233 - 241 , doi: 10.1111/ijn.12131 .

Ignatavicius , D. ( 2001 ), “ Six critical thinking skills for at-the-bedside success ”, Dimensions of Critical Care Nursing , Vol. 20 No. 2 , pp. 30 - 33 .

Institute of Medicine ( 2001 ), Crossing the Quality Chasm: A New Health System for the 21st Century , National Academy Press , Washington .

James , J. ( 2013 ), “ A new, evidence-based estimate of patient harms associated with hospital care ”, Journal of Patient Safety , Vol. 9 No. 3 , pp. 122 - 128 , doi: 10.1097/PTS.0b013e3182948a69 .

Jones , J.H. ( 2010 ), “ Developing critical thinking in the perioperative environment ”, AORN Journal , Vol. 91 No. 2 , pp. 248 - 256 , doi: 10.1016/j.aorn.2009.09.025 .

Kaplan Nursing ( 2012 ), Kaplan Nursing Integrated Testing Program Faculty Manual , Kaplan Nursing , New York, NY .

Kim , J.S. , Gu , M.O. and Chang , H.K. ( 2019 ), “ Effects of an evidence-based practice education program using multifaceted interventions: a quasi-experimental study with undergraduate nursing students ”, BMC Medical Education , Vol. 19 , doi: 10.1186/s12909-019-1501-6 .

Longton , S. ( 2014 ), “ Utilizing evidence-based practice for patient safety ”, Nephrology Nursing Journal , Vol. 41 No. 4 , pp. 343 - 344 .

McCausland , L.L. and Meyers , C.C. ( 2013 ), “ An interactive skills fair to prepare undergraduate nursing students for clinical experience ”, Nursing Education Perspectives , Vol. 34 No. 6 , pp. 419 - 420 , doi: 10.5480/1536-5026-34.6.419 .

McMullen , M.A. and McMullen , W.F. ( 2009 ), “ Examining patterns of change in the critical thinking skills of graduate nursing students ”, Journal of Nursing Education , Vol. 48 No. 6 , pp. 310 - 318 , doi: 10.3928/01484834-20090515-03 .

Moore , Z.E. ( 2007 ), “ Critical thinking and the evidence-based practice of sport psychology ”, Journal of Clinical Sport Psychology , Vol. 1 , pp. 9 - 22 , doi: 10.1123/jcsp.1.1.9 .

Nadelson , S. and Nadelson , L.S. ( 2014 ), “ Evidence-based practice article reviews using CASP tools: a method for teaching EBP ”, Worldviews on Evidence-Based Nursing , Vol. 11 No. 5 , pp. 344 - 346 , doi: 10.1111/wvn.12059 .

Newton , S.E. and Moore , G. ( 2013 ), “ Critical thinking skills of basic baccalaureate and accelerated second-degree nursing students ”, Nursing Education Perspectives , Vol. 34 No. 3 , pp. 154 - 158 , doi: 10.5480/1536-5026-34.3.154 .

Nibert , A. ( 2011 ), “ Nursing education and practice: bridging the gap ”, Advance Healthcare Network , viewed 3 May 2017, available at: https://www.elitecme.com/resource-center/nursing/nursing-education-practice-bridging-the-gap/ .

Oermann , M.H. , Kardong-Edgren , S. , Odom-Maryon , T. , Hallmark , B.F. , Hurd , D. , Rogers , N. and Smart , D.A. ( 2011 ), “ Deliberate practice of motor skills in nursing education: CPR as exemplar ”, Nursing Education Perspectives , Vol. 32 No. 5 , pp. 311 - 315 , doi: 10.5480/1536-5026-32.5.311 .

Papathanasiou , I.V. , Kleisiaris , C.F. , Fradelos , E.C. , Kakou , K. and Kourkouta , L. ( 2014 ), “ Critical thinking: the development of an essential skill for nursing students ”, Acta Informatica Medica , Vol. 22 No. 4 , pp. 283 - 286 , doi: 10.5455/aim.2014.22.283-286 .

Park , M.Y. , Conway , J. and McMillan , M. ( 2016 ), “ Enhancing critical thinking through simulation ”, Journal of Problem-Based Learning , Vol. 3 No. 1 , pp. 31 - 40 , doi: 10.24313/jpbl.2016.3.1.31 .

Paul , R. ( 1993 ), Critical Thinking: How to Prepare Students for a Rapidly Changing World , The Foundation for Critical Thinking , Santa Rosa .

Paul , R. and Elder , L. ( 2008 ), “ Critical thinking: the art of socratic questioning, part III ”, Journal of Developmental Education , Vol. 31 No. 3 , pp. 34 - 35 .

Paul , R. and Elder , L. ( 2012 ), Critical Thinking: Tools for Taking Charge of Your Learning and Your Life , 3rd ed. , Pearson/Prentice Hall , Boston .

Profetto-McGrath , J. ( 2005 ), “ Critical thinking and evidence-based practice ”, Journal of Professional Nursing , Vol. 21 No. 6 , pp. 364 - 371 , doi: 10.1016/j.profnurs.2005.10.002 .

Rahman , A. and Applebaum , R. ( 2011 ), “ What's all this about evidence-based practice? The roots, the controversies, and why it matters ”, American Society on Aging , viewed 3 May 2017, available at: https://www.asaging.org/blog/whats-all-about-evidence-based-practice-roots-controversies-and-why-it-matters .

Rieger , K. , Chernomas , W. , McMillan , D. , Morin , F. and Demczuk , L. ( 2015 ), “ The effectiveness and experience of arts‐based pedagogy among undergraduate nursing students: a comprehensive systematic review protocol ”, JBI Database of Systematic Reviews and Implementation Reports , Vol. 13 No. 2 , pp. 101 - 124 , doi: 10.11124/jbisrir-2015-1891 .

Robert , R.R. and Petersen , S. ( 2013 ), “ Critical thinking at the bedside: providing safe passage to patients ”, Medsurg Nursing , Vol. 22 No. 2 , pp. 85 - 118 .

Roberts , S.T. , Vignato , J.A. , Moore , J.L. and Madden , C.A. ( 2009 ), “ Promoting skill building and confidence in freshman nursing students with a skills-a-thon ”, Educational Innovations , Vol. 48 No. 8 , pp. 460 - 464 , doi: 10.3928/01484834-20090518-05 .

Romeo , E. ( 2010 ), “ Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance ”, Journal of Nursing Education , Vol. 49 No. 7 , pp. 378 - 386 , doi: 10.3928/01484834-20100331-05 .

Sackett , D. , Rosenberg , W. , Gray , J. , Haynes , R. and Richardson , W. ( 1996 ), “ Evidence-based medicine: what it is and what it isn't ”, British Medical Journal , Vol. 312 No. 7023 , pp. 71 - 72 , doi: 10.1136/bmj.312.7023.71 .

Saintsing , D. , Gibson , L.M. and Pennington , A.W. ( 2011 ), “ The novice nurse and clinical decision-making: how to avoid errors ”, Journal of Nursing Management , Vol. 19 No. 3 , pp. 354 - 359 .

Saldana , J. ( 2009 ), The Coding Manual for Qualitative Researchers , Sage , Los Angeles .

Scheffer , B. and Rubenfeld , M. ( 2000 ), “ A consensus statement on critical thinking in nursing ”, Journal of Nursing Education , Vol. 39 No. 8 , pp. 352 - 359 .

Stanley , M.C. and Dougherty , J.P. ( 2010 ), “ Nursing education model. A paradigm shift in nursing education: a new model ”, Nursing Education Perspectives , Vol. 31 No. 6 , pp. 378 - 380 , doi: 10.1043/1536-5026-31.6.378 .

Swing , V.K. ( 2014 ), “ Early identification of transformation in the proficiency level of critical thinking skills (CTS) for the first-semester associate degree nursing (ADN) student ”, doctoral thesis , Capella University , Minneapolis , viewed 3 May 2017, ProQuest Dissertations & Theses database .

Turner , P. ( 2005 ), “ Critical thinking in nursing education and practice as defined in the literature ”, Nursing Education Perspectives , Vol. 26 No. 5 , pp. 272 - 277 .

Twibell , R. , St Pierre , J. , Johnson , D. , Barton , D. , Davis , C. and Kidd , M. ( 2012 ), “ Tripping over the welcome mat: why new nurses don't stay and what the evidence says we can do about it ”, American Nurse Today , Vol. 7 No. 6 , pp. 1 - 10 .

Watson , G. and Glaser , E.M. ( 1980 ), Watson Glaser Critical Thinking Appraisal , Psychological Corporation , San Antonio .

Wittmann-Price , R.A. ( 2013 ), “ Facilitating learning in the classroom setting ”, in Wittmann-Price , R.A. , Godshall , M. and Wilson , L. (Eds), Certified Nurse Educator (CNE) Review Manual , Springer Publishing , New York, NY , pp. 19 - 70 .

Corresponding author

Related articles, we’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

April 13, 2024

Winnipeg 7° C , Cloudy with wind

Full Forecast

  • Advertising Contact
  • Send a Letter to the Editor
  • Staff biographies
  • Submit a News Tip
  • Subscribe to Newsletters
  • Notifications
  • Create Account
  • Compact View
  • About the E-Edition
  • Winnipeg Free Press
  • Community Review East
  • Community Review West
  • All Arts & Life
  • Celebrities
  • Environment
  • Food & Drink
  • Life & Style
  • Science & Technology
  • All Business
  • Agriculture
  • Personal Finance
  • Manitoba’s Top Employers
  • All Opinion
  • Editorial Cartoon
  • Letters to the Editor
  • Auto Racing
  • Blue Bombers
  • High School
  • Horse Racing
  • Winnipeg Jets
  • Manitoba Moose
  • Reader Bridge
  • Free Press 101: How we practise journalism
  • Advertising
  • Carrier Positions & Retailer Requests
  • FP Newspapers Inc.
  • Internships
  • Job Opportunities
  • Local Journalism Initiative
  • Privacy Policy
  • Retail Locations
  • Staff Biographies
  • Terms and Conditions
  • All Free Press Community Review News
  • East Edition
  • West Edition
  • Classifieds
  • All FP Features
  • Business Hub
  • Drink & Dine
  • Health & Wellness
  • Whiskers & Wings
  • Sponsored Articles
  • Property Listings
  • Featured News
  • Renovation and design
  • Resale homes
  • Newsletters
  • Niigaan and the Lone Ranger
  • Photo and Book store
  • Become a Patron

© 2024 Winnipeg Free Press

Quick Links

  • Publications
  • Sponsored Content

Ways to support us

  • Pay it Forward program
  • Support Faith coverage
  • Support Arts coverage

Replica E-Edition

Arts & Life

  • Photo Galleries

Canstar Community news

notifications banner icon

Notification Settings

This browser doesn't support push notifications at the moment. Check browsers features, update your browser or try to use one from the list of recommended to manage your notifications settings:

  • Firefox (27+)
  • Google Chrome (30+)
  • Safari ( MacOS 13+ with browser 16.1+ and iOS 16.4+ ) / Note make sure Push API support enabled under Settings > Safari > Advanced > Experimental Features
  • Microsoft Edge

If you wish to manage your notification settings from this browser you will need to update your browser's settings for this site. Just click button below and allow notifications for this site

Note Safari 16.4+ working on iOS devices also need this site app to be installed at device's Home Screen for Push Notifications to work

Notifications are blocked for this site. If you wish to manage your notification settings from this browser you will need to update your browser's settings. Usually you'd need to click on site options icon to the left of address bar and change notifications preferences/permissions from there

Breaking News

Urgent and important stories

Recommended Reads

Noteworthy news and features

Advertisement

Learn more about Free Press Advertising solutions

Critical thinking for democracy

Advertise with us

In an era of rapidly changing technology, the ways in which we engage with media and acquire information continually evolves. Being informed about political updates is no longer limited to local radio, newspapers, and television. The emergence of the internet and cellular devices have revolutionized how we become informed of local and global developments.

Read this article for free:

Already have an account? Log in here »

To continue reading, please subscribe:

Monthly Digital Subscription

$19 $0 for the first 4 weeks *

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles

*No charge for 4 weeks then billed as $19 every four weeks (new subscribers and qualified returning subscribers only). Cancel anytime.

Read unlimited articles for free today:

There are myriad benefits to these technological advancements, including being exposed to pluralistic voices, hearing diverse and contrasting perspectives on contentious topics, and having speedy access to new information. However, despite the benefits of digital media, there are certainly various concerns.

Topical discussion regarding the proliferation of conspiracy theories, disinformation, and fake news are all relevant in our 21st century society. Public distrust has only been exacerbated as a former U.S. president has repeatedly proclaimed free media as “the enemy of the people.”

There is evidence that our electoral processes, the heart of social democracies, are becoming increasingly interfered by adversarial, autocratic governments. Social media audiences are particularly vulnerable to such concerted manipulation efforts and are perhaps at the crux of the disinformation issue.

As social media technologies have become so readily accessible and routinely utilized, we are highly vulnerable to purposeful disinformation by unregulated entities. This is particularly true for younger generations, as reports suggest that youth heavily rely upon social media platforms for their daily news and political updates.

Further, the rise in artificial intelligence and understanding what impact this new technology may have in influencing our consumption of daily news, remains largely unknown. Coupling these developments with the escalating distrust in public institutions and electoral processes raises genuine alarm for the vitality of our social democracies.

Now more than ever, acquiring literacy and critical thinking skills are of paramount importance. Ascertaining truth from fact can be a difficult task for adults, and the challenge is understandably more pronounced for children and youth. Our public schools must continue to prioritze pedagogies that cultivate critical thinking capacities among young people.

The United States education department concluded in a 2017 national review that approximately 130 million Americans read at, or below, a sixth-grade level. While this statistic may be shocking, there are numerous articles since published sharing how low national literacy rates impact the U.S. economy by trillions of dollars annually. Literacy rates in Canada are generally higher, but efforts to improve literacy rates among children and youth are ongoing.

High literacy proficiencies may contribute to a strong economy, but more importantly being able to read, write, and critically engage with information has countless benefits to individuals, groups, and our collective society.

Not only do students need to develop strong literacy rates to be effective in their future employment opportunities, they also deserve a quality education to enhance their humanistic growth and potential. As we learn to read, write, and engage with text we also learn to decipher trustworthy from unreliable. Literacy skills and critical thinking skills are not quite synonymous, but they are strongly correlated with one another.

As such, in the era of “fake news” and concerted efforts to disseminate disinformation, our public schools need to further prioritize pedagogies to foster critical thinking skills among young people. Although we may be unable to prevent the proliferation of disinformation on social media, we may ameliorate these challenges through educational interventions.

Teachers and parents alike should encourage youth to question the credibility of their sources of information, to analyze authors’ levels of education and/or experience related to the topic, and to consider multiple sources before arriving at conclusions.

We should respect perspectives from authority figures, those that study various fields, such as medicine, law, ecology, education, and so forth. Understandably, all humans are subject to fallibility, so we should corroborate experts’ positions with others studying the discipline.

In short, the rise in disinformation is a genuine threat to the social function of our democracy. Recognizing critical thinking skills to be important may be somewhat axiomatic. However, cognizant of growing public distrust with the media, cultivating such skills among youth is timely. We should engage in explicit dialogue with youth over the credibility of internet and social media sources, to critically interrogate texts, and to investigate the strengths and limitations of our public institutions. A critically reflective society is conducive not only to the vitality in our democratic infrastructures, but also to our individual and collective humanistic growth.

Jordan Laidlaw is a public school teacher and a Ph.D. candidate in educational administration at the University of Manitoba.

Advertisement Advertise With Us

Journalism Trust Initiative

Advertisement

Advertisement

Scientific Thinking and Critical Thinking in Science Education 

Two Distinct but Symbiotically Related Intellectual Processes

  • Open access
  • Published: 05 September 2023

Cite this article

You have full access to this open access article

  • Antonio García-Carmona   ORCID: orcid.org/0000-0001-5952-0340 1  

4206 Accesses

Explore all metrics

Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one or the other indistinctly to refer to the same cognitive and metacognitive skills, usually leaving unclear what are their differences and what are their common aspects. The present work therefore was aimed at elucidating what the differences and relationships between these two types of thinking are. The conclusion reached was that, while they differ in regard to the purposes of their application and some skills or processes, they also share others and are related symbiotically in a metaphorical sense; i.e., each one makes sense or develops appropriately when it is nourished or enriched by the other. Finally, an orientative proposal is presented for an integrated development of the two types of thinking in science classes.

Similar content being viewed by others

critical thinking about evidence

Philosophical Inquiry and Critical Thinking in Primary and Secondary Science Education

Fostering scientific literacy and critical thinking in elementary science education.

Rui Marques Vieira & Celina Tenreiro-Vieira

critical thinking about evidence

Enhancing Scientific Thinking Through the Development of Critical Thinking in Higher Education

Avoid common mistakes on your manuscript.

Education is not the learning of facts, but the training of the mind to think. Albert Einstein

1 Introduction

In consulting technical reports, theoretical frameworks, research, and curricular reforms related to science education, one commonly finds appeals to scientific thinking and critical thinking as essential educational processes or objectives. This is confirmed in some studies that include exhaustive reviews of the literature in this regard such as those of Bailin ( 2002 ), Costa et al. ( 2020 ), and Santos ( 2017 ) on critical thinking, and of Klarh et al. ( 2019 ) and Lehrer and Schauble ( 2006 ) on scientific thinking. However, conceptualizing and differentiating between both types of thinking based on the above-mentioned documents of science education are generally difficult. In many cases, they are referred to without defining them, or they are used interchangeably to represent virtually the same thing. Thus, for example, the document A Framework for K-12 Science Education points out that “Critical thinking is required, whether in developing and refining an idea (an explanation or design) or in conducting an investigation” (National Research Council (NRC), 2012 , p. 46). The same document also refers to scientific thinking when it suggests that basic scientific education should “provide students with opportunities for a range of scientific activities and scientific thinking , including, but not limited to inquiry and investigation, collection and analysis of evidence, logical reasoning, and communication and application of information” (NRC, 2012 , p. 251).

A few years earlier, the report Science Teaching in Schools in Europe: Policies and Research (European Commission/Eurydice, 2006 ) included the dimension “scientific thinking” as part of standardized national science tests in European countries. This dimension consisted of three basic abilities: (i) to solve problems formulated in theoretical terms , (ii) to frame a problem in scientific terms , and (iii) to formulate scientific hypotheses . In contrast, critical thinking was not even mentioned in such a report. However, in subsequent similar reports by the European Commission/Eurydice ( 2011 , 2022 ), there are some references to the fact that the development of critical thinking should be a basic objective of science teaching, although these reports do not define it at any point.

The ENCIENDE report on early-year science education in Spain also includes an explicit allusion to critical thinking among its recommendations: “Providing students with learning tools means helping them to develop critical thinking , to form their own opinions, to distinguish between knowledge founded on the evidence available at a certain moment (evidence which can change) and unfounded beliefs” (Confederation of Scientific Societies in Spain (COSCE), 2011 , p. 62). However, the report makes no explicit mention to scientific thinking. More recently, the document “ Enseñando ciencia con ciencia ” (Teaching science with science) (Couso et al., 2020 ), sponsored by Spain’s Ministry of Education, also addresses critical thinking:

(…) with the teaching approach through guided inquiry students learn scientific content, learn to do science (procedures), learn what science is and how it is built, and this (...) helps to develop critical thinking , that is, to question any statement that is not supported by evidence. (Couso et al., 2020 , p. 54)

On the other hand, in referring to what is practically the same thing, the European report Science Education for Responsible Citizenship speaks of scientific thinking when it establishes that one of the challenges of scientific education should be: “To promote a culture of scientific thinking and inspire citizens to use evidence-based reasoning for decision making” (European Commission, 2015 , p. 14). However, the Pisa 2024 Strategic Vision and Direction for Science report does not mention scientific thinking but does mention critical thinking in noting that “More generally, (students) should be able to recognize the limitations of scientific inquiry and apply critical thinking when engaging with its results” (Organization for Economic Co-operation and Development (OECD), 2020 , p. 9).

The new Spanish science curriculum for basic education (Royal Decree 217/ 2022 ) does make explicit reference to scientific thinking. For example, one of the STEM (Science, Technology, Engineering, and Mathematics) competency descriptors for compulsory secondary education reads:

Use scientific thinking to understand and explain the phenomena that occur around them, trusting in knowledge as a motor for development, asking questions and checking hypotheses through experimentation and inquiry (...) showing a critical attitude about the scope and limitations of science. (p. 41,599)

Furthermore, when developing the curriculum for the subjects of physics and chemistry, the same provision clarifies that “The essence of scientific thinking is to understand what are the reasons for the phenomena that occur in the natural environment to then try to explain them through the appropriate laws of physics and chemistry” (Royal Decree 217/ 2022 , p. 41,659). However, within the science subjects (i.e., Biology and Geology, and Physics and Chemistry), critical thinking is not mentioned as such. Footnote 1 It is only more or less directly alluded to with such expressions as “critical analysis”, “critical assessment”, “critical reflection”, “critical attitude”, and “critical spirit”, with no attempt to conceptualize it as is done with regard to scientific thinking.

The above is just a small sample of the concepts of scientific thinking and critical thinking only being differentiated in some cases, while in others they are presented as interchangeable, using one or the other indistinctly to talk about the same cognitive/metacognitive processes or practices. In fairness, however, it has to be acknowledged—as said at the beginning—that it is far from easy to conceptualize these two types of thinking (Bailin, 2002 ; Dwyer et al., 2014 ; Ennis, 2018 ; Lehrer & Schauble, 2006 ; Kuhn, 1993 , 1999 ) since they feed back on each other, partially overlap, and share certain features (Cáceres et al., 2020 ; Vázquez-Alonso & Manassero-Mas, 2018 ). Neither is there unanimity in the literature on how to characterize each of them, and rarely have they been analyzed comparatively (e.g., Hyytinen et al., 2019 ). For these reasons, I believed it necessary to address this issue with the present work in order to offer some guidelines for science teachers interested in deepening into these two intellectual processes to promote them in their classes.

2 An Attempt to Delimit Scientific Thinking in Science Education

For many years, cognitive science has been interested in studying what scientific thinking is and how it can be taught in order to improve students’ science learning (Klarh et al., 2019 ; Zimmerman & Klarh, 2018 ). To this end, Kuhn et al. propose taking a characterization of science as argument (Kuhn, 1993 ; Kuhn et al., 2008 ). They argue that this is a suitable way of linking the activity of how scientists think with that of the students and of the public in general, since science is a social activity which is subject to ongoing debate, in which the construction of arguments plays a key role. Lehrer and Schauble ( 2006 ) link scientific thinking with scientific literacy, paying especial attention to the different images of science. According to those authors, these images would guide the development of the said literacy in class. The images of science that Leherer and Schauble highlight as characterizing scientific thinking are: (i) science-as-logical reasoning (role of domain-general forms of scientific reasoning, including formal logic, heuristic, and strategies applied in different fields of science), (ii) science-as-theory change (science is subject to permanent revision and change), and (iii) science-as-practice (scientific knowledge and reasoning are components of a larger set of activities that include rules of participation, procedural skills, epistemological knowledge, etc.).

Based on a literature review, Jirout ( 2020 ) defines scientific thinking as an intellectual process whose purpose is the intentional search for information about a phenomenon or facts by formulating questions, checking hypotheses, carrying out observations, recognizing patterns, and making inferences (a detailed description of all these scientific practices or competencies can be found, for example, in NRC, 2012 ; OECD, 2019 ). Therefore, for Jirout, the development of scientific thinking would involve bringing into play the basic science skills/practices common to the inquiry-based approach to learning science (García-Carmona, 2020 ; Harlen, 2014 ). For other authors, scientific thinking would include a whole spectrum of scientific reasoning competencies (Krell et al., 2022 ; Moore, 2019 ; Tytler & Peterson, 2004 ). However, these competences usually cover the same science skills/practices mentioned above. Indeed, a conceptual overlap between scientific thinking, scientific reasoning, and scientific inquiry is often found in science education goals (Krell et al., 2022 ). Although, according to Leherer and Schauble ( 2006 ), scientific thinking is a broader construct that encompasses the other two.

It could be said that scientific thinking is a particular way of searching for information using science practices Footnote 2 (Klarh et al., 2019 ; Zimmerman & Klarh, 2018 ; Vázquez-Alonso & Manassero-Mas, 2018 ). This intellectual process provides the individual with the ability to evaluate the robustness of evidence for or against a certain idea, in order to explain a phenomenon (Clouse, 2017 ). But the development of scientific thinking also requires metacognition processes. According to what Kuhn ( 2022 ) argues, metacognition is fundamental to the permanent control or revision of what an individual thinks and knows, as well as that of the other individuals with whom it interacts, when engaging in scientific practices. In short, scientific thinking demands a good connection between reasoning and metacognition (Kuhn, 2022 ). Footnote 3

From that perspective, Zimmerman and Klarh ( 2018 ) have synthesized a taxonomy categorizing scientific thinking, relating cognitive processes with the corresponding science practices (Table 1 ). It has to be noted that this taxonomy was prepared in line with the categorization of scientific practices proposed in the document A Framework for K-12 Science Education (NRC, 2012 ). This is why one needs to understand that, for example, the cognitive process of elaboration and refinement of hypotheses is not explicitly associated with the scientific practice of hypothesizing but only with the formulation of questions. Indeed, the K-12 Framework document does not establish hypothesis formulation as a basic scientific practice. Lederman et al. ( 2014 ) justify it by arguing that not all scientific research necessarily allows or requires the verification of hypotheses, for example, in cases of exploratory or descriptive research. However, the aforementioned document (NRC, 2012 , p. 50) does refer to hypotheses when describing the practice of developing and using models , appealing to the fact that they facilitate the testing of hypothetical explanations .

In the literature, there are also other interesting taxonomies characterizing scientific thinking for educational purposes. One of them is that of Vázquez-Alonso and Manassero-Mas ( 2018 ) who, instead of science practices, refer to skills associated with scientific thinking . Their characterization basically consists of breaking down into greater detail the content of those science practices that would be related to the different cognitive and metacognitive processes of scientific thinking. Also, unlike Zimmerman and Klarh’s ( 2018 ) proposal, Vázquez-Alonso and Manassero-Mas’s ( 2018 ) proposal explicitly mentions metacognition as one of the aspects of scientific thinking, which they call meta-process . In my opinion, the proposal of the latter authors, which shells out scientific thinking into a broader range of skills/practices, can be more conducive in order to favor its approach in science classes, as teachers would have more options to choose from to address components of this intellectual process depending on their teaching interests, the educational needs of their students and/or the learning objectives pursued. Table 2 presents an adapted characterization of the Vázquez-Alonso and Manassero-Mas’s ( 2018 ) proposal to address scientific thinking in science education.

3 Contextualization of Critical Thinking in Science Education

Theorization and research about critical thinking also has a long tradition in the field of the psychology of learning (Ennis, 2018 ; Kuhn, 1999 ), and its application extends far beyond science education (Dwyer et al., 2014 ). Indeed, the development of critical thinking is commonly accepted as being an essential goal of people’s overall education (Ennis, 2018 ; Hitchcock, 2017 ; Kuhn, 1999 ; Willingham, 2008 ). However, its conceptualization is not simple and there is no unanimous position taken on it in the literature (Costa et al., 2020 ; Dwyer et al., 2014 ); especially when trying to relate it to scientific thinking. Thus, while Tena-Sánchez and León-Medina ( 2022 ) Footnote 4 and McBain et al. ( 2020 ) consider critical thinking to be the basis of or forms part of scientific thinking, Dowd et al. ( 2018 ) understand scientific thinking to be just a subset of critical thinking. However, Vázquez-Alonso and Manassero-Mas ( 2018 ) do not seek to determine whether critical thinking encompasses scientific thinking or vice versa. They consider that both types of knowledge share numerous skills/practices and the progressive development of one fosters the development of the other as a virtuous circle of improvement. Other authors, such as Schafersman ( 1991 ), even go so far as to say that critical thinking and scientific thinking are the same thing. In addition, some views on the relationship between critical thinking and scientific thinking seem to be context-dependent. For example, Hyytine et al. ( 2019 ) point out that in the perspective of scientific thinking as a component of critical thinking, the former is often used to designate evidence-based thinking in the sciences, although this view tends to dominate in Europe but not in the USA context. Perhaps because of this lack of consensus, the two types of thinking are often confused, overlapping, or conceived as interchangeable in education.

Even with such a lack of unanimous or consensus vision, there are some interesting theoretical frameworks and definitions for the development of critical thinking in education. One of the most popular definitions of critical thinking is that proposed by The National Council for Excellence in Critical Thinking (1987, cited in Inter-American Teacher Education Network, 2015 , p. 6). This conceives of it as “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action”. In other words, critical thinking can be regarded as a reflective and reasonable class of thinking that provides people with the ability to evaluate multiple statements or positions that are defensible to then decide which is the most defensible (Clouse, 2017 ; Ennis, 2018 ). It thus requires, in addition to a basic scientific competency, notions about epistemology (Kuhn, 1999 ) to understand how knowledge is constructed. Similarly, it requires skills for metacognition (Hyytine et al., 2019 ; Kuhn, 1999 ; Magno, 2010 ) since critical thinking “entails awareness of one’s own thinking and reflection on the thinking of self and others as objects of cognition” (Dean & Kuhn, 2003 , p. 3).

In science education, one of the most suitable scenarios or resources, but not the only one, Footnote 5 to address all these aspects of critical thinking is through the analysis of socioscientific issues (SSI) (Taylor et al., 2006 ; Zeidler & Nichols, 2009 ). Without wishing to expand on this here, I will only say that interesting works can be found in the literature that have analyzed how the discussion of SSIs can favor the development of critical thinking skills (see, e.g., López-Fernández et al., 2022 ; Solbes et al., 2018 ). For example, López-Fernández et al. ( 2022 ) focused their teaching-learning sequence on the following critical thinking skills: information analysis, argumentation, decision making, and communication of decisions. Even some authors add the nature of science (NOS) to this framework (i.e., SSI-NOS-critical thinking), as, for example, Yacoubian and Khishfe ( 2018 ) in order to develop critical thinking and how this can also favor the understanding of NOS (Yacoubian, 2020 ). In effect, as I argued in another work on the COVID-19 pandemic as an SSI, in which special emphasis was placed on critical thinking, an informed understanding of how science works would have helped the public understand why scientists were changing their criteria to face the pandemic in the light of new data and its reinterpretations, or that it was not possible to go faster to get an effective and secure medical treatment for the disease (García-Carmona, 2021b ).

In the recent literature, there have also been some proposals intended to characterize critical thinking in the context of science education. Table 3 presents two of these by way of example. As can be seen, both proposals share various components for the development of critical thinking (respect for evidence, critically analyzing/assessing the validity/reliability of information, adoption of independent opinions/decisions, participation, etc.), but that of Blanco et al. ( 2017 ) is more clearly contextualized in science education. Likewise, that of these authors includes some more aspects (or at least does so more explicitly), such as developing epistemological Footnote 6 knowledge of science (vision of science…) and on its interactions with technology, society, and environment (STSA relationships), and communication skills. Therefore, it offers a wider range of options for choosing critical thinking skills/processes to promote it in science classes. However, neither proposal refers to metacognitive skills, which are also essential for developing critical thinking (Kuhn, 1999 ).

3.1 Critical thinking vs. scientific thinking in science education: differences and similarities

In accordance with the above, it could be said that scientific thinking is nourished by critical thinking, especially when deciding between several possible interpretations and explanations of the same phenomenon since this generally takes place in a context of debate in the scientific community (Acevedo-Díaz & García-Carmona, 2017 ). Thus, the scientific attitude that is perhaps most clearly linked to critical thinking is the skepticism with which scientists tend to welcome new ideas (Normand, 2008 ; Sagan, 1987 ; Tena-Sánchez and León-Medina, 2022 ), especially if they are contrary to well-established scientific knowledge (Bell, 2009 ). A good example of this was the OPERA experiment (García-Carmona & Acevedo-Díaz, 2016a ), which initially seemed to find that neutrinos could move faster than the speed of light. This finding was supposed to invalidate Albert Einstein’s theory of relativity (the finding was later proved wrong). In response, Nobel laureate in physics Sheldon L. Glashow went so far as to state that:

the result obtained by the OPERA collaboration cannot be correct. If it were, we would have to give up so many things, it would be such a huge sacrifice... But if it is, I am officially announcing it: I will shout to Mother Nature: I’m giving up! And I will give up Physics. (BBVA Foundation, 2011 )

Indeed, scientific thinking is ultimately focused on getting evidence that may support an idea or explanation about a phenomenon, and consequently allow others that are less convincing or precise to be discarded. Therefore when, with the evidence available, science has more than one equally defensible position with respect to a problem, the investigation is considered inconclusive (Clouse, 2017 ). In certain cases, this gives rise to scientific controversies (Acevedo-Díaz & García-Carmona, 2017 ) which are not always resolved based exclusively on epistemic or rational factors (Elliott & McKaughan, 2014 ; Vallverdú, 2005 ). Hence, it is also necessary to integrate non-epistemic practices into the framework of scientific thinking (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ), practices that transcend the purely rational or cognitive processes, including, for example, those related to emotional or affective issues (Sinatra & Hofer, 2021 ). From an educational point of view, this suggests that for students to become more authentically immersed in the way of working or thinking scientifically, they should also learn to feel as scientists do when they carry out their work (Davidson et al., 2020 ). Davidson et al. ( 2020 ) call it epistemic affect , and they suggest that it could be approach in science classes by teaching students to manage their frustrations when they fail to achieve the expected results; Footnote 7 or, for example, to moderate their enthusiasm with favorable results in a scientific inquiry by activating a certain skepticism that encourages them to do more testing. And, as mentioned above, for some authors, having a skeptical attitude is one of the actions that best visualize the application of critical thinking in the framework of scientific thinking (Normand, 2008 ; Sagan, 1987 ; Tena-Sánchez and León-Medina, 2022 ).

On the other hand, critical thinking also draws on many of the skills or practices of scientific thinking, as discussed above. However, in contrast to scientific thinking, the coexistence of two or more defensible ideas is not, in principle, a problem for critical thinking since its purpose is not so much to invalidate some ideas or explanations with respect to others, but rather to provide the individual with the foundations on which to position themself with the idea/argument they find most defensible among several that are possible (Ennis, 2018 ). For example, science with its methods has managed to explain the greenhouse effect, the phenomenon of the tides, or the transmission mechanism of the coronavirus. For this, it had to discard other possible explanations as they were less valid in the investigations carried out. These are therefore issues resolved by the scientific community which create hardly any discussion at the present time. However, taking a position for or against the production of energy in nuclear power plants transcends the scope of scientific thinking since both positions are, in principle, equally defensible. Indeed, within the scientific community itself there are supporters and detractors of the two positions, based on the same scientific knowledge. Consequently, it is critical thinking, which requires the management of knowledge and scientific skills, a basic understanding of epistemic (rational or cognitive) and non-epistemic (social, ethical/moral, economic, psychological, cultural, ...) aspects of the nature of science, as well as metacognitive skills, which helps the individual forge a personal foundation on which to position themself in one place or another, or maintain an uncertain, undecided opinion.

In view of the above, one can summarize that scientific thinking and critical thinking are two different intellectual processes in terms of purpose, but are related symbiotically (i.e., one would make no sense without the other or both feed on each other) and that, in their performance, they share a fair number of features, actions, or mental skills. According to Cáceres et al. ( 2020 ) and Hyytine et al. ( 2019 ), the intellectual skills that are most clearly common to both types of thinking would be searching for relationships between evidence and explanations , as well as investigating and logical thinking to make inferences . To this common space, I would also add skills for metacognition in accordance with what has been discussed about both types of knowledge (Khun, 1999 , 2022 ).

In order to compile in a compact way all that has been argued so far, in Table 4 , I present my overview of the relationship between scientific thinking and critical thinking. I would like to point out that I do not intend to be extremely extensive in the compilation, in the sense that possibly more elements could be added in the different sections, but rather to represent above all the aspects that distinguish and share them, as well as the mutual enrichment (or symbiosis) between them.

4 A Proposal for the Integrated Development of Critical Thinking and Scientific Thinking in Science Classes

Once the differences, common aspects, and relationships between critical thinking and scientific thinking have been discussed, it would be relevant to establish some type of specific proposal to foster them in science classes. Table 5 includes a possible script to address various skills or processes of both types of thinking in an integrated manner. However, before giving guidance on how such skills/processes could be approached, I would like to clarify that while all of them could be dealt within the context of a single school activity, I will not do so in this way. First, because I think that it can give the impression that the proposal is only valid if it is applied all at once in a specific learning situation, which can also discourage science teachers from implementing it in class due to lack of time or training to do so. Second, I think it can be more interesting to conceive the proposal as a set of thinking skills or actions that can be dealt with throughout the different science contents, selecting only (if so decided) some of them, according to educational needs or characteristics of the learning situation posed in each case. Therefore, in the orientations for each point of the script or grouping of these, I will use different examples and/or contexts. Likewise, these orientations in the form of comments, although founded in the literature, should be considered only as possibilities to do so, among many others possible.

Motivation and predisposition to reflect and discuss (point i ) demands, on the one hand, that issues are chosen which are attractive for the students. This can be achieved, for example, by asking the students directly what current issues, related to science and its impact or repercussions, they would like to learn about, and then decide on which issue to focus on (García-Carmona, 2008 ). Or the teacher puts forward the issue directly in class, trying for it be current, to be present in the media, social networks, etc., or what they think may be of interest to their students based on their teaching experience. In this way, each student is encouraged to feel questioned or concerned as a citizen because of the issue that is going to be addressed (García-Carmona, 2008 ). Also of possible interest is the analysis of contemporary, as yet unresolved socioscientific affairs (Solbes et al., 2018 ), such as climate change, science and social justice, transgenic foods, homeopathy, and alcohol and drug use in society. But also, everyday questions can be investigated which demand a decision to be made, such as “What car to buy?” (Moreno-Fontiveros et al., 2022 ), or “How can we prevent the arrival of another pandemic?” (Ushola & Puig, 2023 ).

On the other hand, it is essential that the discussion about the chosen issue is planned through an instructional process that generates an environment conducive to reflection and debate, with a view to engaging the students’ participation in it. This can be achieved, for example, by setting up a role-play game (Blanco-López et al., 2017 ), especially if the issue is socioscientific, or by critical and reflective reading of advertisements with scientific content (Campanario et al., 2001 ) or of science-related news in the daily media (García-Carmona, 2014 , 2021a ; Guerrero-Márquez & García-Carmona, 2020 ; Oliveras et al., 2013 ), etc., for subsequent discussion—all this, in a collaborative learning setting and with a clear democratic spirit.

Respect for scientific evidence (point ii ) should be the indispensable condition in any analysis and discussion from the prisms of scientific and of critical thinking (Erduran, 2021 ). Although scientific knowledge may be impregnated with subjectivity during its construction and is revisable in the light of new evidence ( tentativeness of scientific knowledge), when it is accepted by the scientific community it is as objective as possible (García-Carmona & Acevedo-Díaz, 2016b ). Therefore, promoting trust and respect for scientific evidence should be one of the primary educational challenges to combating pseudoscientists and science deniers (Díaz & Cabrera, 2022 ), whose arguments are based on false beliefs and assumptions, anecdotes, and conspiracy theories (Normand, 2008 ). Nevertheless, it is no simple task to achieve the promotion or respect for scientific evidence (Fackler, 2021 ) since science deniers, for example, consider that science is unreliable because it is imperfect (McIntyre, 2021 ). Hence the need to promote a basic understanding of NOS (point iii ) as a fundamental pillar for the development of both scientific thinking and critical thinking. A good way to do this would be through explicit and reflective discussion about controversies from the history of science (Acevedo-Díaz & García-Carmona, 2017 ) or contemporary controversies (García-Carmona, 2021b ; García-Carmona & Acevedo-Díaz, 2016a ).

Also, with respect to point iii of the proposal, it is necessary to manage basic scientific knowledge in the development of scientific and critical thinking skills (Willingham, 2008 ). Without this, it will be impossible to develop a minimally serious and convincing argument on the issue being analyzed. For example, if one does not know the transmission mechanism of a certain disease, it is likely to be very difficult to understand or justify certain patterns of social behavior when faced with it. In general, possessing appropriate scientific knowledge on the issue in question helps to make the best interpretation of the data and evidence available on this issue (OECD, 2019 ).

The search for information from reliable sources, together with its analysis and interpretation (points iv to vi ), are essential practices both in purely scientific contexts (e.g., learning about the behavior of a given physical phenomenon from literature or through enquiry) and in the application of critical thinking (e.g., when one wishes to take a personal, but informed, position on a particular socio-scientific issue). With regard to determining the credibility of information with scientific content on the Internet, Osborne et al. ( 2022 ) propose, among other strategies, to check whether the source is free of conflicts of interest, i.e., whether or not it is biased by ideological, political or economic motives. Also, it should be checked whether the source and the author(s) of the information are sufficiently reputable.

Regarding the interpretation of data and evidence, several studies have shown the difficulties that students often have with this practice in the context of enquiry activities (e.g., Gobert et al., 2018 ; Kanari & Millar, 2004 ; Pols et al., 2021 ), or when analyzing science news in the press (Norris et al., 2003 ). It is also found that they have significant difficulties in choosing the most appropriate data to support their arguments in causal analyses (Kuhn & Modrek, 2022 ). However, it must be recognized that making interpretations or inferences from data is not a simple task; among other reasons, because their construction is influenced by multiple factors, both epistemic (prior knowledge, experimental designs, etc.) and non-epistemic (personal expectations, ideology, sociopolitical context, etc.), which means that such interpretations are not always the same for all scientists (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ). For this reason, the performance of this scientific practice constitutes one of the phases or processes that generate the most debate or discussion in a scientific community, as long as no consensus is reached. In order to improve the practice of making inferences among students, Kuhn and Lerman ( 2021 ) propose activities that help them develop their own epistemological norms to connect causally their statements with the available evidence.

Point vii refers, on the one hand, to an essential scientific practice: the elaboration of evidence-based scientific explanations which generally, in a reasoned way, account for the causality, properties, and/or behavior of the phenomena (Brigandt, 2016 ). In addition, point vii concerns the practice of argumentation . Unlike scientific explanations, argumentation tries to justify an idea, explanation, or position with the clear purpose of persuading those who defend other different ones (Osborne & Patterson, 2011 ). As noted above, the complexity of most socioscientific issues implies that they have no unique valid solution or response. Therefore, the content of the arguments used to defend one position or another are not always based solely on purely rational factors such as data and scientific evidence. Some authors defend the need to also deal with non-epistemic aspects of the nature of science when teaching it (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ) since many scientific and socioscientific controversies are resolved by different factors or go beyond just the epistemic (Vallverdú, 2005 ).

To defend an idea or position taken on an issue, it is not enough to have scientific evidence that supports it. It is also essential to have skills for the communication and discussion of ideas (point viii ). The history of science shows how the difficulties some scientists had in communicating their ideas scientifically led to those ideas not being accepted at the time. A good example for students to become aware of this is the historical case of Semmelweis and puerperal fever (Aragón-Méndez et al., 2019 ). Its reflective reading makes it possible to conclude that the proposal of this doctor that gynecologists disinfect their hands, when passing from one parturient to another to avoid contagions that provoked the fever, was rejected by the medical community not only for epistemic reasons, but also for the difficulties that he had to communicate his idea. The history of science also reveals that some scientific interpretations were imposed on others at certain historical moments due to the rhetorical skills of their proponents although none of the explanations would convincingly explain the phenomenon studied. An example is the case of the controversy between Pasteur and Liebig about the phenomenon of fermentation (García-Carmona & Acevedo-Díaz, 2017 ), whose reading and discussion in science class would also be recommended in this context of this critical and scientific thinking skill. With the COVID-19 pandemic, for example, the arguments of some charlatans in the media and on social networks managed to gain a certain influence in the population, even though scientifically they were muddled nonsense (García-Carmona, 2021b ). Therefore, the reflective reading of news on current SSIs such as this also constitutes a good resource for the same educational purpose. In general, according to Spektor-Levy et al. ( 2009 ), scientific communication skills should be addressed explicitly in class, in a progressive and continuous manner, including tasks of information seeking, reading, scientific writing, representation of information, and representation of the knowledge acquired.

Finally (point ix ), a good scientific/critical thinker must be aware of what they know, of what they have doubts about or do not know, to this end continuously practicing metacognitive exercises (Dean & Kuhn, 2003 ; Hyytine et al., 2019 ; Magno, 2010 ; Willingham, 2008 ). At the same time, they must recognize the weaknesses and strengths of the arguments of their peers in the debate in order to be self-critical if necessary, as well as to revising their own ideas and arguments to improve and reorient them, etc. ( self-regulation ). I see one of the keys of both scientific and critical thinking being the capacity or willingness to change one’s mind, without it being frowned upon. Indeed, quite the opposite since one assumes it to occur thanks to the arguments being enriched and more solidly founded. In other words, scientific and critical thinking and arrogance or haughtiness towards the rectification of ideas or opinions do not stick well together.

5 Final Remarks

For decades, scientific thinking and critical thinking have received particular attention from different disciplines such as psychology, philosophy, pedagogy, and specific areas of this last such as science education. The two types of knowledge represent intellectual processes whose development in students, and in society in general, is considered indispensable for the exercise of responsible citizenship in accord with the demands of today’s society (European Commission, 2006 , 2015 ; NRC, 2012 ; OECD, 2020 ). As has been shown however, the task of their conceptualization is complex, and teaching students to think scientifically and critically is a difficult educational challenge (Willingham, 2008 ).

Aware of this, and after many years dedicated to science education, I felt the need to organize my ideas regarding the aforementioned two types of thinking. In consulting the literature about these, I found that, in many publications, scientific thinking and critical thinking are presented or perceived as being interchangeable or indistinguishable; a conclusion also shared by Hyytine et al. ( 2019 ). Rarely have their differences, relationships, or common features been explicitly studied. So, I considered that it was a matter needing to be addressed because, in science education, the development of scientific thinking is an inherent objective, but, when critical thinking is added to the learning objectives, there arise more than reasonable doubts about when one or the other would be used, or both at the same time. The present work came about motivated by this, with the intention of making a particular contribution, but based on the relevant literature, to advance in the question raised. This converges in conceiving scientific thinking and critical thinking as two intellectual processes that overlap and feed into each other in many aspects but are different with respect to certain cognitive skills and in terms of their purpose. Thus, in the case of scientific thinking, the aim is to choose the best possible explanation of a phenomenon based on the available evidence, and it therefore involves the rejection of alternative explanatory proposals that are shown to be less coherent or convincing. Whereas, from the perspective of critical thinking, the purpose is to choose the most defensible idea/option among others that are also defensible, using both scientific and extra-scientific (i.e., moral, ethical, political, etc.) arguments. With this in mind, I have described a proposal to guide their development in the classroom, integrating them under a conception that I have called, metaphorically, a symbiotic relationship between two modes of thinking.

Critical thinking is mentioned literally in other of the curricular provisions’ subjects such as in Education in Civics and Ethical Values or in Geography and History (Royal Decree 217/2022).

García-Carmona ( 2021a ) conceives of them as activities that require the comprehensive application of procedural skills, cognitive and metacognitive processes, and both scientific knowledge and knowledge of the nature of scientific practice .

Kuhn ( 2021 ) argues that the relationship between scientific reasoning and metacognition is especially fostered by what she calls inhibitory control , which basically consists of breaking down the whole of a thought into parts in such a way that attention is inhibited on some of those parts to allow a focused examination of the intended mental content.

Specifically, Tena-Sánchez and León-Medina (2020) assume that critical thinking is at the basis of rational or scientific skepticism that leads to questioning any claim that does not have empirical support.

As discussed in the introduction, the inquiry-based approach is also considered conducive to addressing critical thinking in science education (Couso et al., 2020 ; NRC, 2012 ).

Epistemic skills should not be confused with epistemological knowledge (García-Carmona, 2021a ). The former refers to skills to construct, evaluate, and use knowledge, and the latter to understanding about the origin, nature, scope, and limits of scientific knowledge.

For this purpose, it can be very useful to address in class, with the help of the history and philosophy of science, that scientists get more wrong than right in their research, and that error is always an opportunity to learn (García-Carmona & Acevedo-Díaz, 2018 ).

Acevedo-Díaz, J. A., & García-Carmona, A. (2017). Controversias en la historia de la ciencia y cultura científica [Controversies in the history of science and scientific culture]. Los Libros de la Catarata.

Aragón-Méndez, M. D. M., Acevedo-Díaz, J. A., & García-Carmona, A. (2019). Prospective biology teachers’ understanding of the nature of science through an analysis of the historical case of Semmelweis and childbed fever. Cultural Studies of Science Education , 14 (3), 525–555. https://doi.org/10.1007/s11422-018-9868-y

Bailin, S. (2002). Critical thinking and science education. Science & Education, 11 (4), 361–375. https://doi.org/10.1023/A:1016042608621

Article   Google Scholar  

BBVA Foundation (2011). El Nobel de Física Sheldon L. Glashow no cree que los neutrinos viajen más rápido que la luz [Physics Nobel laureate Sheldon L. Glashow does not believe neutrinos travel faster than light.]. https://www.fbbva.es/noticias/nobel-fisica-sheldon-l-glashow-no-cree-los-neutrinos-viajen-mas-rapido-la-luz/ . Accessed 5 Februray 2023.

Bell, R. L. (2009). Teaching the nature of science: Three critical questions. In Best Practices in Science Education . National Geographic School Publishing.

Google Scholar  

Blanco-López, A., España-Ramos, E., & Franco-Mariscal, A. J. (2017). Estrategias didácticas para el desarrollo del pensamiento crítico en el aula de ciencias [Teaching strategies for the development of critical thinking in the teaching of science]. Ápice. Revista de Educación Científica, 1 (1), 107–115. https://doi.org/10.17979/arec.2017.1.1.2004

Brigandt, I. (2016). Why the difference between explanation and argument matters to science education. Science & Education, 25 (3-4), 251–275. https://doi.org/10.1007/s11191-016-9826-6

Cáceres, M., Nussbaum, M., & Ortiz, J. (2020). Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity, 37 , 100674. https://doi.org/10.1016/j.tsc.2020.100674

Campanario, J. M., Moya, A., & Otero, J. (2001). Invocaciones y usos inadecuados de la ciencia en la publicidad [Invocations and misuses of science in advertising]. Enseñanza de las Ciencias, 19 (1), 45–56. https://doi.org/10.5565/rev/ensciencias.4013

Clouse, S. (2017). Scientific thinking is not critical thinking. https://medium.com/extra-extra/scientific-thinking-is-not-critical-thinking-b1ea9ebd8b31

Confederacion de Sociedades Cientificas de Espana [COSCE]. (2011). Informe ENCIENDE: Enseñanza de las ciencias en la didáctica escolar para edades tempranas en España [ENCIENDE report: Science education for early-year in Spain] . COSCE.

Costa, S. L. R., Obara, C. E., & Broietti, F. C. D. (2020). Critical thinking in science education publications: the research contexts. International Journal of Development Research, 10 (8), 39438. https://doi.org/10.37118/ijdr.19437.08.2020

Couso, D., Jiménez-Liso, M.R., Refojo, C. & Sacristán, J.A. (coords.) (2020). Enseñando ciencia con ciencia [Teaching science with science]. FECYT & Fundacion Lilly / Penguin Random House

Davidson, S. G., Jaber, L. Z., & Southerland, S. A. (2020). Emotions in the doing of science: Exploring epistemic affect in elementary teachers' science research experiences. Science Education, 104 (6), 1008–1040. https://doi.org/10.1002/sce.21596

Dean, D., & Kuhn, D. (2003). Metacognition and critical thinking. ERIC document. Reproduction No. ED477930 . https://files.eric.ed.gov/fulltext/ED477930.pdf

Díaz, C., & Cabrera, C. (2022). Desinformación científica en España . FECYT/IBERIFIER https://www.fecyt.es/es/publicacion/desinformacion-cientifica-en-espana

Dowd, J. E., Thompson, R. J., Jr., Schiff, L. A., & Reynolds, J. A. (2018). Understanding the complex relationship between critical thinking and science reasoning among undergraduate thesis writers. CBE—Life Sciences . Education, 17 (1), ar4. https://doi.org/10.1187/cbe.17-03-0052

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity, 12 , 43–52. https://doi.org/10.1016/j.tsc.2013.12.004

Elliott, K. C., & McKaughan, D. J. (2014). Non-epistemic values and the multiple goals of science. Philosophy of Science, 81 (1), 1–21. https://doi.org/10.1086/674345

Ennis, R. H. (2018). Critical thinking across the curriculum: A vision. Topoi, 37 (1), 165–184. https://doi.org/10.1007/s11245-016-9401-4

Erduran, S. (2021). Respect for evidence: Can science education deliver it? Science & Education, 30 (3), 441–444. https://doi.org/10.1007/s11191-021-00245-8

European Commission. (2015). Science education for responsible citizenship . Publications Office https://op.europa.eu/en/publication-detail/-/publication/a1d14fa0-8dbe-11e5-b8b7-01aa75ed71a1

European Commission / Eurydice. (2011). Science education in Europe: National policies, practices and research . Publications Office. https://op.europa.eu/en/publication-detail/-/publication/bae53054-c26c-4c9f-8366-5f95e2187634

European Commission / Eurydice. (2022). Increasing achievement and motivation in mathematics and science learning in schools . Publications Office. https://eurydice.eacea.ec.europa.eu/publications/mathematics-and-science-learning-schools-2022

European Commission/Eurydice. (2006). Science teaching in schools in Europe. Policies and research . Publications Office. https://op.europa.eu/en/publication-detail/-/publication/1dc3df34-acdf-479e-bbbf-c404fa3bee8b

Fackler, A. (2021). When science denial meets epistemic understanding. Science & Education, 30 (3), 445–461. https://doi.org/10.1007/s11191-021-00198-y

García-Carmona, A. (2008). Relaciones CTS en la educación científica básica. II. Investigando los problemas del mundo [STS relationships in basic science education II. Researching the world problems]. Enseñanza de las Ciencias, 26 (3), 389–402. https://doi.org/10.5565/rev/ensciencias.3750

García-Carmona, A. (2014). Naturaleza de la ciencia en noticias científicas de la prensa: Análisis del contenido y potencialidades didácticas [Nature of science in press articles about science: Content analysis and pedagogical potential]. Enseñanza de las Ciencias, 32 (3), 493–509. https://doi.org/10.5565/rev/ensciencias.1307

García-Carmona, A., & Acevedo-Díaz, J. A. (2016). Learning about the nature of science using newspaper articles with scientific content. Science & Education, 25 (5–6), 523–546. https://doi.org/10.1007/s11191-016-9831-9

García-Carmona, A., & Acevedo-Díaz, J. A. (2016b). Concepciones de estudiantes de profesorado de Educación Primaria sobre la naturaleza de la ciencia: Una evaluación diagnóstica a partir de reflexiones en equipo [Preservice elementary teachers' conceptions of the nature of science: a diagnostic evaluation based on team reflections]. Revista Mexicana de Investigación Educativa, 21 (69), 583–610. https://www.redalyc.org/articulo.oa?id=14045395010

García-Carmona, A., & Acevedo-Díaz, J. A. (2017). Understanding the nature of science through a critical and reflective analysis of the controversy between Pasteur and Liebig on fermentation. Science & Education, 26 (1–2), 65–91. https://doi.org/10.1007/s11191-017-9876-4

García-Carmona, A., & Acevedo-Díaz, J. A. (2018). The nature of scientific practice and science education. Science & Education, 27 (5–6), 435–455. https://doi.org/10.1007/s11191-018-9984-9

García-Carmona, A. (2020). From inquiry-based science education to the approach based on scientific practices. Science & Education, 29 (2), 443–463. https://doi.org/10.1007/s11191-020-00108-8

García-Carmona, A. (2021a). Prácticas no-epistémicas: ampliando la mirada en el enfoque didáctico basado en prácticas científicas [Non-epistemic practices: extending the view in the didactic approach based on scientific practices]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 18 (1), 1108. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2021.v18.i1.1108

García-Carmona, A. (2021b). Learning about the nature of science through the critical and reflective reading of news on the COVID-19 pandemic. Cultural Studies of Science Education, 16 (4), 1015–1028. https://doi.org/10.1007/s11422-021-10092-2

Guerrero-Márquez, I., & García-Carmona, A. (2020). La energía y su impacto socioambiental en la prensa digital: temáticas y potencialidades didácticas para una educación CTS [Energy and its socio-environmental impact in the digital press: issues and didactic potentialities for STS education]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 17(3), 3301. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2020.v17.i3.3301

Gobert, J. D., Moussavi, R., Li, H., Sao Pedro, M., & Dickler, R. (2018). Real-time scaffolding of students’ online data interpretation during inquiry with Inq-ITS using educational data mining. In M. E. Auer, A. K. M. Azad, A. Edwards, & T. de Jong (Eds.), Cyber-physical laboratories in engineering and science education (pp. 191–217). Springer.

Chapter   Google Scholar  

Harlen, W. (2014). Helping children’s development of inquiry skills. Inquiry in Primary Science Education, 1 (1), 5–19. https://ipsejournal.files.wordpress.com/2015/03/3-ipse-volume-1-no-1-wynne-harlen-p-5-19.pdf

Hitchcock, D. (2017). Critical thinking as an educational ideal. In On reasoning and argument (pp. 477–497). Springer.

Hyytinen, H., Toom, A., & Shavelson, R. J. (2019). Enhancing scientific thinking through the development of critical thinking in higher education. In M. Murtonen & K. Balloo (Eds.), Redefining scientific thinking for higher education . Palgrave Macmillan.

Jiménez-Aleixandre, M. P., & Puig, B. (2022). Educating critical citizens to face post-truth: the time is now. In B. Puig & M. P. Jiménez-Aleixandre (Eds.), Critical thinking in biology and environmental education, Contributions from biology education research (pp. 3–19). Springer.

Jirout, J. J. (2020). Supporting early scientific thinking through curiosity. Frontiers in Psychology, 11 , 1717. https://doi.org/10.3389/fpsyg.2020.01717

Kanari, Z., & Millar, R. (2004). Reasoning from data: How students collect and interpret data in science investigations. Journal of Research in Science Teaching, 41 (7), 748–769. https://doi.org/10.1002/tea.20020

Klahr, D., Zimmerman, C., & Matlen, B. J. (2019). Improving students’ scientific thinking. In J. Dunlosky & K. A. Rawson (Eds.), The Cambridge handbook of cognition and education (pp. 67–99). Cambridge University Press.

Krell, M., Vorholzer, A., & Nehring, A. (2022). Scientific reasoning in science education: from global measures to fine-grained descriptions of students’ competencies. Education Sciences, 12 , 97. https://doi.org/10.3390/educsci12020097

Kuhn, D. (1993). Science as argument: Implications for teaching and learning scientific thinking. Science education, 77 (3), 319–337. https://doi.org/10.1002/sce.3730770306

Kuhn, D. (1999). A developmental model of critical thinking. Educational Researcher, 28 (2), 16–46. https://doi.org/10.3102/0013189X028002016

Kuhn, D. (2022). Metacognition matters in many ways. Educational Psychologist, 57 (2), 73–86. https://doi.org/10.1080/00461520.2021.1988603

Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006

Kuhn, D., & Lerman, D. (2021). Yes but: Developing a critical stance toward evidence. International Journal of Science Education, 43 (7), 1036–1053. https://doi.org/10.1080/09500693.2021.1897897

Kuhn, D., & Modrek, A. S. (2022). Choose your evidence: Scientific thinking where it may most count. Science & Education, 31 (1), 21–31. https://doi.org/10.1007/s11191-021-00209-y

Lederman, J. S., Lederman, N. G., Bartos, S. A., Bartels, S. L., Meyer, A. A., & Schwartz, R. S. (2014). Meaningful assessment of learners' understandings about scientific inquiry—The views about scientific inquiry (VASI) questionnaire. Journal of Research in Science Teaching, 51 (1), 65–83. https://doi.org/10.1002/tea.21125

Lehrer, R., & Schauble, L. (2006). Scientific thinking and science literacy. In K. A. Renninger, I. E. Sigel, W. Damon, & R. M. Lerner (Eds.), Handbook of child psychology: Child psychology in practice (pp. 153–196). John Wiley & Sons, Inc.

López-Fernández, M. D. M., González-García, F., & Franco-Mariscal, A. J. (2022). How can socio-scientific issues help develop critical thinking in chemistry education? A reflection on the problem of plastics. Journal of Chemical Education, 99 (10), 3435–3442. https://doi.org/10.1021/acs.jchemed.2c00223

Magno, C. (2010). The role of metacognitive skills in developing critical thinking. Metacognition and Learning, 5 , 137–156. https://doi.org/10.1007/s11409-010-9054-4

McBain, B., Yardy, A., Martin, F., Phelan, L., van Altena, I., McKeowen, J., Pembertond, C., Tosec, H., Fratuse, L., & Bowyer, M. (2020). Teaching science students how to think. International Journal of Innovation in Science and Mathematics Education, 28 (2), 28–35. https://openjournals.library.sydney.edu.au/CAL/article/view/14809/13480

McIntyre, L. (2021). Talking to science deniers and sceptics is not hopeless. Nature, 596 (7871), 165–165. https://doi.org/10.1038/d41586-021-02152-y

Moore, C. (2019). Teaching science thinking. Using scientific reasoning in the classroom . Routledge.

Moreno-Fontiveros, G., Cebrián-Robles, D., Blanco-López, A., & y España-Ramos, E. (2022). Decisiones de estudiantes de 14/15 años en una propuesta didáctica sobre la compra de un coche [Fourteen/fifteen-year-old students’ decisions in a teaching proposal on the buying of a car]. Enseñanza de las Ciencias, 40 (1), 199–219. https://doi.org/10.5565/rev/ensciencias.3292

National Research Council [NRC]. (2012). A framework for K-12 science education . National Academies Press.

Network, I.-A. T. E. (2015). Critical thinking toolkit . OAS/ITEN.

Normand, M. P. (2008). Science, skepticism, and applied behavior analysis. Behavior Analysis in Practice, 1 (2), 42–49. https://doi.org/10.1007/BF03391727

Norris, S. P., Phillips, L. M., & Korpan, C. A. (2003). University students’ interpretation of media reports of science and its relationship to background knowledge, interest, and reading difficulty. Public Understanding of Science, 12 (2), 123–145. https://doi.org/10.1177/09636625030122001

Oliveras, B., Márquez, C., & Sanmartí, N. (2013). The use of newspaper articles as a tool to develop critical thinking in science classes. International Journal of Science Education, 35 (6), 885–905. https://doi.org/10.1080/09500693.2011.586736

Organisation for Economic Co-operation and Development [OECD]. (2019). PISA 2018. Assessment and Analytical Framework . OECD Publishing. https://doi.org/10.1787/b25efab8-en

Book   Google Scholar  

Organisation for Economic Co-operation and Development [OECD]. (2020). PISA 2024: Strategic Vision and Direction for Science. https://www.oecd.org/pisa/publications/PISA-2024-Science-Strategic-Vision-Proposal.pdf

Osborne, J., Pimentel, D., Alberts, B., Allchin, D., Barzilai, S., Bergstrom, C., Coffey, J., Donovan, B., Kivinen, K., Kozyreva, A., & Wineburg, S. (2022). Science Education in an Age of Misinformation . Stanford University.

Osborne, J. F., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95 (4), 627–638. https://doi.org/10.1002/sce.20438

Pols, C. F. J., Dekkers, P. J. J. M., & De Vries, M. J. (2021). What do they know? Investigating students’ ability to analyse experimental data in secondary physics education. International Journal of Science Education, 43 (2), 274–297. https://doi.org/10.1080/09500693.2020.1865588

Royal Decree 217/2022. (2022). of 29 March, which establishes the organisation and minimum teaching of Compulsory Secondary Education (Vol. 76 , pp. 41571–41789). Spanish Official State Gazette. https://www.boe.es/eli/es/rd/2022/03/29/217

Sagan, C. (1987). The burden of skepticism. Skeptical Inquirer, 12 (1), 38–46. https://skepticalinquirer.org/1987/10/the-burden-of-skepticism/

Santos, L. F. (2017). The role of critical thinking in science education. Journal of Education and Practice, 8 (20), 160–173. https://eric.ed.gov/?id=ED575667

Schafersman, S. D. (1991). An introduction to critical thinking. https://facultycenter.ischool.syr.edu/wp-content/uploads/2012/02/Critical-Thinking.pdf . Accessed 10 May 2023.

Sinatra, G. M., & Hofer, B. K. (2021). How do emotions and attitudes influence science understanding? In Science denial: why it happens and what to do about it (pp. 142–180). Oxford Academic.

Solbes, J., Torres, N., & Traver, M. (2018). Use of socio-scientific issues in order to improve critical thinking competences. Asia-Pacific Forum on Science Learning & Teaching, 19 (1), 1–22. https://www.eduhk.hk/apfslt/

Spektor-Levy, O., Eylon, B. S., & Scherz, Z. (2009). Teaching scientific communication skills in science studies: Does it make a difference? International Journal of Science and Mathematics Education, 7 (5), 875–903. https://doi.org/10.1007/s10763-009-9150-6

Taylor, P., Lee, S. H., & Tal, T. (2006). Toward socio-scientific participation: changing culture in the science classroom and much more: Setting the stage. Cultural Studies of Science Education, 1 (4), 645–656. https://doi.org/10.1007/s11422-006-9028-7

Tena-Sánchez, J., & León-Medina, F. J. (2022). Y aún más al fondo del “bullshit”: El papel de la falsificación de preferencias en la difusión del oscurantismo en la teoría social y en la sociedad [And even deeper into “bullshit”: The role of preference falsification in the difussion of obscurantism in social theory and in society]. Scio, 22 , 209–233. https://doi.org/10.46583/scio_2022.22.949

Tytler, R., & Peterson, S. (2004). From “try it and see” to strategic exploration: Characterizing young children's scientific reasoning. Journal of Research in Science Teaching, 41 (1), 94–118. https://doi.org/10.1002/tea.10126

Uskola, A., & Puig, B. (2023). Development of systems and futures thinking skills by primary pre-service teachers for addressing epidemics. Research in Science Education , 1–17. https://doi.org/10.1007/s11165-023-10097-7

Vallverdú, J. (2005). ¿Cómo finalizan las controversias? Un nuevo modelo de análisis: la controvertida historia de la sacarina [How does controversies finish? A new model of analysis: the controversial history of saccharin]. Revista Iberoamericana de Ciencia, Tecnología y Sociedad, 2 (5), 19–50. http://www.revistacts.net/wp-content/uploads/2020/01/vol2-nro5-art01.pdf

Vázquez-Alonso, A., & Manassero-Mas, M. A. (2018). Más allá de la comprensión científica: educación científica para desarrollar el pensamiento [Beyond understanding of science: science education for teaching fair thinking]. Revista Electrónica de Enseñanza de las Ciencias, 17 (2), 309–336. http://reec.uvigo.es/volumenes/volumen17/REEC_17_2_02_ex1065.pdf

Willingham, D. T. (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109 (4), 21–32. https://doi.org/10.3200/AEPR.109.4.21-32

Yacoubian, H. A. (2020). Teaching nature of science through a critical thinking approach. In W. F. McComas (Ed.), Nature of Science in Science Instruction (pp. 199–212). Springer.

Yacoubian, H. A., & Khishfe, R. (2018). Argumentation, critical thinking, nature of science and socioscientific issues: a dialogue between two researchers. International Journal of Science Education, 40 (7), 796–807. https://doi.org/10.1080/09500693.2018.1449986

Zeidler, D. L., & Nichols, B. H. (2009). Socioscientific issues: Theory and practice. Journal of elementary science education, 21 (2), 49–58. https://doi.org/10.1007/BF03173684

Zimmerman, C., & Klahr, D. (2018). Development of scientific thinking. In J. T. Wixted (Ed.), Stevens’ handbook of experimental psychology and cognitive neuroscience (Vol. 4 , pp. 1–25). John Wiley & Sons, Inc..

Download references

Conflict of Interest

The author declares no conflict of interest.

Funding for open access publishing: Universidad de Sevilla/CBUA

Author information

Authors and affiliations.

Departamento de Didáctica de las Ciencias Experimentales y Sociales, Universidad de Sevilla, Seville, Spain

Antonio García-Carmona

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Antonio García-Carmona .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

García-Carmona, A. Scientific Thinking and Critical Thinking in Science Education . Sci & Educ (2023). https://doi.org/10.1007/s11191-023-00460-5

Download citation

Accepted : 30 July 2023

Published : 05 September 2023

DOI : https://doi.org/10.1007/s11191-023-00460-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Cognitive skills
  • Critical thinking
  • Metacognitive skills
  • Science education
  • Scientific thinking
  • Find a journal
  • Publish with us
  • Track your research

Help | Advanced Search

Computer Science > Artificial Intelligence

Title: a survey on the integration of generative ai for critical thinking in mobile networks.

Abstract: In the near future, mobile networks are expected to broaden their services and coverage to accommodate a larger user base and diverse user needs. Thus, they will increasingly rely on artificial intelligence (AI) to manage network operation and control costs, undertaking complex decision-making roles. This shift will necessitate the application of techniques that incorporate critical thinking abilities, including reasoning and planning. Symbolic AI techniques already facilitate critical thinking based on existing knowledge. Yet, their use in telecommunications is hindered by the high cost of mostly manual curation of this knowledge and high computational complexity of reasoning tasks. At the same time, there is a spurt of innovations in industries such as telecommunications due to Generative AI (GenAI) technologies, operating independently of human-curated knowledge. However, their capacity for critical thinking remains uncertain. This paper aims to address this gap by examining the current status of GenAI algorithms with critical thinking capabilities and investigating their potential applications in telecom networks. Specifically, the aim of this study is to offer an introduction to the potential utilization of GenAI for critical thinking techniques in mobile networks, while also establishing a foundation for future research.

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Astronomers Just Used A Bizarre New Method To Capture Evidence of Elusive Stellar Winds

The researchers used creative methods to observe the sea of charged particles from three extremely bright stars.

3d illustration of a golden sun setting over the horizon

For the first time, astronomers have used X-ray data to observe the sea of charged particles that radiates off a trio of bright stars.

This outward flow, called stellar wind, plays a critical role in both a planet’s habitability and how stars evolve. But astronomers don’t yet fully understand how stellar wind works. In the new study, published Friday in the journal Nature Astronomy, the team of astrophysicists captured evidence of the stellar winds wafting away from three extremely bright stars some 11 to 15 light years away. The signatures they found could help them better understand stellar winds’ critical role in space.

The Sun blasts out particles in all directions, and strike Mars, casting off new particles into spac...

Artist’s rendition of a solar storm hitting Mars and stripping ions from the planet’s upper atmosphere.

The answer is blowin’ in the wind

When Kristina Kislyakova, an astrophysicist at the University of Vienna in Austria and the lead author of the new study, first went looking for ways to capture stellar wind she started too small. She went looking for X-ray emissions around exoplanets to observe non-solar stellar wind.

The idea is that stellar winds make themselves known upon whatever they strike. But the team eventually had to search for bigger objects on the receiving end. “I did a small estimate for a hot Jupiter, and it turned out that, for planets, you would see nothing. The emission would of course be strong, but since they are so far away, the signal wouldn’t be strong enough to detect from the Earth,” she says.

critical thinking about evidence

Infrared image of the massive giant star Zeta Ophiuchi and its shockwave as its astrosphere collides with the interstellar medium.

The next step was to explore astrospheres, the bubbles of charged gas and magnetic fields that surround stars as they orbit the center of the Milky Way, which were especially vivid. Charged particles from a star will extract electrons from neutral matter, and ultimately, this creates X-ray emissions astronomers can observe. Comets in our Solar System, she says, are bright in X-ray emissions for this reason.

“We were lucky enough, after many many trials and errors, because it was quite tricky to work with this data, to search for these faint signals. We found signals from three stars, and we were very surprised,” she said.

The X-ray data gained from the stellar winds of 70 Ophiuchi, Epsilon Eridani, and 61 Cygni is crucial. We cannot send a satellite to a different star to study the charged stellar wind particles, Kislyakova says. The only way we can see the energetic breeze from these stars is through indirect methods or, now, more direct methods showcased in the new work.

  • Space Science

critical thinking about evidence

critical thinking about evidence

IMAGES

  1. 6 Main Types of Critical Thinking Skills (With Examples)

    critical thinking about evidence

  2. 💋 What is critical thinking examples. What Is Critical Thinking?. 2022

    critical thinking about evidence

  3. 25 Critical Thinking Examples (2024)

    critical thinking about evidence

  4. 5+ Critical Thinking Strategies for your Essay (2023)

    critical thinking about evidence

  5. Critical Thinking Skills

    critical thinking about evidence

  6. RIP1 Lecture1 Critical thinking about evidence and practice

    critical thinking about evidence

VIDEO

  1. Part 3 Various tools, Critical Thinking, Evidence

  2. Understanding Evidence in Academic Writing

  3. MGT10002 Critical Thinking Evidence Based Presentation

  4. Part 1 Various tools, Critical Thinking, Evidence

  5. Neil deGrasse Tyson on Peoples that don't believe in science! #neildegrassetyson #science

  6. Critical Thinking, part 2

COMMENTS

  1. What Is Critical Thinking?

    Critical thinking is important in all disciplines and throughout all stages of the research process. The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both. In academic writing, critical thinking can help you to determine whether a source: Is free from research bias ...

  2. Eight Types of Evidence

    Strengths - Collected by the senses, scientific measurement techniques can carefully and cleverly isolate the information you are seeking. Weaknesses - The same as Personal Experience, scientific measurements can be corrupted by factors you didn't anticipate. 3. Testimonial - The experience or observation of someone else; a witness.

  3. 6.2: Defining Evidence

    Types of Evidence. There are five types of evidence critical thinkers can use to support their arguments: precedent evidence, statistical evidence, testimonial evidence, hearsay evidence, and common knowledge evidence.. Precedent evidence is an act or event which establishes expectations for future conduct.There are two forms of precedent evidence: legal and personal.

  4. A Crash Course in Critical Thinking

    Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the ...

  5. The Role of Evidence Evaluation in Critical Thinking: Fostering

    A central component of such critical thinking is reasoning with and about evidence. In the U.S. context, the Framework for K-12 Science Education (National Research Council, 2012 ) argues that a common feature of science knowledge building across domains is "a commitment to data and evidence as the foundation for developing claims" (p. 26).

  6. Critical thinking

    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind, thus a critical thinker is a person who practices the ...

  7. Critical Thinking

    Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. ... People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often ...

  8. Evidenced-Based Thinking for Scientific Thinking

    As Hyytinen, Toom, and Shavelson discussed in Chapter 3 of this book, critical thinking can be defined in many ways (Lai, 2011) and involves complex skills to follow reasons and evidence, question information, tolerate new ideas and clarity of thought, and interpret information and perspectives (Pascarella & Terenzini, 2005).It is one important dimension of scientific thinking because with ...

  9. Critical Thinking

    Critical Theory refers to a way of doing philosophy that involves a moral critique of culture. A "critical" theory, in this sense, is a theory that attempts to disprove or discredit a widely held or influential idea or way of thinking in society. Thus, critical race theorists and critical gender theorists offer critiques of traditional ...

  10. PDF Chapter 5 The Role of Evidence Evaluation in Critical Thinking

    The Role of Evidence Evaluation in Critical Thinking: Fostering Epistemic Vigilance Ravit Golan Duncan, Veronica L. Cavera, and Clark A. Chinn 5.1 Introduction: Promoting Reasoning in Epistemically Unfriendly Contexts The current times, with a global pandemic, have brought into focus the dangers of

  11. Critical Thinking, Evidence-Based Practice, and Mental Health

    In this chapter I suggest that values, knowledge, and skills related to critical thinking and their overlap with the philosophy and evolving technology of evidence-based practice (EBP) as described in original sources (Sackett et al. 2000; Gray 2001 a; Guyatt and Rennie 2002) should contribute to an informed dialogue regarding controversial issues in the area of mental health and to honoring ...

  12. Bridging critical thinking and transformative learning: The role of

    It is no longer about statements but about people navigating everyday problems to arrive at solutions. Whether it is the abilities and dispositions involved in understanding arguments, evaluating data, or proportioning beliefs to the available evidence, the tools of critical thinking help individuals proceed from a problem to a solution.

  13. Applying Critical Thinking

    Applying Critical Thinking to Research and Writing. Professors like to use the term critical thinking; in fact, the idea of being critical permeates much of higher education writ large. ... and evaluating if the assumptions used to arrive at a specific conclusion are evidence-based and relevant to addressing the research problem. An assessment ...

  14. PDF The Importance of Critical Thinking in Evidenced-Based Practice

    critical thinking might be better understood. A progression of logical ques-tions about a research article will also be offered to show the practical use of critical thinking in evaluating best evidence.I am indebted to Eileen Gambrill (1999) in this chapter for her work on critical thinking. Ways of Knowing THEORY BUILDING THROUGH OBSERVATION

  15. Critical Thinking

    The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students' critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). ... selection of the correct answer generally reflected good thinking and selection of an incorrect answer ...

  16. How to Organize and Evaluate Evidence with Critical Thinking

    1. Identify your purpose and question. Be the first to add your personal experience. 2. Gather and sort evidence. Be the first to add your personal experience. 3. Analyze and interpret evidence ...

  17. What Are Critical Thinking Skills and Why Are They Important?

    According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills. Very helpful in promoting creativity. Important for self-reflection.

  18. Critical thinking and evidence-based practice

    Critical thinking (CT) is vital to evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care and can contribute positively to patient outcomes across a variety of settings and geographic locations. The nature of EBP, its relevance to nursing, and the skills needed to support it should be required components of ...

  19. Critical Thinking and Problem-Solving

    Critical thinking involves asking questions, defining a problem, examining evidence, analyzing assumptions and biases, avoiding emotional reasoning, avoiding oversimplification, considering other interpretations, and tolerating ambiguity. Dealing with ambiguity is also seen by Strohm & Baukus (1995) as an essential part of critical thinking ...

  20. Critical Thinking and Evidence-Based Practice

    Critical thinking (CT) is vital to evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care and can contribute positively to patient outcomes across a variety of settings and geographic locations. The nature of EBP, its relevance to nursing, and the skills needed to support it should be required components of ...

  21. Evidence-based practice beliefs and implementations: a cross-sectional

    In Jordan, evidence-based knowledge with critical thinking is one of the seven standards for the professional practice of registered nurses that were released by the Jordan Nursing Council . Despite the plethora of studies on undergraduate nursing students' beliefs about EBP and its implementation in everyday clinical practice, this topic has ...

  22. Critical thinking

    So the critical thinking aspect of our New Profession Map on which it is a focus area, is part of the analytics and creating value section in our core knowledge, and much of the evidence-based practice work also sits in that area and evidence-based practice is about asking good questions and that is a key aspect of critical thinking.

  23. Promoting critical thinking through an evidence-based skills fair

    One type of evidence-based practice that can be used to engage students, promote active learning and develop critical thinking is skills fair intervention ( McCausland and Meyers, 2013; Roberts et al., 2009 ). Skills fair intervention promoted a consistent teaching approach of the psychomotor skills to the novice nurse that decreased anxiety ...

  24. Evidence-based practice for effective decision-making

    Evidence-based practice is an approach for improving decision-making which takes account of the best available evidence and critical thinking. ... Critical thinking: throughout this process, question assumptions and carefully consider where there are gaps in knowledge.

  25. Teaching critical thinking: An evidence-based guide

    Teaching critical thinking may boost inventiveness and raise IQ. Richard Herrnstein and his colleagues gave over 400 seventh graders explicit instruction in critical thinking-a program that covered hypothesis testing, basic logic, and the evaluation of complex arguments, inventiveness, decision making, and other topics.

  26. Critical thinking for democracy

    Recognizing critical thinking skills to be important may be somewhat axiomatic. However, cognizant of growing public distrust with the media, cultivating such skills among youth is timely.

  27. Scientific Thinking and Critical Thinking in Science Education

    On the other hand, in referring to what is practically the same thing, the European report Science Education for Responsible Citizenship speaks of scientific thinking when it establishes that one of the challenges of scientific education should be: "To promote a culture of scientific thinking and inspire citizens to use evidence-based reasoning for decision making" (European Commission ...

  28. A Survey on the Integration of Generative AI for Critical Thinking in

    In the near future, mobile networks are expected to broaden their services and coverage to accommodate a larger user base and diverse user needs. Thus, they will increasingly rely on artificial intelligence (AI) to manage network operation and control costs, undertaking complex decision-making roles. This shift will necessitate the application of techniques that incorporate critical thinking ...

  29. Astronomers Just Used A Bizarre New Method To Capture Evidence of

    For the first time, astronomers have used X-ray data to observe the sea of charged particles that radiates off a trio of bright stars. This outward flow, called stellar wind, plays a critical role ...

  30. "Debug Your Thinking" Recommended by MVP Laila Bougria

    In her keynote, Laila navigated the topic of critical thinking by drawing parallels between troubleshooting and debugging issues in our systems. Critical thinking can, in many ways, align with how we debug systems: - We need to slow down and take time to understand the problem we're presented with.