Check Out the New Website Shop!

Teaching with a Mountain View

Novels & Picture Books

critical thinking and problem solving skills in mathematics

Anchor Charts

Classroom

  • Critical Thinking

How To Encourage Critical Thinking in Math

By Mary Montero

Share This Post:

  • Facebook Share
  • Twitter Share
  • Pinterest Share
  • Email Share

Critical thinking in math helps students learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different strategies.

Critical thinking is more than just a buzzword… It’s an essential skill that helps students develop problem-solving abilities and make logical connections between different concepts. By encouraging critical thinking in math, students learn to approach problems more thoughtfully, they learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different strategies for finding the solution. Critical thinking also involves a great deal of persistence. Those are critical life skills!

When you think about it, students are typically asked to solve math problems and find the answer. Showing their work is frequently stressed too, which is important, but not the end. Instead, students need to be able to look at math in different ways in order to truly grasp a complete understanding of math concepts. Mathematics requires logical reasoning, problem-solving, and abstract thinking.

Critical thinking in math helps students learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different strategies.

What Does Critical Thinking in Math Look Like?

When I think about critical thinking in math, I focus on:

  • Solving problems through logical thinking . Students learn how to break down complex problems, analyze the different parts, and understand how they fit together logically.
  • Identifying patterns and making connections. Students learn how to identify patterns across different math concepts, make connections between seemingly unrelated topics, and develop a more in-depth understanding of how math works.
  • Evaluating and comparing solutions. Students learn to evaluate which solution is best for a given problem and identify any flaws in their reasoning or others’ reasoning when looking at different solutions

Mathematician Posters

These FREE Marvelous Mathematician posters have been a staple in my classroom for the last 8+ years! I first started using a version from MissMathDork and adapted them for my classroom over the years. 

free marvelous mathematician posters

I print, laminate, and add magnetic stickers on the back. At the beginning of the year, I only put one or two up at a time depending on our area of focus. Now, they are all hanging on my board, and I’ll pull out different ones depending on our area of focus. They are so empowering to my mathematicians and help them stay on track!

A Marvelous Mathematician:

  • knows that quicker doesn’t mean better
  • looks for patterns
  • knows mistakes happen and keeps going
  • makes sense of the most important details
  • embraces challenges and works through frustrations
  • uses proper math vocabulary to explain their thinking
  • shows their work and models their thinking
  • discusses solutions and evaluates reasonableness
  • gives context by labeling answers
  • applies mathematical knowledge to similar situations
  • checks for errors (computational and conceptual)

Critical Thinking Math Activities

Here are a few of my favorite critical thinking activities. 

Square Of Numbers

I love to incorporate challenge problems (use Nrich and Openmiddle to get started) because they teach my students so much more than how to solve a math problem. They learn important lessons in teamwork, persistence, resiliency, and growth mindset. We talk about strategies for tackling difficult problems and the importance of not giving up when things get hard.

This square of numbers challenge was a hit!

ALL kids need to feel and learn to embrace challenge. Oftentimes, kids I see have rarely faced an academic challenge. Things have just come easy to them, so when it doesn’t, they can lack strategies that will help them. In fact, they will often give up before they even get started.

I tell them it’s my job to make sure I’m helping them stretch and grow their brain by giving them challenges. They don’t love it at first, but they eventually do! 

This domino challenge was another one from Nrich . I’m always on the hunt for problems like this!!  How would you guide students toward an answer??

Nrich domino challenge math puzzler for critical thinking in math

Fifteen Cards

This is a well-loved math puzzle with my students, and it’s amazing for encouraging students to consider all options when solving a math problem.

fifteen cards Nrich math puzzler for critical thinking in math

We have number cards 1-15 (one of each number) and only seven are laid out. With the given clues, students need to figure out which seven cards should be put out and in what order. My students love these, and after they’ve done a few, they enjoy creating their own, too! Use products, differences, and quotients to increase the challenge.

This is also adapted from Nrich, which is an AMAZING resource for math enrichment!

This is one of my favorite fraction lessons that I’ve done for years! Huge shout out to Meg from The Teacher Studio for this one. I give each child a slip of paper with this figure and they have to silently write their answer and justification. Then I tally up the answers and have students take a side and DEBATE with their reasoning! It’s an AMAZING conversation, and I highly recommend trying it with your students. 

Sometimes we leave it hanging overnight and work on visual models to make some proofs. 

fourths math puzzler

Logic Puzzles

Logic puzzles are always a hit too! You can enrich and extend your math lessons with these ‘Math Mystery’ logic puzzles that are the perfect challenge for 4th, 5th, and 6th grades. The puzzles are skills-based, so they integrate well with almost ANY math lesson. You can use them to supplement instruction or challenge your fast-finishers and gifted students… all while encouraging critical thinking about important math skills!

 math logic puzzles for critical thinking in math

Three levels are included, so they’re perfect to use for differentiation.

  • Introductory logic puzzles are great for beginners (4th grade and up!)
  • Advanced logic puzzles are great for students needing an extra challenge
  • Extra Advanced logic puzzles are perfect for expert solvers… we dare you to figure these puzzles out! 

Do you have a group of students who are ready for more of a fraction challenge? My well-loved fraction puzzlers are absolutely perfect for fraction enrichment. They’ll motivate your students to excel at even the most challenging tasks! 

fraction math puzzlers for critical thinking

Math Projects

Math projects are another way to differentiation while building critical thinking skills. Math projects hold so much learning power with their real-world connections, differentiation options, collaborative learning opportunities, and numerous avenues for cross curricular learning too. 

If you’re new to math projects, I shared my best tips and tricks for using math projects in this blog post . They’re perfect for cumulative review, seasonal practice, centers, early finisher work, and more.

math projects upper elementary

I use both concept-based math projects to focus on specific standards and seasonal math projects that integrate several skills.

Place Value Detectives Lay 804151 2642763 1

Error Analysis

Finally, error analysis is always a challenging way to encourage critical thinking. When we use error analysis, we encourage students to analyze their own mistakes to prevent making the same mistakes in the future.

For my gifted students, I use error analysis tasks as an assessment when they have shown mastery of a unit during other tasks. For students in the regular classroom needing enrichment, I usually have them complete the tasks in a center or with a partner.

For students needing extra support, we complete error analysis in small groups.  We go step-by-step through the concept and they are always able to eventually identify what the error is. It is so empowering to students when they finally figure out the error AND it helps prevent them from making the same error in the future!

My FREE addition error analysis is a good place to start, no matter the grade level. I show them the process of walking through the problem and how best to complete an error analysis task.

When you’re ready for more, this bundle of error analysis tasks contains more than 240 tasks to engage and enrich your students in critical thinking practice.

Division Strategies Error AnalysisIMG 0763 3512378 6647195 jpg

If you want to dig even deeper, visit this conceptual vs computational error analysis post to learn more about using error analysis in the classroom. 

analyzing errors anchor chart for error analysis

Related Critical Thinking Posts

  • How to Increase Critical Thinking and Creativity in Your “Spare” Time
  • More Tips to Increase Critical Thinking

Critical thinking is essential for students to develop a deeper understanding of math concepts, problem-solving skills, and a stronger ability to reason logically. When you learn how to encourage critical thinking in math, you’re setting your students up for success not only in more advanced math subjects they’ll encounter, but also in life. 

How do you integrate critical thinking in your classroom? Come share your ideas with us in our FREE Inspired In Upper Elementary Facebook group .

facebook group promo 3

Mary Montero

I’m so glad you are here. I’m a current gifted and talented teacher in a small town in Colorado, and I’ve been in education since 2009. My passion (other than my family and cookies) is for making teachers’ lives easier and classrooms more engaging.

You might also like…

Setting2BHigh2BAcademic2BStandards2B252812529

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

One Comment

Mary Thankyou for your inspirational activities. I have just read and loved the morning talk activities. I do have meetings with my students but usually at end of day. What time do you

Comment “EARTH” to grab a link to these math and reading Earth Day Resources! 🌎 There is everything from original podcasts to reader’s theatre tasks to planning and planting new grids of trees! It’s perfect for this week or next. ☀️

©2023 Teaching With a Mountain View . All Rights Reserved | Designed by Ashley Hughes

Username or Email Address

Remember Me

Lost your password?

Review Cart

No products in the cart.

  • Our Mission

6 Tips for Teaching Math Problem-Solving Skills

Solving word problems is tougher than computing with numbers, but elementary teachers can guide students to do the deep thinking involved.

Photo of elementary school teacher with students

A growing concern with students is the ability to problem-solve, especially with complex, multistep problems. Data shows that students struggle more when solving word problems than they do with computation , and so problem-solving should be considered separately from computation. Why?

Consider this. When we’re on the way to a new destination and we plug in our location to a map on our phone, it tells us what lane to be in and takes us around any detours or collisions, sometimes even buzzing our watch to remind us to turn. When I experience this as a driver, I don’t have to do the thinking. I can think about what I’m going to cook for dinner, not paying much attention to my surroundings other than to follow those directions. If I were to be asked to go there again, I wouldn’t be able to remember, and I would again seek help.

If we can switch to giving students strategies that require them to think instead of giving them too much support throughout the journey to the answer, we may be able to give them the ability to learn the skills to read a map and have several ways to get there.

Here are six ways we can start letting students do this thinking so that they can go through rigorous problem-solving again and again, paving their own way to the solution. 

1. Link problem-solving to reading

When we can remind students that they already have many comprehension skills and strategies they can easily use in math problem-solving, it can ease the anxiety surrounding the math problem. For example, providing them with strategies to practice, such as visualizing, acting out the problem with math tools like counters or base 10 blocks, drawing a quick sketch of the problem, retelling the story in their own words, etc., can really help them to utilize the skills they already have to make the task less daunting.

We can break these skills into specific short lessons so students have a bank of strategies to try on their own. Here's an example of an anchor chart that they can use for visualizing . Breaking up comprehension into specific skills can increase student independence and help teachers to be much more targeted in their problem-solving instruction. This allows students to build confidence and break down the barriers between reading and math to see they already have so many strengths that are transferable to all problems.

2. Avoid boxing students into choosing a specific operation

It can be so tempting to tell students to look for certain words that might mean a certain operation. This might even be thoroughly successful in kindergarten and first grade, but just like when our map tells us where to go, that limits students from becoming deep thinkers. It also expires once they get into the upper grades, where those words could be in a problem multiple times, creating more confusion when students are trying to follow a rule that may not exist in every problem.

We can encourage a variety of ways to solve problems instead of choosing the operation first. In first grade, a problem might say, “Joceline has 13 stuffed animals and Jordan has 17. How many more does Jordan have?” Some students might choose to subtract, but a lot of students might just count to find the amount in between. If we tell them that “how many more” means to subtract, we’re taking the thinking out of the problem altogether, allowing them to go on autopilot without truly solving the problem or using their comprehension skills to visualize it. 

3. Revisit ‘representation’

The word “representation” can be misleading. It seems like something to do after the process of solving. When students think they have to go straight to solving, they may not realize that they need a step in between to be able to support their understanding of what’s actually happening in the problem first.

Using an anchor chart like one of these ( lower grade , upper grade ) can help students to choose a representation that most closely matches what they’re visualizing in their mind. Once they sketch it out, it can give them a clearer picture of different ways they could solve the problem.

Think about this problem: “Varush went on a trip with his family to his grandmother’s house. It was 710 miles away. On the way there, three people took turns driving. His mom drove 214 miles. His dad drove 358 miles. His older sister drove the rest. How many miles did his sister drive?”

If we were to show this student the anchor chart, they would probably choose a number line or a strip diagram to help them understand what’s happening.

If we tell students they must always draw base 10 blocks in a place value chart, that doesn’t necessarily match the concept of this problem. When we ask students to match our way of thinking, we rob them of critical thinking practice and sometimes confuse them in the process. 

4. Give time to process

Sometimes as educators, we can feel rushed to get to everyone and everything that’s required. When solving a complex problem, students need time to just sit with a problem and wrestle with it, maybe even leaving it and coming back to it after a period of time.

This might mean we need to give them fewer problems but go deeper with those problems we give them. We can also speed up processing time when we allow for collaboration and talk time with peers on problem-solving tasks. 

5. Ask questions that let Students do the thinking

Questions or prompts during problem-solving should be very open-ended to promote thinking. Telling a student to reread the problem or to think about what tools or resources would help them solve it is a way to get them to try something new but not take over their thinking.

These skills are also transferable across content, and students will be reminded, “Good readers and mathematicians reread.” 

6. Spiral concepts so students frequently use problem-solving skills

When students don’t have to switch gears in between concepts, they’re not truly using deep problem-solving skills. They already kind of know what operation it might be or that it’s something they have at the forefront of their mind from recent learning. Being intentional within their learning stations and assessments about having a variety of rigorous problem-solving skills will refine their critical thinking abilities while building more and more resilience throughout the school year as they retain content learning in the process. 

Problem-solving skills are so abstract, and it can be tough to pinpoint exactly what students need. Sometimes we have to go slow to go fast. Slowing down and helping students have tools when they get stuck and enabling them to be critical thinkers will prepare them for life and allow them multiple ways to get to their own destination.

Wonder Math

How to Improve Problem-Solving Skills: Mathematics and Critical Thinking

how-to-improve-problem-solving-skills

In today’s rapidly changing world, problem-solving has become a quintessential skill. When we discuss the topic, it’s natural to ask, “What is problem-solving?” and “How can we enhance this skill, particularly in children?” The discipline of mathematics offers a rich platform to explore these questions. Through math, not only do we delve into numbers and equations, but we also explore how to improve problem-solving skills and how to develop critical thinking skills in math. Let’s embark on this enlightening journey together.

What is Problem-Solving?

At its core, problem-solving involves identifying a challenge and finding a solution. But it’s not always as straightforward as it sounds. So, what is problem-solving? True problem-solving requires a combination of creative thinking and logical reasoning. Mathematics, in many ways, embodies this blend. When a student approaches a math problem, they must discern the issue at hand, consider various methods to tackle it, and then systematically execute their chosen strategy.

But what is problem-solving in a broader context? It’s a life skill. Whether we’re deciding the best route to a destination, determining how to save for a big purchase, or even figuring out how to fix a broken appliance, we’re using problem-solving.

How to Develop Critical Thinking Skills in Math

Critical thinking goes hand in hand with problem-solving. But exactly how to develop critical thinking skills in math might not be immediately obvious. Here are a few strategies:

  • Contextual Learning: Teaching math within a story or real-life scenario makes it relevant. When students see math as a tool to navigate the world around them, they naturally begin to think critically about solutions.
  • Open-ended Questions: Instead of merely seeking the “right” answer, encourage students to explain their thought processes. This nudges them to think deeply about their approach.
  • Group Discussions: Collaborative learning can foster different perspectives, prompting students to consider multiple ways to solve a problem.
  • Challenging Problems: Occasionally introducing problems that are a bit beyond a student’s current skill level can stimulate critical thinking. They will have to stretch their understanding and think outside the box.

What are the Six Basic Steps of the Problem-Solving Process?

Understanding how to improve problem-solving skills often comes down to familiarizing oneself with the systematic approach to challenges. So, what are the six basic steps of the problem-solving process?

  • Identification: Recognize and define the problem.
  • Analysis: Understand the problem’s intricacies and nuances.
  • Generation of Alternatives: Think of different ways to approach the challenge.
  • Decision Making: Choose the most suitable method to address the problem.
  • Implementation: Put the chosen solution into action.
  • Evaluation: Reflect on the solution’s effectiveness and learn from the outcome.

By embedding these steps into mathematical education, we provide students with a structured framework. When they wonder about how to improve problem-solving skills or how to develop critical thinking skills in math, they can revert to this process, refining their approach with each new challenge.

Making Math Fun and Relevant

At Wonder Math, we believe that the key to developing robust problem-solving skills lies in making math enjoyable and pertinent. When students see math not just as numbers on a page but as a captivating story or a real-world problem to be solved, their engagement skyrockets. And with heightened engagement comes enhanced understanding.

As educators and parents, it’s crucial to continuously ask ourselves: how can we demonstrate to our children what problem-solving is? How can we best teach them how to develop critical thinking skills in math? And how can we instill in them an understanding of the six basic steps of the problem-solving process?

The answer, we believe, lies in active learning, contextual teaching, and a genuine passion for the beauty of mathematics.

The Underlying Beauty of Mathematics

Often, people perceive mathematics as a rigid discipline confined to numbers and formulas. However, this is a limited view. Math, in essence, is a language that describes patterns, relationships, and structures. It’s a medium through which we can communicate complex ideas, describe our universe, and solve intricate problems. Understanding this deeper beauty of math can further emphasize how to develop critical thinking skills in math.

Why Mathematics is the Ideal Playground for Problem-Solving

Math provides endless opportunities for problem-solving. From basic arithmetic puzzles to advanced calculus challenges, every math problem offers a chance to hone our problem-solving skills. But why is mathematics so effective in this regard?

  • Structured Challenges: Mathematics presents problems in a structured manner, allowing learners to systematically break them down. This format mimics real-world scenarios where understanding the structure of a challenge can be half the battle.
  • Multiple Approaches: Most math problems can be approached in various ways . This teaches learners flexibility in thinking and the ability to view a single issue from multiple angles.
  • Immediate Feedback: Unlike many real-world problems where solutions might take time to show results, in math, students often get immediate feedback. They can quickly gauge if their approach works or if they need to rethink their strategy.

Enhancing the Learning Environment

To genuinely harness the power of mathematics in developing problem-solving skills, the learning environment plays a crucial role. A student who is afraid of making mistakes will hesitate to try out different approaches, stunting their critical thinking growth.

However, in a nurturing, supportive environment where mistakes are seen as learning opportunities, students thrive. They become more willing to take risks, try unconventional solutions, and learn from missteps. This mindset, where failure is not feared but embraced as a part of the learning journey, is pivotal for developing robust problem-solving skills.

Incorporating Technology

In our digital age, technology offers innovative ways to explore math. Interactive apps and online platforms can provide dynamic problem-solving scenarios, making the process even more engaging. These tools can simulate real-world challenges, allowing students to apply their math skills in diverse contexts, further answering the question of how to improve problem-solving skills.

More than Numbers 

In summary, mathematics is more than just numbers and formulas—it’s a world filled with challenges, patterns, and beauty. By understanding its depth and leveraging its structured nature, we can provide learners with the perfect platform to develop critical thinking and problem-solving skills. The key lies in blending traditional techniques with modern tools, creating a holistic learning environment that fosters growth, curiosity, and a lifelong love for learning.

Join us on this transformative journey at Wonder Math. Let’s make math an adventure, teaching our children not just numbers and equations, but also how to improve problem-solving skills and navigate the world with confidence. Enroll your child today and witness the magic of mathematics unfold before your eyes!

FAQ: Mathematics and Critical Thinking

1. what is problem-solving in the context of mathematics.

Problem-solving in mathematics refers to the process of identifying a mathematical challenge and systematically working through methods and strategies to find a solution.

2. Why is math considered a good avenue for developing problem-solving skills?

Mathematics provides structured challenges and allows for multiple approaches to find solutions. This promotes flexibility in thinking and encourages learners to view problems from various angles.

3. How does contextual learning enhance problem-solving abilities?

By teaching math within a story or real-life scenario, it becomes more relevant for the learner. This helps them see math as a tool to navigate real-world challenges , thereby promoting critical thinking.

4. What are the six basic steps of the problem-solving process in math?

The six steps are: Identification, Analysis, Generation of Alternatives, Decision Making, Implementation, and Evaluation.

5. How can parents support their children in developing mathematical problem-solving skills?

Parents can provide real-life contexts for math problems , encourage open discussions about different methods, and ensure a supportive environment where mistakes are seen as learning opportunities.

6. Are there any tools or apps that can help in enhancing problem-solving skills in math?

Yes, there are various interactive apps and online platforms designed specifically for math learning. These tools provide dynamic problem-solving scenarios and simulate real-world challenges, making the learning process engaging.

7. How does group discussion foster critical thinking in math?

Group discussions allow students to hear different perspectives and approaches to a problem. This can challenge their own understanding and push them to think about alternative methods.

8. Is it necessary to always follow the six steps of the problem-solving process sequentially?

While the six steps provide a structured approach, real-life problem-solving can sometimes be more fluid. It’s beneficial to know the steps, but adaptability and responsiveness to the situation are also crucial.

9. How does Wonder Math incorporate active learning in teaching mathematics?

Wonder Math integrates mathematics within engaging stories and real-world scenarios, making it fun and relevant. This active learning approach ensures that students are not just passive recipients but active participants in the learning process.

10. What if my child finds a math problem too challenging and becomes demotivated?

It’s essential to create a supportive environment where challenges are seen as growth opportunities. Remind them that every problem is a chance to learn, and it’s okay to seek help or approach it differently.

Related posts

Summer Math Programs: How They Can Prevent Learning Loss in Young Students

Summer Math Programs: How They Can Prevent Learning Loss in Young Students

As summer approaches, parents and educators alike turn their attention to how they can support young learners during the break. Summer is a time for relaxation, fun, and travel, yet it’s also a critical period when learning loss can occur. This phenomenon, often referred to as the “summer slide,” impacts students’ progress, especially in foundational subjects like mathematics. It’s reported…

I

Math Programs 101: What Every Parent Should Know When Looking For A Math Program

  As a parent, you know that a solid foundation in mathematics is crucial for your child’s success, both in school and in life. But with so many math programs and math help services out there, how do you choose the right one? Whether you’re considering Outschool classes, searching for “math tutoring near me,” or exploring tutoring services online, understanding…

critical thinking and problem solving skills in mathematics

5. Teaching Mathematical Reasoning: Critical Math Thinking Through Problem-Solving and Modeling

  • Mathematical problem-solving : This approach makes students think conceptually about problems before applying tools they’ve learned.
  • Mathematical modeling : Modeling projects give students experience in weighing several factors against one another and using mathematical knowledge to make decisions.

What is mathematical reasoning? The short answer is that is that is reasoning with math, and in a sense, it’s the skill that underlies all other math skills.

I. Mathematical Problem-Solving

An emphasis on open-ended mathematical problem-solving can help develop mathematical reasoning skills and address a problem teachers have long been concerned about: too much “rote” learning in math. 

Too often students spend time in math class memorizing procedures and applying them mindlessly to problems. This leads to errors when students are confronted with unfamiliar problems. It also contributes to a widespread misperception of math as boring and lacking relevance to everyday life. 

On the other hand, attempting to remedy this problem by giving students open-ended problems has its own drawbacks. Without the conceptual and methodological tools to solve these problems students become frustrated and disengaged. It can end up being an inefficient way to spend class time.  

Although learning fundamental math skills like algorithms for adding, subtracting, multiplying, and dividing is absolutely critical for students in the early grades, the deeper mathematical problem-solving skills are the ones we really want students to graduate with. How can we ensure they do?

The deeper mathematical problem-solving skills are the ones we really want students to graduate with.

critical thinking and problem solving skills in mathematics

Evidence suggests that skills in mathematical problem-solving lead to more general improvements in outcomes related to math. They help students acquire a deeper understanding of mathematical reasoning and concepts. 

For instance, the commutative property, which most students learn applies to addition and multiplication problems (changing the order of the operations doesn’t affect the outcome), also applies to other logical and practical situations. A familiarity with some of these situations fosters deeper conceptual understanding, and deeper conceptual understanding leads to better critical thinking.

And learning these skills helps students improve outcomes related to critical thinking more generally. For example, students who become skilled in mathematical problem-solving tend to also:

  • Create beneficial habits of mind — persistence, thoroughness, creativity in solution-finding, and improved self-monitoring.
  • Break down hard problems into easier parts or reframing problems so that they can think about them more clearly. 
  • Some problem solving tactics are applicable to situations well beyond math: making a visualization of a situation to understand it more clearly; creating a simplified version of the problem to more easily address the essence of the problem; creating branches of possibilities to solve the problem; creating “what if” example cases to test key assumptions, etc.
  • Elevate the value of discussion and argumentation over simple appeals to authority.

Small-group mathematical problem solving targets skills that traditional mathematics instruction doesn’t. Instead of just finding a match between an algorithm and a question, students must: adapt or create an algorithm; evaluate and debate the merits of different solution paths; and verify their solution through additional evidence.

Small-group mathematical problem solving targets skills that traditional mathematics instruction doesn’t.

This process continues until the class has thoroughly explored the problem space, revealing multiple solution paths and exploring variations on the problem or contrasting problem-types.

Of course, the usefulness of a question like this depends on what students already know. If students don’t already know that chickens have two legs and pigs have four, they’re just going to be confused by the problem (and the explanation of the solution). It also requires some other basic skills—for instance, that if one chicken has two legs, four chickens would have eight.

As a way of evaluating student growth, teachers could also include some of these open-ended problems in homework assignments or as extra credit assignments.

Lesson Plan Outline

An example that might be appropriate for fifth grade is something like the following: A farmer has some pigs and some chickens. He finds that together they have 70 heads and 200 legs. How many pigs and how many chickens does he have? Divide the class into student groups of three to four. Have students spend a few minutes reading over the problem individually. Then let student groups discuss possible solution paths. The teacher walks around the classroom, monitoring the groups. Then the teacher leads a whole-class discussion about the problem.

  • So how did you go about thinking about the problem?
  • Show us how you got your answer and why you think it’s right. This might mean that a student goes up to the board to illustrate something if a verbal explanation is inadequate.
  • And what was the answer you got?
  • Does anyone else have a different way of thinking about the problem? If there are other ways of solving the problem that students didn’t come up with, teachers can introduce these other ways themselves.

Developing Math Problem-Solving Skills

Teachers should keep in mind the following as they bring mathematical problem-solving activities into their classrooms:

  • Problem selection . Teachers have to select grade-appropriate problems. A question like “John is taller than Mary. Mary is taller than Peter. Who is the shortest of the three children?” may be considered an exercise to older students — that is, a question where the solutions steps are already known — but a genuine problem to younger students. It’s also helpful when problems can be extended in various ways. Adding variation and complexity to a problem lets students explore a class of related problems in greater depth.
  • Managing student expectations . Introducing open-ended math problems to students who haven’t experienced them before can also be confusing for the students. Students who are used to applying algorithms to problems can be confused about what teachers expect them to do with open-ended problems, because no algorithm is available.
  • Asking why . Asking students to explain the rationale behind their answer is critical to improving their thinking. Teachers need to make clear that these rationales or justifications are even more important than the answer itself. These justifications give us confidence that an answer is right. That is, if the student can’t justify her answer, it almost doesn’t matter if it’s correct, because there’s no way of verifying it.

critical thinking and problem solving skills in mathematics

II. Mathematical Modeling

Another approach is mathematical modeling. Usually used for students in middle or high school, mathematical modeling brings math tools to bear on real-world problems, keeping students engaged and helping them to develop deeper mathematical reasoning and critical thinking skills.

Math modeling is an extremely common practice in the professional world. Investors model returns and the effects of various events on the market; business owners model revenue and expenses, buying behavior, and more; ecologists model population growth, rainfall, water levels, and soil composition, among many other things. 

But, despite these many applications and the contributions it can make to general mathematical reasoning and critical thinking skills, mathematical modeling is rarely a main component of the math curriculum. Although textbook examples occasionally refer to real-world phenomena, the modeling process is not commonly practiced in the classroom.

Modeling involves engaging students in a big, messy real-world problem. The goals are for students to:

  • refine their understanding of the situation by asking questions and making assumptions,
  • leverage mathematical tools to solve the problem,
  • make their own decisions about how to go about solving the problem,
  • explain whether and how their methods and solutions make sense,
  • and test or revise their solutions if necessary.

Mathematical modeling typically takes place over the course of several class sessions and involves working collaboratively with other students in small groups.

Modeling is not just about getting to a “right” answer — it’s about considering factors beyond mathematics as well.

Modeling also offers the opportunity to integrate other material across the curriculum and to “think mathematically” in several different contexts. Modeling is not just about getting to a “right” answer — it’s about considering factors beyond mathematics as well. For example, students deal with questions like:

  • What is a “fair” split? 
  • What level of risk should someone tolerate?
  • What tradeoffs should a society make?

In others words, students come to see mathematics as the socially indispensable tool that it is, rather than an abstract (and sometimes frustrating) school subject.

Mathematical Modeling and Critical Thinking

Research suggests that the ability to solve abstractly framed academic math problems is not necessarily related to mathematical reasoning more broadly: that is, the ability to use math well in everyday life or to integrate mathematical thinking into one’s decision-making. Students may be able to follow procedures when given certain cues, but unable to reason about underlying concepts. 

It’s also very common to hear complaints from students about math — that either they aren’t “ math people ,” that math is irrelevant, or that math is simply boring.

Mathematical modeling is one approach to resolving both these problems. It asks students to move between the concreteness of real — or at least relatively realistic — situations and the abstraction of mathematical models. Well-chosen problems can engage student interest. And the practice emphasizes revision, step-by-step improvement, and tradeoffs over single solution paths and single right-or-wrong answers.

critical thinking and problem solving skills in mathematics

Mathematical modeling often begins with a general question, one that may initially seem only loosely related to mathematics:

  • how to design an efficient elevator system, given certain constraints;
  • what the best gas station is to visit in our local area;
  • how to distinguish between two kinds of flies, given some data about their physical attributes.

Then, over the course of the modeling process, students develop more specific questions or cases, adding constraints or assumptions to simplify the problem. Along the way, students identify the important variables — what’s changing, and what’s not changing? Which variables are playing the biggest role in the desired outcomes?

Students with little experience in modeling can leap too quickly into looking for a generalized solution, before they have developed a feel for the problem. They may also need assistance in developing those specific cases. During this part of the process, it can be easiest to use well-defined values for some variables. These values may then become variables later on.

After students explore some simplifying cases, then they work on extensions of these cases to reach ever more general solutions.

A key part of this activity is letting students be creative — students will often come up with unusual or especially innovative solutions.

Throughout the modeling process, the teacher may need to point out missing assumptions or constraints, or offer other ways of reframing the problem. For any given modeling problem, some solutions are usually more obvious than others, which leads to common stages students may reach as they solve the problem. But a key part of this activity is letting students be creative — students will often come up with unusual or especially innovative solutions.

A sample problem, from the Guidelines for Assessment and Instruction in Mathematical Modeling Education is below:

critical thinking and problem solving skills in mathematics

This problem involves variables that aren’t necessarily immediately apparent to students. For instance, the size of the gas tank, and how much gas you fill up on per trip. As students manage this specific case, they can take other hypothetical scenarios to generalize their solution: if it’s 10 miles away, how cheap would the gas have to be to make it worth it? What about the time spent in the car — is there a value to put on that?

Many modeling problems can be arbitrarily extended in various directions. Instead of just considering the best gas station to go to for a single car, for instance, students can explore the behavior of a fleet of trucks on set routes or seasonal changes to gas prices.

It’s also possible to include shorter modeling activities, where students work together in pairs or small groups to extend a problem or interpret the meaning of a solution.

These kinds of modeling activities are not reserved solely for older students. One example of a modeling problem for students in elementary school might be something like: what should go in a lunchbox? Students can talk about what kinds of things are important to them for lunch, “mathematize” the problem by counting student preferences or coming up with an equation (e.g., lunch = sandwich + vegetable + dessert + drink); and even explore geometrically how to fit such items into a lunchbox of a certain size.

Teaching Mathematical Modeling: Further Key Factors

Mathematical modeling activities can be challenging for both teachers and students. 

Often, mathematical modeling activities stretch over several class periods. Fitting modeling activities in, especially if standardized tests are focused on mathematical content, can be challenging. One approach is to design modeling activities that support the overall content goals.

The teacher’s role during mathematical modeling is more like a facilitator than a lecturer. Mathematical modeling activities are considerably more open-ended than typical math activities, and require active organization, monitoring, and regrouping by the teacher. Deciding when to let students persevere on a problem for a bit longer and when to stop the class to provide additional guidance is a key skill that only comes with practice.

The teacher’s role during math modeling is more like a facilitator than a lecturer.

Students — especially students who have traditionally been successful in previous math classes — may also experience frustration when encountering modeling activities for the first time. Traditional math problems involve applying the right procedure to a well-defined problem. But expertise at this kind of mathematical reasoning differs markedly from tackling yet-to-be-defined problems with many possible solutions, each of which has tradeoffs and assumptions. Students might feel unprepared or even that they’re being treated unfairly.

Students also have to have some knowledge about the situation to reason mathematically about it. If the question is about elevators, for example, they need to know that elevators in tall buildings might go to different sets of floors; that elevators have a maximum capacity; that elevators occasionally break and need to be repaired. 

Finally, the mathematical question needs to be tailored to students’ experience and interests. Asking a group of students who don’t drive about how to efficiently purchase gas won’t garner student interest. Teachers should use their familiarity with their students to find and design compelling modeling projects. This is chance for both students and teachers to be creative. 

To download the PDF of the Teachers’ Guide

(please click here)

Sources and Resources

O’Connell, S. (2000). Introduction to Problem Solving: Strategies for The Elementary Classroom . Heinemann. A recent handbook for teachers with tips on how to implement small-group problem solving.

Youcubed.org , managed by Jo Boaler.  A community with lots of resources for small-group problem solving instruction.

Yackel, E., Cobb, P., & Wood, T. (1991). Small group interactions as a source of learning opportunities in second-grade mathematics . Journal for research in mathematics education , 390-408. Education research that illustrates how small-group problem solving leads to different kinds of learning opportunities than traditional instruction.

Guidelines for Assessment and Instruction in Mathematical Modeling Education , 2nd ed. (2019). Consortium for Mathematics and its Applications & Society for Industrial and Applied Mathematics.  An extensive guide for teaching mathematical modeling at all grade levels.

Hernández, M. L., Levy, R., Felton-Koestler, M. D., & Zbiek, R. M. (March/April 2017). Mathematical modeling in the high school curriculum . The variable , 2(2). A discussion of the advantages of mathematical modeling at the high school level.

Privacy Overview

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Clio cresswell.

1 School of Mathematics and Statistics, The University of Sydney, Sydney, Australia

Craig P. Speelman

2 School of Arts and Humanities, Edith Cowan University, Joondalup, Australia

Associated Data

All relevant data are within the paper and its Supporting Information files.

Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational and logical reasoning tasks aggregated from various psychological studies. These tasks, that included the Cognitive Reflection Test and the Wason Selection Task, are of particular interest as they have typically and reliably eluded participants in all studies, and results have been uncorrelated with general intelligence, education levels and other demographic information. The results in this study revealed that in general the greater the mathematics training of the participant, the more tasks were completed correctly, and that performance on some tasks was also associated with performance on others not traditionally associated. A ceiling effect also emerged. The work is deconstructed from the viewpoint of adding to the platform from which to approach the greater, and more scientifically elusive, question: are any skills associated with mathematics training innate or do they arise from skills transfer?

Introduction

Mathematics is often promoted as endowing those who study it with a number of broad thinking skills such as: an ability to think logically, analytically, critically and abstractly; having capacity to weigh evidence with impartiality. This is a view of mathematics as providing transferable skills which can be found across educational institutions, governments and corporations worldwide. A view material to the place of mathematics in curricula.

Consider the UK government’s commissioned inquiry into mathematics education “Making Mathematics Count” ascertaining the justification that “mathematical training disciplines the mind, develops logical and critical reasoning, and develops analytical and problem-solving skills to a high degree” [ 1 p11]. The Australian Mathematical Sciences Institute very broadly states in its policy document “Vision for a Maths Nation” that “Not only is mathematics the enabling discipline, it has a vital productive role planning and protecting our well-being” (emphasis in original) [ 2 ]. In Canada, British Columbia’s New 2016 curriculum K-9 expressly mentions as part of its “Goals and Rationale”: “The Mathematics program of study is designed to develop deep mathematical understanding and fluency, logical reasoning, analytical thought, and creative thinking.” [ 3 ]. Universities, too, often make such specific claims with respect to their teaching programs. “Mathematics and statistics will help you to think logically and clearly, and apply a range of problem-solving strategies” is claimed by The School of Mathematical Sciences at Monash University, Australia [ 4 ]. The School of Mathematics and Statistics at The University of Sydney, Australia, directly attributes as part of particular course objectives and outcomes skills that include “enhance your problem-solving skills” as part of studies in first year [ 5 ], “develop logical thinking” as part of studies in second year, which was a statement drafted by the lead author in fact [ 6 ], and “be fluent in analysing and constructing logical arguments” as part of studies in third year [ 7 ]. The University of Cambridge’s Faculty of Mathematics, UK, provides a dedicated document “Transferable Skills in the Mathematical Tripos” as part of its undergraduate mathematics course information, which again lists “analytic ability; creativity; initiative; logical and methodical reasoning; persistence” [ 8 ].

In contrast, psychological research, which has been empirically investigating the concept of transferability of skills since the early 1900s, points quite oppositely to reasoning skills as being highly domain specific [ 9 ]. Therefore, support for claims that studying mathematics engenders more than specific mathematics knowledge is highly pertinent. And yet it is largely absent. The 2014 Centre for Curriculum Redesign (CCR) four part paper “Mathematics for the 21st Century: What Should Students Learn?” concludes in its fourth paper titled “Does mathematics education enhance higher-order thinking skills?” with a call to action “… there is not sufficient evidence to conclude that mathematics enhances higher order cognitive functions. The CCR calls for a much stronger cognitive psychology and neuroscience research base to be developed on the effects of studying mathematics” [ 10 ].

Inglis and Simpson [ 11 ], bringing up this very issue, examined the ability of first-year undergraduate students from a high-ranking UK university mathematics department, on the “Four Cards Problem” thinking task, also known as the Wason Selection Task. It is stated as follows.

Each of the following cards have a letter on one side and a number on the other.

equation image

Here is a rule: “if a card has a D on one side, then it has a 3 on the other”. Your task is to select all those cards, but only those cards, which you would have to turn over in order to find out whether the rule is true or false. Which cards would you select?

This task involves understanding conditional inference, namely understanding the rule “If P then Q” and with this, deducing the answer as “P and not Q” or “D and 7”. Such logical deduction indeed presents as a good candidate to test for a potential ability of the mathematically trained. This task has also been substantially investigated in the domain of the psychology of reasoning [ 12 p8] revealing across a wide range of publications that only around 10% of the general population reach the correct result. The predominant mistake being to pick “D and 3”; where in the original study by Wason [ 13 ] it is suggested that this was picked by 65% of people. This poor success rate along with a standard mistake has fuelled interest in the task as well as attempts to understand why it occurs. A prevailing theory being the so named matching bias effect; the effect of disproportionately concentrating on items specifically mentioned in the situation, as opposed to reasoning according to logical rules.

Inglis and Simpson’s results isolated mathematically trained individuals with respect to this task. The participants were under time constraint and 13% of the first-year undergraduate mathematics students sampled reached the correct response, compared to 4% of the non-mathematics (arts) students that was included. Of note also was the 24% of mathematics students as opposed to 45% of the non-mathematics students who chose the standard mistake. The study indeed unveiled that mathematically trained individuals were significantly less affected by the matching bias effect with this problem than the individuals without mathematics training. However, the achievement of the mathematically trained group was still far from masterful and the preponderance for a non-standard mistake compared with non-mathematically trained people is suggestive. Mathematical training appears to engender a different thinking style, but it remains unclear what the difference is.

Inglis, Simpson and colleagues proceeded to follow up their results with a number of studies concentrated on conditional inference in general [ 14 , 15 ]. A justification for this single investigatory pathway being that if transfer of knowledge is present, something subtle to test for in the first place, a key consideration should be the generalisation of learning rather than the application of skills learned in one context to another (where experimenter bias in the choice of contexts is more likely to be an issue). For this they typically used sixteen “if P then Q” comprehension tasks, where their samples across a number of studies have included 16-year-old pre-university mathematics students (from England and Cyprus), mathematics honours students in their first year of undergraduate university study, third year university mathematics students, and associated control groups. The studies have encompassed controls for general intelligence and thinking disposition prior to training, as well as follows ups of up to two years to address the issue of causation. The conclusive thinking pattern that has emerged is a tendency of the mathematical groups towards a greater likelihood of rejecting the invalid denial of the antecedent and affirmation of the consequent inferences. But with this, and this was validated by a second separate study, the English mathematics group actually became less likely to endorse the valid modus tollens inference. So again, mathematical training appears to engender a different thinking style, but there are subtleties and it remains unclear what the exact difference is.

This project was designed to broaden the search on the notion that mathematics training leads to increased reasoning skills. We focused on a range of reasoning problems considered in psychological research to be particularly insightful into decision making, critical thinking and logical deduction, with their distinction in that the general population generally struggles with answering them correctly. An Australian sample adds diversity to the current enquiries that have been European focussed. Furthermore, in an effort to identify the impact of mathematics training through a possible gradation effect, different levels of mathematically trained individuals were tested for performance.

Well-studied thinking tasks from a variety of psychological studies were chosen. Their descriptions, associated success rates and other pertinent details follows. They were all chosen as the correct answer is typically eluded for a standard mistake.

The three-item Cognitive Reflection Test (CRT) was used as introduced by Frederick [ 16 ]. This test was devised in line with the theory that there are two general types of cognitive activity: one that operates quickly and without reflection, and another that requires not only conscious thought and effort, but also an ability to reflect on one’s own cognition by including a step of suppression of the first to reach it. The three items in the test involve an incorrect “gut” response and further cognitive skill is deemed required to reach the correct answer (although see [ 17 ] for evidence that correct responses can result from “intuition”, which could be related to intelligence [ 18 ]).

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Bat and ball

A bat and a ball cost $1.10 in total. The bat costs a dollar more than the ball. How much does the ball cost?

The solutions are: 47 days for the Lily Pads problem, 5 minutes for the Widgets problem and 5 cents for the Bat and Ball problem. The considered intuitive, but wrong, answers are 24 days, 100 minutes and 10 cents, respectively. These wrong answers are attributed to participants becoming over focused on the numbers so as to ignore the exponential growth pattern in the Lily Pads problem, merely complete a pattern in numbers in the Widgets problem, and neglect the relationship “more than” in the Bat and Ball problem [ 19 ]. The original study by Frederick [ 16 ] provides a composite measure of the performance on these three items, with only 17% of those studied (n = 3428) reaching the perfect score. The CRT has since been studied extensively [ 19 – 21 ]. Research using the CRT tends not to report performance on the individual items of the test, but rather a composite measure of performance. Attridge and Inglis [ 22 ] used the CRT as a test for thinking disposition of mathematics students as one way to attempt to disentangle the issue of filtering according to prior thinking styles rather than transference of knowledge in successful problem solving. They repeat tested 16-year old pre-university mathematics students and English literature students without mathematics subjects at a one-year interval and found no difference between groups.

Three problems were included that test the ability to reason about probability. All three problems were originally discussed by Kahneman and Tversky [ 23 ], with the typically poor performance on these problems explained by participants relying not on probability knowledge, but a short-cut method of thinking known as the representativeness heuristic. In the late 1980s, Richard Nisbett and colleagues showed that graduate level training in statistics, while not revealing any improvement in logical reasoning, did correlate with higher-quality statistical answers [ 24 ]. Their studies lead in particular to the conclusion that comprehension of, what is known as the law of large numbers, did show improvement with training. The first of our next three problems targeted this law directly.

A certain town is served by two hospitals. In the larger hospital, about 45 babies are born each day, and in the smaller hospital, about 15 babies are born each day. As you know, about 50 percent of all babies are boys. However, the exact percentage varies from day to day. Sometimes it may be higher than 50 percent, sometimes lower. For a period of one year, each hospital recorded the number of days on which more than 60 percent of the babies born were boys. Which hospital do you think recorded more such days? (Circle one letter.)

  • (a) the larger hospital
  • (b) the smaller hospital
  • (c) about the same (that is, within 5 percent of each other)

Kahneman and Tversky [ 23 ] reported that, of 50 participants, 12 chose (a), 10 chose (b), and 28 chose (c). The correct answer is (b), for the reason that small samples are more likely to exhibit extreme events than large samples from the same population. The larger the sample, the more likely it will exhibit characteristics of the parent population, such as the proportion of boys to girls. However, people tend to discount or be unaware of this feature of sampling statistics, which Kahneman and Tversky refer to as the law of large numbers. Instead, according to Kahneman and Tversky, people tend to adhere to a fallacious law of small numbers, where even small samples are expected to exhibit properties of the parent population, as illustrated by the proportion of participants choosing the answer (c) in their 1972 study. Such thinking reflects use of the representativeness heuristic, whereby someone will judge the likelihood of an uncertain event based on how similar it is to characteristics of the parent population of events.

Birth order

All families of six children in a city were surveyed. In 72 families the exact order of births of boys and girls was GBGBBG.

  • (a) What is your estimate of the number of families surveyed in which the exact order of births was BGBBBB?
  • (b) In the same survey set, which, if any, of the following two sequences would be more likely: BBBGGG or GBBGBG?

All of the events listed in the problem have an equal probability, so the correct answer to (a) is 72, and to (b) is “neither is more likely”. Kahneman and Tversky [ 23 ] reported that 75 of 92 participants judged the sequence in (a) as less likely than the given sequence. A similar number (unspecified by Kahneman and Tversky, but the statistical effect was reported to be of the same order as in (a)) reported that GBBGBG was the more likely sequence. Again, Kahneman and Tversky suggested that these results reflected use of the representativeness heuristic. In the context of this problem, the heuristic would have taken the following form: some birth orders appear less patterned than others, and less patterned is to be associated with the randomness of birth order, making them more likely.

Coin tosses

In a sequence of coin tosses (the coin is fair) which of the following outcomes would be most likely (circle one letter):

  • (a) H T H T H T H T
  • (b) H H H H T T T T
  • (c) T T H H T T H H
  • (d) H T T H T H H T
  • (e) all of the above are equally likely

The correct answer in this problem is (e). Kahneman and Tversky [ 23 ] reported that participants tend to choose less patterned looking sequences (e.g., H T T H T H H T) as more likely than more systematic looking sequences (e.g., H T H T H T H T). This reasoning again reflects the representativeness heuristic.

Three further questions from the literature were included to test problem solving skill.

Two drivers

Two drivers set out on a 100-mile race that is marked off into two 50-mile sections. Driver A travels at exactly 50 miles per hour during the entire race. Driver B travels at exactly 45 mph during the first half of the race (up to the 50-mile marker) and travels at exactly 55 mph during the last half of the race (up to the finish line). Which of the two drivers would win the race? (Circle one letter.)

  • (a) Driver A would win the race
  • (b) Driver B would win the race
  • (c) the two drivers would arrive at the same time (within a few seconds of one another)

This problem was developed by Pelham and Neter [ 25 ]. The correct answer is (a), which can be determined by calculations of driving times for each Driver, using time = distance/velocity. Pelham and Neter argue, however, that (c) is intuitively appealing, on the basis that both drivers appear to have the same overall average speed. Pelham and Neter reported that 67% of their sample gave this incorrect response to the problem, and a further 13% selected (b).

Petrol station

Imagine that you are driving along the road and you notice that your car is running low on petrol. You see two petrol stations next to each other, both advertising their petrol prices. Station A’s price is 65c/litre; Station B’s price is 60c/litre. Station A’s sign also announces: “5c/litre discount for cash!” Station B’s sign announces “5c/litre surcharge for credit cards.” All other factors being equal (for example, cleanliness of the stations, number of cars waiting at each etc), to which station would you choose to go, and why?

This problem was adapted from one described by Galotti [ 26 ], and is inspired by research reported by Thaler [ 27 ]. According to Thaler’s research, most people prefer Station A, even though both stations are offering the same deal: 60c/litre for cash, and 65c/litre for credit. Tversky and Kahneman [ 28 ] explain this preference by invoking the concept of framing effects. In the context of this problem, such an effect would involve viewing the outcomes as changes from some initial point. The initial point frames the problem, and provides a context for viewing the outcome. Thus, depending on the starting point, outcomes in this problem can be viewed as either a gain (in Station A, you gain a discount if you use cash) or a loss (in Station B, you are charged more (a loss) for using credit). Given that people are apparently more concerned about a loss than a gain [ 29 ], the loss associated with Station B makes it the less attractive option, and hence the preference for Station A. The correct answer, though, is that the stations are offering the same deal and so no station should be preferred.

And finally, a question described by Stanovich [ 30 , 31 ] as testing our predisposition for cognitive operations that require the least computational effort.

Jack looking at Anne

Jack is looking at Anne, but Anne is looking at George. Jack is married, but George is not. Is a married person looking at an unmarried person? (Circle one letter.)

  • (c) Cannot be determined

Stanovich reported that over 80% of people choose the “lazy” answer (c). The correct answer is (a).

The above questions survey, in a clear problem solving setting, an ability to engage advanced cognitive processing in order to critically evaluate and possibly override initial gut reasoning, an ability to reason about probability within the framework of the law of large numbers and the relationship between randomness and patterning, an ability to isolate salient features of a problem and, with the last question in particular, an ability to map logical relations. It might be hypothesised that according to degrees of mathematical training, in line with the knowledge base provided and the claims of associated broad and enhanced problem-solving abilities in general, that participants with greater degrees of such training would outperform others on these questions. This hypothesis was investigated in this study. In addition, given that no previous study on this issue has examined the variety of problems used in this study, we also undertook an exploratory analysis to investigate whether there exist any associations between the problems in terms of their likelihood of correct solution. Similarities between problems might indicate which problem solving domains could be susceptible to the effects of mathematics training.

A questionnaire was constructed containing the problems described in the previous sections plus the Four Cards Problem as tested by Inglis and Simpson [ 11 ] for comparison. The order of the problems was as follows: 1) Lily Pads; 2) Hospitals; 3) Widgets; 4) Four Cards; 5) Bat and Ball; 6) Birth Order; 7) Petrol Station; 8) Coin Tosses; 9) Two Drivers; 10) Jack looking at Anne. It was administered to five groups distinctive in mathematics training levels chosen from a high-ranking Australian university, where the teaching year is separated into two teaching semesters and where being a successful university applicant requires having been highly ranked against peers in terms of intellectual achievement:

  • Introductory—First year, second semester, university students with weak high school mathematical results, only enrolled in the current unit as a compulsory component for their chosen degree, a unit not enabling any future mathematical pathway, a typical student may be enrolled in a Biology or Geography major;
  • Standard—First year, second semester, university students with fair to good high school mathematical results, enrolled in the current mathematics unit as a compulsory component for their chosen degree with the possibility of including some further mathematical units in their degree pathway, a typical student may be enrolled in an IT or Computer Science major;
  • Advanced1—First year, second semester, university mathematics students with very strong interest as well as background in mathematics, all higher year mathematical units are included as possible future pathway, a typical student may be enrolled in a Mathematics or Physics major;
  • Advanced2—Second year, second semester, university mathematics students with strong interest as well as background in mathematics, typically a direct follow on from the previously mentioned Advanced1 cohort;
  • Academic—Research academics in the mathematical sciences.

Participants

123 first year university students volunteered during “help on demand” tutorial times containing up to 30 students. These are course allocated times that are supervised yet self-directed by students. This minimised disruption and discouraged coercion. 44 second year university students completed the questionnaire during a weekly one-hour time slot dedicated to putting the latest mathematical concepts to practice with the lecturer (whereby contrast to what occurs in tutorial times the lecturer does most of the work and all students enrolled are invited). All these university students completed the questionnaire in normal classroom conditions; they were not placed under strict examination conditions. The lead author walked around to prevent discussion and coercion and there was minimum disruption. 30 research academics responded to local advertising and answered the questionnaire in their workplace while supervised.

The questionnaires were voluntary, anonymous and confidential. Participants were free to withdraw from the study at any time and without any penalty. No participant took this option however. The questionnaires gathered demographic information which included age, level of education attained and current qualification pursued, name of last qualification and years since obtaining it, and an option to note current speciality for research academics. Each problem task was placed on a separate page. Participants were not placed under time constraint, but while supervised, were asked to write their start and finish times on the front page of the survey to note approximate completion times. Speed of completion was not incentivised. Participants were not allowed to use calculators. A final “Comments Page” gave the option for feedback including specifically if the participants had previously seen any of the questions. Questionnaires were administered in person and supervised to avoid collusion or consulting of external sources.

The responses were coded four ways: A) correct; B) standard error (the errors discussed above in The Study); C) other error; D) left blank.

The ethical aspects of the study were approved by the Human Research Ethics Committee of the University of Sydney, protocol number [2016/647].

The first analysis examined the total number of correct responses provided by the participants as a function of group. Scores ranged from 1 to 11 out of a total possible of 11 (Problem 6 had 2 parts) ( Fig 1 ). An ANOVA of this data indicated a significant effect of group (F(4, 192) = 20.426, p < .001, partial η 2 = .299). Pairwise comparisons using Tukey’s HSD test indicated that the Introductory group performed significantly worse than the Advanced1, Advanced2 and Academic groups. There were no significant differences between the Advanced1, Advanced2 and Academic groups.

An external file that holds a picture, illustration, etc.
Object name is pone.0236153.g001.jpg

Error bars are one standard error of the mean.

Overall solution time, while recorded manually and approximately, was positively correlated with group, such that the more training someone had received, the longer were these solution times (r(180) = 0.247, p = .001). However, as can be seen in Fig 2 , this relationship is not strong.

An external file that holds a picture, illustration, etc.
Object name is pone.0236153.g002.jpg

A series of chi-squared analyses, and their Bayesian equivalents, were performed on each problem, to determine whether the distribution of response types differed as a function of group. To minimise the number of cells in which expected values in some of these analyses were less than 5, the Standard Error, Other Error and Blank response categories were collapsed into one category (Incorrect Response). For three of the questions, the expected values of some cells did fall below 5, and this was due to most people getting the problem wrong (Four Cards), or most people correctly responding to the problem (Bat and Ball, Coin Tosses). In these cases, the pattern of results was so clear that a statistical analysis was barely required. Significant chi-squared results were examined further with pairwise posthoc comparisons (see Table 1 ).

Superscripts label the groups (e.g., Introductory = a). Within the table, these letters refer to which other group a particular group was significantly different to according to a series of pairwise post hoc chi squared analyses (Bonferroni corrected α = .005) (e.g., ‘d’ in the Introductory column indicates the Introductory and the Advanced2 (d) group were significantly different for a particular problem).

The three groups with the least amount of training in mathematics were far less likely than the other groups to give the correct solution (χ 2 (4) = 31.06, p < .001; BF 10 = 45,045) ( Table 1 ). People in the two most advanced groups (Advanced2 and Academic) were more likely to solve the card problem correctly, although it was still less than half of the people in these groups who did so. Further, these people were less likely to give the standard incorrect solution, so that most who were incorrect suggested some more cognitively elaborate answer, such as turning over all cards. The proportion of people in the Advanced2 and Academic groups (39 and 37%) who solved the problem correctly far exceeded the typical proportion observed with this problem (10%). Of note, also, is the relatively high proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [ 11 ].

The cognitive reflection test

In the Lily Pads problem, although most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, it was also the case that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 27.28, p < .001; BF 10 = 15,554), with the standard incorrect answer being the next most prevalent response for the two lower ability mathematics groups ( Table 1 ).

Performance on the Widgets problem was similar to performance on the Lily Pads problem in that most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, but that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 23.76, p< .001; BF 10 = 516) ( Table 1 ). As with the Lily Pads and Widget problems, people in the Standard, Advanced1, Advanced2 and Academic groups were highly likely to solve the Bat and Ball problem (χ 2 (4) = 35.37, p < .001; BF 10 = 208,667). Errors were more likely from the least mathematically trained people (Introductory, Standard) than the other groups ( Table 1 ).

To compare performance on the CRT with previously published results, performance on the three problems (Lily Pads, Widgets, Bat and Ball) were combined. The number of people in each condition that solved 0, 1, 2, or 3 problems correctly is presented in Table 2 . The Introductory group were evenly distributed amongst the four categories, with 26% solving all three problems correctly. Around 70% of the rest of the groups solved all 3 problems correctly, which is vastly superior to the 17% reported by Frederick [ 16 ].

Responses to the Hospitals problem were almost universally split between correct and standard errors in the Standard, Advanced1, Advanced2 and Academic groups. Although this pattern of responses was also evident in the Introductory group, this group also exhibited more non-standard errors and non-responses than the other groups. However, the differences between the groups were not significant (χ 2 (4) = 4.93, p = .295; BF 10 = .068) ( Table 1 ). Nonetheless, the performance of all groups exceeds the 20% correct response rate reported by Kahneman and Tversky [ 23 ].

The two versions of the Birth Order problem showed similar results, with correct responses being more likely in the groups with more training (i.e., Advanced1, Advanced2 and Academic), and responses being shared amongst the various categories in the Introductory and Standard groups (χ a 2 (4) = 24.54, p < .001; BF 10 = 1,303; χ b 2 (4) = 25.77, p < .001; BF 10 = 2,970) ( Table 1 ). Nonetheless, performance on both versions of the problem in this study was significantly better than the 82% error rate reported by Kahneman and Tversky [ 23 ].

The Coin Tosses problem was performed well by all groups, with very few people in any condition committing errors. There were no obvious differences between the groups (χ 2 (4) = 3.70, p = .448; BF 10 = .160) ( Table 1 ). Kahneman and Tversky [ 23 ] reported that people tend to make errors on this type of problem by choosing less patterned looking sequences, but they did not report relative proportions of people making errors versus giving correct responses. Clearly the sample in this study did not perform like those in Kahneman and Tversky’s study.

Responses on the Two Drivers problem were clearly distinguished by a high chance of error in the Introductory and Standard groups (over 80%), and a fairly good chance of being correct in the Advanced1, Advanced2 and Academic groups (χ 2 (4) = 46.16, p < .001; BF 10 = 1.32 x 10 8 ) ( Table 1 ). Academics were the standout performers on this problem, although over a quarter of this group produced an incorrect response. Thus, the first two groups performed similarly to the participants in the Pelham and Neter [ 25 ] study, 80% of whom gave an incorrect response.

Responses on the Petrol Station problem were marked by good performance by the Academic group (73% providing a correct response), and just over half of each of the other groups correctly solving the problem. This difference was not significant (χ 2 (4) = 4.68, p = .322: BF 10 = .059) ( Table 1 ). Errors were fairly evenly balanced between standard and other, except for the Academic group, who were more likely to provide a creative answer if they made an error. Thaler [ 27 ] reported that most people get this problem wrong. In this study, however, on average, most people got this problem correct, although this average was boosted by the Academic group.

Responses on the Jack looking at Anne problem generally were standard errors, except for the Advanced2 and Academic groups, which were evenly split between standard errors and correct responses (χ 2 (4) = 18.03, p = .001; BF 10 = 46) ( Table 1 ). Thus, apart from these two groups, the error rate in this study was similar to that reported by Stanovich [ 30 ], where 80% of participants were incorrect.

A series of logistic regression analyses were performed in order to examine whether the likelihood of solving a particular problem correctly could be predicted on the basis of whether other problems were solved correctly. Each analysis involved selecting performance (correct or error) on one problem as the outcome variable, and performance on the other problems as predictor variables. Training (amount of training) was also included as a predictor variable in each analysis. A further logistic regression was performed with training as the outcome variable, and performance on all of the problems as predictor variables. The results of these analyses are summarised in Table 3 . There were three multi-variable relationships observed in these analyses, which can be interpreted as the likelihood of solving one problem in each group being associated with solving the others in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. Training also featured in each of these sets, moderating the relationships as per the results presented above for each problem.

P = Problem (1 = Four Cards; 2 = Lily Pads; 3 = Widgets; 4 = Bat & Ball; 5 = Hospitals; 6a = Birth Order (a); 6b = Birth Order (b); 7 = Coin Tosses; 8 = Two Drivers; 9 = Petrol Station; 10 = Jack looking at Anne).

training = Amount of training condition.

p = significance level of logistic regression model.

% = percentage of cases correctly classified by the logistic regression model.

✓ = significant predictor, α < .05.

* = logistic regression for the training outcome variable is multinomial, whereas all other logistic regressions are binomial.

The final “Comments Page” revealed the participants as overwhelmingly enjoying the questions. Any analysis of previous exposure to the tasks proved impossible as there was little to no alignment on participant’s degree of recall, if any, and even perceptions of what exposure entailed. For example, some participants confused being exposed to the particular tasks with being habitually exposed to puzzles, or even mathematics problems, more broadly.

In general, the amount of mathematics training a group had received predicted their performance on the overall set of problems. The greater the training, the more problems were answered correctly, and the slower the recorded response times. There was not an obvious difference between the Advanced1, Advanced2 and Academic groups on either of these measures, however there were clear differences between this group and the Introductory and Standard groups, with the former exhibiting clearly superior accuracy. While time records were taken approximately, so as to avoid adding time pressure as a variable, that the Advanced1, Advanced2 and Academic groups recorded more time in their consideration of the problems, may suggest a “pause and consider” approach to such problems is a characteristic of the advanced groups. This is in line with what was suggested by an eye-movement tracking study of mathematically trained students attempting the Four Cards Problem; where participants that had not chosen the standard error had spent longer considering the card linked to the matching bias effect [ 14 ]. It is important to note, however, that longer response times may reflect other cognitive processes than deliberation [ 32 ].

Performance on some problems was associated with performance on other problems. That is, if someone correctly answered a problem in one of these sets, they were also highly likely to correctly answer the other problems in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. This is different with how these problems have been typically clustered a priori in the research literature: (I) Lily Pads, Widgets and Bat and Ball (CRT); (II) Hospitals and Two Drivers (explained below); (III) Hospitals, Birth Order and Coin Tosses (representativeness heuristic); (IV) Birth Order and Coin Tosses (probability theory). Consideration of these problem groupings follows.

Correctly answering all three problems in (I) entailed not being distracted by particular pieces of information in the problems so as to stay focused on uncovering the real underlying relationships. The Lily Pads and Widget problems can mislead if attention is over focused on the numbers, and conversely, the Petrol Station problem can mislead if there is too much focus on the idea of a discount. While the Lily Pads and Widget problems are traditionally paired with the Bat and Ball problem in the CRT, it may be that performance on the Bat and Ball problem did not appear as part of this set due to an added level of difficulty. With the problems in (I), avoiding being distracted by certain parts of the questions at the expense of others almost leads directly to the correct answer. However, with the Bat and Ball problem, further steps in mathematical reasoning still need to occur in answering which two numbers add together to give a result while also subtracting one from the other for another.

With the problems in (II) it is of interest that the Two Drivers problem was created specifically to be paired with the Hospitals problem to test for motivation in problem solving [ 23 ]. Within this framework further transparent versions of these problems were successfully devised to manipulate for difficulty. The Two Drivers problem was amended to have Driver B travelling at exactly 5 mph during the first half of the race and at exactly 95 mph during the last half of the race. The Hospitals problem was amended so the smaller hospital would have “only 2” babies born each day and where for a period of one year the hospitals recorded the number of days on which all of the babies born were boys. Could the association in (II) be pointing to how participants overcome initial fictitious mathematical rules? Maybe they reframe the question in simpler terms to see the pattern. The Four Cards Problem also elicited a high number of incorrect answers where, associated with mathematical training, the standard incorrect solution was avoided for more cognitively elaborate ones. Indeed, a gradation effect appeared across the groups where the standard error of the “D and 3” cards becomes “D only” ( Table 4 ). Adrian Simpson and Derrick Watson found a comparable result across their two groups [14 p61]. This could again be pointing to having avoided an initial fictitious rule of simply concentrating on items directly found in the question, participants then seek to reframe the question to unearth the logical rule to be deduced. An added level of difficulty with this question may be why participants become trapped in a false answer. The eye-movement tracking study mentioned above supports this theory.

The problems in (III) fit naturally together as part of basic probability theory, a topic participants would have assimilated, or not, as part of various education curricula. While the equal likelihood of all possible outcomes with respect to a coin toss may be culturally assimilated, the same may not be as straightforward for birth gender outcomes where such assumptions could be swayed by biological hypothesis or folk wisdom [ 33 ]. The gradation of the results in terms of mathematical training does not support this possibility.

The effect of training on performance accuracy was more obvious in some problems compared to others, and to some extent, this was related to the type of problem. For instance, most of the problems in which performance was related to training (Four Cards, CRT [Lily Pads, Widgets, Bat and Ball], Two Drivers, Jack looking at Anne) could be classed as relying on logical and/or critical thinking. The one exception was the Birth Order problems, which are probability related.

In contrast, two of the three problems in which training did not appear to have much impact on performance (Hospitals and Coin Tosses) require domain-specific knowledge. The Hospitals problem requires a degree of knowledge about sampling statistics. This is a topic of quite distinct flavour that not all mathematically trained individuals gain familiarity with. On the other hand, all groups having performed well on the Coin Tosses problem is in line with a level of familiarity with basic probability having been originally presented at high school. While the questioning of patterning as negatively correlated with randomness is similar to that appearing in the Birth Order question, in the Birth Order question this aspect is arguably more concealed. These results and problem grouping (III) could be pointing to an area for improvement in teaching where the small gap in knowledge required to go from answering the Coin Tosses problem correctly to achieving similarly with the Birth Order problem could be easily addressed. A more formal introduction to sampling statistics in mathematical training could potentially bridge this gap as well as further be extended towards improvement on the Hospitals problem.

The other problem where performance was unrelated to training, the Petrol Station problem, cannot be characterised similarly. It is more of a logical/critical thinking type problem, where there remains some suggestion that training may have impacted performance, as the Academic group seemed to perform better than the rest of the sample. An alternate interpretation of this result is therefore that this problem should not be isolated but grouped with the other problems where performance is affected by training.

Although several aspects of the data suggest mathematics training improves the chances that someone will solve problems of the sort examined here, differences in the performance of participants in the Advanced1, Advanced2 and Academic groups were not obvious. This is despite the fact that large differences exist in the amount of training in these three groups. The first two groups were undergraduate students and the Academic group all had PhDs and many were experienced academic staff. One interpretation of this result is current mathematics training can only take someone so far in terms of improving their abilities with these problems. There is a point of demarcation to consider in terms of mathematical knowledge between the Advanced1, Advanced2 and Academic groups as compared to the Introductory and Standard groups. In Australia students are able to drop mathematical study at ages 15–16 years, or choose between a number of increasingly involved levels of mathematics. For the university in this study, students are filtered upon entry into mathematics courses according to their current knowledge status. All our groups involved students who had opted for post-compulsory mathematics at high school. And since our testing occurred in second semester, some of the mathematical knowledge shortfalls that were there upon arrival were bridged in first semester. Students must pass a first semester course to be allowed entry into the second semester course. A breakdown of the mathematics background of each group is as follows:

  • The Introductory group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Introduction to Differentiation, Applications of the Derivative, Antiderivatives, Areas and the Definite Integral), Financial Mathematics, Statistical Analysis. The Introductory group then explored concepts in mathematical modelling with emphasis on the importance of calculus in their first semester of mathematical studies.
  • The Standard group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Rates of Change, Integration including the method of substitution, trigonometric identities and inverse trigonometric functions, Areas and Volumes of solids of revolution, some differential equations), Combinatorics, Proof (with particular focus on Proof by Mathematical Induction), Vectors (with application to projectile motion), Statistical Analysis. In first semester their mathematical studies then covered a number of topics the Advanced1 group studied prior to gaining entrance at university; further details on this are given below.
  • The Advanced1 group’s mathematics high school syllabus studied prior to first semester course entry covered: the same course content the Standard group covered at high school plus extra topics on Proof (develop rigorous mathematical arguments and proofs, specifically in the context of number and algebra and further develop Proof by Mathematical Induction), Vectors (3 dimensional vectors, vector equations of lines), Complex Numbers, Calculus (Further Integration techniques with partial fractions and integration by parts), Mechanics (Application of Calculus to Mechanics with simple harmonic motion, modelling motion without and with resistance, projectiles and resisted motion). The Standard group cover these topics in their first semester university studies in mathematics with the exclusion of further concepts of Proof or Mechanics. In first semester the Advanced1 group have built on their knowledge with an emphasis on both theoretical and foundational aspects, as well as developing the skill of applying mathematical theory to solve practical problems. Theoretical topics include a host of theorems relevant to the study of Calculus.

In summary, at the point of our study, the Advanced1 group had more knowledge and practice on rigorous mathematical arguments and proofs in the context of number and algebra, and more in-depth experience with Proofs by Induction, but the bulk of extra knowledge rests with a much deeper knowledge of Calculus. They have had longer experience with a variety of integration techniques, and have worked with a variety of applications of calculus to solve practical problems, including a large section on mechanics at high school. In first semester at university there has been a greater focus on theoretical topics including a host of theorems and associated proofs relevant to the topics studied. As compared to the Introductory and Standard groups, the Advanced1 group have only widened the mathematics knowledge gap since their choice of post-compulsory mathematics at high school. The Advanced2 group come directly from an Advanced1 cohort. And the Academics group would have reached the Advanced1 group’s proficiency as part of their employment. So, are specific reasoning skills resulting from this level of abstract reasoning? Our findings suggest this should certainly be an area of investigation and links in interestingly with other research work. In studying one of the thinking tasks in particular (the Four Cards Problem) and its context of conditional inference more specifically, Inglis and Simpson [ 15 ] found a clear difference between undergraduates in mathematics and undergraduates in other university disciplines, yet also showed a lack of development over first-year university studies on conditional inference measures. A follow up study by Attridge and Inglis [ 22 ] then zeroed in on post-compulsory high school mathematical training and found that students with such training did develop their conditional reasoning to a greater extent than their control group over the course of a year, despite them having received no explicit tuition in conditional logic. The development though, whilst demonstrated as not being the result of a domain-general change in cognitive capacity or thinking disposition, and most likely associated with the domain-specific study of mathematics, revealed a complex pattern of endorsing more of some inferences and less of others. The study here focused on a much broader problem set associated with logical and critical thinking and it too is suggestive of a more complex picture in how mathematics training may be contributing to problem solving styles. A more intricate pattern to do with the impact of mathematical training on problem solving techniques is appearing as required for consideration.

There is also a final interpretation to consider: that people in the Advanced 1, Advanced2 and Academic groups did not gain anything from their mathematics training in terms of their ability to solve these problems. Instead, with studies denying any correlation of many of these problems with what is currently measured as intelligence [ 30 ], they might still be people of a particular intelligence or thinking disposition to start with, who have been able to use that intelligence to not only solve these problems, but also survive the challenges of their mathematics training.

That the CRT has been traditionally used as a measure of baseline thinking disposition and that performance has been found to be immutable across groups tested is of particular interest since our results show a clear possible training effect on these questions. CRT is tied with a willingness to engage in effortful thinking which presents as a suitable ability for training. It is beyond the scope of this study, but a thorough review of CRT testing is suggestive of a broader appreciation and better framework to understand thinking disposition, ability and potential ability.

Mathematical training appears associated with certain thinking skills, but there are clearly some subtleties that need to be extricated. The thinking tasks here add to the foundational results where the aim is for a firmer platform on which to eventually base more targeted and illustrative inquiry. If thinking skills can be fostered, could first year university mathematics teaching be improved so that all samples from that group reach the Advanced1 group level of reasoning? Do university mathematics courses become purely about domain-specific knowledge from this point on? Intensive training has been shown to impact the brain and cognition across a number of domains from music [ 34 ], to video gaming [ 35 ], to Braille reading [ 36 ]. The hypothesis that mathematics, with its highly specific practice, fits within this list remains legitimate, but simply unchartered. With our current level of understanding it is worth appreciating the careful wording of the NYU Courant Institute on ‘Why Study Math?’ where there is no assumption of causation: “Mathematicians need to have good reasoning ability in order to identify, analyze, and apply basic logical principles to technical problems.” [ 37 ].

Limitations

One possible limitation of the current study is that the problems may have been too easy for the more advanced people, and so we observed a ceiling effect (i.e., some people obtained 100% correct on all problems). This was most obvious in the Advanced1, Advanced2 and Academic groups. It is possible that participants in these groups had developed logical and critical thinking skills throughout their mathematical training that were sufficient to cope with most of the problems used in this study, and so this would support the contention that training in mathematics leads to the development of logical and critical thinking skills useful in a range of domains. Another interpretation is that participants in these groups already possessed the necessary thinking skills for solving the problems in this study, which is why they are able to cope with the material in the advanced units they were enrolled in, or complete a PhD in mathematics and hold down an academic position in a mathematics department. This would then suggest that training in mathematics had no effect on abstract thinking skills—people in this study possessed them to varying extents prior to their studies. This issue might be settled in a future study that used a greater number of problems of varying difficulties to maximise the chances of finding a difference between the three groups with the most amount of training. Alternatively, a longitudinal study that followed people through their mathematics training could determine whether their logical and critical thinking abilities changed throughout their course.

A further limitation of the study may be that several of the reasoning biases examined in this study were measured by only one problem each (i.e., Four Cards Problem, Two Drivers, Petrol Station, Jack looking at Anne). A more reliable measure of these biases could be achieved by including more problems that tap into these biases. This would, however, increase the time required of participants during data collection, and in the context of this study, would mean a different mode of testing would likely be required.

Broad sweeping intuitive claims of the transferable skills endowed by a study of mathematics require evidence. Our study uniquely covers a wide range of participants, from limited mathematics training through to research academics in the mathematical sciences. It furthermore considered performance on 11 well-studied thinking tasks that typically elude participants in psychological studies and on which results have been uncorrelated with general intelligence, education levels and other demographic information [ 15 , 16 , 30 ]. We identified different performances on these tasks with respect to different groups, based on level of mathematical training. This included the CRT which has developed into a method of measuring baseline thinking disposition. We identified different distributions of types of errors for the mathematically trained. We furthermore identified a performance threshold that exists in first year university for those with high level mathematics training. This study then provides insight into possible changes and adjustments to mathematics courses in order for them to fulfil their advertised goal of reaching improved rational and logical reasoning for a higher number of students.

It is central to any education program to have a clear grasp of the nature of what it delivers and how, but arguably especially so for the core discipline that is mathematics. In 2014 the Office of The Chief Scientist of Australia released a report “Australia’s STEM workforce: a survey of employers” where transferable skills attributed to mathematics were also ones that employers deemed as part of the most valuable [ 38 ]. A better understanding of what mathematics delivers in this space is an opportunity to truly capitalise on this historical culture-crossing subject.

Supporting information

Acknowledgments.

The authors would like to thank Jacqui Ramagge for her proof reading and input, as well as support towards data collection.

Funding Statement

The authors received no specific funding for this work.

Data Availability

  • PLoS One. 2020; 15(7): e0236153.

Decision Letter 0

17 Mar 2020

PONE-D-20-01159

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Dear Professor Speelman,

Thank you for submitting your manuscript to PLOS ONE. I have sent it to two expert reviewers and have received their comments back. As you can see at the bottom of this email, both reviewers are positive about your manuscript but raise some issues that you would need to address before the manuscript can be considered for publication. Notably, reviewer #1 points out that the manuscript should include a discussion on the reasons why individuals with math training may have improved reasoning skills (e.g., logical intuitions versus deliberate thinking). The reviewer also rightly mentions that your sample sizes are limited, notably for the most advanced groups. This should be discussed and acknowledged. Reviewer #2 has a number of conceptual and methodological points that you will also have to address. The reviewer provides very thorough comments and I will not reiterate the points here. However, note that both reviewers suggest that you need to improve the figures and I agree with them.   

We would appreciate receiving your revised manuscript by May 01 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Jérôme Prado

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements:

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.plosone.org/attachments/PLOSOne_formatting_sample_main_body.pdf and http://www.plosone.org/attachments/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include additional information regarding the survey or questionnaire used in the study and ensure that you have provided sufficient details that others could replicate the analyses. For instance, if you developed a questionnaire as part of this study and it is not under a copyright more restrictive than CC-BY, please include a copy, in both the original language and English, as Supporting Information. Please also let us know if it would be possible to provide the anonymized data points necessary to replicate the statistical analyses, for instance, as shown in fig 1 and 2. If so, please deposit those to a suitable data repository or include them in the Supporting Information files.

3. Thank you for stating the following financial disclosure:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

  • Please provide an amended Funding Statement that declares *all* the funding or sources of support received during this specific study (whether external or internal to your organization) as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now .  
  • Please state what role the funders took in the study.  If any authors received a salary from any of your funders, please state which authors and which funder. If the funders had no role, please state: "The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer #1: I think this is a very good and interesting manuscript trying to answer an important research question. I propose some changes that I believe should be applied before publication.

1. Each reasoning bias is measured with only one problem. In reasoning research, it is rather common to measure each type of reasoning problem with a series of structurally equivalent reasoning problems, so the results will be independent of contexts effects and will be generalizable to that type of problem. Here, the authors only measured each reasoning bias with one single problem and this might be problematic (see, for example: Fiedler & Hertel, 1994). I think this can be addressed by simply discussing it in the limitation section.

2. This is rather a minor issue, but the discussion on the CRT problems is not up-to-date (page 7). Most recent experiments on dual process theory suggest that people who are able to correctly solve these reasoning problems (including the CRT) do so intuitively, and not because they engaged in careful deliberation (Bago & De Neys, 2019). Intelligence made people have better intuitive responses (Thompson, Pennycook, Trippas & Evans, 2018). Similarly, this problems persists in the discussion about reaction times (page 25). Longer reaction times does not necessarily mean that people engaged in deliberation (see: Evans, Kyle, Dillon & Rand, 2015). Response time might be driven by decision conflict or response rationalization. These issues could be clarified with some changes in the wording or some footnotes on page 7 and 25. Furthermore, it would be interesting to have a discussion on how mathematical education helps people overcome their biases. Is it because it creates better intuition, or helps people engage in deliberation? An interesting question this manuscript does not discuss. It’s on the authors whether or not they discuss this latter point now, but the changes on page 7 and 25 should be made.

3. A more serious problem is the rather small sample size (especially in the more advanced groups). This small sample size makes the appearance of both false negatives and false positives more likely. Perhaps, the authors could compute the Bayes Factors for the chi-square or logistic regression test, so we can actually see how strong the evidence is for or against the null. This is especially important as the authors run a great number of explorative analysis (Table 3), and some of those results might need to be interpreted with great caution (depending on the Bayes Factor).

The graphs are not looking good, they should comply with APA formatting. At the very least, the axis titles should be meaningful and measure units should be written there.

The presentation order of the problems is quite unusual; why isn’t it random? Why did the authors decide on this order?

Reviewer #2: The study reported in this paper compared five groups of participants with varying levels of mathematical expertise on a set of reasoning tasks. The study is interesting and informative. It extends the current literature on this topic (which is reviewed very nicely in the introduction). However, there are some issues with the current analysis and interpretation that should be resolved prior to publication. I have therefore recommended major revisions. My comments are organised in the order in which they came up in the paper and they explain my responses to the questions above.

1. Line 114 – “general population” a bit misleading – they were also students but from other disciplines.

2. Line 124 onwards reads:

“The ultimate question to consider here is: are any skills associated with mathematics training innate or do they arise from skills transfer? Though to investigate how mathematical training affects reasoning skills, randomised sampling and randomised intervention to reveal causal relationships are clearly not viable. With so many possible confounding variables and logistical issues, it is even questionable what conclusions such studies might provide. Furthermore, a firm baseline from which to propose more substantive investigations is still missing.”

I find this paragraph slightly problematic because the current study doesn’t inform us on this ultimate question, so it makes the outline of the current study in the following paragraph feel unsatisfactory. I think the current study is important but prefacing it with this paragraph underplays that importance. And I think a randomised controlled study, although not viable, would give the answers we need because the random allocation to groups would allow us to rule out any confounding variables. Finally, the last sentence in this paragraph is unclear to me.

3. In the descriptions of the five participants groups the authors refer to the group’s level of interest in mathematics, but this seems like an overgeneralisation to me. Surely the introductory group could contain a biology student who also happens to be good at mathematics and very much enjoy it? I would be more comfortable with the descriptions if the parts about interest level were removed.

4. How many of the 123 first year students were in each of the three first year groups?

5. Line 313 – the standard group is referred to as “university mathematics students”, but they are not taking mathematics degreed.

6. Line 331 - what is a practice class?

7. Were the data collection settings quiet? From the description it sounds like groups of participants were completing the study at the same time in the same room, but the authors should make this explicit for the sake of the method being reproducible. E.g. how many students were in the room at the time?

8. Line 355-356 – the authors should not use the term “marginally worse” because this is statistically inappropriate – in a frequentist approach results are either significant or non-significant.

9. Line 340 – “approximate completion times were noted.”

This doesn’t sound rigorous enough to justify analysing them. Their analysis is interesting, but the authors should remind readers clearly whenever the response times are analysed or discussed that their recording was only manual and approximate.

10. I suggest replacing Figure 1 with a bar chart showing standard error of the mean on the error bars. A table with mean score out of 11 and the standard deviation for each group may also be useful. Figure 2 should be a scatterplot rather than a box and whisker plot.

11. Was the 0-11 total correct score approximately normally distributed across the full sample?

12. Chi square analysis requires at least 5 cases in each cell, was this met? It seems not since Table 1 shows lots of cells in the “no response” row having 0% of cases.

13. The chi-square analyses should be followed up with post hoc tests to see exactly where the differences between groups are. The descriptions as they stand aren’t that informative (as readers can just look at Table 1) without being backed up by post hoc tests.

14. For each chi square analysis in the text, I would find it easier to read if the test statistics came at the top of the paragraph, before the description.

15. Line 381-383 – “Of note, also, is the relatively low proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [11]."

I think this is supposed to say that a low proportion did make the standard error or that a high proportion did not make the standard error.

16. Line 403 - p values this small should be reported as p < .001 rather than p = .000 since they aren’t actually 0.

17. Line 476 – “…if a particular outcome variable was predicted significantly by a particular predictor variable, the converse relationship was also observed”

Isn’t that necessarily the case with regression analyses, like with correlations?

18. I don’t think the logistic regression analyses add much to the paper and at the moment they come across as potential p-hacking since they don’t clearly relate to the research question. To me they make the paper feel less focused. Having said that, there is some interesting discussion of them in the Discussion section. I’d recommend adding some justification to the introduction for why it is interesting to look at the relationships among tasks (without pretending to have made any specific hypotheses about the relationships, of course).

19. Line 509 would be clearer if it read “between these groups and the introductory and standard groups”

20. Lines 597 – 620 - This is an interesting discussion, especially the suggestion that advanced calculus may be responsible for the development. No development in reasoning skills from the beginning of a mathematics degree onwards was also found by Inglis and Simpson (2009), who suggested that the initial difference between mathematics and non-mathematics undergraduates could have been due to pre-university study of mathematics. Attridge & Inglis (2013) found evidence that this was the case (they found no difference between mathematics and non-mathematics students at age 16 but a significant difference at the end of the academic year, where the mathematics students had improved and the non-mathematics students had not).

Could the authors add some discussion of whether something similar may have been the case with their Australian sample? E.g. do students in Australia choose whether, or to what extent, to study mathematics towards the end of high school? If not, the description of the groups suggests that there were at least differences in high school mathematics attainment between groups 1-3, even if they studied the same mathematics curriculum. Do the authors think that this difference in attainment could have led to the differences between groups in the current study?

21. Line 617 – “Intensive training has been shown to impact the brain and cognition across a number of domains from music, to video gaming, to Braille reading [31].”

Reference 31 appears to only relate to music. Please add references for video gaming and Braille reading.

22. I recommend editing the figures from SPSS’s default style or re-making them in Excel or DataGraph to look more attractive.

23. I cannot find the associated datafile anywhere in the submission. Apologies if this is my mistake.

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

20 Apr 2020

All responses are detailed against the specific reviewers' comments in the Response to Reviewers document

Submitted filename: Response to Reviewers.docx

Decision Letter 1

11 Jun 2020

PONE-D-20-01159R1

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors.

Dear Dr. Speelman,

Thank you for submitting your revised manuscript to PLOS ONE. I have sent it to reviewer #2 and have now received the reviewer's comment. As you can see, the reviewer thinks that the manuscript is improved but has some outstanding issues that you would need to address in another round of revision. I notably agree with the reviewer that you should provide the raw data, allowing readers to replicate your analyses. Therefore, I invite you submit a revised version of your manuscript.

Please submit your revised manuscript by Jul 26 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Reviewer #2: The manuscript has improved but there are still a few issues that should be resolved prior to publication.

1. On lines 96, 97, 100 and 102, the references to “general population” should be changed to reflect the fact that these participants were non-mathematics (arts) students.

2. Line 306 – change “mathematics students” to “university students”.

3. The method section doesn’t specify the gender split and mean age of the sample.

4. Table 3 - values the p values listed as .000 should be changed to <.001.

5. Table 3 - I suggest repeating the list of problem numbers and names in the legend. It may make for a long legend but would make it much easier for the reader to interpret the table.

6. I am not sure what the new post hoc tests are comparing. What I expected was to see group 1 compared to groups 2, 3, 4 and 5, and so on. This would tell us which groups are statistically different from each other. At the moment we only know from the overall chi square tests whether there are any differences among the groups or not, we don’t know specifically which groups are statistically different from each other and which ones are not. We only have the authors’ interpretations based on the observed counts.

7. Line 584 - change “performance was correlated with training” to “performance was related to training” to avoid any confusion since a correlation analysis was not performed.

8. Data file – I had expected the data file to give the raw data rather than summary data, i.e. with each participant in a separate row, and a column indicating their group membership, a column giving their age, a column for sex etc (including all the demographics mentioned in the method), and a column for each reasoning question. This would allow other researchers to replicate the regression analyses and look at other relationships within the dataset. Without being able to replicate all analyses in the paper, the data file does not meet the minimal data set definition for publication in PLOS journals: https://journals.plos.org/plosone/s/data-availability .

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 1

16 Jun 2020

Please see "Response to Reviewers" document

Decision Letter 2

PONE-D-20-01159R2

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

Dear Dr. Speelman:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Jérôme Prado

Critical thinking definition

critical thinking and problem solving skills in mathematics

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

Critical Thinking in Mathematics Education

  • Reference work entry
  • First Online: 01 January 2014
  • Cite this reference work entry

critical thinking and problem solving skills in mathematics

  • Eva Jablonka 2  

816 Accesses

2 Citations

16 Altmetric

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Appelbaum P, Davila E (2009) Math education and social justice: gatekeepers, politics and teacher agency. In: Ernest P, Greer B, Sriraman B (eds) Critical issues in mathematics education. Information Age, Charlotte, pp 375–394

Google Scholar  

Applebaum M, Leikin R (2007) Looking back at the beginning: critical thinking in solving unrealistic problems. Mont Math Enthus 4(2):258–265

Bacon F (1605) Of the proficience and advancement of learning, divine and human. Second Book (transcribed from the 1893 Cassell & Company edition by David Price. Available at: http://www.gutenberg.org/dirs/etext04/adlr10h.htm

Common Core State Standards Initiative (2010) Mathematics standards. http://www.corestandards.org/Math . Accessed 20 July 2013

Ernest P (2010) The scope and limits of critical mathematics education. In: Alrø H, Ravn O, Valero P (eds) Critical mathematics education: past, present and future. Sense Publishers, Rotterdam, pp 65–87

Fawcett HP (1938) The nature of proof. Bureau of Publications, Columbia, New York City. University (Re-printed by the National Council of Teachers of Mathematics in 1995)

Fenner P (1994) Spiritual inquiry in Buddhism. ReVision 17(2):13–24

Fish M, Persaud A (2012) (Re)presenting critical mathematical thinking through sociopolitical narratives as mathematics texts. In: Hickman H, Porfilio BJ (eds) The new politics of the textbook. Sense Publishers, Rotterdam, pp 89–110

Chapter   Google Scholar  

Garfield JL (1990) Epoche and śūnyatā: skepticism east and west. Philos East West 40(3):285–307

Article   Google Scholar  

Jablonka E (1997) What makes a model effective and useful (or not)? In: Blum W, Huntley I, Houston SK, Neill N (eds) Teaching and learning mathematical modelling: innovation, investigation and applications. Albion Publishing, Chichester, pp 39–50

Keitel C, Kotzmann E, Skovsmose O (1993) Beyond the tunnel vision: analyzing the relationship between mathematics, society and technology. In: Keitel C, Ruthven K (eds) Learning from computers: mathematics education and technology. Springer, New York, pp 243–279

Legrand M (2001) Scientific debate in mathematics courses. In: Holton D (ed) The teaching and learning of mathematics at university level: an ICMI study. Kluwer, Dordrect, pp 127–137

National Council of Teachers of Mathematics (NCTM) (1989) Curriculum and evaluation standards for school mathematics. National Council of Teachers of Mathematics (NCTM), Reston

O’Daffer PG, Thomquist B (1993) Critical thinking, mathematical reasoning, and proof. In: Wilson PS (ed) Research ideas for the classroom: high school mathematics. MacMillan/National Council of Teachers of Mathematics, New York, pp 31–40

Paul R, Elder L (2001) The miniature guide to critical thinking concepts and tools. Foundation for Critical Thinking Press, Dillon Beach

Pimm D (1990) Mathematical versus political awareness: some political dangers inherent in the teaching of mathematics. In: Noss R, Brown A, Dowling P, Drake P, Harris M, Hoyles C et al (eds) Political dimensions of mathematics education: action and critique. Institute of Education, University of London, London

Skovsmose O (1989) Models and reflective knowledge. Zentralblatt für Didaktik der Mathematik 89(1):3–8

Stallman J (2003) John Dewey’s new humanism and liberal education for the 21st century. Educ Cult 20(2):18–22

Steiner H-G (1988) Theory of mathematics education and implications for scholarship. In: Steiner H-G, Vermandel A (eds) Foundations and methodology of the discipline mathematics education, didactics of mathematics. In: Proceedings of the second tme conference, Bielefeld-Antwerpen, pp 5–20

Walkerdine V (1988) The mastery of reason: cognitive development and the production of rationality. Routledge, London

Walshaw M (2003) Democratic education under scrutiny: connections between mathematics education and feminist political discourses. Philos Math Educ J 17. http://people.exeter.ac.uk/PErnest/pome17/contents.htm

Download references

Author information

Authors and affiliations.

Department of Education and Professional Studies, King’s College London, Waterloo Bridge Wing Franklin-Wilkins Building, SE1 9NH, London, UK

Eva Jablonka

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Eva Jablonka .

Editor information

Editors and affiliations.

Department of Education, Centre for Mathematics Education, London South Bank University, London, UK

Stephen Lerman

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Dordrecht

About this entry

Cite this entry.

Jablonka, E. (2014). Critical Thinking in Mathematics Education. In: Lerman, S. (eds) Encyclopedia of Mathematics Education. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-4978-8_35

Download citation

DOI : https://doi.org/10.1007/978-94-007-4978-8_35

Published : 31 July 2014

Publisher Name : Springer, Dordrecht

Print ISBN : 978-94-007-4977-1

Online ISBN : 978-94-007-4978-8

eBook Packages : Humanities, Social Sciences and Law

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

AIP Publishing Logo

Student critical thinking ability in solving PISA-Like mathematics problem in the context of Palembang tourism “ Bait Al-Quran Al-Akbar”

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Reprints and Permissions
  • Cite Icon Cite
  • Search Site

Dinda Fitri Humaira , Zulkardi , Ratu Ilma Indra Putri; Student critical thinking ability in solving PISA-Like mathematics problem in the context of Palembang tourism “ Bait Al-Quran Al-Akbar”. AIP Conf. Proc. 22 April 2024; 3052 (1): 020025. https://doi.org/10.1063/5.0201017

Download citation file:

  • Ris (Zotero)
  • Reference Manager

This research aims to develop the critical thinking skills of junior high school students in the context of Palembang tourism using PISA-like mathematics problem, and the PMRI approach. This research took a special research subject for eighth grade junior high school students in the city of Palembang. The methodology of this research is a formative evaluation type of design research. The data were analyzed qualitatively through tests, interviews and observations. The results of the research on students’ critical thinking skills in solving PISA type questions in the tourism context of Bait Al-Quran Al-Akbar Palembang have good critical thinking skills with a percentage of 71.42%. This research can teach students to think critically and introduce tourism and culture in Palembang at the same time.

Citing articles via

Publish with us - request a quote.

critical thinking and problem solving skills in mathematics

Sign up for alerts

  • Online ISSN 1551-7616
  • Print ISSN 0094-243X
  • For Researchers
  • For Librarians
  • For Advertisers
  • Our Publishing Partners  
  • Physics Today
  • Conference Proceedings
  • Special Topics

pubs.aip.org

  • Privacy Policy
  • Terms of Use

Connect with AIP Publishing

This feature is available to subscribers only.

Sign In or Create an Account

critical thinking and problem solving skills in mathematics

Explained: Importance of critical thinking, problem-solving skills in curriculum

F uture careers are no longer about domain expertise or technical skills. Rather, critical thinking and problem-solving skills in employees are on the wish list of every big organization today. Even curriculums and pedagogies across the globe and within India are now requiring skilled workers who are able to think critically and are analytical.

The reason for this shift in perspective is very simple.

These skills provide a staunch foundation for comprehensive learning that extends beyond books or the four walls of the classroom. In a nutshell, critical thinking and problem-solving skills are a part of '21st Century Skills' that can help unlock valuable learning for life.

Over the years, the education system has been moving away from the system of rote and other conventional teaching and learning parameters.

They are aligning their curriculums to the changing scenario which is becoming more tech-driven and demands a fusion of critical skills, life skills, values, and domain expertise. There's no set formula for success.

Rather, there's a defined need for humans to be more creative, innovative, adaptive, agile, risk-taking, and have a problem-solving mindset.

In today's scenario, critical thinking and problem-solving skills have become more important because they open the human mind to multiple possibilities, solutions, and a mindset that is interdisciplinary in nature.

Therefore, many schools and educational institutions are deploying AI and immersive learning experiences via gaming, and AR-VR technologies to give a more realistic and hands-on learning experience to their students that hone these abilities and help them overcome any doubt or fear.

ADVANTAGES OF CRITICAL THINKING AND PROBLEM-SOLVING IN CURRICULUM

Ability to relate to the real world:  Instead of theoretical knowledge, critical thinking, and problem-solving skills encourage students to look at their immediate and extended environment through a spirit of questioning, curiosity, and learning. When the curriculum presents students with real-world problems, the learning is immense.

Confidence, agility & collaboration : Critical thinking and problem-solving skills boost self-belief and confidence as students examine, re-examine, and sometimes fail or succeed while attempting to do something.

They are able to understand where they may have gone wrong, attempt new approaches, ask their peers for feedback and even seek their opinion, work together as a team, and learn to face any challenge by responding to it.

Willingness to try new things: When problem-solving skills and critical thinking are encouraged by teachers, they set a robust foundation for young learners to experiment, think out of the box, and be more innovative and creative besides looking for new ways to upskill.

It's important to understand that merely introducing these skills into the curriculum is not enough. Schools and educational institutions must have upskilling workshops and conduct special training for teachers so as to ensure that they are skilled and familiarized with new teaching and learning techniques and new-age concepts that can be used in the classrooms via assignments and projects.

Critical thinking and problem-solving skills are two of the most sought-after skills. Hence, schools should emphasise the upskilling of students as a part of the academic curriculum.

The article is authored by Dr Tassos Anastasiades, Principal- IB, Genesis Global School, Noida. 

Watch Live TV in English

Watch Live TV in Hindi

Explained: Importance of critical thinking, problem-solving skills in curriculum

IMAGES

  1. PSLE Mathematics Critical Thinking Practice In Challenging Problem Sums

    critical thinking and problem solving skills in mathematics

  2. Mathematics Improves Your Critical Thinking and Problem-Solving

    critical thinking and problem solving skills in mathematics

  3. 6 Examples of Critical Thinking Skills

    critical thinking and problem solving skills in mathematics

  4. Critical Thinking Skills

    critical thinking and problem solving skills in mathematics

  5. The benefits of critical thinking for students and how to develop it

    critical thinking and problem solving skills in mathematics

  6. PPT

    critical thinking and problem solving skills in mathematics

VIDEO

  1. Navigating Student-Teacher Conflict w/ Dr. Stephanie Mihalas

  2. Problem Solving and Reasoning: Polya's Steps and Problem Solving Strategies

  3. Simple Problem Solving Models

  4. Introduction to Mathematical Thinking

  5. Thinking Mathematically as the Key to Understanding the Future

  6. 19 Ways To Improve Your Critical Thinking And Problem -Solving Skills

COMMENTS

  1. PDF Mathematical Teaching Strategies: Pathways to Critical Thinking and

    When teaching mathematics, critical thinking skills can be used, practiced and enhanced by effective cognitive methods. Critical thinking can enhance creative problem solving options by encouraging students to seek new strategies when solving mathematical problems. Mathematics teachers know the importance of mathematical

  2. How To Encourage Critical Thinking in Math

    Critical thinking is more than just a buzzword… It's an essential skill that helps students develop problem-solving abilities and make logical connections between different concepts. By encouraging critical thinking in math, students learn to approach problems more thoughtfully, they learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different ...

  3. 6 Tips for Teaching Math Problem-Solving Skills

    1. Link problem-solving to reading. When we can remind students that they already have many comprehension skills and strategies they can easily use in math problem-solving, it can ease the anxiety surrounding the math problem. For example, providing them with strategies to practice, such as visualizing, acting out the problem with math tools ...

  4. (PDF) Students' Critical Thinking Skills in Solving Mathematical

    Mathematical Problem-solving and Critical Thinking Skills Using Problem-Based Learning. Universal Journal of Educational Research , 8 (5), 2012 - 2021.

  5. Creative and Critical Thinking in Primary Mathematics

    In mathematics, creative thinking occurs when students generalise. Generalising involves identifying common properties or patterns across more than one case and communicating a rule (conjecture) to describe the common property, pattern or relationship. In order to generalise students need to first analyse the problem to notice things that are ...

  6. Critical Thinking Math Problems: Examples and Activities

    Cite this lesson. Critical thinking is an important factor in understanding math. Discover how critical thinking can help with real-world problem solving, using examples and activities like asking ...

  7. Critical Thinking in Mathematics Education

    Definition. Mainstream educational psychologists view critical thinking (CT) as the strategic use of a set of reasoning skills for developing a form of reflective thinking that ultimately optimizes itself, including a commitment to using its outcomes as a basis for decision-making and problem solving. In such descriptions, CT is established as ...

  8. How to Improve Problem-Solving Skills: Mathematics and Critical Thinking

    How to Develop Critical Thinking Skills in Math. Critical thinking goes hand in hand with problem-solving. But exactly how to develop critical thinking skills in math might not be immediately obvious. Here are a few strategies: Contextual Learning: Teaching math within a story or real-life scenario makes it relevant. When students see math as a ...

  9. Full article: Promoting critical thinking through mathematics and

    Therefore, along with the development of argumentative skills and critical thinking, ... "On the Relationship between Problem Posing, Problem Solving, and Creativity in the Primary School." ... In Mathematical Problem Posing from Research to Effective Practice, edited by F. M. Singer, N. F. Ellerton, and J. Cai, 103-123. New York, NY ...

  10. Mathematics Improves Your Critical Thinking and Problem-Solving

    Mathematics provides a systematic and logical framework for problem-solving and critical thinking. The study of math helps to develop analytical skills, logical reasoning, and problem-solving abilities that can be applied to many areas of life.By using critical thinking skills to solve math problems, we can develop a deeper understanding of concepts, enhance our problem-solving skills, and ...

  11. (PDF) Critical Thinking and Problem Solving Skills in Mathematics of

    PDF | On Nov 29, 2017, Emil Alcantara and others published Critical Thinking and Problem Solving Skills in Mathematics of Grade-7 Public Secondary Students | Find, read and cite all the research ...

  12. Teaching Mathematical Reasoning

    It's also central to being proficient in math and being able to solve math problems. In this article, we outline two approaches to fostering mathematical reasoning skills and improved critical thinking in math: Mathematical problem-solving: This approach makes students think conceptually about problems before applying tools they've learned.

  13. (Pdf) Enhancing the Problem-solving and Critical Thinking Skills of

    Enhancing the problem-solving and critical thinking skills of students using the mathematical investigation approach Pentang, J. (2019). Determining elementary pre- service teachers' problem solving

  14. Development and differences in mathematical problem-solving skills: A

    1. Introduction. Problem-solving skills are a complex set of cognitive, behavioral, and attitudinal components that are situational and dependent on thorough knowledge and experience [1,2].Problem-solving skills are acquired over time and are the most widely applicable cognitive tool [].Problem-solving skills are particularly important in mathematics education [3,4].

  15. Does mathematics training lead to better logical thinking and reasoning

    The School of Mathematics and Statistics at The University of Sydney, Australia, directly attributes as part of particular course objectives and outcomes skills that include "enhance your problem-solving skills" as part of studies in first year , "develop logical thinking" as part of studies in second year, which was a statement drafted ...

  16. (PDF) Developing Critical Thinking Skills of Students in Mathematics

    Critical thinking skills include high-level thinking that leads to problem solving and decision making, and are important and beneficial for every individual, especially in the field of education ...

  17. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and ...

  18. Critical Thinking in Mathematics Education

    Educational psychologists frame critical thinking (CT) as a set of generic thinking and reasoning skills, including a disposition for using them, as well as a commitment to using the outcomes of CT as a basis for decision-making and problem solving. In such descriptions, CT is established as a general standard for making judgments and decisions.

  19. Student critical thinking ability in solving PISA-Like mathematics

    The data were analyzed qualitatively through tests, interviews and observations. The results of the research on students' critical thinking skills in solving PISA type questions in the tourism context of Bait Al-Quran Al-Akbar Palembang have good critical thinking skills with a percentage of 71.42%.

  20. Explained: Importance of critical thinking, problem-solving skills in

    In a nutshell, critical thinking and problem-solving skills are a part of '21st Century Skills' that can help unlock valuable learning for life. Over the years, the education system has been ...

  21. Edulize

    2 likes, 0 comments - edulizeblogDecember 8, 2023 on : "Boost Your Problem-Solving Skills with Cubes Math Strategy [ProblemSolvingSkills, CubesMathStrategy, Mathematics, CriticalThinking, Probl ...

  22. PDF Critical Thinking and Problem Solving Skills in Mathematics of Grade-7

    Alcantara & Bacsa, Critical Thinking and Problem Solving Skills in Mathematics… 22 P-ISSN 2350-7756 | E-ISSN 2350-8442 | www.apjmr.com Asia Pacific Journal of Multidisciplinary Research, Vol. 5 ...

  23. The role of clean energy in the future of American manufacturing

    Jay Timmons, president and CEO of the National Association of Manufacturers and Valerie Sheares Ashby, president of the University of Maryland, Baltimore...

  24. (PDF) Developing Students' Critical Thinking Skills in Mathematics

    Critical thinking and problem-solving skills are necessary skills in the 21st century learning. However, the initial tests of students' critical thinking and problem-solving skills showed low scores.