helpful professor logo

35 Higher-Order Thinking Questions

higher order thinking examples and definition

Higher-order thinking questions are questions that you can ask in order to stimulate thinking that requires significant knowledge mastery and data manipulation.

Generally, higher-order thinking involves thinking from the top 3 levels of bloom’s taxonomy: analysis, evaluation, and knowledge creation.

The term “higher-order” is used because these forms of thinking require strong command of information and the ability to work with it to develop complex understanding (Stanley, 2021).

Generally, a higher-order thinking question will be open-ended and require the student to demonstrate their ability to analyze and evaluate information.

Higher-Order Thinking Questions

Below are some useful questions for stimulating higher-order thinking.

Questions for Teachers to Ask Students

  • Encourage compare and contrast: How would you compare and contrast these two concepts/ideas?
  • Seek alternatives: Can you provide an alternative solution to this problem?
  • Apply an ethical lens: What ethical considerations are involved in this situation or decision?
  • Categorize and classify: How would you categorize or classify these items based on their shared characteristics?
  • Sort by priority: How would you prioritize these tasks, and what factors did you consider?
  • Real-world connections: How can you apply this concept to a real-world situation?
  • Rephrase and reframe: How would you rephrase this question or problem from a different perspective?
  • Identify trends: Can you identify any trends or developments that may influence this issue in the future?
  • Seek solutions: How would you design a solution to address this challenge?
  • Use evidence: What evidence supports your point of view or conclusion?
  • Find relationships: Can you explain the relationship between these two events or phenomena?
  • Change a variable: How would this situation change if we altered this variable or factor?
  • Compare to prior knowledge: In what ways does this concept challenge your previous understanding or beliefs?
  • Identify connections: Can you explain how these two seemingly unrelated ideas are connected or interdependent ?
  • Re-conextualize: How would you adapt this solution to work in a different context or environment?
  • Identifying consequences: What are the potential consequences of this decision or action?
  • Evaluate: What criteria would you use to evaluate the effectiveness or success of this approach?
  • Interdisciplinary connections: How can you apply principles from another discipline to enhance your understanding of this topic?
  • Distil key factors: What factors may have contributed to this outcome or result, and how might they be addressed?
  • Identifying bias: Can you identify any biases or assumptions in this argument?
  • Find weaknesses: How would you argue against your own position or point of view?
  • Steelman: Can you think of likely criticisms of your position and identify ways you would respond?
  • Make judgments about best practices: Can you develop a set of guidelines or best practices based on this information?
  • Seek next steps: What questions would you ask to further investigate or explore this topic?
  • Reflect on process: What did you learn about how you went about this task and how would you make changes next time for improvements?

Questions for Students to Ask Themselves

  • K-W-L: What do I already know about this topic, what do I still need to learn, and what have I learned today?
  • Compare and contrast with prior knowledge: How does this new information relate to what I already know?
  • Identify assumptions : What assumptions am I making, and are they justified?
  • Organize: How can I organize this information in a way that makes sense to me?
  • Identify trends: What patterns or connections can I identify between these concepts or ideas?
  • Think from another perspective: Am I considering multiple perspectives or viewpoints in my analysis?
  • Brainstorm implications : What are the potential implications of my conclusions or decisions?
  • Hypothesize: How can I use my current knowledge to predict or hypothesize about future events?
  • Identify inconsistency: Can I recognize any logical fallacies or inconsistencies in my reasoning?
  • Seek new strategies: What strategies can I employ to improve my understanding and retention of this material?

Higher-Order Thinking vs Lower-Order Thinking

Benefits of higher-order thinking.

Higher-order thinking offers numerous benefits to learners, including:

  • Enhanced problem-solving skills : Higher-order thinking develops a student’s ability to tackle complex problems by breaking them down, analyzing different aspects, and putting the information back together to find new solutions. This is highly valued in 21st Century workplaces (Saifer, 2018).
  • Critical thinking and reasoning : Students who engage in higher-order thinking are better equipped to evaluate information, question assumptions, and identify biases. This helps them to have better media literacy and enables them to form independent conclusions rather than being easily swayed by flawed information (Richland & Simms, 2015).
  • Creativity and innovation : Higher-order thinking fosters creativity by encouraging students to think beyond the obvious. Students are encouraged to explore alternative perspectives and find alternative ways to approach common problems. This creative thinking is highly valuable in various academic and professional fields, including STEM and the arts.
  • Deeper understanding and retention: Lower-order thinking prioritizes memorization, but because the information is not sufficiently contextualized and learned though knowledge construction, it tends to be lost with time. Higher-order thinking, on the other hand,  promotes a more profound understanding of subjects. This deeper comprehension leads to better long-term retention of knowledge and better ability to manipulate information (Ghanizadeh, Al-Hoorie & Jahedizadeh, 2020).
  • Greater self-awareness and metacognition : Higher-order thinking fosters self-reflection and metacognition. Students who have learned skills like critique, identifying flaws and biases, and logical analysis, are able to apply those skills to their own thinking to reflect on how they can improve their own rational meaning-making.

How to Stimulate Higher-Order Thinking in the Classroom

  • Cultivate inquisitive minds: Encourage students to ask questions – regularly. Create a classroom culture where questioning is encouraged and there are “no wrong questions.” Encourage questions that delve deeper into subjects, challenge assumptions, or stimulate further cuiriosity. This will foster their critical thinking by constantly making them peel back the layers of knowledge on any topic (Yen & Halili, 2015).
  • Tackle real-life challenges: Create lesson plans that root the learning content in real-world situations (i.e. situated learning ). Require students to apply their knowledge and skills to new situations rather than just on worksheets. By addressing genuine issues that, ideally, are relevant to students’ lives, students can start to work with and manipulate the knowledge they have received in the classroom (Saifer, 2018).
  • Encourage collaboration and active learning : Promote group discussions, debates, and cooperative problem-solving activities. Group work helps with higher-order thinking because students are exposed to diverse perspectives and new ways of doing things from their peers. By seeing others’ thought processes, we can enhance our own (Ghanizadeh, Al-Hoorie & Jahedizadeh, 2020).
  • Reflect and build self-awareness : Nurture the habit of self-reflection in students. Here, we’re referring to the concept of metacognition which refers to ‘thinking about thinking’. This encourages students to evaluate how they went about learning and continually work on improving their learning process. This plays a vital role in recognizing my strengths and weaknesses and refining my learning strategies (Yen & Halili, 2015).
  • Interweave interdisciplinary connections: Combine ideas, concepts, and techniques from various disciplines to encourage a comprehensive understanding of complex subjects. One discipline may shed light on the topic in a way that another discipline is completely blind to. By establishing connections between different fields, students can sharpen their analytical and creative thinking abilities (Richland & Simms, 2015).

Higher-Order Thinking on Bloom’s Taxonomy

Higher and lower-order thinking skills are most famously presented in Bloom’s Taxonomy .

This taxonomy is used to categorize levels of understanding , starting from shallow knowledge and ending with deep understanding.

Below is an image demonstrating the Bloom’s Taxonomy hierarchy of knowledge :

blooms taxonomy, explained below

As shown in the above image, Bloom distils 6 forms of knowledge and understanding. The bottom 3 (remember, understand, and apply) relate to lower-order thinking that doesn’t require deep knowledge. The top 3 (analyze, evaluate, create) represent higher-oreder thinking.

Each is explained below:

1. Remembering (Lower-Order)

Definition: This is the most fundamental level of understanding that involves remembering basic information regarding a subject matter. This means that students will be able to define concepts, list facts, repeat key arguments, memorize details, or repeat information.

Example Question: “What is 5×5?”

2. Understanding (Lower-Order)

Definition: Understanding means being able to explain. This can involve explaining the meaning of a concept or an idea. This is above remembering because it requires people to know why , but it is not yet at a level of analysis or critique.

Example Question: “Can you show me in a drawing what 5×5 looks like?”

3. Applying (Middle-Order)

Definition: Applying refers to the ability to use information to do work. Ideally, it will occur in situations other than the situation in which it was learned. This represents a deeper level of understanding.

Example Question: “If you buy five chocolates worth $5 each, how much will you have to pay?”

4. Analyzing (Higher-Order)

Definition: This is generally considered to be the first layer of higher-order thinking. It involves conducting an analysis independently. This includes the ability to make connections between ideas, explore the logic of an argument, and compare various concepts.

Example Question: “Based on what you’ve learned, can you identify five key themes?”

5. Evaluating (Higher-Order)

Definition: Evaluating means determining the correctness, morality, or rationality of a perspective. At this level, students can identify the merits of an argument or point of view and weigh the relative strengths of each point. It requires analysis, but steps-up to making judgments about what you’re seeing.

Example Question: “Based on all the information you’ve gathered, what do you think is the most ethical course of action?”

6. Creating (Higher-Order)

Definition: The final level of Bloom’s taxonomy is when students can create knowledge by building on what they already know. This may include, for example, formulating a hypothesis and then testing it through rigorous experimentation.

Example Question: “Now you’ve mastered an understanding of accounting, could you make an app that helps an everyday person manage their bookkeeping?”

Higher-order thinking is a necessary skill for the 21st Century. It promotes those thinking skills that are required for high-paying jobs and allows people to think critically, be more media literate, and come to better solutions to problems both in their personal and professional lives. By encouraging this sort of thinking in school, educators can help their students get better grades now and live a better life into the future.

Ghanizadeh, A., Al-Hoorie, A. H., & Jahedizadeh, S. (2020).  Higher order thinking skills in the language classroom: A concise guide . New York: Springer International Publishing.

Richland, L. E., & Simms, N. (2015). Analogy, higher order thinking, and education.  Wiley Interdisciplinary Reviews: Cognitive Science ,  6 (2), 177-192. doi: https://doi.org/10.1002/wcs.1336

Saifer, S. (2018).  HOT skills: Developing higher-order thinking in young learners . London: Redleaf Press.

Stanley, T. (2021). Promoting rigor through higher level questioning practical strategies for developing students’ critical thinking. New York: Taylor & Francis.

Yen, T. S., & Halili, S. H. (2015). Effective teaching of higher order thinking (HOT) in education.  The Online Journal of Distance Education and e-Learning ,  3 (2), 41-47.

Chris

Chris Drew (PhD)

Dr. Chris Drew is the founder of the Helpful Professor. He holds a PhD in education and has published over 20 articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education. [Image Descriptor: Photo of Chris]

  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 5 Top Tips for Succeeding at University
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 50 Durable Goods Examples
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 100 Consumer Goods Examples
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 30 Globalization Pros and Cons

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

  • Grades 6-12
  • School Leaders

FREE Poetry Worksheet Bundle! Perfect for National Poetry Month.

50+ Higher-Order Thinking Questions To Challenge Your Students

50+ lower order thinking questions too!

Did the character make a good decision? Why or why not?

Want to help your students make strong connections with the material? Ensure you’re using all six levels of cognitive thinking. This means asking lower-order thinking questions as well as higher-order thinking questions. Learn more about each here, and find plenty of examples for each.

What are lower-order and higher-order thinking questions?

An altered form of the Bloom's Taxonomy pyramid, showing the three higher order level skills spread across the top tier together

Source: University of Michigan

Bloom’s Taxonomy is a way of classifying cognitive thinking skills. The six main categories—remember, understand, apply, analyze, evaluate, create—are broken into lower-order thinking skills (LOTS) and higher-order thinking skills (HOTS). LOTS includes remember, understand, and apply. HOTS covers analyze, evaluate, and create.

While both LOTS and HOTS have value, higher-order thinking questions urge students to develop deeper connections with information. They also encourage kids to think critically and develop problem-solving skills. That’s why teachers like to emphasize them in the classroom.

New to higher-order thinking? Learn all about it here. Then use these lower- and higher-order thinking questions to inspire your students to examine subject material on a variety of levels.

Remember (LOTS)

  • Who are the main characters?
  • When did the event take place?
  • What is the setting of the story?

What is the setting of the story?

  • Where would you find _________?
  • How do you __________?
  • What is __________?
  • How do you define _________?
  • How do you spell ________?
  • What are the characteristics of _______?
  • List the _________ in proper order.
  • Name all the ____________.
  • Describe the __________.
  • Who was involved in the event or situation?

Who was involved in the event or situation?

  • How many _________ are there?
  • What happened first? Next? Last?

Understand (LOTS)

  • Can you explain why ___________?
  • What is the difference between _________ and __________?
  • How would you rephrase __________?
  • What is the main idea?
  • Why did the character/person ____________?

Why did the character/person ____________?

  • What’s happening in this illustration?
  • Retell the story in your own words.
  • Describe an event from start to finish.
  • What is the climax of the story?
  • Who are the protagonists and antagonists?

Who are the protagonists and antagonists?

  • What does ___________ mean?
  • What is the relationship between __________ and ___________?
  • Provide more information about ____________.
  • Why does __________ equal ___________?
  • Explain why _________ causes __________.

Apply (LOTS)

  • How do you solve ___________?
  • What method can you use to __________?
  • What methods or approaches won’t work?

What methods or approaches won't work?

  • Provide examples of _____________.
  • How can you demonstrate your ability to __________.
  • How would you use ___________?
  • Use what you know to __________.
  • How many ways are there to solve this problem?
  • What can you learn from ___________?
  • How can you use ________ in daily life?
  • Provide facts to prove that __________.
  • Organize the information to show __________.

Organize the information to show __________.

  • How would this person/character react if ________?
  • Predict what would happen if __________.
  • How would you find out _________?

Analyze (HOTS)

  • What facts does the author offer to support their opinion?
  • What are some problems with the author’s point of view?
  • Compare and contrast two main characters or points of view.

Compare and contrast two main characters or points of view.

  • Discuss the pros and cons of _________.
  • How would you classify or sort ___________?
  • What are the advantages and disadvantages of _______?
  • How is _______ connected to __________?
  • What caused __________?
  • What are the effects of ___________?
  • How would you prioritize these facts or tasks?
  • How do you explain _______?
  • Using the information in a chart/graph, what conclusions can you draw?
  • What does the data show or fail to show?
  • What was a character’s motivation for a specific action?

What was a character's motivation for a specific action?

  • What is the theme of _________?
  • Why do you think _______?
  • What is the purpose of _________?
  • What was the turning point?

Evaluate (HOTS)

  • Is _________ better or worse than _________?
  • What are the best parts of __________?
  • How will you know if __________ is successful?
  • Are the stated facts proven by evidence?
  • Is the source reliable?

Is the source reliable?

  • What makes a point of view valid?
  • Did the character/person make a good decision? Why or why not?
  • Which _______ is the best, and why?
  • What are the biases or assumptions in an argument?
  • What is the value of _________?
  • Is _________ morally or ethically acceptable?
  • Does __________ apply to all people equally?
  • How can you disprove __________?
  • Does __________ meet the specified criteria?

Does __________ meet the specified criteria?

  • What could be improved about _________?
  • Do you agree with ___________?
  • Does the conclusion include all pertinent data?
  • Does ________ really mean ___________?

Create (HOTS)

  • How can you verify ____________?
  • Design an experiment to __________.
  • Defend your opinion on ___________.
  • How can you solve this problem?
  • Rewrite a story with a better ending.

Rewrite a story with a better ending.

  • How can you persuade someone to __________?
  • Make a plan to complete a task or project.
  • How would you improve __________?
  • What changes would you make to ___________ and why?
  • How would you teach someone to _________?
  • What would happen if _________?
  • What alternative can you suggest for _________?
  • What solutions do you recommend?
  • How would you do things differently?

How would you do things differently?

  • What are the next steps?
  • What factors would need to change in order for __________?
  • Invent a _________ to __________.
  • What is your theory about __________?

What are your favorite higher-order thinking questions? Come share in the WeAreTeachers HELPLINE group on Facebook .

Plus, 100+ critical thinking questions for students to ask about anything ..

Use these higher-order thinking questions to challenge students to analyze and evaluate information and use it to create something new.

You Might Also Like

What is Higher Order Thinking? #buzzwordsexplained

What Is Higher-Order Thinking and How Do I Teach It?

Go beyond basic remembering and understanding. Continue Reading

Copyright © 2023. All rights reserved. 5335 Gate Parkway, Jacksonville, FL 32256

Learning Center

Higher Order Thinking: Bloom’s Taxonomy

Many students start college using the study strategies they used in high school, which is understandable—the strategies worked in the past, so why wouldn’t they work now? As you may have already figured out, college is different. Classes may be more rigorous (yet may seem less structured), your reading load may be heavier, and your professors may be less accessible. For these reasons and others, you’ll likely find that your old study habits aren’t as effective as they used to be. Part of the reason for this is that you may not be approaching the material in the same way as your professors. In this handout, we provide information on Bloom’s Taxonomy—a way of thinking about your schoolwork that can change the way you study and learn to better align with how your professors think (and how they grade).

Why higher order thinking leads to effective study

Most students report that high school was largely about remembering and understanding large amounts of content and then demonstrating this comprehension periodically on tests and exams. Bloom’s Taxonomy is a framework that starts with these two levels of thinking as important bases for pushing our brains to five other higher order levels of thinking—helping us move beyond remembering and recalling information and move deeper into application, analysis, synthesis, evaluation, and creation—the levels of thinking that your professors have in mind when they are designing exams and paper assignments. Because it is in these higher levels of thinking that our brains truly and deeply learn information, it’s important that you integrate higher order thinking into your study habits.

The following categories can help you assess your comprehension of readings, lecture notes, and other course materials. By creating and answering questions from a variety of categories, you can better anticipate and prepare for all types of exam questions. As you learn and study, start by asking yourself questions and using study methods from the level of remembering. Then, move progressively through the levels to push your understanding deeper—making your studying more meaningful and improving your long-term retention.

Level 1: Remember

This level helps us recall foundational or factual information: names, dates, formulas, definitions, components, or methods.

Level 2: Understand

Understanding means that we can explain main ideas and concepts and make meaning by interpreting, classifying, summarizing, inferring, comparing, and explaining.

Level 3: Apply

Application allows us to recognize or use concepts in real-world situations and to address when, where, or how to employ methods and ideas.

Level 4: Analyze

Analysis means breaking a topic or idea into components or examining a subject from different perspectives. It helps us see how the “whole” is created from the “parts.” It’s easy to miss the big picture by getting stuck at a lower level of thinking and simply remembering individual facts without seeing how they are connected. Analysis helps reveal the connections between facts.

Level 5: Synthesize

Synthesizing means considering individual elements together for the purpose of drawing conclusions, identifying themes, or determining common elements. Here you want to shift from “parts” to “whole.”

Level 6: Evaluate

Evaluating means making judgments about something based on criteria and standards. This requires checking and critiquing an argument or concept to form an opinion about its value. Often there is not a clear or correct answer to this type of question. Rather, it’s about making a judgment and supporting it with reasons and evidence.

Level 7: Create

Creating involves putting elements together to form a coherent or functional whole. Creating includes reorganizing elements into a new pattern or structure through planning. This is the highest and most advanced level of Bloom’s Taxonomy.

Pairing Bloom’s Taxonomy with other effective study strategies

While higher order thinking is an excellent way to approach learning new information and studying, you should pair it with other effective study strategies. Check out some of these links to read up on other tools and strategies you can try:

  • Study Smarter, Not Harder
  • Simple Study Template
  • Using Concept Maps
  • Group Study
  • Evidence-Based Study Strategies Video
  • Memory Tips Video
  • All of our resources

Other UNC resources

If you’d like some individual assistance using higher order questions (or with anything regarding your academic success), check out some of your UNC resources:

  • Academic Coaching: Make an appointment with an academic coach at the Learning Center to discuss your study habits one-on-one.
  • Office Hours : Make an appointment with your professor or TA to discuss course material and how to be successful in the class.

Works consulted

Anderson, L. W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Wittrock, M.C (2001). A taxonomy of learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman.

“Bloom’s Taxonomy.” University of Waterloo. Retrieved from https://uwaterloo.ca/centre-for-teaching-excellence/teaching-resources/teaching-tips/planning-courses-and-assignments/course-design/blooms-taxonomy

“Bloom’s Taxonomy.” Retrieved from http://www.bloomstaxonomy.org/Blooms%20Taxonomy%20questions.pdf

Overbaugh, R., and Schultz, L. (n.d.). “Image of two versions of Bloom’s Taxonomy.” Norfolk, VA: Old Dominion University. Retrieved from https://www.odu.edu/content/dam/odu/col-dept/teaching-learning/docs/blooms-taxonomy-handout.pdf

Creative Commons License

If you enjoy using our handouts, we appreciate contributions of acknowledgement.

Make a Gift

higher level critical thinking questions

Higher Order Thinking Questions for Your Next Lesson

12 Min Read  •  21st Century Skills

Higher order thinking questions help students explore and express rigor in their application of knowledge. There are 5 main areas of higher order thinking that promote rigor:

Higher Level Thinking

Deep inquiry.

  • Demonstration and

Quality Over Quantity

Each of these areas encourage students to move beyond rote knowledge and to expand their thinking process. Let’s explore each in more depth.

Higher level thinking is simply taking our students to the next level by pushing for more than simple recall or comprehension. There are many resources for higher level thinking.  Costa’s Levels of Questioning, Bloom’s Taxonomy and Webb’s Depth of Knowledge are two common references for building higher level thinking. Take a look at this Higher Order Thinking Chart to help you organize these methods and see how to apply them in your own lessons:

Higher Order Thinking Questions Reference

DOWNLOAD THE PDF

All 3 of these methodologies provide building blocks for increasing the level of thinking.  Creating opportunities for students to work within the recalling and remembering level is relatively simple because we are asking students to identify or recall information. However, moving to the higher levels things become a little more difficult.  Here’s a basic list of higher order thinking questions to get your started. However, let’s take a look at how to do this specifically within the STEAM areas.

Webb (2002) offers some of the following activities for using higher levels in science.

DOK Level 1

Recall or recognize a fact, term, or property. Represent in words or diagrams a scientific concept or relationship. Provide or recognize a standard scientific representation for a simple phenomenon. Perform a routine procedure such as measuring length.

DOK Level 2

Specify and explain the relationship between facts, terms, properties, or variables. Describe and explain examples and non-examples of science concepts. Select a procedure according to specified criteria, and perform it. Formulate a routine problem given data and conditions. Organize, represent and interpret data.

DOK Level 3

Identify research questions and design investigations for a scientific problem. Solve non-routine problems, then develop a scientific model for a complex situation. And finally, form conclusions from experimental data.

DOK Level 4

Based on provided data from a complex experiment that is novel to the student, deduct the fundamental relationship between several controlled variables.  Conduct an investigation, from specifying a problem to designing and carrying out an experiment, to analyzing its data and forming conclusions.

The SBBC Department of Instructional Technology has developed a comprehensive chart of both teacher-directed and student-directed activities pushing students to higher-level thinking skills.  

Engineering

Engineering standards are embedded within the next generation science standards and are engineered with higher-level thinking in mind.  The objectives of secondary education engineering are already designed with the Depth of Knowledge levels:

  • Defining and delimiting engineering problems involves stating the problem as clearly as possible in terms of criteria for success, and constraints or limits.
  • Designing solutions to engineering problems begins with generating a number of different possible solutions, Then, evaluating potential solutions to see which ones best meet the criteria and constraints of the problem.
  • Optimizing the design solution involves a process in which solutions are systematically tested, and the final design is improved by trading off less important features for those more important.

Similar to many of the STEAM subjects, the arts push students to higher levels due to the nature of artistic creation.  Gerald Aungst designed a wonderful reference chart providing concrete examples of how each of the arts can utilize the higher levels of Depth of Knowledge .  

Mathematics

The Kentucky Department of Education has a great resource using Webb’s Depth of Knowledge for building higher-level thinking in Mathematics . Examples include:

Identify a diagonal in a geometric figure. Multiply two numbers. Find the area of a rectangle. Convert scientific notation to decimal form. Measure an angle.

Classify quadrilaterals. Compare two sets of data using the mean, median, and mode of each set. Determine a strategy to estimate the number of jellybeans in a jar. Extend a geometric pattern. Organize a set of data and construct an appropriate display.

Write a mathematical rule for a non-routine pattern. Explain how changes in the dimensions affect the area and perimeter/circumference of geometric figures. Determine the equations and solve and interpret a system of equations for a given problem. Provide a mathematical justification when a situation has more than one possible outcome. Interpret information from a series of data displays.

Collect data over time taking into consideration a number of variables and analyze the results. Model a social studies situation with many alternatives and select one approach to solve with a mathematical model. Develop a rule for a complex pattern and find a phenomenon that exhibits that behavior. Complete a unit of formal geometric constructions, such as nine-point circles or the Euler line. Construct a non-Euclidean geometry.

Often confused with fun , engagement is the presence of all student minds hard at work.   Ensuring that all student voices are heard and all students are a part of the learning process.

The G lossary of Education Reform defines engagement as: “the degree of attention, curiosity, interest, optimism, and passion that students show when they are learning or being taught, which extends to the level of motivation they have to learn and progress in their education.”  The words that stand out most to me are curiosity and interest.  If we foster curiosity, then attention, optimism, and passion will follow.

STEAM content areas inherently encourage engagement, and below is a compilation of resources that build R igor through E ngagement in the STEAM classrooms.

Science provides many opportunities for engagement through experimentation and labs. But, remember to use focused notes strategies for engagement while presenting information prior to the hands on experiments.  Life Sciences Education offers Biology-specific strategies for student engagement in Structure Matters: Twenty-One Teaching Strategies to Promote Student Engagement and Cultivate Classroom Equity

Technology in, and of itself fosters engagement.  There are so many tools available for our students and teachers.  Thomas Murray presents this list of ways to use technology as a tool of engagement in the classroom.

Teach Engineering has a great chart of engagement activitie s to use in the engineering classroom. Nova Teachers also provides a long list of engaging, student centered activities for engineering in Classroom Activities.   

The Perpich Professional Development and Resource Center has documented engaging activities for six arts disciplines spanning kindergarten to high school.  

Edutopia provides insight on motivating our students in mathematics in 9 Strategies for Motivating Students in Mathematics .  The ASCD provides a quick start guide to Common Core math in Unlocking Engagement through Mathematical Discourse .

Inquiry and curiosity, the original purpose of education, is often pushed aside for test prep through breadth not depth.  Rigor encourages curiosity, and curiosity spawns inquiry , allowing for a more in-depth look at topics and content.

Merriam-Webster defines inquiry as: a request for information. An official effort to collect and examine information about something, and the act of asking questions in order to collect information.  Translating this into the classroom may seem easy, but there is more to inquiry than simply getting students to ask questions.  Thirteen.org offers comprehensive overviews of bringing the inquiry process into the classroom.

One key piece of the inquiry process is in asking effective questions. How do you do that? 

1. Identify your essential question

First, we must identify the “big idea.” What is the larger question around the piece of art your students are engaging with? It’s time to think beyond your lesson plan! Is the true, essential objective of your lesson that students demonstrate that they know that Georges Seurat painted “A Sunday on La Grande Jette” using a technique called pointillism through identification and the development of a matching product. Or, is there something bigger? The essential questions for National Core Visual Arts Standard 1.2 read:

“How does knowing the context histories, and traditions of art forms help us create works of art and design? Why do artists follow or break from established tradition? How do artists determine what resources are needed to formulate artistic investigation?”

Outline some big ideas and essential questions for your content area that encourage creative, artful thinking can serve to guide you this year.

2. Build an effective questioning toolkit.

This is a great time to look at the essential questions built right into the National Core Arts Standards. And, begin developing some lines of effective questioning helping students meet those standards. What kinds of questions will you ask to encourage inquiry around a piece of art, music, theatre, or dance? How will you guide students to the big idea with smaller questions?

3. Give wait time.

When time is at a premium, it’s easy to forget to do this. However, giving students moments of thoughtful silence to formulate their own observation, ideas, hypotheses, and opinions is crucial to developing artistic minds. Every student should have time to think individually before discussion, so that they all have something to share. Challenge yourself to give your students just a little bit longer this year!

4. Allow opportunities for all students to engage.

This might mean giving students time to turn and talk with a partner. It might mean instituting a “no hands up” policy allowing you to choose, who will respond. This gives students the opportunity to continue thinking while responses are made. Encouraging discussion among all students is difficult to do within time constraints, but it is vitally important to ensure that every child is thinking critically and artfully.

5. Dig deeper.

Follow up student responses in a way that encourages deeper thinking. Ask students to explain their thinking using support and evidence from the piece of art. This is a standard and a skill that crosses all curricular lines, so encouraging this, we are achieving standards in every content area. What better use of time is there than this?

Demonstration

Actions speak louder than words in all areas of life, and education is no different.  Being able to recall and regurgitate rote information was helpful in the pre-google era, but now we need our students to show us they understand, not just tell us.

The foundation of Demonstration is the age-old mantra: don’t tell me… show me.  Beyond showing, if we have students demonstrate through real-world application we can engage them even more. We can then provide a rigorous platform for their knowledge.

Through experiments, science naturally promotes demonstration.  However, if we involve real-world applications, it becomes exciting and engaging.  Have students solve realistic problems with limited resources, or propose solutions to issues on a global scale.

Technology offers many opportunities for students to demonstrate their knowledge through real-life application.  Using Project-Based Learning in the technology classroom creates engaging lessons with rigorous application of demonstration.  The following sites offer ideas for demonstration through technology projects:

Bringing Real-World Project Management into Technology Lessons

Top 10 Innovative Projects

20 Ideas for Engaging Projects

Engineering uses the design process to build and create, which is innately demonstration.  However, if you have students determine their own projects they will be engaged and excited about demonstrating their knowledge.  Have students journal issues they encounter for one week.  Then, have them choose one to solve by building/creating a product that helps solve the issue.  Check out these sites for engaging engineering projects:

100 Engineering Projects for Kids

Purdue EPICS High School Projects

Hands-On Engineering STEM Projects

The arts are built on creation, but often it is the teacher demonstrating, and the students mimicking the process.  We teach a dance, or a piece of music and the students copy.  Have students demonstrate their knowledge of skills by actually having them do the creation:

High School Art Lessons

Arts Integration Lessons

Artsonia Art Lessons

Math is an area where it is difficult to step away from the traditional methods of instruction.  The teacher demonstrates, the class practices as a whole, and then practices individually.  What would happen if we taught math through projects and allowed students to demonstrate what they know? Here are a couple of sites that bring demonstration into the math classroom:

NASA’s Exploring Math

Authentic Assessment Examples

Demonstration is a great way to bring engagement through rigor into the classroom.  Don’t be afraid to share objectives, standards, and goals with students to have them determine how best they can demonstrate their knowledge.

Rigor does not mean more – it means better .  Students don’t need more work they need better work.  Furthermore, they need exciting work that makes them want to work.

When teachers are asked why we provide curriculum, units, lesson plans, and homework, the answers that come back are often “I’m not sure.” 

Here are some helpful ways to focus on the quality of instruction, rather than simply the quantity of what’s provided:

When mapping out your semester, or yearly curriculum, work backwards.  Take a look at the standards you plan to cover, and think project-based when you ask yourself how students should demonstrate the standard(s).  Keep the big picture in mind when you finally create the project. As you design the major units needed in order to accomplish the big picture, continue to ask why .

Continue working backward as you move into the larger units of your curriculum.  What important information do students need to know or understand in order to achieve the big picture?  How and what will students do in order to prove they have learned?

As you begin designing the day-to-day lessons, keep asking why .  Everything from the opening activity to the exit slip, make sure you are asking why .  Don’t do something because you think you are supposed to, be sure each and every activity/task has a purpose.  Not only you, but also your students, should know why the activities are being completed and the overall purpose.

Homework is probably the largest area where quality over quantity needs to be investigated.  Why do we give homework: because we are supposed to, because our teachers gave us homework, because it helps. But, does it really?  Alfie Kohn provided 8 conclusions in his 2006 book The Homework Myth:

  • At best, most homework studies show only an association, not a causal relationship.
  • Do we really know how much homework kids do?
  • Homework studies confuse grades and test scores with learning.
  • Homework matters less the longer you look.
  • Even where they do exist, positive effects are often quite small.
  • There is no evidence of any academic benefit from homework in elementary school.
  • The results of national and international exams raise further doubts about homework’s role.
  • Incidental research raises further doubts about homework.

So as you assign homework, keep asking yourself Why ?  If you are assigning it because you think you have to , then stop.

As you continue to work through lesson planning, curriculum design, and providing high-quality instruction, keep in mind these examples of higher-order thinking questions and examples. The more we engage students in rigorous and purposeful content that encourages inquiry and critical thinking, the more they will be prepared for the 21st century.

Learn how to successfully integrate the arts in any classroom.

Join 65,000+ K-12 educators receiving creative inspiration, free tools, and practical tips once per month in the SmART Ideas Digest.

higher level critical thinking questions

TRAININGS Courses Micro-Credentials Conference Membership Micro Credentials Certification State PD Acceptance Full 2023-2024 Catalog

SUPPLEMENTS Books Free Teacher Resources Find Funding Using ESSR Funds Research Consulting and Speaking

COMPANY About Us Press Accreditation Careers Download a Free Toolkit Privacy Policy Terms of Service

SUPPORT The Institute for Arts Integration & STEAM PO Box 2622 Westminster, MD 21158 Main: 443-821-1089 Sales: 443-293-5851 Help Center Email Us

higher level critical thinking questions

Copyright 2010-2024 The Vision Board, LLC | All Rights Reserved

higher level critical thinking questions

💻 Upcoming Live Webinar With Cofounder John Hollingsworth | Join Us On Thursday, 5/9 at 10AM PDT | Learn More

  • Staff Portal
  • Consultant Portal

Toll Free 1-800-495-1550

Local 559-834-2449

Dataworks Educational Research

  • Articles & Books
  • Explicit Direct Instruction
  • Student Engagement
  • Checking for Understanding
  • ELD Instruction
  • About Our Company
  • DataWORKS as an EMO
  • About Our Professional Development
  • English Learner PD
  • Schedule a Webinar
  • Higher-order Questions
  • Classroom Strategy

higher level critical thinking questions

After reading The Diary of Anne Frank , a student is asked, “ Who is Anne Frank ?” To answer the question, the student simply recalls the information he or she memorized from the reading.

With the implementation of Common Core, students are expected to become critical thinkers instead of just recalling facts and ideas from text. In order for students to reach this potential and be prepared for success, educators must engage students during instruction by asking higher-order questions.

Higher-order Questions (HOQ)

Higher-order questions are those that the students cannot answer just by simple recollection or by reading the information “ verbatim ” from the text. Higher-order questions put advanced cognitive demand on students. They encourage students to think beyond literal questions.

Higher-order questions promote critical thinking skills because these types of questions expect students to apply, analyze, synthesize, and evaluate information instead of simply recalling facts. For instance, application questions require students to transfer knowledge learned in one context to another; analysis questions expect students to break the whole into component parts such as analyze mood, setting, characters, express opinions, make inferences, and draw conclusions; synthesis questions have students use old ideas to create new ones using information from a variety of sources; and evaluation questions require students to make judgments, explain reasons for judgments, compare and contrast information, and develop reasoning using evidence from the text.

Higher-order Questions Research

According to research, teachers who effectively use a variety of higher-order questions can overcome the brain’s natural tendency to develop mental routines and patterns to limit information, which is called neural pruning . As a result, student’s brains may become more open-minded, which strengthens the brain.

According to an article in Educational Leadership (March 1997), researchers Thomas Cardellichio and Wendy Field discovered that higher-order questions increase neural branching , the opposite of neural pruning. In addition, these researchers found that teachers can promote the process of neural branching through seven types of questions.

  • Hypothetical thinking . This form of thinking is used to create new information. It causes a person to develop an answer based on generalizations related to that situation. These questions follow general forms such as What if this happened ? What if this were not true ?, etc.
  • Reversal thinking . This type of thinking expects students to turn a question around and look for opposite ideas. For example, What happens if I reverse the addends in a math problem ? What caused this? How does it change if I go backward ?, etc.
  • Application of different symbol systems . This way of thinking is to apply a symbol system to a situation for which it is not usually used, such as writing a math equation to show how animal interaction is related.
  • Analogy . This process of thinking is to compare unrelated situations such as how is the Pythagorean Theorem related to cooking. These questions typically ask How is this like ___?
  • Analysis of point of view . This way of thinking requires students to consider and question other people’s perspective, belief, or opinion in order to extend their minds. For instance, a teacher may ask a student, What else could account for this ? or How many other ways could someone look at this ?
  • Completion . This form of thinking requires students to finish an incomplete project or situation that would normally be completed. For example, removing the end of a story and expecting the students to create their own ending.
  • Web analysis . With web analysis, students must synthesize how events are related in complex ways instead of simply relying on the brain’s natural ability to develop a simple pattern. For example, How extensive were the effects of _____? Or Track the relationship of events following from ___ are types of web analysis questions.

The researchers concluded that this type of questioning can lead to better critical thinking skills. “ They can analyze, synthesize, evaluate, and interpret the text they are reading at complex levels. They can process text at deep levels, make judgments, and detect shades of meaning. They can make critical interpretations and demonstrate high levels of insight and sophistication in their thinking. They are able to make inferences, draw relevant and insightful conclusions, use their knowledge in new situations, and relate their thinking to other situations and to their own background knowledge. These students fare well on standardized tests and are considered to be advanced. They will indeed be prepared to function as outstanding workers and contributors in a fast-paced workplace where the emphasis is on using information rather than just knowing facts .”

Higher-order Questions and Explicit Direct Instruction

The Explicit Direct Instruction (EDI) model incorporates a variety of higher-order questions in order to encourage and increase critical thinking skills.

The LEARNING OBJECTIVE component in EDI is the only question that is at a low level of Bloom’s Taxonomy. The reason for this is because the content during this portion of the lesson is not at a high level. Also, the students have not been taught the high-level content. Typically, the question asked to students is “ What are we doing today ? or What is our Learning Objective ?”

The CONCEPT DEVELOPMENT component includes a variety of higher-order concept-related questions because the content is at a high level. Here is a list of higher-order questions that are asked during this EDI component:

  • In your own words, what is (insert the concept being taught)?
  • Which is an example of ________? Why?
  • What is the difference between the example and the non-example?
  • Why is this an example of ______?
  • Give me an example of ______.
  • Draw an example of ______.
  • Match the examples to the definition of ______.
  • Which picture/poster shows an example of _______?

The SKILL DEVELOPMENT component asks higher-level thinking-process questions after modeling the skill.

  • How did I know how to (insert skill modeled)?
  • How did I know that this was the correct answer?
  • How did I use to ensure that I knew how to find the _____?
  • How did I know how to interpret the answer?

The GUIDED PRACTICE asks higher-level process questions that require the students to show their thought process when performing the skill.

  • How did you know how to __________?
  • How did you know that this was the correct answer?
  • How did you use to ensure that you knew how to find the _____?
  • How did you know how to interpret the answer?
  • Which steps was most difficult for you? Why?

The RELEVANCE component includes higher-level evaluation questions.

  • Does anyone have any other reason as to why this is important?
  • Which reason is the most relevant to you? Why?

The CLOSURE component includes high-level questions such as:

  • What did you learn today?
  • How did the lesson meet the Learning Objective ?
  • How will this lesson benefit you in the future ?

If higher-order questions promote critical thinking skills, as research shows, then higher-order questions should be included throughout instruction. The EDI model offers a good way to do just that!

Educational Leadership, Seven Strategies That Encourage Neural Branching , March 1997

How do you incorporate higher-order questions during instruction?

Additional Resources:

  • Educeri; Ready Made Lessons with Higher-Order Questions
  • The Secret to Differentiation with EDI: Making Better Decisions at Choice Points
  • How to Differentiate and Scaffold – after teaching including RTI
  • How to Differentiate and Scaffold – Before the Lesson
  • How to Differentiate and Scaffold – while teaching
  • Differentiation Strategies: Teaching Grade-Level Content to ALL Students
  • Explicit Direct Instruction (EDI) Resource Page

Author:  Patricia Bogdanovich

Patricia has held various positions with DataWORKS since 2002. She currently works as a Curriculum Specialist. Patricia helped develop and create many of the early resources and workshops designed by DataWORKS, and she is an expert in analysis of standards. Patricia plans to blog about curriculum and assessments for CCSS and NGSS, classroom strategies, and news and research from the world of education.

Related posts

classroom-observations

Leave a Reply Cancel reply

You must be logged in to post a comment.

  • Our Mission

How to Lead Students to Engage in Higher Order Thinking

Asking students a series of essential questions at the start of a course signals that deep engagement is a requirement.

Teacher kneeling down working with two high school students sitting at their desks

I teach multigrade, theme-based courses like Spirituality in Literature and The Natural World in Literature to high school sophomores, juniors, and seniors. And like most English language arts teachers, I’ve taught courses built around the organizing principles of genre (Introduction to Drama), time period and geography (American Literature From 1950), and even assessment instrument (A.P. Literature).

No matter what conceptual framework guides the course I’m teaching, though, I begin and anchor it with what I call a thinking inventory.

Thinking Inventories and Essential Questions

Essential questions—a staple of project-based learning—call on students’ higher order thinking and connect their lived experience with important texts and ideas. A thinking inventory is a carefully curated set of about 10 essential questions of various types, and completing one the first thing I ask students to do in every course I teach.

Although a thinking inventory is made up of questions, it’s more than a questionnaire. When we say we’re “taking inventory”—whether we’re in a warehouse or a relationship—we mean we’re taking stock of where things stand at a given moment in time, with the understanding that those things are fluid and provisional. With a thinking inventory, we’re taking stock of students’ thinking, experiences, and sense-making at the beginning of the course.

A well-designed thinking inventory formalizes the essential questions of any course and serves as a touchpoint for both teacher and students throughout that course. For a teacher, writing a course’s thinking inventory can help separate the essential from the nonessential when planning. And starting your class with a thinking inventory signals to students that higher order thinking is both required and valued.

How to Design an Effective Thinking Inventory

I tell students the thinking inventory is a document we’ll be living with—revisiting and referring to often—and that they should spend time mulling their answers before writing them down. The inventory should include a variety of essential questions, including ones that invite students to share relevant experiences.

I may ask students about their current knowledge base or life experience (What’s the best example of empathy you’ve ever witnessed?). I may ask them to make predictions or imagine scenarios (How will an American Literature course in 100 years look different from today’s American Literature course?). Or I may ask perennial questions (To what extent is it possible for human beings to change fundamentally?).

Here are a few of the questions I asked students to address at the start of a course called The Outsider in Literature:

  • Who is the most visionary person you know? How do you know they’re visionary? Is there anything about them you want to emulate? Anything about them that frightens you?
  • What are the risks of rebelling? Of not rebelling?  Explain.
  • What would happen if there were no outsiders? How would the world, and your world, be different?
  • Do you think there are any ongoing conflicts between groups that are intractable—that will likely never be resolved? What is the root of the intractability? What would need to happen in order to resolve the conflict? Be specific.
  • Who is the most deviant, threatening outsider you can think of? Tell us what makes them threatening.
  • To what extent do you think that teenagers, as a group, are (by definition) outsiders?

How I Use Thinking Inventories

On the first day of class, I give students the inventory for homework. Because I expect well-thought-out answers and generative thinking, I assign it in chunks over two nights, and we spend at least the second and third class meetings discussing their answers.

Throughout the course, I use the inventory both implicitly and explicitly. I purposefully weave inventory questions into discussions and student writing prompts. More explicitly, I use inventory questions as a framework for pre- and post-reading activities, and as prompts for reading responses, formal writing, and journaling.

The inventory functions as a kind of time stamp that documents each student’s habits of mind, opinions, and ways of framing experience at the start of the year or semester. At the midpoint and at the end of the course, I have students return to their inventory, choose a question they’d now answer differently, and reflect on why and how their thinking has changed.

The Inventory as a Bridge Between Students and Content

By including a variety of essential questions (practical and experiential, conceptual and theoretical) and making a course’s aims explicit, the inventory invites all students into the conversation and the material from day one. It gives a deep thinker with slower processing speed or attention-deficit/hyperactivity disorder, for example, time to orient themselves to the course’s core questions. Meanwhile, the inventory challenges students who see themselves as high achievers to respond authentically to thorny questions that have no right answers.

In addition, using a thinking inventory models how to ask good questions; gives introverts and anxious students an entry point because cold calling becomes warmer (I can ask, “What did you say on your inventory?”); and cultivates a community of learners connected by real, worthwhile inquiry and communal discourse.

Recently, a student reflecting on his inventory at the end of a course wrote that he was taken aback by how intolerant of “loser characters” he’d seemed just a few months prior on his inventory. He noted that he’d been through some upheaval since then. And he ended his paper with the observation that empathy—for people and characters—grows “when you know their backstory.”

Our websites may use cookies to personalize and enhance your experience. By continuing without changing your cookie settings, you agree to this collection. For more information, please see our University Websites Privacy Notice .

Center for Excellence in Teaching and Learning

  • Critical Thinking and other Higher-Order Thinking Skills

Critical thinking is a higher-order thinking skill. Higher-order thinking skills go beyond basic observation of facts and memorization. They are what we are talking about when we want our students to be evaluative, creative and innovative.

When most people think of critical thinking, they think that their words (or the words of others) are supposed to get “criticized” and torn apart in argument, when in fact all it means is that they are criteria-based. These criteria require that we distinguish fact from fiction; synthesize and evaluate information; and clearly communicate, solve problems and discover truths.

Why is Critical Thinking important in teaching?

According to Paul and Elder (2007), “Much of our thinking, left to itself, is biased, distorted, partial, uninformed or down-right prejudiced.  Yet the quality of our life and that of which we produce, make, or build depends precisely on the quality of our thought.”  Critical thinking is therefore the foundation of a strong education.

higher level critical thinking questions

Using Bloom’s Taxonomy of thinking skills, the goal is to move students from lower- to higher-order thinking:

  • from knowledge (information gathering) to comprehension (confirming)
  • from application (making use of knowledge) to analysis (taking information apart)
  • from evaluation (judging the outcome) to synthesis (putting information together) and creative generation

This provides students with the skills and motivation to become innovative producers of goods, services, and ideas.  This does not have to be a linear process but can move back and forth, and skip steps.

How do I incorporate critical thinking into my course?

The place to begin, and most obvious space to embed critical thinking in a syllabus, is with student-learning objectives/outcomes.  A well-designed course aligns everything else—all the activities, assignments, and assessments—with those core learning outcomes.

higher level critical thinking questions

Learning outcomes contain an action (verb) and an object (noun), and often start with, “Student’s will....” Bloom’s taxonomy can help you to choose appropriate verbs to clearly state what you want students to exit the course doing, and at what level.

  • Students will define the principle components of the water cycle. (This is an example of a lower-order thinking skill.)
  • Students will evaluate how increased/decreased global temperatures will affect the components of the water cycle. (This is an example of a higher-order thinking skill.)

Both of the above examples are about the water cycle and both require the foundational knowledge that form the “facts” of what makes up the water cycle, but the second objective goes beyond facts to an actual understanding, application and evaluation of the water cycle.

Using a tool such as Bloom’s Taxonomy to set learning outcomes helps to prevent vague, non-evaluative expectations. It forces us to think about what we mean when we say, “Students will learn…”  What is learning; how do we know they are learning?

higher level critical thinking questions

The Best Resources For Helping Teachers Use Bloom’s Taxonomy In The Classroom by Larry Ferlazzo

Consider designing class activities, assignments, and assessments—as well as student-learning outcomes—using Bloom’s Taxonomy as a guide.

The Socratic style of questioning encourages critical thinking.  Socratic questioning  “is systematic method of disciplined questioning that can be used to explore complex ideas, to get to the truth of things, to open up issues and problems, to uncover assumptions, to analyze concepts, to distinguish what we know from what we don’t know, and to follow out logical implications of thought” (Paul and Elder 2007).

Socratic questioning is most frequently employed in the form of scheduled discussions about assigned material, but it can be used on a daily basis by incorporating the questioning process into your daily interactions with students.

In teaching, Paul and Elder (2007) give at least two fundamental purposes to Socratic questioning:

  • To deeply explore student thinking, helping students begin to distinguish what they do and do not know or understand, and to develop intellectual humility in the process
  • To foster students’ abilities to ask probing questions, helping students acquire the powerful tools of dialog, so that they can use these tools in everyday life (in questioning themselves and others)

How do I assess the development of critical thinking in my students?

If the course is carefully designed around student-learning outcomes, and some of those outcomes have a strong critical-thinking component, then final assessment of your students’ success at achieving the outcomes will be evidence of their ability to think critically.  Thus, a multiple-choice exam might suffice to assess lower-order levels of “knowing,” while a project or demonstration might be required to evaluate synthesis of knowledge or creation of new understanding.

Critical thinking is not an “add on,” but an integral part of a course.

  • Make critical thinking deliberate and intentional in your courses—have it in mind as you design or redesign all facets of the course
  • Many students are unfamiliar with this approach and are more comfortable with a simple quest for correct answers, so take some class time to talk with students about the need to think critically and creatively in your course; identify what critical thinking entail, what it looks like, and how it will be assessed.

Additional Resources

  • Barell, John. Teaching for Thoughtfulness: Classroom Strategies to Enhance Intellectual Development . Longman, 1991.
  • Brookfield, Stephen D. Teaching for Critical Thinking: Tools and Techniques to Help Students Question Their Assumptions . Jossey-Bass, 2012.
  • Elder, Linda and Richard Paul. 30 Days to Better Thinking and Better Living through Critical Thinking . FT Press, 2012.
  • Fasko, Jr., Daniel, ed. Critical Thinking and Reasoning: Current Research, Theory, and Practice . Hampton Press, 2003.
  • Fisher, Alec. Critical Thinking: An Introduction . Cambridge University Press, 2011.
  • Paul, Richard and Linda Elder. Critical Thinking: Learn the Tools the Best Thinkers Use . Pearson Prentice Hall, 2006.
  • Faculty Focus article, A Syllabus Tip: Embed Big Questions
  • The Critical Thinking Community
  • The Critical Thinking Community’s The Thinker’s Guides Series and The Art of Socratic Questioning

Quick Links

  • Developing Learning Objectives
  • Creating Your Syllabus
  • Active Learning
  • Service Learning
  • Case Based Learning
  • Group and Team Based Learning
  • Integrating Technology in the Classroom
  • Effective PowerPoint Design
  • Hybrid and Hybrid Limited Course Design
  • Online Course Design

higher level critical thinking questions

Consult with our CETL Professionals

Consultation services are available to all UConn faculty at all campuses at no charge.

Chap. 2: Critical Thinking

Critical thinking and higher order levels of cognition (thinking), critical thinking in college.

Most of the reading and writing that you will do in college will require you to move beyond remembering and understanding material. You will be required to apply what you learn and create something new (i.e. and essay or project), and evaluate and analyze texts or information. All of these activities are higher order levels of thinking that require students to move beyond the lower two levels: remembering and understanding.

Bloom's Taxonomy

To view a clearer image of the chart above, go to this website: Bloom’s Taxonomy  

Using Bloom’s Taxonomy for Effective Learning

Adapted from the article written by Beth Lewis

The hierarchy of Bloom’s Taxonomy is the widely accepted framework through which all teachers should guide their students through the cognitive learning process. In other words, teachers use this framework to focus on higher-order thinking skills.

You can think of Bloom’s Taxonomy as a pyramid, with simple knowledge-based recall questions at the base. Building up through this foundation, you can ask your students increasingly challenging questions to test their comprehension of a given material.

By asking these critical thinking questions or higher-order questions, you are developing all levels of thinking. Students will have improved attention to detail, as well as an increase in their comprehension and problem-solving skills.

There are six levels in the framework, here is a brief look at each of them and a few examples of the questions that you would ask for each component.

  • Remember : Recognizing and recalling facts:   In this level students are asked questions to see whether they have remembered key information from a lesson and/or reading assignment. (What is… Where is… How would you describe?)
  • Understand :   Understanding what the facts mean : During this level, students will be asked to interpret facts that they learned. (What is the main idea… How would you summarize?)
  • Apply :   Applying the facts, rules, concepts and ideas:   Questions asked during this level are meant to have students apply or use the knowledge learned during the lesson. (How would you use… How would you solve it?)
  • Analyze :  Breaking down the information into component parts : In the analysis level, students will be required to go beyond knowledge and see if they can analyze a problem. (What is the theme… How would you classify?)
  • Evaluate :  Judging the value of information and ideas: Evaluating information is where students are expected to assess the information learned and come to a conclusion about it. (What is your opinion of…how would you evaluate… How would you select… What data was used?)
  • Create :  Combining parts to make a new whole: The highest level of critical thinking involves creating new or original work.

Lewis, Beth. “Using Bloom’s Taxonomy for Effective Learning.” ThoughtCo, Feb. 11, 2020, thoughtco.com/blooms-taxonomy-the-incredible-teaching-tool-2081869.

Footer Logo Lumen Candela

Privacy Policy

  • Teaching Tips

Bloom’s Taxonomy Question Stems For Use In Assessment [With 100+ Examples]

This comprehensive list of pre-created Bloom’s taxonomy question stems ensure students are critically engaging with course material

' src=

Jacob Rutka

Bloom’s Taxonomy Question Stems For Use In Assessment [With 100+ Examples]

One of the most powerful aspects of Bloom’s Taxonomy is that it offers you, as an educator, the ability to construct a curriculum to assess objective learning outcomes, including advanced educational objectives like critical thinking. Pre-created Bloom’s Taxonomy questions can also make planning discussions, learning activities, and formative assessments much easier.

For those unfamiliar with Bloom’s Taxonomy, it consists of a series of hierarchical levels (normally arranged in a pyramid) that build on each other and progress towards higher-order thinking skills. Each level contains verbs, such as “demonstrate” or “design,” that can be measured to gain greater insight into student learning.

Click here to download 100+ Bloom’s taxonomy question stems for your classroom and get everything you need to engage your students.

Table of Contents

  • Bloom’s Taxonomy (1956)

Revised Bloom’s Taxonomy (2001)

Bloom’s taxonomy for adjunct professors, examples of bloom’s taxonomy question stems, additional bloom’s taxonomy example questions, higher-level thinking questions, bloom’s taxonomy (1956).

The original Bloom’s Taxonomy framework consists of six levels that build off of each other as the learning experience progresses. It was developed in 1956 by Benjamin Bloom, an American educational psychologist. Below are descriptions of each level:

  • Knowledge: Identification and recall of course concepts learned
  • Comprehension: Ability to grasp the meaning of the material 
  • Application: Demonstrating a grasp of the material at this level by solving problems and creating projects
  • Analysis: Finding patterns and trends in the course material
  • Synthesis: The combining of ideas or concepts to form a working theory 
  • Evaluation: Making judgments based on the information students have learned as well as their own insights

A group of educational researchers and cognitive psychologists developed the new and revised Bloom’s Taxonomy framework in 2001 to be more action-oriented. This way, students work their way through a series of verbs to meet learning objectives. Below are descriptions of each of the levels in revised Bloom’s Taxonomy:

  • Remember: To bring an awareness of the concept to learners’ minds.
  • Understand: To summarize or restate the information in a particular way.
  • Apply: The ability to use learned material in new and concrete situations.
  • Analyze: Understanding the underlying structure of knowledge to be able to distinguish between fact and opinion.
  • Evaluate: Making judgments about the value of ideas, theories, items and materials.
  • Create: Reorganizing concepts into new structures or patterns through generating, producing or planning.

Free Download: Bloom’s Taxonomy Question Stems and Examples

Bloom’s Taxonomy questions are a great way to build and design curriculum and lesson plans. They encourage the development of higher-order thinking and encourage students to engage in metacognition by thinking and reflecting on their own learning. In The Ultimate Guide to Bloom’s Taxonomy Question Stems , you can access more than 100 examples of Bloom’s Taxonomy questions examples and higher-order thinking question examples at all different levels of Bloom’s Taxonomy. 

Bloom’s Taxonomy (1956) question samples:

  • Knowledge: How many…? Who was it that…? Can you name the…? 
  • Comprehension: Can you write in your own words…? Can you write a brief outline…? What do you think could have happened next…?
  • Application: Choose the best statements that apply… Judge the effects of… What would result …? 
  • Analysis: Which events could have happened…? If … happened, how might the ending have been different? How was this similar to…?
  • Synthesis: Can you design a … to achieve …? Write a poem, song or creative presentation about…? Can you see a possible solution to…?
  • Evaluation: What criteria would you use to assess…? What data was used to evaluate…? How could you verify…?

Click here to get 100+ Bloom’s taxonomy question stems that’ll help engage students in your classroom.

Revised Bloom’s Taxonomy (2001) question samples:

  • Remember: Who…? What…? Where…? How…?
  • Understand: How would you generalize…? How would you express…? What information can you infer from…?
  • Apply: How would you demonstrate…? How would you present…? Draw a story map… 
  • Analyze: How can you sort the different parts…? What can you infer about…? What ideas validate…? How would you categorize…?
  • Evaluate: What criteria would you use to assess…? What sources could you use to verify…? What information would you use to prioritize…? What are the possible outcomes for…?
  • Create: What would happen if…? List the ways you can…? Can you brainstorm a better solution for…? 

As we know, Bloom’s Taxonomy is a framework used in education to categorize levels of cognitive learning. Here are 10 Bloom’s Taxonomy example questions, each corresponding to one of the six levels in Bloom’s Taxonomy, starting from the lowest level (Remember) to the highest level (Create):

  • Remember (Knowledge): What are the four primary states of matter? Can you list the main events of the American Civil War?
  • Understand (Comprehension): How would you explain the concept of supply and demand to someone who is new to economics? Can you summarize the main idea of the research article you just read?
  • Apply (Application): Given a real-world scenario, how would you use the Pythagorean theorem to solve a practical problem? Can you demonstrate how to conduct a chemical titration in a laboratory setting?
  • Analyze (Analysis): What are the key factors contributing to the decline of a particular species in an ecosystem? How do the social and economic factors influence voting patterns in a specific region?
  • Evaluate (Evaluation): Compare and contrast the strengths and weaknesses of two different programming languages for a specific project. Assess the effectiveness of a marketing campaign, providing recommendations for improvement.
  • Create (Synthesis): Design a new and innovative product that addresses a common problem in society. Develop a comprehensive lesson plan that incorporates various teaching methods to enhance student engagement in a particular subject.

Download Now: Bloom’s Taxonomy Question Stems and Examples

Higher-level thinking questions are designed to encourage critical thinking, analysis, and synthesis of information. Here are eight examples of higher-level thinking questions that can be used in higher education:

  • Critical Analysis (Analysis): “What are the ethical implications of the decision made by the characters in the novel, and how do they reflect broader societal values?”
  • Problem-Solving (Application): “Given the current environmental challenges, how can we develop sustainable energy solutions that balance economic and ecological concerns?”
  • Evaluation of Evidence (Evaluation): “Based on the data presented in this research paper, do you think the study’s conclusions are valid? Why or why not?”
  • Comparative Analysis (Analysis): “Compare and contrast the economic policies of two different countries and their impact on income inequality.”
  • Hypothetical Scenario (Synthesis): “Imagine you are the CEO of a multinational corporation. How would you navigate the challenges of globalization and cultural diversity in your company’s workforce?”
  • Ethical Dilemma (Evaluation): “In a medical emergency with limited resources, how should healthcare professionals prioritize patients, and what ethical principles should guide their decisions?”
  • Interdisciplinary Connection (Synthesis): “How can principles from psychology and sociology be integrated to address the mental health needs of a diverse student population in higher education institutions?”
  • Creative Problem-Solving (Synthesis): “Propose a novel solution to reduce urban congestion while promoting eco-friendly transportation options. What are the potential benefits and challenges of your solution?”

These questions encourage students to go beyond simple recall of facts and engage in critical thinking, analysis, synthesis, and ethical considerations. They are often used to stimulate class discussions, research projects, and written assignments in higher education settings.

Click here to download 100+ Bloom’s taxonomy question stems

Recommended Readings

higher level critical thinking questions

Educators In Conversation: How to Help Students ‘Do’ Sociology

higher level critical thinking questions

A 6-Step Exercise for Discussing AI In Education

Subscribe to the top hat blog.

Join more than 10,000 educators. Get articles with higher ed trends, teaching tips and expert advice delivered straight to your inbox.

Higher Level Thinking: Synthesis in Bloom's Taxonomy

Putting the Parts Together to Create New Meaning

  • Teaching Resources
  • An Introduction to Teaching
  • Tips & Strategies
  • Policies & Discipline
  • Community Involvement
  • School Administration
  • Technology in the Classroom
  • Teaching Adult Learners
  • Issues In Education
  • Becoming A Teacher
  • Assessments & Tests
  • Elementary Education
  • Secondary Education
  • Special Education
  • Homeschooling
  • M.Ed., Curriculum and Instruction, University of Florida
  • B.A., History, University of Florida

Bloom’s Taxonomy  (1956 ) was designed with six levels in order to promote higher order thinking. Synthesis was placed on the fifth level of the Bloom’s taxonomy pyramid as it requires students to infer relationships among sources. The high-level thinking of synthesis is evident when students put the parts or information they have reviewed as a whole in order to create new meaning or a new structure.

The Online Etymology Dictionary records the word synthesis as coming from two sources:

"Latin synthesis  meaning a "collection, set, suit of clothes, composition (of a medication)" and also from the Greek  synthesis  meaning "a composition, a putting together."

The dictionary also records the evolution of the use of synthesis to include "deductive reasoning" in 1610 and "a combination of parts into a whole" in 1733. Today's students may use a variety of sources when they combine parts into a whole. The sources for synthesis may include articles, fiction, posts, or infographics as well as non-written sources, such as films, lectures, audio recordings, or observations.

Types of Synthesis in Writing

Synthesis writing is a process in which a student makes the explicit connection between a thesis (the argument) and evidence from sources with similar or dissimilar ideas. Before synthesis can take place, however, the student must complete a careful examination or close reading of all source material. This is especially important before a student can draft a synthesis essay.

There are two types of synthesis essays:

  • A student may choose to use an explanatory synthesis essay in order to deconstruct or divide evidence into logical parts so that the essay is organized for readers. Explanatory synthesis essays usually include descriptions of objects, places, events or processes. Descriptions are written objectively because the explanatory synthesis does not present a position. The essay here has information gathered from the sources that the student places in a sequence or other logical manner.
  • In order to present a position or opinion, a student may choose to use an argumentative synthesis. The thesis or position of an argumentative essay is one that can be debated. A thesis or position in this essay can be supported with evidence taken from sources and is organized so that it can be presented in a logical manner. 

The introduction to either synthesis essay contains a one-sentence (thesis) statement that sums up the essay's focus and introduces the sources or texts that will be synthesized. Students should follow the citation guidelines in referencing the texts in the essay, which includes their title and author(s) and maybe a little context about the topic or background information. 

The body paragraphs of a synthesis essay can be organized using several different techniques separately or in combination. These techniques can include: using a summary, making comparisons and contrasts, providing examples, proposing cause and effect, or conceding opposing viewpoints. Each of these formats allows the student the chance to incorporate the source materials in either the explanatory or the argumentative synthesis essay.

The conclusion of a synthesis essay may remind readers of the key points or suggestions for further research. In the case of the argumentative synthesis essay, the conclusion answers the "so what" that was proposed in the thesis or may call for action from the reader.

Key Words for the Synthesis Category:

blend, categorize, compile, compose, create, design, develop, form, fuse, imagine, integrate, modify, originate, organize, plan, predict, propose, rearrange, reconstruct, reorganize, solve, summarize, test, theorize, unite.

Synthesis Question Stems With Examples

  • Can you develop a theory for the popularity of a text in English? 
  • Can you predict the outcome of behavior in Psychology I by using polls or exit slips?
  • How could you test the speed of a rubber-band car in physics if a test track is not available?
  • How would you adapt ingredients to create a healthier casserole in Nutrition 103 class?'
  • How could you change the plot of Shakespeare's Macbeth so it could be rated "G"?
  • Suppose you could blend iron with another element so that it could burn hotter?
  • What changes would you make to solve a linear equation if you could not use letters as variables?
  • Can you fuse Hawthorne's short story "The Minister's Black Veil" with a soundtrack?
  • Compose a nationalist song using percussion only.
  • If you rearrange the parts in the poem "The Road Not Taken", what would the last line be?

Synthesis Essay Prompt Examples

  • Can you propose a universal course of study in the use of social media that could be implemented across the United States?
  • What steps could be taken in order to minimize food waste from the school cafeteria?
  • What facts can you compile to determine if there has been an increase in racist behavior or an increase in awareness of racist behavior?
  • What could you design to wean young children off video games?
  • Can you think of an original way for schools to promote awareness of global warming or climate change?
  • How many ways can you use technology in the classroom to improve student understanding?
  • What criteria would you use to compare American Literature with English Literature?

Synthesis Performance Assessment Examples

  • Design a classroom that would support educational technology.
  • Create a new toy for teaching the American Revolution. Give it a name and plan a marketing campaign.
  • Write and present a news broadcast about a scientific discovery.
  • Propose a magazine cover for a famous artist using his or her work.
  • Make a mix tape for a character in a novel.
  • Hold an election for the most important element on the periodic table.
  • Put new words to a known melody in order to promote healthy habits.
  • Higher-Order Thinking Skills (HOTS) in Education
  • Bloom's Taxonomy - Application Category
  • Question Stems for Each Level of Bloom's Taxonomy
  • How to Write a Good Thesis Statement
  • Asking Better Questions With Bloom's Taxonomy
  • How to Construct a Bloom's Taxonomy Assessment
  • An Introduction to Academic Writing
  • Using Bloom's Taxonomy for Effective Learning
  • Beef Up Critical Thinking and Writing Skills: Comparison Essays
  • Bloom's Taxonomy in the Classroom
  • Definition and Examples of Analysis in Composition
  • What an Essay Is and How to Write One
  • Composition Type: Problem-Solution Essays
  • What Is Expository Writing?
  • How to Write a Solid Thesis Statement
  • Tips on How to Write an Argumentative Essay

McREL International

Higher-order questioning inspires higher-level thinking

higher level critical thinking questions

Teacher: You are floating down the Delaware River and you are seated behind George Washington. What do you hear, feel, smell, and see?

Students: I hear the waves crashing against the boat. I feel anxious and scared. I smell body odor. I see George’s white hair.

The next day, begin with a reminder of their imagined journey on the boat; then review and check for understanding. The students could have simply read a passage and answered questions about George Washington’s river crossing, but this simple immersive exercise promotes deeper relevance, engagement, understanding, problem-solving, comprehension, and retention.

Why does this exercise work so well?

Asking higher-order questions requires more time for students to think and articulate their answers, and can greatly extend classroom conversations and learning. When students are challenged with higher-order questions, they draw from their own experience to formulate their answers. In other words, their understanding becomes personalized. Thought-provoking questions not only encourage deeper discussions in the classroom, but also help students develop skills they can use in real-life decision making. Asking a variety of questions helps students actively and broadly engage with and deepen their understanding of the content. The questions invite students to respond based on their thoughts about the content, relying not just on basic recall but actual experience, helping students learn how to think rather than what to think.

McREL_Blooms_infographic

CLICK IMAGE TO ENLARGE

It’s a powerful instructional strategy, but classroom observation data collected with our Power Walkthrough system shows that teachers aren’t using higher-order questioning very often. In fact, we found that teachers are asking questions at the lower three levels of Bloom’s Taxonomy a whopping 71% of the time. Why might that be?

In the interest of time, teachers often perform a quick check for understanding, asking specific questions that require a simple right or wrong answer. Sometimes, teachers don’t know how to ask higher-order questions, or feel that they don’t have adequate time to generate more provocative questions during a lesson. Advance organizers can help students understand the expectations for each lesson and facilitate a higher level of classroom questioning. In Classroom Instruction That Works , 2nd ed. (2012), our McREL colleagues recommend asking inferential questions and using explicit cues to activate your students’ prior knowledge and develop deeper understanding of the content. You can also prepare sentence stems that help you craft higher order questions on the fly during classroom discussions.

Another way to focus classroom effort on higher-order questions that make learning memorable is to teach your students about the levels of Bloom’s Taxonomy , emphasizing how higher-order questioning promotes deeper learning. Once they have an understanding, they can then articulate at what level their questions are occurring. Try creating a poster of Bloom’s Taxonomy with your students for your classroom. Then, during classroom discussions, place a sticky note at the level of Bloom’s in which the students are working. This will help guide discussions, and serve as reminder for you and your students to stretch learning by reaching for the highest level of discussion. In our experience, most students would rather imagine themselves in a different time and place than sit and read a passage and complete a worksheet.

As a teacher or administrator, how have you inspired classroom curiosity through higher-order-thinking questions? Please share your great ideas.

6a010536aec25c970b01b7c7c12852970b

About McREL.org

McREL is a non-profit, non-partisan education research and development organization that since 1966 has turned knowledge about what works in education into practical, effective guidance and training for teachers and education leaders across the U.S. and around the world.

Previous Post White Paper | Let’s Rethink Online Learning (2017)

Next post teachable life lessons: hula hoops® and fishy handshakes.

' src=

Teachers need to make every lesson relevant and significant to every student. Have the students make the real-world connection, don’t do it for them.

' src=

I use real-life experiences to aid my students in higher order thinking.

I use an active process of analyzing, synthesizing, evaluating, reflecting, and applying how to think questioning to information toward specific situations and contexts.

' src=

For Science, we complete many hands on activities that lead to higher order thinking, especially when something does not go right in the lab. This allows us to question why this occurred and what could be done differently.

' src=

Hi Tony, Thank you for your comment. I like the idea of having students create relevance to the material they are currently learning. Nice way to get them thinking a little deeper. (Sometimes with younger students we have to prompt them a bit to find the relevance.

Hello Antwanita, That’s the best way to get buy-in and interest in the new learning. Thank you for your comment.

Hi Anice, Thank you for sharing with us. Learning through failure is an excellent way to get to the higher-order thinking we want for our students and science is ideal!

' src=

I use open-ended questions a lot and relate the lesson to real-life applications.

I use open-ended questions and real-life applications problems in my classes as much as possible.

Leave a Reply

Save my name, email, and website in this browser for the next time I comment.

  • McREL CATALOG
  • ABOUT McREL
  • Who We Are—McREL Staff
  • McREL Catalog
  • School & System Improvement
  • Balanced Leadership
  • Classroom Instruction That Works
  • Curiosity Works
  • Curriculum & Standards
  • Personnel Evaluation
  • Power Walkthrough
  • Student Learning That Sticks
  • Research Partnerships
  • Research Reports
  • White Papers
  • Ed-Tech Evaluation
  • PD Courses & Events
  • FREE RESOURCES
  • Subscribe to Free Resources
  • Success Stories
  • Compendium of Standards

Search McREL.org

© 2024 McREL International. @2023 Mcrel International | Powered by NEWMEDIA

  • About McREL
  • Equal Employment Opportunity
  • Learning That Sticks
  • Multilingual and English Learners
  • Research & Evaluation
  • Free Videos
  • In-Person Events
  • for Teachers
  • for Principals
  • REL Pacific
  • CC Region 11
  • CC Region 12

higher level critical thinking questions

Teaching Made Practical

  • Character Traits
  • Compare and Contrast
  • Read Alouds
  • Point of View
  • Reading Response Ideas
  • Summarizing
  • Text Features
  • Text Structures
  • Find the Fib
  • Reusable Ideas
  • Disclosure Policy
  • Lifetime Access

Questions for your text feature lesson plans to help promote higher level thinking in 3rd, 4th, and 5th graders

Higher Order Questions for Your Text Feature Lessons

Questions for your text feature lesson plans to help promote higher level thinking in 3rd, 4th, and 5th graders

When we question our students about text features, we often focus too much on having students identify different nonfiction text features.  While this is essential, it is equally important to get our 3rd, 4th, and 5th grade students thinking more deeply about text features - moving past knowledge and recall questions and into more higher order thinking questions.

Below, find text feature questions you can include in your upper elementary lesson plans for each of the levels of Bloom's Taxonomy.  You'll find a free pdf printable of these questions at the bottom of the webpage.

Knowledge Questions

  • List all of the text features you found on this page.
  • Circle the heading.
  • Describe the diagram.
  • Draw an example of bold letters.
  • Explain where you would find the table of contents of a book.
  • Point at the bullet points on this page.

Comprehension Questions

  • Explain what a table is in your own words.
  • How are a photograph and an illustration different?
  • How are captions and labels alike?
  • Which text feature best supports the main idea of this paragraph?
  • What text feature should you use to figure out the meaning of a word: an index or a glossary?  Why?
  • Based on the text features in this book, what do you think the book will be about?

Use this free text features chart to help your students learn about the purposes of different nonfiction text features.

Application Questions

  • How could you use the title or headings of this book to predict the main idea?
  • In what other situations would bold letters be useful?
  • What caption would you write for this photograph?
  • What text features would you include if you were writing an article on basketball?
  • Organize the information in this paragraph into a table or chart.
  • Write an appropriate heading for this paragraph.

Text Feature Activities for 3rd, 4th, and 5th grade - posters, charts, no prep activities, task cards, scavenger hunts, and more

Want to make your lesson planning even easier?  Find everything you need to teach nonfiction text features in this  Text Features Bundle.   

There are posters, task cards, no prep activities, reading passages, and more to help your students have a deep and thorough understanding of text features and their purposes.

Analysis Questions

  • How do the text features on this page relate to each other?
  • If you were asked to divide the text features on this page into 2 groups, how would you categorize them?
  • What inference can you make about this book based on its text features?
  • How do the text features on this page relate to the text?
  • Compare and contrast two of the text features on this page.
  • Explain the different parts of this diagram or chart.  What text features are included within the diagram or chart?

Evaluation Questions

  • Which text feature was most useful in helping you understand the text?  Why?
  • Which text feature was least helpful to you in understanding the text?  Why?
  • Where in the text could the author have added a table, chart, or diagram?
  • Which text feature do you think is the most important to nonfiction books?  Why?
  • Why do you think the author chose to add this text feature?
  • Which text feature did the author use most effectively?  Defend your reasoning.

Synthesis Questions

  • Write a nonfiction article that includes at least 6 different text features.
  • Create an additional text feature for this book.
  • How would this book have been different if the author hadn't included any photographs or illustrations?
  • Choose one of the text features on the page and write your own paragraph to support the text feature.
  • What text feature could be added to help you understand the text better?
  • How would the book have been different if the author had not included any headings or titles?

Download a free pdf version of these questions here: Text Feature Questions for Higher Level Thinking

You. might also like these other text feature ideas and activities or these other questions for higher level thinking in upper elementary. 

Get Another Text Feature Freebie

Image

A no prep resource to help your students learn the purposes of the most common text features!

Comments 10

I love those sub plans. They are ideal. Do you have any for Non-Fiction?

I don’t have any nonfiction sub plans. I like that idea, however – I might look into creating some in the future.

Thank you for these resources. They have helped me a lot. These are very interactive and helps my students to use their higher level thinking skills.

I’m so glad you have found some helpful ideas!

Thanks for this resource. Check the PDFs heading. It says Character Traits.

Thanks! It has been updated.

I love this free resource but the text features PDF says character traits at the top.

Thanks for pointing that out! I’ll update that.

Check the spelling error.. evaluation. I love this and plan to share it.

Thanks for pointing that out! I plan on turning this information into a pdf document so that its easier to print, so come back and check for that!

Leave a Reply Cancel reply

You must be logged in to post a comment.

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

higher level critical thinking questions

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Support Group
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

eSoft Skills Global Training Solutions

Higher-Level Thinking Skills Questions: Fostering Critical Thinking and Problem-Solving Abilities

Higher-level thinking skills questions

Higher-level thinking skills questions are a powerful resource for nurturing deep analysis and critical problem-solving abilities in students. These questions go beyond simple recall and require learners to engage in advanced cognitive processes, such as evaluation, synthesis, and application.

Incorporating higher-level thinking skills questions into instruction offers numerous benefits, including the development of critical thinking , problem-solving , and analytical skills. By challenging students to think critically and solve complex problems, educators can create meaningful learning experiences and promote cognitive growth.

Throughout this article series, we will explore the various types of higher-level thinking skills questions and their role in fostering cognitive development . We will delve into the application of Bloom’s Taxonomy , which provides a framework for classifying cognitive thinking skills . Additionally, we will offer tips for crafting effective higher-level thinking skills questions and discuss the benefits of incorporating them into instructional practices.

Join us on this journey as we uncover the power of higher-level thinking skills questions and their ability to prepare students for success in academic and real-world contexts.

Table of Contents

Key Takeaways:

  • Higher-level thinking skills questions foster critical thinking and problem-solving abilities.
  • These questions require advanced cognitive processes, such as evaluation, synthesis, and application.
  • Incorporating higher-level thinking skills questions enhances students’ critical thinking , problem-solving, and analytical skills.
  • Bloom’s Taxonomy provides a framework for classifying cognitive thinking skills into lower-order and higher-order thinking skills .
  • Crafting effective higher-level thinking skills questions involves aligning them with specific learning objectives, utilizing a variety of question stems, and encouraging open-ended responses.

Understanding Bloom’s Taxonomy

Bloom’s Taxonomy is a framework for classifying cognitive thinking skills . It consists of six levels: remember, understand, apply, analyze, evaluate, and create . The first three levels, remember, understand, and apply, are considered lower-order thinking skills (LOTS), while the last three levels, analyze, evaluate, and create, fall under higher-order thinking skills (HOTS).

Higher-order thinking skills questions, which align with the HOTS levels, encourage deeper engagement with the subject matter and promote critical thinking and problem-solving skills.

“Higher-order thinking is the ability to engage in complex cognitive processes such as analysis, evaluation, and creation. These skills are crucial for students to develop as they prepare for future challenges in their academic and professional lives.” – Dr. Jane Johnson

Lower-Order Thinking Skills (LOTS)

The first three levels of Bloom’s Taxonomy, remember, understand, and apply, fall under lower-order thinking skills (LOTS). These levels focus on building foundational knowledge and understanding of the subject matter. Students are expected to recall facts, explain concepts, and apply learned information to solve basic problems.

Higher-Order Thinking Skills (HOTS)

The last three levels of Bloom’s Taxonomy, analyze, evaluate, and create, fall under higher-order thinking skills (HOTS). These levels require students to think critically, examine relationships, and generate new ideas. Students are challenged to evaluate evidence, analyze complex problems, and create innovative solutions.

Remembering and Understanding Questions

Remembering and understanding questions play a crucial role in the learning process, allowing students to recall and comprehend information effectively. These types of questions focus on the lower-order thinking skills necessary for building a solid foundation of knowledge. By answering remembering and understanding questions , students can demonstrate their ability to recognize and recall important information related to the subject matter.

Remembering questions typically require students to retrieve specific details, facts, or concepts from memory. These questions prompt students to describe, list, define, or identify elements of the subject matter. By answering these questions, students reinforce their understanding of key concepts and develop their ability to recall information accurately.

“Remembering is the fundamental building block of learning. Only by remembering can students move forward and engage in higher-order thinking skills.”

Understanding questions delve deeper into the meaning and comprehension of the subject matter. These questions require students to explain concepts, provide examples, or summarize information to showcase their understanding. By answering understanding questions, students can not only demonstrate their comprehension of the material but also identify connections between different ideas and concepts.

To illustrate, let’s consider an example from a history class:

In this example, the remembering question requires students to recall specific facts—three causes of the war. On the other hand, the understanding question entails a deeper analysis of how alliances influenced the course of the war, prompting students to demonstrate their comprehension and critical thinking skills.

By incorporating both remembering and understanding questions in instructional activities, educators can facilitate the acquisition and consolidation of knowledge while fostering lower-order thinking skills.

  • Remembering and understanding questions focus on recalling and comprehending information.
  • Remembering questions prompt students to describe, list, define, or identify elements of the subject matter.
  • Understanding questions require students to explain concepts, provide examples, or summarize information to showcase their comprehension.
  • Both types of questions play a crucial role in reinforcing knowledge and building a solid foundation before advancing to higher-order thinking skills.

Applying and Analyzing Questions

Applying and analyzing questions play a crucial role in developing higher-order thinking skills among students. These types of questions require learners to go beyond the surface level of understanding and apply their knowledge to solve problems, make connections, and examine relationships. By engaging with applying and analyzing questions , students can enhance their critical thinking abilities and problem-solving skills, preparing them to navigate complex real-world contexts.

Applying questions prompt students to use their knowledge and understanding in practical scenarios. These questions require learners to utilize their skills and apply them in real-life contexts, enabling them to develop a deeper understanding of how the subject matter can be used or implemented.

For example, in a history class, an applying question could be:

“How can the lessons learned from past conflicts be used to guide diplomatic decisions in modern times?”

This type of question prompts students to consider historical events and their implications in a contemporary context, requiring them to analyze the relevance and application of historical knowledge.

On the other hand, analyzing questions encourage learners to critically evaluate and examine different elements of the subject matter. These questions prompt students to compare and contrast, identify patterns, and make connections between various concepts, fostering a deeper level of understanding.

An example of an analyzing question in a literature class could be:

“Compare and contrast the themes of love and betrayal in two different Shakespearean plays.”

By answering this analyzing question, students are required to delve into the texts, identify common themes and distinct differences, and analyze the underlying messages conveyed by Shakespeare.

Applying and analyzing questions not only facilitate critical thinking but also encourage students to develop their abilities to construct arguments and think critically. These types of questions support learners in honing their analytical skills, enabling them to comprehend complex concepts and evaluate information effectively.

Evaluating and Creating Questions

Evaluating and creating questions are essential for developing higher-order thinking skills in students. These types of questions require learners to go beyond simple recall and comprehension and engage in critical analysis, evaluation, and synthesis.

Evaluating questions prompt students to assess evidence, make judgments, and critique arguments. By answering evaluating questions , students can enhance their ability to think critically, weigh different perspectives, and form well-reasoned opinions. This type of questioning encourages students to evaluate the strengths and weaknesses of various arguments or approaches, fostering a deeper understanding of the subject matter.

Creating questions stimulate students to generate new ideas, propose alternative solutions, and think creatively. By formulating their own questions, students actively engage with the content, develop a deeper understanding, and apply their knowledge in unique ways. This type of questioning promotes innovation, problem-solving skills, and independent thinking.

Engaging in evaluating and creating questions allows students to develop their higher-order thinking skills, such as critical thinking, analysis, and synthesis. Through these processes, students enhance their ability to assess information, make informed judgments, and generate creative solutions.

By incorporating evaluating and creating questions into classroom instruction, educators can cultivate an environment that nurtures higher-order thinking skills. Through meaningful questioning techniques, educators encourage students to think critically, evaluate information, and develop innovative approaches to problem-solving.

Benefits of Higher-Level Thinking Skills Questions

Incorporating higher-level thinking skills questions into instruction offers numerous benefits for students. These types of questions encourage critical thinking , problem-solving , and cognitive development . They promote a deeper understanding of the subject matter and foster cognitive growth. By engaging students in higher-level thinking skills questions, educators can create an engaging and stimulating learning environment that prepares students for success in academic and real-world contexts .

Higher-level thinking skills questions enable students to:

  • Develop critical thinking abilities
  • Enhance problem-solving skills
  • Apply knowledge in practical situations
  • Analyze complex information
  • Evaluate evidence and arguments
  • Generate innovative ideas

By actively engaging with these types of questions, students can acquire a range of high-level cognitive skills, enabling them to tackle challenging tasks and approach learning with depth and critical analysis.

Real-World Application

Higher-level thinking skills questions help students develop the ability to think critically and creatively. These skills are essential for success in the workplace, as they allow individuals to adapt to new challenges, solve complex problems, and generate innovative solutions.

Furthermore, the cultivation of higher-level thinking skills supports students’ overall cognitive development , fostering the acquisition of advanced analytical and evaluative capabilities. These skills are transferable to various academic disciplines and practical situations, preparing students for academic success and future endeavors.

Tips for Crafting Higher-Level Thinking Skills Questions

Crafting effective higher-level thinking skills questions requires careful consideration and planning. By following these tips, educators can create meaningful and challenging questions that promote critical thinking and problem-solving abilities.

  • Align questions with specific learning objectives: Start by identifying the desired learning outcomes and skills you want to develop in your students. Ensure that the questions you craft align with these objectives, allowing students to engage in higher-level cognitive processes.
  • Utilize a variety of question stems: Use different question stems to encourage diverse thinking and prompt students to analyze, evaluate, and create. Some examples of question stems include “What evidence supports your conclusion ?”, “How would you solve this problem differently?”, and “Why is this approach effective or ineffective?”.
  • Incorporate real-world scenarios: Make the questions relevant and relatable by connecting them to real-life situations or problems. This helps students see the practical applications of their learning and encourages them to think critically in authentic contexts.
  • Encourage open-ended responses: Avoid simple yes-or-no questions and instead ask questions that require students to provide detailed explanations, justify their reasoning, or propose alternative solutions. Open-ended questions promote deeper thinking and allow students to express their thoughts more fully.
“The formulation of a problem is often more essential than its solution.” – Albert Einstein

Implementing these instructional strategies can enhance student engagement and foster critical thinking skills. By crafting higher-level thinking skills questions that align with learning objectives, utilize various question stems, incorporate real-world scenarios, and encourage open-ended responses, educators can empower students to think critically, solve complex problems, and develop advanced cognitive abilities.

Higher-level thinking skills questions are indispensable for developing students’ critical thinking abilities and problem-solving skills. By integrating these types of questions into instruction, educators can cultivate deep analysis, evaluation, and synthesis, which are essential for success in academic and real-world contexts.

Through the implementation of instructional strategies and the crafting of effective higher-level thinking skills questions, educators can create a stimulating learning environment that promotes cognitive development and prepares students for future challenges.

By engaging in activities that require higher-level thinking, such as answering complex questions and solving intricate problems, students can enhance their analytical thinking and creativity. These skills are invaluable in equipping them with the necessary tools to navigate today’s complex and ever-evolving world.

In conclusion , higher-level thinking skills questions are a cornerstone of education, empowering students to think critically, solve problems, and promote cognitive growth. By incorporating these questions into instructional practices, educators can nurture the next generation of innovative thinkers and problem solvers, poised for success in all aspects of life.

eSoft Skills Team

The eSoft Editorial Team, a blend of experienced professionals, leaders, and academics, specializes in soft skills, leadership, management, and personal and professional development. Committed to delivering thoroughly researched, high-quality, and reliable content, they abide by strict editorial guidelines ensuring accuracy and currency. Each article crafted is not merely informative but serves as a catalyst for growth, empowering individuals and organizations. As enablers, their trusted insights shape the leaders and organizations of tomorrow.

View all posts

Similar Posts

Understanding Object Relations Theory Essentials

Understanding Object Relations Theory Essentials

Object relations theory is a variation of psychoanalytic theory that focuses on the importance of human relationships and the need for contact with others. It diverges from Freud’s belief in sexual and aggressive drives as the primary motivation and instead emphasizes the role of relationships in human development. Object relations therapists aim to help individuals…

The World of Agricultural Compliance: Ensuring Sustainable Practices

The World of Agricultural Compliance: Ensuring Sustainable Practices

In an era marked by increasing awareness of environmental sustainability, the world of agricultural compliance plays a pivotal role in ensuring that farming practices align with the principles of responsible and sustainable agriculture. As the global population continues to expand, the demand for food production intensifies, placing greater emphasis on the need for agricultural compliance…

Developing a Scalable Training Program

Developing a scalable training program is essential for organizations aiming to efficiently and effectively build the knowledge and skills of their workforce. By establishing a well-structured and adaptable training framework, businesses can meet the evolving needs of their employees while accommodating organizational growth. This involves identifying training needs, creating modular training materials, implementing learning management…

The Role Of Gratitude In Emotional Well-Being

The Role Of Gratitude In Emotional Well-Being

Gratitude is a powerful emotion that can have a profound impact on your emotional well-being. Research has shown that practicing gratitude can lead to numerous benefits, including improved mood, reduced stress levels, and enhanced relationships. By cultivating a grateful mindset and incorporating gratitude into your daily life, you can experience significant improvements in your overall…

Boost Efficiency with Top Productivity Apps

Boost Efficiency with Top Productivity Apps

Productivity is a personal and subjective concept. There is no one-size-fits-all solution when it comes to productivity apps. However, there are several categories of productivity apps that can help individuals become more efficient in their work. These categories include to-do list apps, calendar apps, AI scheduling assistants, note-taking apps, focus apps, habit tracker apps, read-it-later…

Networking Strategies in a Digital Job Market

Networking Strategies in a Digital Job Market

In today’s digital job market, networking has become an essential tool for job seekers to navigate the competitive landscape and uncover new career opportunities. With the rise of online networking platforms and virtual events, professionals now have the ability to connect with like-minded individuals from around the world, build valuable professional relationships, and expand their…

New user? Create an account

Forgot your password?

Forgot your username?

Advertisement

higher level critical thinking questions

Writing Multiple-Choice Questions for Higher-level Thinking

Join or login to save this to your library

Contributor

higher level critical thinking questions

Mike Dickinson

Freelance Instructional Designer

We eLearning developers are used to the question, “Which is better, eLearning or classroom instruction?” The answer is, “It depends.” It’s the same answer if one asks, “Which are better, multiple-choice or essay questions?” Either question type is useful for assessing a variety of levels of thinking, depending on how well the designer crafts the questions. Designing multiple-choice questions is not as daunting a task as one might think.

What is higher-level thinking?

What do we mean by higher-level thinking? Benjamin Bloom described six levels of cognitive behavior, listed here from the most basic – Knowledge – at the bottom to the most complex – Evaluation – at the top:

  • Application
  • Comprehension

Bloom’s taxonomy offers one way of looking at increasingly complex cognitive abilities. For example, Knowledge and Comprehension mean a person can recall facts or paraphrase a concept. Synthesis, on the other hand, means a person can create something new, such as an essay or a painting. (Please see the list of References at the end of this article for the sources of ideas presented here.)

J. P. Guilford offered another way of looking at cognition with his description of convergent and divergent production . Convergent thinking means someone is working with knowledge, processes, concepts, etc, that exist; it has a certain correctness about it. When applied to test questions, convergent thinking means there is a preexisting correct answer. Verbs for convergent thinking include select , identify , calculate , label , and diagnose . Conversely, divergent thinking means there is not a preexisting correct answer. The person must take existing knowledge and create new knowledge. As Marie Hoepfl explains, verbs for divergent thinking objectives include create (a poem or story), compose (a song), etc.

Mapping Guilford’s concepts onto Bloom’s taxonomy, convergent thinking applies to Bloom’s first four levels of cognitive behavior, that is, up through Analysis, and divergent thinking applies to Bloom’s top two levels, Synthesis and Evaluation. See Table 1.

This combination thus suggests that the designer can write multiple-choice questions for Bloom’s first four levels of cognitive behavior (Knowledge, Comprehension, Application, and Analysis) since they require a predictable or calculable answer.

On the other hand, Bloom’s top two levels – Synthesis and Evaluation – being divergent thinking, are best tested with fill-in or essay questions since a predetermined correct answer does not exist. 

It starts with the objectives

Before we look at specific techniques, let’s be clear about one thing. We’re not talking about making multiple-choice tests artificially difficult. Rather, when the learning objectives dictate assessments at higher levels, we need the tools to meet that requirement. In the eLearning world, we are pretty much confined to multiple-choice or similar selected-response questions. Even those instructors who conduct classroom sessions may want to augment essay questions with multiple-choice in order to take advantage of some of the latter’s efficiencies. For example, compared to essay questions, multiple-choice questions can be graded faster and more reliably by people other than the instructor, and by the computer. They can also cover a broader scope of the subject in the same amount of time it would take a student to complete one essay question.

Writing higher-order multiple-choice questions

Let’s look at the way thinking skills progress, using the cold and flu for context (Table 2). At the Knowledge level we are asking the learner to merely identify or select symptoms of a cold. At the Comprehension level we might want the learner to match symptoms with their respective ailment. At the Application level the learner must do something (or determine what they would do in real life) with the knowledge they possess. Notice that even though we’re talking about diagnosis and interpretation, there is still a predetermined correct answer. That is, this still represents convergent thinking.

Now consider Bloom’s two highest levels: Synthesis and Evaluation. These are divergent thinking. At the Synthesis level we would be asking a person to develop a new protocol for treating the cold, and at the Evaluation level we would ask them to assess the effectiveness of that protocol. Neither of those outcomes can be predetermined. Thus they are not suitable for multiple-choice questions; later I’ll suggest a way multiple-choice questions support pseudo assessment of those levels.

Specific techniques

Here are some specific techniques gleaned from the literature and my own experience.

Transform existing items

You can transform existing items that were written for lower cognitive levels such as recall of facts, according to guidelines from Penn State’s Schreyer Institute. One note of caution: even if your question is written at a higher level of knowledge, if you use statements or examples that were mentioned in reading assignments or the presentation, then the student may be doing nothing more than recall.

One way to move up from the knowledge level to the comprehension level is to ask the learner to distinguish whether statements are consistent with a principle, concept, or rule. For example, say you had a knowledge-level question that merely asked the learner to select the common symptoms of the flu from a list. You could transform the question by describing a patient who presents with certain symptoms and asking the learner to determine whether those symptoms are consistent with the flu or not. Scenarios or situations like this are good ways to set up questions to assess higher-level thinking.

You could raise the question another notch by having the learner compare and contrast symptoms. For example, rather than just determining if the patient appears to have the flu, you could have the learner determine whether the patient is likely to have a cold, the flu, or severe allergies. Obviously a question like this requires careful selection of terminology so the question truly distinguishes between those learners with complete vs. partial knowledge.

So, to generalize, if you have an existing question that states the rule and then asks the learner to identify one characteristic of that rule or concept, you can often flip it by presenting the characteristic in the question stem, and then asking the learner to identify the rule or concept.

I used this technique a lot in compliance training. Compliance dilemmas do not present themselves in the real world with their label. They present themselves through people’s actions and words, and then we have to recognize what kind of situation is developing. So in our compliance training we described workplace scenarios and then asked the learner to identify what kind of compliance issue was developing and what was the appropriate response. This not only raised the questions to higher level thinking, it made the training much more realistic than merely categorizing or labeling terms.

Use plausible distractors and new examples  

Another way to transform existing questions is to ensure you are using plausible distractors. You can often do this with anticipated wrong answers. Here is an example:

Calculate the median of the following numbers: 15, 27, 27, 44, 67, 75, and 81.

The student must recall the definition of median and then apply that definition to the list of numbers. You will recall that the median is the number at the midpoint of a distribution. That number is 44 in this question. A common mistake is to confuse the definitions of median and mean (average). Hopefully our instruction will have helped learners understand the difference, but to be sure, the mean (48) is one of the distractors. So is the mode, the number with the most instances, or 27 in this case. So we have one correct choice and four distractors, two of which are plausible if the learner is not clear on the definition of these terms.

Interpret charts and graphs  

We have probably all experienced questions on standardized tests, such as the SAT or GRE in the U.S., which showed two or three charts and graphs and asked questions that required us to interpret the meaning. If the subject matter allows, this is a good way to increase the level of thinking.

Premise-choice or Multi-logic thinking

Aiken described a premise-choice technique and Morrison and Walsh described multi-logical thinking. I think they are roughly the same technique.

In this kind of question the stem contains two premises and the student must select the correct conclusion or solution. For example, let’s say we need to assess the learner’s knowledge of team-building processes. One premise could be a team development model consisting of four phases, and the second premise could be different ways of communicating in each of those phases.

A knowledge or comprehension level question might name a phase and ask the learner to select characteristics of that phase from a list. A higher level question could describe the observed behaviors of team members and ask the learner to identify the preferred communication process. To answer a question like this, the learner has to first classify the team’s stage and then apply the communication rule.

Premise-choice or multi-logical questions should require a high level of discriminating judgment. These questions often use words in the stem such as best , most important , first , or most correct .

Bury the verb!

Recently I recognized a rather simple way to write multiple-choice questions for higher-level thinking. This method is totally contrary to what my English teacher taught me. Since I live in Texas, I’ll call this the Texas two-step of higher-level assessment. As the name implies, it consists of two steps:

  • Then bury that verb by changing it to a noun and putting a convergent verb in front of it.

Often this will mean changing the verb to a “-tion” derivative. Here are three examples using this technique:

  • Select the best description .
  • Identify the most accurate interpretation .
  • Select a correctly constructed sentence

Depending on whether the verb you bury is convergent or divergent, this technique may be a pseudo measure, but if you must use multiple-choice questions, or if you want to increase the span of their capability, this is a practical way to do it.

Don’t give away the farm!

After going to the trouble of crafting multiple-choice questions for higher levels of thinking, be careful you don’t give away the farm. In my research for this article, I was surprised by the number of poorly written multiple-choice questions I found while randomly searching for ideas among online multiple-choice tests. It’s out of scope for this article, but I urge you to review guidelines for basic multiple-choice item construction. It is easy to find such resources on the Web.

Use higher-order tests for teaching

Finally, don’t overlook the value of higher-level multiple-choice questions for teaching. In areas where the target audience has some degree of prior knowledge, or where their life experience is relevant, I often make online courses denser by using multiple-choice exercises instead of the more traditional present-and-test format. This technique is also useful when there is room for judgment, or the preferred choice is conditional and you want the student to understand how different circumstances can affect the preferred action.

For example, a few years ago I developed a scenario-based online course on preventing sexual harassment. One of the company’s tenets was that a person should try to resolve an issue with another employee directly rather than elevating it to management. We wanted to reinforce that expectation while honoring those who may not feel comfortable taking such a direct route in a sensitive situation. So the course accepted both the preferred and other acceptable choices, with feedback that was supportive and instructive.

Remember to analyze results

Your best intentions notwithstanding, you don’t really know how well a question is going to perform until you have data to analyze after learners have taken the test. You don’t need to do a sophisticated analysis, but as a minimum you should tally up how many times each choice was selected and what proportion of the respondents got the question right. This data can reveal things like questions that are too easy or too difficult, and if distractors are working the way you intended or not. And especially, for questions that appear to be too difficult, you should investigate further to determine if the question is faulty, or if the instruction itself needs improvement.

Aiken, Lewis R., (1982). Writing multiple-choice items to measure higher-order educational objectives. Educational and Psychological Measurement , 1982, Vol. 42, pp. 803-806.

Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives. The classification of educational goals: Handbook I. Cognitive domain. New York: David McKay.

Bloom’s Taxonomy, downloaded from Wikipedia 11/8/2011, http://en.wikipedia.org/wiki/Bloom's_Taxonomy

Guilford, J.P., (1967). The nature of human intelligence , New York, McGraw-Hill

Hoepfl, Marie C. (1994) Developing and evaluating multiple-choice tests. The Technology Teacher , April 1994, pp. 25-26.

Morrison, Susan, and Kathleen Walsh Free, (2001) Writing multiple-choice test items that promote and measure critical thinking. Journal of Nursing Education , January 2001, Vol. 40, No. 1, pp. 17-24.

Schreyer Institute for Teaching Excellence at Penn State, Writing multiple-choice items to assess higher order thinking . Downloaded Nov. 1, 2011.

December 5, 2011

Accessible & Inclusive Learning: A Panel Discussion

Online Events Archive

Trends in Instructional Technologies over the Past 15 Years

Revitalizing learning: merging timeless design principles with cutting-edge learning environments, get them to yes: tools and tactics to get what you need from stakeholders, instructional design in the real world.

404 Not found

  • Research article
  • Open access
  • Published: 12 April 2024

Feedback sources in essay writing: peer-generated or AI-generated feedback?

  • Seyyed Kazem Banihashem 1 , 2 ,
  • Nafiseh Taghizadeh Kerman 3 ,
  • Omid Noroozi 2 ,
  • Jewoong Moon 4 &
  • Hendrik Drachsler 1 , 5  

International Journal of Educational Technology in Higher Education volume  21 , Article number:  23 ( 2024 ) Cite this article

371 Accesses

2 Altmetric

Metrics details

Peer feedback is introduced as an effective learning strategy, especially in large-size classes where teachers face high workloads. However, for complex tasks such as writing an argumentative essay, without support peers may not provide high-quality feedback since it requires a high level of cognitive processing, critical thinking skills, and a deep understanding of the subject. With the promising developments in Artificial Intelligence (AI), particularly after the emergence of ChatGPT, there is a global argument that whether AI tools can be seen as a new source of feedback or not for complex tasks. The answer to this question is not completely clear yet as there are limited studies and our understanding remains constrained. In this study, we used ChatGPT as a source of feedback for students’ argumentative essay writing tasks and we compared the quality of ChatGPT-generated feedback with peer feedback. The participant pool consisted of 74 graduate students from a Dutch university. The study unfolded in two phases: firstly, students’ essay data were collected as they composed essays on one of the given topics; subsequently, peer feedback and ChatGPT-generated feedback data were collected through engaging peers in a feedback process and using ChatGPT as a feedback source. Two coding schemes including coding schemes for essay analysis and coding schemes for feedback analysis were used to measure the quality of essays and feedback. Then, a MANOVA analysis was employed to determine any distinctions between the feedback generated by peers and ChatGPT. Additionally, Spearman’s correlation was utilized to explore potential links between the essay quality and the feedback generated by peers and ChatGPT. The results showed a significant difference between feedback generated by ChatGPT and peers. While ChatGPT provided more descriptive feedback including information about how the essay is written, peers provided feedback including information about identification of the problem in the essay. The overarching look at the results suggests a potential complementary role for ChatGPT and students in the feedback process. Regarding the relationship between the quality of essays and the quality of the feedback provided by ChatGPT and peers, we found no overall significant relationship. These findings imply that the quality of the essays does not impact both ChatGPT and peer feedback quality. The implications of this study are valuable, shedding light on the prospective use of ChatGPT as a feedback source, particularly for complex tasks like argumentative essay writing. We discussed the findings and delved into the implications for future research and practical applications in educational contexts.

Introduction

Feedback is acknowledged as one of the most crucial tools for enhancing learning (Banihashem et al., 2022 ). The general and well-accepted definition of feedback conceptualizes it as information provided by an agent (e.g., teacher, peer, self, AI, technology) regarding aspects of one’s performance or understanding (e.g., Hattie & Timplerely, 2007 ). Feedback serves to heighten students’ self-awareness concerning their strengths and areas warranting improvement, through providing actionable steps required to enhance performance (Ramson, 2003 ). The literature abounds with numerous studies that illuminate the positive impact of feedback on diverse dimensions of students’ learning journey including increasing motivation (Amiryousefi & Geld, 2021 ), fostering active engagement (Zhang & Hyland, 2022 ), promoting self-regulation and metacognitive skills (Callender et al., 2016 ; Labuhn et al., 2010 ), and enriching the depth of learning outcomes (Gan et al., 2021 ).

Normally, teachers have primarily assumed the role of delivering feedback, providing insights into students’ performance on specific tasks or their grasp of particular subjects (Konold et al., 2004 ). This responsibility has naturally fallen upon teachers owing to their expertise in the subject matter and their competence to offer constructive input (Diezmann & Watters, 2015 ; Holt-Reynolds, 1999 ; Valero Haro et al., 2023 ). However, teachers’ role as feedback providers has been challenged in recent years as we have witnessed a growth in class sizes due to the rapid advances in technology and the widespread use of digital technologies that resulted in flexible and accessible education (Shi et al., 2019 ). The growth in class sizes has translated into an increased workload for teachers, leading to a pertinent predicament. This situation has directly impacted their capacity to provide personalized and timely feedback to each student, a capability that has encountered limitations (Er et al., 2021 ).

In response to this challenge, various solutions have emerged, among which peer feedback has arisen as a promising alternative instructional approach (Er et al., 2021 ; Gao et al., 2024 ; Noroozi et al., 2023 ; Kerman et al., 2024 ). Peer feedback entails a process wherein students assume the role of feedback providers instead of teachers (Liu & Carless, 2006 ). Involving students in feedback can add value to education in several ways. First and foremost, research indicates that students delve into deeper and more effective learning when they take on the role of assessors, critically evaluating and analyzing their peers’ assignments (Gielen & De Wever, 2015 ; Li et al., 2010 ). Moreover, involving students in the feedback process can augment their self-regulatory awareness, active engagement, and motivation for learning (e.g., Arguedas et al., 2016 ). Lastly, the incorporation of peer feedback not only holds the potential to significantly alleviate teachers’ workload by shifting their responsibilities from feedback provision to the facilitation of peer feedback processes but also nurtures a dynamic learning environment wherein students are actively immersed in the learning journey (e.g., Valero Haro et al., 2023 ).

Despite the advantages of peer feedback, furnishing high-quality feedback to peers remains a challenge. Several factors contribute to this challenge. Primarily, generating effective feedback necessitates a solid understanding of feedback principles, an element that peers often lack (Latifi et al., 2023 ; Noroozi et al., 2016 ). Moreover, offering high-quality feedback is inherently a complex task, demanding substantial cognitive processing to meticulously evaluate peers’ assignments, identify issues, and propose constructive remedies (King, 2002 ; Noroozi et al., 2022 ). Furthermore, the provision of valuable feedback calls for a significant level of domain-specific expertise, which is not consistently possessed by students (Alqassab et al., 2018 ; Kerman et al., 2022 ).

In recent times, advancements in technology, coupled with the emergence of fields like Learning Analytics (LA), have presented promising avenues to elevate feedback practices through the facilitation of scalable, timely, and personalized feedback (Banihashem et al., 2023 ; Deeva et al., 2021 ; Drachsler, 2023 ; Drachsler & Kalz, 2016 ; Pardo et al., 2019 ; Zawacki-Richter et al., 2019 ; Rüdian et al., 2020 ). Yet, a striking stride forward in the field of educational technology has been the advent of a novel Artificial Intelligence (AI) tool known as “ChatGPT,” which has sparked a global discourse on its potential to significantly impact the current education system (Ray, 2023 ). This tool’s introduction has initiated discussions on the considerable ways AI can support educational endeavors (Bond et al., 2024 ; Darvishi et al., 2024 ).

In the context of feedback, AI-powered ChatGPT introduces what is referred to as AI-generated feedback (Farrokhnia et al., 2023 ). While the literature suggests that ChatGPT has the potential to facilitate feedback practices (Dai et al., 2023 ; Katz et al., 2023 ), this literature is very limited and mostly not empirical leading us to realize that our current comprehension of its capabilities in this regard is quite restricted. Therefore, we lack a comprehensive understanding of how ChatGPT can effectively support feedback practices and to what degree it can improve the timeliness, impact, and personalization of feedback, which remains notably limited at this time.

More importantly, considering the challenges we raised for peer feedback, the question is whether AI-generated feedback and more specifically feedback provided by ChatGPT has the potential to provide quality feedback. Taking this into account, there is a scarcity of knowledge and research gaps regarding the extent to which AI tools, specifically ChatGPT, can effectively enhance feedback quality compared to traditional peer feedback. Hence, our research aims to investigate the quality of feedback generated by ChatGPT within the context of essay writing and to juxtapose its quality with that of feedback generated by students.

This study carries the potential to make a substantial contribution to the existing body of recent literature on the potential of AI and in particular ChatGPT in education. It can cast a spotlight on the quality of AI-generated feedback in contrast to peer-generated feedback, while also showcasing the viability of AI tools like ChatGPT as effective automated feedback mechanisms. Furthermore, the outcomes of this study could offer insights into mitigating the feedback-related workload experienced by teachers through the intelligent utilization of AI tools (e.g., Banihashem et al., 2022 ; Er et al., 2021 ; Pardo et al., 2019 ).

However, there might be an argument regarding the rationale for conducting this study within the specific context of essay writing. Addressing this potential query, it is crucial to highlight that essay writing stands as one of the most prevalent yet complex tasks for students (Liunokas, 2020 ). This task is not without its challenges, as evidenced by the extensive body of literature that indicates students often struggle to meet desired standards in their essay composition (e.g., Bulqiyah et al., 2021 ; Noroozi et al., 2016 ;, 2022 ; Latifi et al., 2023 ).

Furthermore, teachers frequently express dissatisfaction with the depth and overall quality of students’ essay writing (Latifi et al., 2023 ). Often, these teachers lament that their feedback on essays remains superficial due to the substantial time and effort required for critical assessment and individualized feedback provision (Noroozi et al., 2016 ;, 2022 ). Regrettably, these constraints prevent them from delving deeper into the evaluation process (Kerman et al., 2022 ).

Hence, directing attention towards the comparison of peer-generated feedback quality and AI-generated feedback quality within the realm of essay writing bestows substantial value upon both research and practical application. This study enriches the academic discourse and informs practical approaches by delivering insights into the adequacy of feedback quality offered by both peers and AI for the domain of essay writing. This investigation serves as a critical step in determining whether the feedback imparted by peers and AI holds the necessary caliber to enhance the craft of essay writing.

The ramifications of addressing this query are noteworthy. Firstly, it stands to significantly alleviate the workload carried by teachers in the process of essay evaluation. By ascertaining the viability of feedback from peers and AI, teachers can potentially reduce the time and effort expended in reviewing essays. Furthermore, this study has the potential to advance the quality of essay compositions. The collaboration between students providing feedback to peers and the integration of AI-powered feedback tools can foster an environment where essays are not only better evaluated but also refined in their content and structure.With this in mind, we aim to tackle the following key questions within the scope of this study:

RQ1. To what extent does the quality of peer-generated and ChatGPT-generated feedback differ in the context of essay writing?

Rq2. does a relationship exist between the quality of essay writing performance and the quality of feedback generated by peers and chatgpt, context and participant.

This study was conducted in the academic year of 2022–2023 at a Dutch university specializing in life sciences. In total, 74 graduate students from food sciences participated in this study in which 77% of students were female ( N  = 57) and 23% were male ( N  = 17).

Study design and procedure

This empirical study has an exploratory nature and it was conducted in two phases. An online module called “ Argumentative Essay Writing ” (AEW) was designed to be followed by students within the Brightspace platform. The purpose of the AEW module was to improve students’ essay writing skills by engaging them in a peer learning process where students were invited to provide feedback on each other’s essays. After designing the module, the study was implemented in two weeks and followed in two phases.

In week one (phase one), students were asked to write an essay on given topics. The topics for the essay were controversial and included “ Scientists with affiliations to the food industry should abstain from participating in risk assessment processes ”, “ powdered infant formula must adhere to strict sterility standards ”, and “ safe food consumption is the responsibility of the consumer ”. The given controversial topics were directly related to the course content and students’ area of study. Students had time for one week to write their essays individually and submit them to the Brightspace platform.

In week two (phase two), students were randomly invited to provide two sets of written/asynchronous feedback on their peers’ submitted essays. We gave a prompt to students to be used for giving feedback ( Please provide feedback to your peer and explain the extent to which she/he has presented/elaborated/justified various elements of an argumentative essay. What are the problems and what are your suggestions to improve each element of the essay? Your feedback must be between 250 and 350 words ). To be able to engage students in the online peer feedback activity, we used the FeedbackFruits app embedded in the Brightspace platform. FeedbackFruits functions as an external educational technology tool seamlessly integrated into Brightspace, aimed at enhancing student engagement via diverse peer collaboration approaches. Among its features are peer feedback, assignment evaluation, skill assessment, automated feedback, interactive videos, dynamic documents, discussion tasks, and engaging presentations (Noroozi et al., 2022 ). In this research, our focus was on the peer feedback feature of the FeedbackFruits app, which empowers teachers to design tasks that enable students to offer feedback to their peers.

In addition, we used ChatGPT as another feedback source on peers’ essays. To be consistent with the criteria for peer feedback, we gave the same feedback prompt question with a minor modification to ChatGPT and asked it to give feedback on the peers’ essays ( Please read and provide feedback on the following essay and explain the extent to which she/he has presented/elaborated/justified various elements of an argumentative essay. What are the problems and what are your suggestions to improve each element of the essay? Your feedback must be between 250 and 350 words ).

Following this design, we were able to collect students’ essay data, peer feedback data, and feedback data generated by ChatGPT. In the next step, we used two coding schemes to analyze the quality of the essays and feedback generated by peers and ChatGPT.

Measurements

Coding scheme to assess the quality of essay writing.

In this study, a coding scheme proposed by Noroozi et al. ( 2016 ) was employed to assess students’ essay quality. This coding system was constructed based on the key components of high-quality essay composition, encompassing eight elements: introduction pertaining to the subject, taking a clear stance on the subject, presenting arguments in favor of the chosen position, providing justifications for the arguments supporting the position, counter-arguments, justifications for counter-arguments, responses to counter-arguments, and concluding with implications. Each element in the coding system is assigned a score ranging from zero (indicating the lowest quality level) to three (representing the highest quality level). The cumulative scores across all these elements were aggregated to determine the overall quality score of the student’s written essays. Two experienced coders in the field of education collaborated to assess the quality of the written essays, and their agreement level was measured at 75% (Cohen’s Kappa = 0.75 [95% confidence interval: 0.70–0.81]; z = 25.05; p  < 0.001), signifying a significant level of consensus between the coders.

Coding scheme to assess the quality of feedback generated by peers and ChatGPT

To assess the quality of feedback provided by both peers and ChatGPT, we employed a coding scheme developed by Noroozi et al. ( 2022 ). This coding framework dissects the characteristics of feedback, encompassing three key elements: the affective component, which considers the inclusion of emotional elements such as positive sentiments like praise or compliments, as well as negative emotions such as anger or disappointment; the cognitive component, which includes description (a concise summary of the essay), identification (pinpointing and specifying issues within the essay), and justification (providing explanations and justifications for the identified issues); and the constructive component, which involves offering recommendations, albeit not detailed action plans for further enhancements. Ratings within this coding framework range from zero, indicating poor quality, to two, signifying good quality. The cumulative scores were tallied to determine the overall quality of the feedback provided to the students. In this research, as each essay received feedback from both peers and ChatGPT, we calculated the average score from the two sets of feedback to establish the overall quality score for the feedback received, whether from peers or ChatGPT. The same two evaluators were involved in the assessment. The inter-rater reliability between the evaluators was determined to be 75% (Cohen’s Kappa = 0.75 [95% confidence interval: 0.66–0.84]; z = 17.52; p  < 0.001), showing a significant level of agreement between them.

The logic behind choosing these coding schemes was as follows: Firstly, from a theoretical standpoint, both coding schemes were developed based on robust and well-established theories. The coding scheme for evaluating essay quality draws on Toulmin’s argumentation model ( 1958 ), a respected framework for essay writing. It encompasses all elements essential for high-quality essay composition and aligns well with the structure of essays assigned in the chosen course for this study. Similarly, the feedback coding scheme is grounded in prominent works on identifying feedback features (e.g., Nelson & Schunn, 2009 ; Patchan et al., 2016 ; Wu & Schunn, 2020 ), enabling the identification of key features of high-quality feedback (Noroozi et al., 2022 ). Secondly, from a methodological perspective, both coding schemes feature a transparent scoring method, mitigating coder bias and bolstering the tool’s credibility.

To ensure the data’s validity and reliability for statistical analysis, two tests were implemented. Initially, the Levene test assessed group homogeneity, followed by the Kolmogorov-Smirnov test to evaluate data normality. The results confirmed both group homogeneity and data normality. For the first research question, gender was considered as a control variable, and the MANCOVA test was employed to compare the variations in feedback quality between peer feedback and ChatGPT-generated feedback. Addressing the second research question involved using Spearman’s correlation to examine the relationships among original argumentative essays, peer feedback, and ChatGPT-generated feedback.

The results showed a significant difference in feedback quality between peer feedback and ChatGPT-generated feedback. Peers provided feedback of higher quality compared to ChatGPT. This difference was mainly due to the descriptive and identification of the problem features of feedback. ChatGPT tended to produce more extensive descriptive feedback including a summary statement such as the description of the essay or taken action, while students performed better in pinpointing and identifying the issues in the feedback provided (see Table  1 ).

A comprehensive list featuring selected examples of feedback generated by peers and ChatGPT is presented in Fig  1 . This table additionally outlines examples of how the generated feedback was coded based on the coding scheme to assess the quality of feedback.

figure 1

A comparative list of selected examples of peer-generated and ChatGPT-generated feedback

Overall, the results indicated that there was no significant relationship between the quality of essay writing and the feedback generated by peers and ChatGPT. However, a positive correlation was observed between the quality of the essay and the affective feature of feedback generated by ChatGPT, while a negative relationship was observed between the quality of the essay and the affective feature of feedback generated by peers. This finding means that as the quality of the essay improves, ChatGPT tends to provide more affective feedback, while peers tend to provide less affective feedback (see Table  2 ).

This study was an initial effort to explore the potential of ChatGPT as a feedback source in the context of essay writing and to compare the extent to which the quality of feedback generated by ChatGPT differs from the feedback provided by peers. Below we discuss our findings for each research question.

Discussion on the results of RQ1

For the first research question, the results revealed a disparity in feedback quality when comparing peer-generated feedback to feedback generated by ChatGPT. Peer feedback demonstrated higher quality compared to ChatGPT-generated feedback. This discrepancy is attributed primarily to variations in the descriptive and problem-identification features of the feedback.

ChatGPT tended to provide more descriptive feedback, often including elements such as summarizing the content of the essay. This inclination towards descriptive feedback could be related to ChatGPT’s capacity to analyze and synthesize textual information effectively. Research on ChatGPT further supports this notion, demonstrating the AI tool’s capacity to offer a comprehensive overview of the provided content, therefore potentially providing insights and a holistic perspective on the content (Farrokhnia et al., 2023 ; Ray, 2023 ).

ChatGPT’s proficiency in providing extensive descriptive feedback could be seen as a strength. It might be particularly valuable for summarizing complex arguments or providing comprehensive overviews, which could aid students in understanding the overall structure and coherence of their essays.

In contrast, students’ feedback content entailed high quality regarding identifying specific issues and areas for improvement. Peers outperformance compared to ChatGPT in identifying problems within the essays could be related to humans’ potential in cognitive skills, critical thinking abilities, and contextual understanding (e.g., Korteling et al., 2021 ; Lamb et al., 2019 ). This means that students, with their contextual knowledge and critical thinking skills, may be better equipped to identify issues within the essays that ChatGPT may overlook.

Furthermore, a detailed look at the findings of the first research question discloses that the feedback generated by ChatGPT comprehensively encompassed all essential components characterizing high-quality feedback, including affective, cognitive, and constructive dimensions (Kerman et al., 2022 ; Patchan et al., 2016 ). This comprehensive observation could be an indication of the fact that ChatGPT-generated feedback could potentially serve as a viable source of feedback. This observation is supported by previous studies where a positive role for AI-generated feedback and automated feedback in enhancing educational outcomes has been recognized (e.g., Bellhäuser et al., 2023 ; Gombert et al., 2024 ; Huang et al., 2023 ; Xia et al., 2022 ).

Finally, an overarching look at the results of the first research question suggests a potential complementary role for ChatGPT and students in the feedback process. This means that using these two feedback sources together creates a synergistic relationship that could result in better feedback outcomes.

Discussion on the results of RQ2

Results for the second research question revealed no observations of a significant correlation between the quality of the essays and the quality of the feedback generated by both peers and ChatGPT. These findings carry a consequential implication, suggesting that the inherent quality of the essays under scrutiny exerts negligible influence over the quality of feedback furnished by both students and the ChatGPT.

In essence, these results point to a notable degree of independence between the writing prowess exhibited in the essays and the efficacy of the feedback received from either source. This disassociation implies that the ability to produce high-quality essays does not inherently translate into a corresponding ability to provide equally insightful feedback, neither for peers nor for ChatGPT. This decoupling of essay quality from feedback quality highlighted the multifaceted nature of these evaluative processes, where proficiency in constructing a coherent essay does not necessarily guarantee an equally adept capacity for evaluating and articulating constructive commentary on peers’ work.

The implications of these findings are both intriguing and defy conventional expectations, as they deviate somewhat from the prevailing literature’s stance. The existing body of scholarly work generally posits a direct relationship between the quality of an essay and the subsequent quality of generated feedback (Noroozi et al., 2016 ;, 2022 ; Kerman et al., 2022 ; Vale Haro et al., 2023 ). This line of thought contends that essays of inferior quality might serve as a catalyst for more pronounced error detection among students, encompassing grammatical intricacies, depth of content, clarity, and coherence, as well as the application of evidence and support. Conversely, when essays are skillfully crafted, the act of pinpointing areas for enhancement becomes a more complex task, potentially necessitating a heightened level of subject comprehension and nuanced evaluation.

However, the present study’s findings challenge this conventional wisdom. The observed decoupling of essay quality from feedback quality suggests a more nuanced interplay between the two facets of assessment. Rather than adhering to the anticipated pattern, wherein weaker essays prompt clearer identification of deficiencies, and superior essays potentially render the feedback process more challenging, the study suggests that the process might be more complex than previously thought. It hints at a dynamic in which the act of evaluating essays and providing constructive feedback transcends a simple linear connection with essay quality.

These findings, while potentially unexpected, are an indication of the complex nature of essay assignments and feedback provision highlighting the complexity of cognitive processes that underlie both tasks, and suggesting that the relationship between essay quality and feedback quality is not purely linear but influenced by a multitude of factors, including the evaluator’s cognitive framework, familiarity with the subject matter, and critical analysis skills.

Despite this general observation, a closer examination of the affective features within the feedback reveals a different pattern. The positive correlation between essay quality and the affective features present in ChatGPT-generated feedback could be related to ChatGPT’s capacity to recognize and appreciate students’ good work. As the quality of the essay increases, ChatGPT might be programmed to offer more positive and motivational feedback to acknowledge students’ progress (e.g., Farrokhnia et al., 2023 ; Ray, 2023 ). In contrast, the negative relationship between essay quality and the affective features in peer feedback may be attributed to the evolving nature of feedback from peers (e.g., Patchan et al., 2016 ). This suggests that as students witness improvements in their peers’ essay-writing skills and knowledge, their feedback priorities may naturally evolve. For instance, students may transition from emphasizing emotional and affective comments to focusing on cognitive and constructive feedback, with the goal of further enhancing the overall quality of the essays.

Limitations and implications for future research and practice

We acknowledge the limitations of this study. Primarily, the data underpinning this investigation was drawn exclusively from a singular institution and a solitary course, featuring a relatively modest participant pool. This confined scope inevitably introduces certain constraints that need to be taken into consideration when interpreting the study’s outcomes and generalizing them to broader educational contexts. Under this constrained sampling, the findings might exhibit a degree of contextual specificity, potentially limiting their applicability to diverse institutional settings and courses with distinct curricular foci. The diverse array of academic environments, student demographics, and subject matter variations existing across educational institutions could potentially yield divergent patterns of results. Therefore, while the current study’s outcomes provide insights within the confines of the studied institution and course, they should be interpreted and generalized with prudence. Recognizing these limitations, for future studies, we recommend considering a large-scale participant pool with a diverse range of variables, including individuals from various programs and demographics. This approach would enrich the depth and breadth of understanding in this domain, fostering a more comprehensive comprehension of the complex dynamics at play.

In addition, this study omitted an exploration into the degree to which students utilize feedback provided by peers and ChatGPT. That is to say that we did not investigate the effects of such feedback on essay enhancements in the revision phase. This omission inherently introduces a dimension of uncertainty and places a constraint on the study’s holistic understanding of the feedback loop. By not addressing these aspects, the study’s insights are somewhat partial, limiting the comprehensive grasp of the potential influences that these varied feedback sources wield on students’ writing enhancement processes. An analysis of the feedback assimilation patterns and their subsequent effects on essay refinement would have unveiled insights into the practical utility and impact of the feedback generated by peers and ChatGPT.

To address this limitation, future investigations could be structured to encompass a more thorough examination of students’ feedback utilization strategies and the resulting implications for the essay revision process. By shedding light on the complex interconnection between feedback reception, its integration into the revision process, and the ultimate outcomes in terms of essay improvement, a more comprehensive understanding of the dynamics involved could be attained.

Furthermore, in this study, we employed identical question prompts for both peers and ChatGPT. However, there is evidence indicating that ChatGPT is sensitive to how prompts are presented to it (e.g., Cao et al., 2023 ; White et al., 2023 ; Zuccon & Koopman, 2023 ). This suggests that variations in the wording, structure, or context of prompts might influence the responses generated by ChatGPT, potentially impacting the comparability of its outputs with those of peers. Therefore, it is essential to carefully consider and control for prompt-related factors in future research when assessing ChatGPT’s performance and capabilities in various tasks and contexts.

In addition, We acknowledge that ChatGPT can potentially generate inaccurate results. Nevertheless, in the context of this study, our examination of the results generated by ChatGPT did not reveal a significant inaccuracies that would warrant inclusion in our findings.

From a methodological perspective, we reported the interrater reliability between the coders to be 75%. While this level of agreement was statistically significant, signifying the reliability of our coders’ analyses, it did not reach the desired level of precision. We acknowledge this as a limitation of the study and suggest enhancing interrater reliability through additional coder training.

In addition, it is worth noting that the advancement of Generative AI like ChatGPT, opens new avenues in educational feedback mechanisms. Beyond just generating feedback, these AI models have the potential to redefine how feedback is presented and assimilated. In the realm of research on adaptive learning systems, the findings of this study also echo the importance of adaptive learning support empowered by AI and ChatGPT (Rummel et al., 2016 ). It can pave the way for tailored educational experiences that respond dynamically to individual student needs. This is not just about the feedback’s content but its delivery, timing, and adaptability. Further exploratory data analyses, such as sequential analysis and data mining, may offer insights into the nuanced ways different adaptive learning supports can foster student discussions (Papamitsiou & Economides, 2014 ). This involves dissecting the feedback dynamics, understanding how varied feedback types stimulate discourse, and identifying patterns that lead to enhanced student engagement.

Ensuring the reliability and validity of AI-empowered feedback is also crucial. The goal is to ascertain that technology-empowered learning support genuinely enhances students’ learning process in a consistent and unbiased manner. Given ChatGPT’s complex nature of generating varied responses based on myriad prompts, the call for enhancing methodological rigor through future validation studies becomes both timely and essential. For example, in-depth prompt validation and blind feedback assessment studies could be employed to meticulously probe the consistency and quality of ChatGPT’s responses. Also, comparative analysis with different AI models can be useful.

From an educational standpoint, our research findings advocate for the integration of ChatGPT as a feedback resource with peer feedback within higher education environments for essay writing tasks since there is a complementary role potential for pee-generated and ChatGPT-generated feedback. This approach holds the potential to alleviate the workload burden on teachers, particularly in the context of online courses with a significant number of students.

This study contributes to and adds value to the young existing but rapidly growing literature in two distinct ways. From a research perspective, this study addresses a significant void in the current literature by responding to the lack of research on AI-generated feedback for complex tasks like essay writing in higher education. The research bridges this gap by analyzing the effectiveness of ChatGPT-generated feedback compared to peer-generated feedback, thereby establishing a foundation for further exploration in this field. From a practical perspective of higher education, the study’s findings offer insights into the potential integration of ChatGPT as a feedback source within higher education contexts. The discovery that ChatGPT’s feedback quality could potentially complement peer feedback highlights its applicability for enhancing feedback practices in higher education. This holds particular promise for courses with substantial enrolments and essay-writing components, providing teachers with a feasible alternative for delivering constructive feedback to a larger number of students.

Data availability

The data is available upon a reasonable request.

Alqassab, M., Strijbos, J. W., & Ufer, S. (2018). Training peer-feedback skills on geometric construction tasks: Role of domain knowledge and peer-feedback levels. European Journal of Psychology of Education , 33 (1), 11–30. https://doi.org/10.1007/s10212-017-0342-0 .

Article   Google Scholar  

Amiryousefi, M., & Geld, R. (2021). The role of redressing teachers’ instructional feedback interventions in EFL learners’ motivation and achievement in distance education. Innovation in Language Learning and Teaching , 15 (1), 13–25. https://doi.org/10.1080/17501229.2019.1654482 .

Arguedas, M., Daradoumis, A., & Xhafa Xhafa, F. (2016). Analyzing how emotion awareness influences students’ motivation, engagement, self-regulation and learning outcome. Educational Technology and Society , 19 (2), 87–103. https://www.jstor.org/stable/jeductechsoci.19.2.87 .

Google Scholar  

Banihashem, S. K., Noroozi, O., van Ginkel, S., Macfadyen, L. P., & Biemans, H. J. (2022). A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educational Research Review , 100489. https://doi.org/10.1016/j.edurev.2022.100489 .

Banihashem, S. K., Dehghanzadeh, H., Clark, D., Noroozi, O., & Biemans, H. J. (2023). Learning analytics for online game-based learning: A systematic literature review. Behaviour & Information Technology , 1–28. https://doi.org/10.1080/0144929X.2023.2255301 .

Bellhäuser, H., Dignath, C., & Theobald, M. (2023). Daily automated feedback enhances self-regulated learning: A longitudinal randomized field experiment. Frontiers in Psychology , 14 , 1125873. https://doi.org/10.3389/fpsyg.2023.1125873 .

Bond, M., Khosravi, H., De Laat, M., Bergdahl, N., Negrea, V., Oxley, E., & Siemens, G. (2024). A meta systematic review of artificial intelligence in higher education: A call for increased ethics, collaboration, and rigour. International Journal of Educational Technology in Higher Education , 21 (4), 1–41. https://doi.org/10.1186/s41239-023-00436-z .

Bulqiyah, S., Mahbub, M., & Nugraheni, D. A. (2021). Investigating writing difficulties in Essay writing: Tertiary Students’ perspectives. English Language Teaching Educational Journal , 4 (1), 61–73. https://doi.org/10.12928/eltej.v4i1.2371 .

Callender, A. A., Franco-Watkins, A. M., & Roberts, A. S. (2016). Improving metacognition in the classroom through instruction, training, and feedback. Metacognition and Learning , 11 (2), 215–235. https://doi.org/10.1007/s11409-015-9142-6 .

Cao, J., Li, M., Wen, M., & Cheung, S. C. (2023). A study on prompt design, advantages and limitations of chatgpt for deep learning program repair. arXiv Preprint arXiv:2304 08191 . https://doi.org/10.48550/arXiv.2304.08191 .

Dai, W., Lin, J., Jin, F., Li, T., Tsai, Y. S., Gasevic, D., & Chen, G. (2023). Can large language models provide feedback to students? A case study on ChatGPT. https://doi.org/10.35542/osf.io/hcgzj .

Darvishi, A., Khosravi, H., Sadiq, S., Gašević, D., & Siemens, G. (2024). Impact of AI assistance on student agency. Computers & Education , 210 , 104967. https://doi.org/10.1016/j.compedu.2023.104967 .

Deeva, G., Bogdanova, D., Serral, E., Snoeck, M., & De Weerdt, J. (2021). A review of automated feedback systems for learners: Classification framework, challenges and opportunities. Computers & Education , 162 , 104094. https://doi.org/10.1016/j.compedu.2020.104094 .

Diezmann, C. M., & Watters, J. J. (2015). The knowledge base of subject matter experts in teaching: A case study of a professional scientist as a beginning teacher. International Journal of Science and Mathematics Education , 13 , 1517–1537. https://doi.org/10.1007/s10763-014-9561-x .

Drachsler, H. (2023). Towards highly informative learning analytics . Open Universiteit. https://doi.org/10.25656/01:26787 .

Drachsler, H., & Kalz, M. (2016). The MOOC and learning analytics innovation cycle (MOLAC): A reflective summary of ongoing research and its challenges. Journal of Computer Assisted Learning , 32 (3), 281–290. https://doi.org/10.1111/jcal.12135 .

Er, E., Dimitriadis, Y., & Gašević, D. (2021). Collaborative peer feedback and learning analytics: Theory-oriented design for supporting class-wide interventions. Assessment & Evaluation in Higher Education , 46 (2), 169–190. https://doi.org/10.1080/02602938.2020.1764490 .

Farrokhnia, M., Banihashem, S. K., Noroozi, O., & Wals, A. (2023). A SWOT analysis of ChatGPT: Implications for educational practice and research. Innovations in Education and Teaching International , 1–15. https://doi.org/10.1080/14703297.2023.2195846 .

Gan, Z., An, Z., & Liu, F. (2021). Teacher feedback practices, student feedback motivation, and feedback behavior: How are they associated with learning outcomes? Frontiers in Psychology , 12 , 697045. https://doi.org/10.3389/fpsyg.2021.697045 .

Gao, X., Noroozi, O., Gulikers, J. T. M., Biemans, H. J., & Banihashem, S. K. (2024). A systematic review of the key components of online peer feedback practices in higher education. Educational Research Review , 100588. https://doi.org/10.1016/j.edurev.2023.100588 .

Gielen, M., & De Wever, B. (2015). Scripting the role of assessor and assessee in peer assessment in a wiki environment: Impact on peer feedback quality and product improvement. Computers & Education , 88 , 370–386. https://doi.org/10.1016/j.compedu.2015.07.012 .

Gombert, S., Fink, A., Giorgashvili, T., Jivet, I., Di Mitri, D., Yau, J., & Drachsler, H. (2024). From the Automated Assessment of Student Essay Content to highly informative feedback: A case study. International Journal of Artificial Intelligence in Education , 1–39. https://doi.org/10.1007/s40593-023-00387-6 .

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research , 77 (1), 81–112. https://doi.org/10.3102/003465430298487 .

Holt-Reynolds, D. (1999). Good readers, good teachers? Subject matter expertise as a challenge in learning to teach. Harvard Educational Review , 69 (1), 29–51. https://doi.org/10.17763/haer.69.1.pl5m5083286l77t2 .

Huang, A. Y., Lu, O. H., & Yang, S. J. (2023). Effects of artificial intelligence–enabled personalized recommendations on learners’ learning engagement, motivation, and outcomes in a flipped classroom. Computers & Education , 194 , 104684. https://doi.org/10.1016/j.compedu.2022.104684 .

Katz, A., Wei, S., Nanda, G., Brinton, C., & Ohland, M. (2023). Exploring the efficacy of ChatGPT in analyzing Student Teamwork Feedback with an existing taxonomy. arXiv Preprint arXiv . https://doi.org/10.48550/arXiv.2305.11882 .

Kerman, N. T., Noroozi, O., Banihashem, S. K., Karami, M., & Biemans, H. J. (2022). Online peer feedback patterns of success and failure in argumentative essay writing. Interactive Learning Environments , 1–13. https://doi.org/10.1080/10494820.2022.2093914 .

Kerman, N. T., Banihashem, S. K., Karami, M., Er, E., Van Ginkel, S., & Noroozi, O. (2024). Online peer feedback in higher education: A synthesis of the literature. Education and Information Technologies , 29 (1), 763–813. https://doi.org/10.1007/s10639-023-12273-8 .

King, A. (2002). Structuring peer interaction to promote high-level cognitive processing. Theory into Practice , 41 (1), 33–39. https://doi.org/10.1207/s15430421tip4101_6 .

Konold, K. E., Miller, S. P., & Konold, K. B. (2004). Using teacher feedback to enhance student learning. Teaching Exceptional Children , 36 (6), 64–69. https://doi.org/10.1177/004005990403600608 .

Korteling, J. H., van de Boer-Visschedijk, G. C., Blankendaal, R. A., Boonekamp, R. C., & Eikelboom, A. R. (2021). Human-versus artificial intelligence. Frontiers in Artificial Intelligence , 4 , 622364. https://doi.org/10.3389/frai.2021.622364 .

Labuhn, A. S., Zimmerman, B. J., & Hasselhorn, M. (2010). Enhancing students’ self-regulation and mathematics performance: The influence of feedback and self-evaluative standards. Metacognition and Learning , 5 , 173–194. https://doi.org/10.1007/s11409-010-9056-2 .

Lamb, R., Firestone, J., Schmitter-Edgecombe, M., & Hand, B. (2019). A computational model of student cognitive processes while solving a critical thinking problem in science. The Journal of Educational Research , 112 (2), 243–254. https://doi.org/10.1080/00220671.2018.1514357 .

Latifi, S., Noroozi, O., & Talaee, E. (2023). Worked example or scripting? Fostering students’ online argumentative peer feedback, essay writing and learning. Interactive Learning Environments , 31 (2), 655–669. https://doi.org/10.1080/10494820.2020.1799032 .

Li, L., & Liu, X. (2010). Steckelberg. Assessor or assessee: How student learning improves by giving and receiving peer feedback. British Journal of Educational Technology , 41 (3), 525–536. https://doi.org/10.1111/j.1467-8535.2009.00968.x .

Liu, N. F., & Carless, D. (2006). Peer feedback: The learning element of peer assessment. Teaching in Higher Education , 11 (3), 279–290. https://doi.org/10.1080/13562510600680582 .

Liunokas, Y. (2020). Assessing students’ ability in writing argumentative essay at an Indonesian senior high school. IDEAS: Journal on English language teaching and learning. Linguistics and Literature , 8 (1), 184–196. https://doi.org/10.24256/ideas.v8i1.1344 .

Nelson, M. M., & Schunn, C. D. (2009). The nature of feedback: How different types of peer feedback affect writing performance. Instructional Science , 37 , 375–401. https://doi.org/10.1007/s11251-008-9053-x .

Noroozi, O., Banihashem, S. K., Taghizadeh Kerman, N., Parvaneh Akhteh Khaneh, M., Babayi, M., Ashrafi, H., & Biemans, H. J. (2022). Gender differences in students’ argumentative essay writing, peer review performance and uptake in online learning environments. Interactive Learning Environments , 1–15. https://doi.org/10.1080/10494820.2022.2034887 .

Noroozi, O., Biemans, H., & Mulder, M. (2016). Relations between scripted online peer feedback processes and quality of written argumentative essay. The Internet and Higher Education , 31, 20-31. https://doi.org/10.1016/j.iheduc.2016.05.002

Noroozi, O., Banihashem, S. K., Biemans, H. J., Smits, M., Vervoort, M. T., & Verbaan, C. L. (2023). Design, implementation, and evaluation of an online supported peer feedback module to enhance students’ argumentative essay quality. Education and Information Technologies , 1–28. https://doi.org/10.1007/s10639-023-11683-y .

Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Journal of Educational Technology & Society , 17 (4), 49–64. https://doi.org/10.2307/jeductechsoci.17.4.49 . https://www.jstor.org/stable/ .

Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology , 50 (1), 128–138. https://doi.org/10.1111/bjet.12592 .

Patchan, M. M., Schunn, C. D., & Correnti, R. J. (2016). The nature of feedback: How peer feedback features affect students’ implementation rate and quality of revisions. Journal of Educational Psychology , 108 (8), 1098. https://doi.org/10.1037/edu0000103 .

Ramsden, P. (2003). Learning to teach in higher education . Routledge.

Ray, P. P. (2023). ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope. Internet of Things and Cyber-Physical Systems , 3 , 121–154. https://doi.org/10.1016/j.iotcps.2023.04.003 .

Rüdian, S., Heuts, A., & Pinkwart, N. (2020). Educational Text Summarizer: Which sentences are worth asking for? In DELFI 2020 - The 18th Conference on Educational Technologies of the German Informatics Society (pp. 277–288). Bonn, Germany.

Rummel, N., Walker, E., & Aleven, V. (2016). Different futures of adaptive collaborative learning support. International Journal of Artificial Intelligence in Education , 26 , 784–795. https://doi.org/10.1007/s40593-016-0102-3 .

Shi, M. (2019). The effects of class size and instructional technology on student learning performance. The International Journal of Management Education , 17 (1), 130–138. https://doi.org/10.1016/j.ijme.2019.01.004 .

Article   MathSciNet   Google Scholar  

Toulmin, S. (1958). The uses of argument . Cambridge University Press.

Valero Haro, A., Noroozi, O., Biemans, H. J., Mulder, M., & Banihashem, S. K. (2023). How does the type of online peer feedback influence feedback quality, argumentative essay writing quality, and domain-specific learning? Interactive Learning Environments , 1–20. https://doi.org/10.1080/10494820.2023.2215822 .

White, J., Fu, Q., Hays, S., Sandborn, M., Olea, C., Gilbert, H., & Schmidt, D. C. (2023). A prompt pattern catalog to enhance prompt engineering with chatgpt. arXiv preprint arXiv:2302.11382 . https://doi.org/10.48550/arXiv.2302.11382 .

Wu, Y., & Schunn, C. D. (2020). From feedback to revisions: Effects of feedback features and perceptions. Contemporary Educational Psychology , 60 , 101826. https://doi.org/10.1016/j.cedpsych.2019.101826 .

Xia, Q., Chiu, T. K., Zhou, X., Chai, C. S., & Cheng, M. (2022). Systematic literature review on opportunities, challenges, and future research recommendations of artificial intelligence in education. Computers and Education: Artificial Intelligence , 100118. https://doi.org/10.1016/j.caeai.2022.100118 .

Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education–where are the educators? International Journal of Educational Technology in Higher Education , 16 (1), 1–27. https://doi.org/10.1186/s41239-019-0171-0 .

Zhang, Z. V., & Hyland, K. (2022). Fostering student engagement with feedback: An integrated approach. Assessing Writing , 51 , 100586. https://doi.org/10.1016/j.asw.2021.100586 .

Zuccon, G., & Koopman, B. (2023). Dr ChatGPT, tell me what I want to hear: How prompt knowledge impacts health answer correctness. arXiv preprint arXiv:2302 .13793. https://doi.org/10.48550/arXiv.2302.13793 .

Download references

No funding has been received for this research.

Author information

Authors and affiliations.

Open Universiteit, Heerlen, The Netherlands

Seyyed Kazem Banihashem & Hendrik Drachsler

Wageningen University and Research, Wageningen, The Netherlands

Seyyed Kazem Banihashem & Omid Noroozi

Ferdowsi University of Mashhad, Mashhad, Iran

Nafiseh Taghizadeh Kerman

The University of Alabama, Tuscaloosa, USA

Jewoong Moon

DIPE Leibniz Institute, Goethe University, Frankfurt, Germany

Hendrik Drachsler

You can also search for this author in PubMed   Google Scholar

Contributions

S. K. Banihashem led this research experiment. N. T. Kerman contributed to the data analysis and writing. O. Noroozi contributed to the designing, writing, and reviewing the manuscript. J. Moon contributed to the writing and revising the manuscript. H. Drachsler contributed to the writing and revising the manuscript.

Corresponding author

Correspondence to Seyyed Kazem Banihashem .

Ethics declarations

Declaration of ai-assisted technologies in the writing process.

The authors used generative AI for language editing and took full responsibility.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Banihashem, S.K., Kerman, N.T., Noroozi, O. et al. Feedback sources in essay writing: peer-generated or AI-generated feedback?. Int J Educ Technol High Educ 21 , 23 (2024). https://doi.org/10.1186/s41239-024-00455-4

Download citation

Received : 20 November 2023

Accepted : 18 March 2024

Published : 12 April 2024

DOI : https://doi.org/10.1186/s41239-024-00455-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • AI-generated feedback
  • Essay writing
  • Feedback sources
  • Higher education
  • Peer feedback

higher level critical thinking questions

higher level critical thinking questions

13 Phrases People With High-Level Thinking Often Say, According to Psychologists

High-level thinking sounds fancy, and you know what? It is.

"High-level thinking goes out of the ordinary and beyond the regular regurgitation of facts or parroting of information that we hear on TV or read in the news," says  Dr. Elisabeth Crain, PsyD . , a doctor of psychology and licensed psychotherapist. 

On second thought, high-level thinking sounds heavenly in 2024. One psychologist likes being a skilled chef in a bustling kitchen:

"Instead of juggling pots and pans, you're juggling complex ideas, patterns and abstract concepts," says Dr. Gayle MacBride, PhD, LP of Veritas Psychology Partners .  "It involves stepping back from the immediate details to view the bigger picture, making connections between seemingly unrelated ideas and employing critical and creative thinking skills to solve problems in innovative ways."

The advantages of being a high-level thinker are immense.

"The benefits of high-level thinking are akin to having a Swiss Army knife for your brain," Dr. MacBride says. "It enhances your problem-solving abilities, makes you more adaptable to change and improves your capacity to understand others' perspectives."

As a result, Dr. MacBride says you're more likely to be creative, make informed decisions, display empathy and be more fun at parties because you're so interesting.

How do you know if you're a high-level thinker? What you say will hold clues. A pair of psychologists shared 13 common phrases of people with high-level thinking.

Related: 6 Things Successful People *Always* Do in a Conversation, According to a Neuropsychotherapist

13 High-Level Thinking Phrases, According to Psychologists

1. "let's look at this from a different angle.".

People with high-level thinking relish looking at ideas and discussions from all sides, even ones different from theirs.

"This phrase shows their willingness to explore alternative perspectives and find novel solutions," Dr. MacBride says.

2. "It is my belief that..."

This one sounds super affirmative for someone who likes to explore different angles. However, high-level thinkers can be open-minded and confident in their opinions.

" People with high-level thinking tend to have a lot of conviction in the things that they say because they’ve been able to establish their own belief system around the thoughts and ideas they have," Dr. Crain says. "These beliefs become part of their identity in a lot of ways, so when they speak, they have a lot of conviction in what they say." 

3. "I believe that..."

This one is almost identical to No. 2, but there's a key difference: It starts with "I," and that's a significant reason why Dr. Crain used it as an example.

"People with high-level thinking tend to use more 'I statements' because their beliefs are tied to their identity," Dr. Crain says.

4. "What's the bigger picture here?"

This phrase exemplifies why people with high-level skills make such strong leaders.

"They're adept at zooming out to see how the pieces fit together, understanding that details are part of larger systems," Dr. MacBride says.

5. "How does this connect to what we were discussing earlier?"

High-level thinkers can connect the dots like nobody's business.

"When high-level thinkers use this phrase, they are demonstrating their ability to draw parallels and link ideas across different contexts," Dr. MacBride says.

6. "What if we approached it this way instead?"

Dr. MacBride loves that this phrase opens the door to problem-solving using creativity and flexibility.

"These are people who don’t just advocate buying new but find creative ways to use what they already have in new ways," Dr. MacBride explains. 

Speaking of which...

7. "Let's see if we can do more with less."

Dr. MacBride says this phrase essentially says the same thing as No. 6, but it's in statement form and a bit more direct. It's an excellent choice for resource-strapped situations.

8. "I can see how you may see it this way. However, to me, it looks more like this..."

This phrase shows a person's ability to see things from all sides without sacrificing firmly held beliefs. 

"Someone who can high-level think will acknowledge other thoughts, beliefs and ideas that don’t necessarily resonate with them," Dr. Crain says. "They can recognize and acknowledge other ways of thinking. They know that thinking is not linear or concrete."

9. "Let's break down the assumptions we're making."

High-level thinkers are willing to push back to help people move forward.

"They challenge the status quo and question underlying beliefs, which can lead to breakthrough insights," Dr. MacBride explains.

10. "Can we find a pattern?"

Again, high-level thinkers are always looking to connect the dots.

"Looking for patterns is a hallmark of strategic thinking, crucial for predicting and planning," Dr. MacBride says.

11. "What are the potential ripple effects?"

People with high-level thinking skills are willing to give even the seemingly best ideas a second (or third and fourth) thought before proceeding, partly because of their empathy toward others.

"Considering the broader consequences of decisions shows an understanding of cause and effect on a complex level," Dr. MacBride says. "They are thoughtful and slow to respond because they are performing some mental gymnastics before they weigh in."

12. "I’d like to listen to what you have to say about it..."

People who can think at a high level don't just talk. They listen.

"People with high-level thinking tend to be really good listeners," Dr. Crain says. "They are very curious about what others have to say."

13. "Can you tell me more about..."

Curiosity is a pillar of high-level thinking.

"They seek further information to concretize what they’re learning," Dr. Crain says. "When we learn about something and understand it, we can then concretize and master it and finally formulate our own thoughts and beliefs around it."

Related: 7 Helpful Phrases for Politely Expressing a Different Opinion, According to a Psychologist

Things People With High-Level Thinking Do *Not* Say

High-level thinking invites discussion and differing viewpoints. It doesn't shut it down. "Something that someone with high-level thinking wouldn’t say is, 'That's just the way it is,'" Dr. MacBride says. "This statement closes the door on questioning, curiosity and the possibility of change. It's antithetical to the very nature of high-level thinking, which thrives on open-ended questions and the potential for innovation."

Similarly, Dr. Crain says you won't hear someone with high-level thinking flat-out tell people they are wrong.

"They respect the fact that others have different ways of thinking, even if they don’t agree with their methods," Dr. Crain says. "They don’t put absolutes on things because they know that ideas are malleable and that they change and shift over time. They do not say things like, 'It is 100% this way.'"

Related: 175 Controversial Questions to Spark Debate—From Politics to Pop Culture

3 Tips for Becoming a High-Level Thinker

1. embrace curiosity.

Forget what curiosity does to cats. This trait fosters high-level thinking.

"Like a child marveling at the world, ask 'why' frequently," Dr. MacBride says. "Dive into topics outside your comfort zone to broaden your perspective. Implement this by adopting a learner's mindset, where you see every experience as an opportunity to grow."

As a result, you'll prevent rushing to judgment before you've seen all angles.

Dr. Crain recommends pausing after reading new ideas or engaging in a healthy discourse with others. 

"Take some time out after you’ve learned things and think about what resonated with you," Dr. Crain says. "Take your learning and thinking to that second or third step by forming your own opinions and beliefs on what you’ve learned."

3. Seek out diverse perspectives

Exit the echo chamber, as comfy as it is.

"Surround yourself with people and situations that challenge your thinking," Dr. MacBride recommends. "This can be through experience, books, podcasts or conversations with individuals from different backgrounds and disciplines."

  • Dr. Elisabeth Crain, PsyD. , a doctor of psychology and licensed psychotherapist
  • Dr. Gayle MacBride, PhD, LP of Veritas Psychology Partners

high-level-thinking-phrases-according-to-psychologists

IMAGES

  1. 35 Higher-Order Thinking Questions (2024) / Higher Order Thinking: Bloom’s Taxonomy

    higher level critical thinking questions

  2. Higher order thinking skills for students and teachers.

    higher level critical thinking questions

  3. Image result for kindergarten hots questions

    higher level critical thinking questions

  4. Bloom's Taxonomy Graphic

    higher level critical thinking questions

  5. Pin by Emily Stewart on Teaching

    higher level critical thinking questions

  6. Bloom`s Critical Thinking Cue Questions

    higher level critical thinking questions

VIDEO

  1. Question Bank Discussion on Problems Solving and Critical Thinking Questions by Dr. Naresh Kumar

  2. ሎጅክና የምክኑያዊ እሳቤ ጥያቄ (Logic & Critical Thinking Questions)

  3. critical thinking questions #basictoadvance #criticalthinking #reminder #thinking #english

  4. Critical Thinking Hacks! #facts #shorts

  5. 🔓🧠💯Unlock Your Brain Full Potential with these 💥🤔🔎Top 4 Exercises to Boost Critical Thinking Skills

  6. 30 Critical Thinking Attributes 머리가 좋아지는 똑똑한 생각 법

COMMENTS

  1. 35 Higher-Order Thinking Questions (2024)

    Higher-order thinking questions are questions that you can ask in order to stimulate thinking that requires significant knowledge mastery and data manipulation. Generally, higher-order thinking involves thinking from the top 3 levels of bloom's taxonomy: analysis, evaluation, and knowledge creation. The term "higher-order" is used because ...

  2. 50+ Higher-Order Thinking Questions and Stems

    Source: University of Michigan. Bloom's Taxonomy is a way of classifying cognitive thinking skills. The six main categories—remember, understand, apply, analyze, evaluate, create—are broken into lower-order thinking skills (LOTS) and higher-order thinking skills (HOTS). LOTS includes remember, understand, and apply.

  3. 36 Question Stems Framed Around Bloom's Taxonomy

    Question stems can be a powerful part of that process no matter where the learner is. They can be used as metacognitive and higher-order thinking prompts for class discussions, prompting, cueing, pre-assessment, self-assessment, formative and summative assessment, etc. See also 28 Critical Thinking Question Stems & Response Cards ($2.95)

  4. Higher Order Thinking: Bloom's Taxonomy

    Creating involves putting elements together to form a coherent or functional whole. Creating includes reorganizing elements into a new pattern or structure through planning. This is the highest and most advanced level of Bloom's Taxonomy. Build a model and use it to teach the information to others.

  5. Questions to Provoke Critical Thinking

    Questions to Provoke Critical Thinking. Varying question stems can sustain engagement and promote critical thinking. The timing, sequence and clarity of questions you ask students can be as important as the type of question you ask. The table below is organized to help formulate questions provoking gradually higher levels of thinking.

  6. Higher Order Thinking Questions for Your Next Lesson

    Higher order thinking questions help students explore and express rigor in their application of knowledge. There are 5 main areas of higher order thinking that promote rigor: Higher Level Thinking. Engagement. Deep Inquiry. Demonstration and. Quality Over Quantity. Each of these areas encourage students to move beyond rote knowledge and to ...

  7. Asking Better Questions With Bloom's Taxonomy

    Benjamin Bloom is known for developing the taxonomy of higher-level thinking questions. The taxonomy provides categories of thinking skills that help educators formulate questions. The taxonomy begins with the lowest level of thinking skill and moves to the highest level of thinking skill. The six thinking skills from the lowest level to the ...

  8. Classroom Strategy for Educators: Higher-order Questions

    Higher-order questions are those that the students cannot answer just by simple recollection or by reading the information " verbatim " from the text. Higher-order questions put advanced cognitive demand on students. They encourage students to think beyond literal questions. Higher-order questions promote critical thinking skills because ...

  9. How to Lead Students to Engage in Higher Order Thinking

    Essential questions—a staple of project-based learning—call on students' higher order thinking and connect their lived experience with important texts and ideas. A thinking inventory is a carefully curated set of about 10 essential questions of various types, and completing one the first thing I ask students to do in every course I teach.

  10. PDF Bloom's Question Starters for Higher Order Thinking

    The first three levels are considered lower order questions; the final three levels are considered higher order. Higher order questions are what we use for Critical Thinking and Creative Problem Solving. I have written what each level of questions are about, given lists of key words that can be used to begin a question for that level, and I ...

  11. Critical Thinking and other Higher-Order Thinking Skills

    Critical thinking is a higher-order thinking skill. ... a multiple-choice exam might suffice to assess lower-order levels of "knowing," while a project or demonstration might be required to evaluate synthesis of knowledge or creation of new understanding. ... Embed Big Questions; The Critical Thinking Community; The Critical Thinking ...

  12. Critical Thinking and Higher Order Levels of Cognition (Thinking)

    Utility. By asking these critical thinking questions or higher-order questions, you are developing all levels of thinking. Students will have improved attention to detail, as well as an increase in their comprehension and problem-solving skills. Levels. There are six levels in the framework, here is a brief look at each of them and a few ...

  13. Bloom's Taxonomy Question Stems For Use In Assessment [With 100

    Download Now: Bloom's Taxonomy Question Stems and Examples. Higher-Level Thinking Questions. Higher-level thinking questions are designed to encourage critical thinking, analysis, and synthesis of information. Here are eight examples of higher-level thinking questions that can be used in higher education:

  14. How To Teach Higher-order Thinking (and Why It Matters)

    The classical Greek philosopher, Socrates, is often thought of as the founder of critical thinking skills. In a nutshell, Socrates introduced the idea of teaching by not providing answers but instead, teaching by asking questions: questions that explore, investigate, probe, stimulate, and engage. Far more recently, Bloom's taxonomy was ...

  15. 101 Great Higher-Order Thinking Questions for Reading

    Higher-order thinking questions require an open-ended response that goes beyond the ability to answer literal questions. These types of questions demand a higher level of critical thinking that prompts students to become problem-solvers as they read, making connections to bigger concepts beyond the text.

  16. 81 Great Higher-Order Thinking Questions

    Higher-order thinking questions are questions that require students to apply, analyze, and evaluate information rather than simply recall it. Students must think beyond the literal in order to make connections and subsequently meaning of what they are reading, writing, or discussing. These types of questions prompt higher-level thinking from ...

  17. Higher Level Thinking: Synthesis in Bloom's Taxonomy

    Updated on September 19, 2018. Bloom's Taxonomy (1956 ) was designed with six levels in order to promote higher order thinking. Synthesis was placed on the fifth level of the Bloom's taxonomy pyramid as it requires students to infer relationships among sources. The high-level thinking of synthesis is evident when students put the parts or ...

  18. Higher-order questioning inspires higher-level thinking

    In Classroom Instruction That Works, 2nd ed. (2012), our McREL colleagues recommend asking inferential questions and using explicit cues to activate your students' prior knowledge and develop deeper understanding of the content. You can also prepare sentence stems that help you craft higher order questions on the fly during classroom discussions.

  19. Higher Order Questions for Your Text Feature Lessons

    When we question our students about text features, we often focus too much on having students identify different nonfiction text features. While this is essential, it is equally important to get our 3rd, 4th, and 5th grade students thinking more deeply about text features - moving past knowledge and recall questions and into more higher order thinking questions.

  20. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  21. Promoting Rigor Through Higher Level Questioning

    Stanley, T. (2020). Promoting Rigor Through Higher Level Questioning: Practical Strategies for Developing Students' Critical Thinking (1st ed.).

  22. Higher-level thinking skills questions

    Higher-order thinking skills questions, which align with the HOTS levels, encourage deeper engagement with the subject matter and promote critical thinking and problem-solving skills. "Higher-order thinking is the ability to engage in complex cognitive processes such as analysis, evaluation, and creation.

  23. Writing Multiple-Choice Questions for Higher-level Thinking

    This not only raised the questions to higher level thinking, it made the training much more realistic than merely categorizing or labeling terms. Use ... Susan, and Kathleen Walsh Free, (2001) Writing multiple-choice test items that promote and measure critical thinking. Journal of Nursing Education, January 2001, Vol. 40, No. 1, ...

  24. 35 Higher-Order Thinking Questions (2024)

    Generally, higher-order thinking included thinking from this top 3 levels of bloom's taxonomy: analysis, evaluation, or knowledge establish. The term "higher-order" is used because such forms is thinking requirement powerful command of information and the ability to work with it to develop complex understanding (Stanley, 2021). Common, a ...

  25. PDF Questions, Critical Thinking, and Language Proficiency

    mation by asking follow-up questions, and allowing students to question one an-other prompt higher-level thinking and result in more complete and complex re-sponses in the target language. Asking effective questions is both an art and a science. There are many teachers who ask high-level questions intuitively. However, with adequate preparation ...

  26. Feedback sources in essay writing: peer-generated or AI-generated

    Peer feedback is introduced as an effective learning strategy, especially in large-size classes where teachers face high workloads. However, for complex tasks such as writing an argumentative essay, without support peers may not provide high-quality feedback since it requires a high level of cognitive processing, critical thinking skills, and a deep understanding of the subject. With the ...

  27. 13 Phrases People With High-Level Thinking Often Say, According ...

    On second thought, high-level thinking sounds heavenly in 2024. One psychologist likes being a skilled chef in a bustling kitchen: "Instead of juggling pots and pans, you're juggling complex ideas ...