How to read a paper, critical review

Reading a scientific article is a complex task. The worst way to approach this task is to treat it like the reading of a textbook—reading from title to literature cited, digesting every word along the way without any reflection or criticism.

A critical review (sometimes called a critique, critical commentary, critical appraisal, critical analysis) is a detailed commentary on and critical evaluation of a text. You might carry out a critical review as a stand-alone exercise, or as part of your research and preparation for writing a literature review. The following guidelines are designed to help you critically evaluate a research article.

How to Read a Scientific Article

You should begin by skimming the article to identify its structure and features. As you read, look for the author’s main points.

  • Generate questions before, during, and after reading.
  • Draw inferences based on your own experiences and knowledge.
  • To really improve understanding and recall, take notes as you read.

What is meant by critical and evaluation?

  • To be critical does not mean to criticise in an exclusively negative manner.   To be critical of a text means you question the information and opinions in the text, in an attempt to evaluate or judge its worth overall.
  • An evaluation is an assessment of the strengths and weaknesses of a text.   This should relate to specific criteria, in the case of a research article.   You have to understand the purpose of each section, and be aware of the type of information and evidence that are needed to make it convincing, before you can judge its overall value to the research article as a whole.

Useful Downloads

  • How to read a scientific paper
  • How to conduct a critical review

Banner

Write a Critical Review of a Scientific Journal Article

1. identify how and why the research was carried out, 2. establish the research context, 3. evaluate the research, 4. establish the significance of the research.

  • Writing Your Critique

Ask Us: Chat, email, visit or call

Click to chat: contact the library

Video: How to Integrate Critical Voice into Your Literature Review

How to Integrate Critical Voice in Your Lit Review

Video: Note-taking and Writing Tips to Avoid Plagiarism

Note-taking and Writing Tips to Avoid Accidental Plagiarism

Get assistance

The library offers a range of helpful services.  All of our appointments are free of charge and confidential.

  • Book an appointment

Read the article(s) carefully and use the questions below to help you identify how and why the research was carried out. Look at the following sections: 

Introduction

  • What was the objective of the study?
  • What methods were used to accomplish this purpose (e.g., systematic recording of observations, analysis and evaluation of published research, assessment of theory, etc.)?
  • What techniques were used and how was each technique performed?
  • What kind of data can be obtained using each technique?
  • How are such data interpreted?
  • What kind of information is produced by using the technique?
  • What objective evidence was obtained from the authors’ efforts (observations, measurements, etc.)?
  • What were the results of the study? 
  • How was each technique used to obtain each result?
  • What statistical tests were used to evaluate the significance of the conclusions based on numeric or graphic data?
  • How did each result contribute to answering the question or testing the hypothesis raised in the introduction?
  • How were the results interpreted? How were they related to the original problem (authors’ view of evidence rather than objective findings)? 
  • Were the authors able to answer the question (test the hypothesis) raised?
  • Did the research provide new factual information, a new understanding of a phenomenon in the field, or a new research technique?
  • How was the significance of the work described?
  • Do the authors relate the findings of the study to literature in the field?
  • Did the reported observations or interpretations support or refute observations or interpretations made by other researchers?

These questions were adapted from the following sources:  Kuyper, B.J. (1991). Bringing up scientists in the art of critiquing research. Bioscience 41(4), 248-250. Wood, J.M. (2003). Research Lab Guide. MICR*3260 Microbial Adaptation and Development Web Site . Retrieved July 31, 2006.

Once you are familiar with the article, you can establish the research context by asking the following questions:

  • Who conducted the research? What were/are their interests?
  • When and where was the research conducted?
  • Why did the authors do this research?
  • Was this research pertinent only within the authors’ geographic locale, or did it have broader (even global) relevance?
  • Were many other laboratories pursuing related research when the reported work was done? If so, why?
  • For experimental research, what funding sources met the costs of the research?
  • On what prior observations was the research based? What was and was not known at the time?
  • How important was the research question posed by the researchers?

These questions were adapted from the following sources: Kuyper, B.J. (1991). Bringing up scientists in the art of critiquing research. Bioscience 41(4), 248-250. Wood, J.M. (2003). Research Lab Guide. MICR*3260 Microbial Adaptation and Development Web Site . Retrieved July 31, 2006.

Remember that simply disagreeing with the material is not considered to be a critical assessment of the material.  For example, stating that the sample size is insufficient is not a critical assessment.  Describing why the sample size is insufficient for the claims being made in the study would be a critical assessment.

Use the questions below to help you evaluate the quality of the authors’ research:

  • Does the title precisely state the subject of the paper?
  • Read the statement of purpose in the abstract. Does it match the one in the introduction?

Acknowledgments

  • Could the source of the research funding have influenced the research topic or conclusions?
  • Check the sequence of statements in the introduction. Does all the information lead coherently to the purpose of the study?
  • Review all methods in relation to the objective(s) of the study. Are the methods valid for studying the problem?
  • Check the methods for essential information. Could the study be duplicated from the methods and information given?
  • Check the methods for flaws. Is the sample selection adequate? Is the experimental design sound?
  • Check the sequence of statements in the methods. Does all the information belong there? Is the sequence of methods clear and pertinent?
  • Was there mention of ethics? Which research ethics board approved the study?
  • Carefully examine the data presented in the tables and diagrams. Does the title or legend accurately describe the content? 
  • Are column headings and labels accurate? 
  • Are the data organized for ready comparison and interpretation? (A table should be self-explanatory, with a title that accurately and concisely describes content and column headings that accurately describe information in the cells.)
  • Review the results as presented in the text while referring to the data in the tables and diagrams. Does the text complement, and not simply repeat data? Are there discrepancies between the results in the text and those in the tables?
  • Check all calculations and presentation of data.
  • Review the results in light of the stated objectives. Does the study reveal what the researchers intended?
  • Does the discussion clearly address the objectives and hypotheses?
  • Check the interpretation against the results. Does the discussion merely repeat the results? 
  • Does the interpretation arise logically from the data or is it too far-fetched? 
  • Have the faults, flaws, or shortcomings of the research been addressed?
  • Is the interpretation supported by other research cited in the study?
  • Does the study consider key studies in the field?
  • What is the significance of the research? Do the authors mention wider implications of the findings?
  • Is there a section on recommendations for future research? Are there other research possibilities or directions suggested? 

Consider the article as a whole

  • Reread the abstract. Does it accurately summarize the article?
  • Check the structure of the article (first headings and then paragraphing). Is all the material organized under the appropriate headings? Are sections divided logically into subsections or paragraphs?
  • Are stylistic concerns, logic, clarity, and economy of expression addressed?

These questions were adapted from the following sources:  Kuyper, B.J. (1991). Bringing up scientists in the art of critiquing research. Bioscience 41(4), 248-250. Wood, J.M. (2003). Research Lab Guide. MICR*3260 Microbial Adaptation and Development Web Site. Retrieved July 31, 2006.

After you have evaluated the research, consider whether the research has been successful. Has it led to new questions being asked, or new ways of using existing knowledge? Are other researchers citing this paper?

You should consider the following questions:

  • How did other researchers view the significance of the research reported by your authors?
  • Did the research reported in your article result in the formulation of new questions or hypotheses (by the authors or by other researchers)?
  • Have other researchers subsequently supported or refuted the observations or interpretations of these authors?
  • Did the research make a significant contribution to human knowledge?
  • Did the research produce any practical applications?
  • What are the social, political, technological, medical implications of this research?
  • How do you evaluate the significance of the research?

To answer these questions, look at review articles to find out how reviewers view this piece of research. Look at research articles and databases like Web of Science to see how other people have used this work. What range of journals have cited this article?

These questions were adapted from the following sources:

Kuyper, B.J. (1991). Bringing up scientists in the art of critiquing research. Bioscience 41(4), 248-250. Wood, J.M. (2003). Research Lab Guide. MICR*3260 Microbial Adaptation and Development Web Site . Retrieved July 31, 2006.

  • << Previous: Start Here
  • Next: Writing Your Critique >>
  • Last Updated: Jan 11, 2024 12:42 PM
  • URL: https://guides.lib.uoguelph.ca/WriteCriticalReview

Suggest an edit to this guide

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 31 January 2022

The fundamentals of critically appraising an article

  • Sneha Chotaliya 1  

BDJ Student volume  29 ,  pages 12–13 ( 2022 ) Cite this article

1963 Accesses

Metrics details

Sneha Chotaliya

We are often surrounded by an abundance of research and articles, but the quality and validity can vary massively. Not everything will be of a good quality - or even valid. An important part of reading a paper is first assessing the paper. This is a key skill for all healthcare professionals as anything we read can impact or influence our practice. It is also important to stay up to date with the latest research and findings.

This is a preview of subscription content, access via your institution

Access options

Subscribe to this journal

We are sorry, but there is no personal subscription option available for your country.

Buy this article

  • Purchase on Springer Link
  • Instant access to full article PDF

Prices may be subject to local taxes which are calculated during checkout

Chambers R, 'Clinical Effectiveness Made Easy', Oxford: Radcliffe Medical Press , 1998

Loney P L, Chambers L W, Bennett K J, Roberts J G and Stratford P W. Critical appraisal of the health research literature: prevalence or incidence of a health problem. Chronic Dis Can 1998; 19 : 170-176.

Brice R. CASP CHECKLISTS - CASP - Critical Appraisal Skills Programme . 2021. Available at: https://casp-uk.net/casp-tools-checklists/ (Accessed 22 July 2021).

White S, Halter M, Hassenkamp A and Mein G. 2021. Critical Appraisal Techniques for Healthcare Literature . St George's, University of London.

Download references

Author information

Authors and affiliations.

Academic Foundation Dentist, London, UK

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sneha Chotaliya .

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Chotaliya, S. The fundamentals of critically appraising an article. BDJ Student 29 , 12–13 (2022). https://doi.org/10.1038/s41406-021-0275-6

Download citation

Published : 31 January 2022

Issue Date : 31 January 2022

DOI : https://doi.org/10.1038/s41406-021-0275-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

how to critically evaluate research papers

  • Privacy Policy

Research Method

Home » Evaluating Research – Process, Examples and Methods

Evaluating Research – Process, Examples and Methods

Table of Contents

Evaluating Research

Evaluating Research

Definition:

Evaluating Research refers to the process of assessing the quality, credibility, and relevance of a research study or project. This involves examining the methods, data, and results of the research in order to determine its validity, reliability, and usefulness. Evaluating research can be done by both experts and non-experts in the field, and involves critical thinking, analysis, and interpretation of the research findings.

Research Evaluating Process

The process of evaluating research typically involves the following steps:

Identify the Research Question

The first step in evaluating research is to identify the research question or problem that the study is addressing. This will help you to determine whether the study is relevant to your needs.

Assess the Study Design

The study design refers to the methodology used to conduct the research. You should assess whether the study design is appropriate for the research question and whether it is likely to produce reliable and valid results.

Evaluate the Sample

The sample refers to the group of participants or subjects who are included in the study. You should evaluate whether the sample size is adequate and whether the participants are representative of the population under study.

Review the Data Collection Methods

You should review the data collection methods used in the study to ensure that they are valid and reliable. This includes assessing the measures used to collect data and the procedures used to collect data.

Examine the Statistical Analysis

Statistical analysis refers to the methods used to analyze the data. You should examine whether the statistical analysis is appropriate for the research question and whether it is likely to produce valid and reliable results.

Assess the Conclusions

You should evaluate whether the data support the conclusions drawn from the study and whether they are relevant to the research question.

Consider the Limitations

Finally, you should consider the limitations of the study, including any potential biases or confounding factors that may have influenced the results.

Evaluating Research Methods

Evaluating Research Methods are as follows:

  • Peer review: Peer review is a process where experts in the field review a study before it is published. This helps ensure that the study is accurate, valid, and relevant to the field.
  • Critical appraisal : Critical appraisal involves systematically evaluating a study based on specific criteria. This helps assess the quality of the study and the reliability of the findings.
  • Replication : Replication involves repeating a study to test the validity and reliability of the findings. This can help identify any errors or biases in the original study.
  • Meta-analysis : Meta-analysis is a statistical method that combines the results of multiple studies to provide a more comprehensive understanding of a particular topic. This can help identify patterns or inconsistencies across studies.
  • Consultation with experts : Consulting with experts in the field can provide valuable insights into the quality and relevance of a study. Experts can also help identify potential limitations or biases in the study.
  • Review of funding sources: Examining the funding sources of a study can help identify any potential conflicts of interest or biases that may have influenced the study design or interpretation of results.

Example of Evaluating Research

Example of Evaluating Research sample for students:

Title of the Study: The Effects of Social Media Use on Mental Health among College Students

Sample Size: 500 college students

Sampling Technique : Convenience sampling

  • Sample Size: The sample size of 500 college students is a moderate sample size, which could be considered representative of the college student population. However, it would be more representative if the sample size was larger, or if a random sampling technique was used.
  • Sampling Technique : Convenience sampling is a non-probability sampling technique, which means that the sample may not be representative of the population. This technique may introduce bias into the study since the participants are self-selected and may not be representative of the entire college student population. Therefore, the results of this study may not be generalizable to other populations.
  • Participant Characteristics: The study does not provide any information about the demographic characteristics of the participants, such as age, gender, race, or socioeconomic status. This information is important because social media use and mental health may vary among different demographic groups.
  • Data Collection Method: The study used a self-administered survey to collect data. Self-administered surveys may be subject to response bias and may not accurately reflect participants’ actual behaviors and experiences.
  • Data Analysis: The study used descriptive statistics and regression analysis to analyze the data. Descriptive statistics provide a summary of the data, while regression analysis is used to examine the relationship between two or more variables. However, the study did not provide information about the statistical significance of the results or the effect sizes.

Overall, while the study provides some insights into the relationship between social media use and mental health among college students, the use of a convenience sampling technique and the lack of information about participant characteristics limit the generalizability of the findings. In addition, the use of self-administered surveys may introduce bias into the study, and the lack of information about the statistical significance of the results limits the interpretation of the findings.

Note*: Above mentioned example is just a sample for students. Do not copy and paste directly into your assignment. Kindly do your own research for academic purposes.

Applications of Evaluating Research

Here are some of the applications of evaluating research:

  • Identifying reliable sources : By evaluating research, researchers, students, and other professionals can identify the most reliable sources of information to use in their work. They can determine the quality of research studies, including the methodology, sample size, data analysis, and conclusions.
  • Validating findings: Evaluating research can help to validate findings from previous studies. By examining the methodology and results of a study, researchers can determine if the findings are reliable and if they can be used to inform future research.
  • Identifying knowledge gaps: Evaluating research can also help to identify gaps in current knowledge. By examining the existing literature on a topic, researchers can determine areas where more research is needed, and they can design studies to address these gaps.
  • Improving research quality : Evaluating research can help to improve the quality of future research. By examining the strengths and weaknesses of previous studies, researchers can design better studies and avoid common pitfalls.
  • Informing policy and decision-making : Evaluating research is crucial in informing policy and decision-making in many fields. By examining the evidence base for a particular issue, policymakers can make informed decisions that are supported by the best available evidence.
  • Enhancing education : Evaluating research is essential in enhancing education. Educators can use research findings to improve teaching methods, curriculum development, and student outcomes.

Purpose of Evaluating Research

Here are some of the key purposes of evaluating research:

  • Determine the reliability and validity of research findings : By evaluating research, researchers can determine the quality of the study design, data collection, and analysis. They can determine whether the findings are reliable, valid, and generalizable to other populations.
  • Identify the strengths and weaknesses of research studies: Evaluating research helps to identify the strengths and weaknesses of research studies, including potential biases, confounding factors, and limitations. This information can help researchers to design better studies in the future.
  • Inform evidence-based decision-making: Evaluating research is crucial in informing evidence-based decision-making in many fields, including healthcare, education, and public policy. Policymakers, educators, and clinicians rely on research evidence to make informed decisions.
  • Identify research gaps : By evaluating research, researchers can identify gaps in the existing literature and design studies to address these gaps. This process can help to advance knowledge and improve the quality of research in a particular field.
  • Ensure research ethics and integrity : Evaluating research helps to ensure that research studies are conducted ethically and with integrity. Researchers must adhere to ethical guidelines to protect the welfare and rights of study participants and to maintain the trust of the public.

Characteristics Evaluating Research

Characteristics Evaluating Research are as follows:

  • Research question/hypothesis: A good research question or hypothesis should be clear, concise, and well-defined. It should address a significant problem or issue in the field and be grounded in relevant theory or prior research.
  • Study design: The research design should be appropriate for answering the research question and be clearly described in the study. The study design should also minimize bias and confounding variables.
  • Sampling : The sample should be representative of the population of interest and the sampling method should be appropriate for the research question and study design.
  • Data collection : The data collection methods should be reliable and valid, and the data should be accurately recorded and analyzed.
  • Results : The results should be presented clearly and accurately, and the statistical analysis should be appropriate for the research question and study design.
  • Interpretation of results : The interpretation of the results should be based on the data and not influenced by personal biases or preconceptions.
  • Generalizability: The study findings should be generalizable to the population of interest and relevant to other settings or contexts.
  • Contribution to the field : The study should make a significant contribution to the field and advance our understanding of the research question or issue.

Advantages of Evaluating Research

Evaluating research has several advantages, including:

  • Ensuring accuracy and validity : By evaluating research, we can ensure that the research is accurate, valid, and reliable. This ensures that the findings are trustworthy and can be used to inform decision-making.
  • Identifying gaps in knowledge : Evaluating research can help identify gaps in knowledge and areas where further research is needed. This can guide future research and help build a stronger evidence base.
  • Promoting critical thinking: Evaluating research requires critical thinking skills, which can be applied in other areas of life. By evaluating research, individuals can develop their critical thinking skills and become more discerning consumers of information.
  • Improving the quality of research : Evaluating research can help improve the quality of research by identifying areas where improvements can be made. This can lead to more rigorous research methods and better-quality research.
  • Informing decision-making: By evaluating research, we can make informed decisions based on the evidence. This is particularly important in fields such as medicine and public health, where decisions can have significant consequences.
  • Advancing the field : Evaluating research can help advance the field by identifying new research questions and areas of inquiry. This can lead to the development of new theories and the refinement of existing ones.

Limitations of Evaluating Research

Limitations of Evaluating Research are as follows:

  • Time-consuming: Evaluating research can be time-consuming, particularly if the study is complex or requires specialized knowledge. This can be a barrier for individuals who are not experts in the field or who have limited time.
  • Subjectivity : Evaluating research can be subjective, as different individuals may have different interpretations of the same study. This can lead to inconsistencies in the evaluation process and make it difficult to compare studies.
  • Limited generalizability: The findings of a study may not be generalizable to other populations or contexts. This limits the usefulness of the study and may make it difficult to apply the findings to other settings.
  • Publication bias: Research that does not find significant results may be less likely to be published, which can create a bias in the published literature. This can limit the amount of information available for evaluation.
  • Lack of transparency: Some studies may not provide enough detail about their methods or results, making it difficult to evaluate their quality or validity.
  • Funding bias : Research funded by particular organizations or industries may be biased towards the interests of the funder. This can influence the study design, methods, and interpretation of results.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Data collection

Data Collection – Methods Types and Examples

Delimitations

Delimitations in Research – Types, Examples and...

Research Process

Research Process – Steps, Examples and Tips

Research Design

Research Design – Types, Methods and Examples

Institutional Review Board (IRB)

Institutional Review Board – Application Sample...

Research Questions

Research Questions – Types, Examples and Writing...

  • - Google Chrome

Intended for healthcare professionals

  • Access provided by Google Indexer
  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • Critically appraising...

Critically appraising qualitative research

  • Related content
  • Peer review
  • Ayelet Kuper , assistant professor 1 ,
  • Lorelei Lingard , associate professor 2 ,
  • Wendy Levinson , Sir John and Lady Eaton professor and chair 3
  • 1 Department of Medicine, Sunnybrook Health Sciences Centre, and Wilson Centre for Research in Education, University of Toronto, 2075 Bayview Avenue, Room HG 08, Toronto, ON, Canada M4N 3M5
  • 2 Department of Paediatrics and Wilson Centre for Research in Education, University of Toronto and SickKids Learning Institute; BMO Financial Group Professor in Health Professions Education Research, University Health Network, 200 Elizabeth Street, Eaton South 1-565, Toronto
  • 3 Department of Medicine, Sunnybrook Health Sciences Centre
  • Correspondence to: A Kuper ayelet94{at}post.harvard.edu

Six key questions will help readers to assess qualitative research

Summary points

Appraising qualitative research is different from appraising quantitative research

Qualitative research papers should show appropriate sampling, data collection, and data analysis

Transferability of qualitative research depends on context and may be enhanced by using theory

Ethics in qualitative research goes beyond review boards’ requirements to involve complex issues of confidentiality, reflexivity, and power

Over the past decade, readers of medical journals have gained skills in critically appraising studies to determine whether the results can be trusted and applied to their own practice settings. Criteria have been designed to assess studies that use quantitative methods, and these are now in common use.

In this article we offer guidance for readers on how to assess a study that uses qualitative research methods by providing six key questions to ask when reading qualitative research (box 1). However, the thorough assessment of qualitative research is an interpretive act and requires informed reflective thought rather than the simple application of a scoring system.

Box 1 Key questions to ask when reading qualitative research studies

Was the sample used in the study appropriate to its research question, were the data collected appropriately, were the data analysed appropriately, can i transfer the results of this study to my own setting, does the study adequately address potential ethical issues, including reflexivity.

Overall: is what the researchers did clear?

One of the critical decisions in a qualitative study is whom or what to include in the sample—whom to interview, whom to observe, what texts to analyse. An understanding that qualitative research is based in experience and in the construction of meaning, combined with the specific research question, should guide the sampling process. For example, a study of the experience of survivors of domestic violence that examined their reasons for not seeking help from healthcare providers might focus on interviewing a sample of such survivors (rather than, for example, healthcare providers, social services workers, or academics in the field). The sample should be broad enough to capture the many facets of a phenomenon, and limitations to the sample should be clearly justified. Since the answers to questions of experience and meaning also relate to people’s social affiliations (culture, religion, socioeconomic group, profession, etc), it is also important that the researcher acknowledges these contexts in the selection of a study sample.

In contrast with quantitative approaches, qualitative studies do not usually have predetermined sample sizes. Sampling stops when a thorough understanding of the phenomenon under study has been reached, an end point that is often called saturation. Researchers consider samples to be saturated when encounters (interviews, observations, etc) with new participants no longer elicit trends or themes not already raised by previous participants. Thus, to sample to saturation, data analysis has to happen while new data are still being collected. Multiple sampling methods may be used to broaden the understanding achieved in a study (box 2). These sampling issues should be clearly articulated in the methods section.

Box 2 Qualitative sampling methods for interviews and focus groups 9

Examples are for a hypothetical study of financial concerns among adult patients with chronic renal failure receiving ongoing haemodialysis in a single hospital outpatient unit.

Typical case sampling —sampling the most ordinary, usual cases of a phenomenon

The sample would include patients likely to have had typical experiences for that haemodialysis unit and patients who fit the profile of patients in the unit for factors found on literature review. Other typical cases could be found via snowball sampling (see below)

Deviant case sampling —sampling the most extreme cases of a phenomenon

The sample would include patients likely to have had different experiences of relevant aspects of haemodialysis. For example, if most patients in the unit are 60-70 years old and recently began haemodialysis for diabetic nephropathy, researchers might sample the unmarried university student in his 20s on haemodialysis since childhood, the 32 year old woman with lupus who is now trying to get pregnant, and the 90 year old who newly started haemodialysis due to an adverse reaction to radio-opaque contrast dye. Other deviant cases could be found via theoretical and/or snowball sampling (see below)

Critical case sampling —sampling cases that are predicted (based on theoretical models or previous research) to be especially information-rich and thus particularly illuminating

The nature of this sample depends on previous research. For example, if research showed that marital status was a major determinant of financial concerns for haemodialysis patients, then critical cases might include patients whose marital status changed while on haemodialysis

Maximum-variation sampling —sampling as wide a range of perspectives as possible to capture the broadest set of information and experiences)

The sample would include typical, deviant, and critical cases (as above), plus any other perspectives identified

Confirming-disconfirming sampling —Sampling both individuals or texts whose perspectives are likely to confirm the researcher’s developing understanding of the phenomenon under study and those whose perspectives are likely to challenge that understanding

The sample would include patients whose experiences would likely either confirm or disconfirm what the researchers had already learnt (from other patients) about financial concerns among patients in the haemodialysis unit. This could be accomplished via theoretical and/or snowball sampling (see below)

Snowball sampling —sampling participants found by asking current participants in a study to recommend others whose experiences would be relevant to the study

Current participants could be asked to provide the names of others in the unit who they thought, when asked about financial concerns, would either share their views (confirming), disagree with their views (disconfirming), have views typical of patients on their unit (typical cases), or have views different from most other patients on their unit (deviant cases)

Theoretical sampling —sampling individuals or texts whom the researchers predict (based on theoretical models or previous research) would add new perspectives to those already represented in the sample

Researchers could use their understanding of known issues for haemodialysis patients that would, in theory, relate to financial concerns to ensure that the relevant perspectives were represented in the study. For example, if, as the research progressed, it turned out that none of the patients in the sample had had to change or leave a job in order to accommodate haemodialysis scheduling, the researchers might (based on previous research) choose to intentionally sample patients who had left their jobs because of the time commitment of haemodialysis (but who could not do peritoneal dialysis) and others who had switched to jobs with more flexible scheduling because of their need for haemodialysis

It is important that a qualitative study carefully describes the methods used in collecting data. The appropriateness of the method(s) selected to use for the specific research question should be justified, ideally with reference to the research literature. It should be clear that methods were used systematically and in an organised manner. Attention should be paid to specific methodological challenges such as the Hawthorne effect, 1 whereby the presence of an observer may influence participants’ behaviours. By using a technique called thick description, qualitative studies often aim to include enough contextual information to provide readers with a sense of what it was like to have been in the research setting.

Another technique that is often used is triangulation, with which a researcher uses multiple methods or perspectives to help produce a more comprehensive set of findings. A study can triangulate data, using different sources of data to examine a phenomenon in different contexts (for example, interviewing palliative patients who are at home, those who are in acute care hospitals, and those who are in specialist palliative care units); it can also triangulate methods, collecting different types of data (for example, interviews, focus groups, observations) to increase insight into a phenomenon.

Another common technique is the use of an iterative process, whereby concurrent data analysis is used to inform data collection. For example, concurrent analysis of an interview study about lack of adherence to medications among a particular social group might show that early participants seem to be dismissive of the efforts of their local pharmacists; the interview script might then be changed to include an exploration of this phenomenon. The iterative process constitutes a distinctive qualitative tradition, in contrast to the tradition of stable processes and measures in quantitative studies. Iterations should be explicit and justified with reference to the research question and sampling techniques so that the reader understands how data collection shaped the resulting insights.

Qualitative studies should include a clear description of a systematic form of data analysis. Many legitimate analytical approaches exist; regardless of which is used, the study should report what was done, how, and by whom. If an iterative process was used, it should be clearly delineated. If more than one researcher analysed the data (which depends on the methodology used) it should be clear how differences between analyses were negotiated. Many studies make reference to a technique called member checking, wherein the researcher shows all or part of the study’s findings to participants to determine if they are in accord with their experiences. 2 Studies may also describe an audit trail, which might include researchers’ analysis notes, minutes of researchers’ meetings, and other materials that could be used to follow the research process.

The contextual nature of qualitative research means that careful thought must be given to the potential transferability of its results to other sociocultural settings. Though the study should discuss the extent of the findings’ resonance with the published literature, 3 much of the onus of assessing transferability is left to readers, who must decide if the setting of the study is sufficiently similar for its results to be transferable to their own context. In doing so, the reader looks for resonance—the extent that research findings have meaning for the reader.

Transferability may be helped by the study’s discussion of how its results advance theoretical understandings that are relevant to multiple situations. For example, a study of patients’ preferences in palliative care may contribute to theories of ethics and humanity in medicine, thus suggesting relevance to other clinical situations such as the informed consent exchange before treatment. We have explained elsewhere in this series the importance of theory in qualitative research, and there are many who believe that a key indicator of quality in qualitative research is its contribution to advancing theoretical understanding as well as useful knowledge. This debate continues in the literature, 4 but from a pragmatic perspective most qualitative studies in health professions journals emphasise results that relate to practice; theoretical discussions tend to be published elsewhere.

Reflexivity is particularly important within the qualitative paradigm. Reflexivity refers to recognition of the influence a researcher brings to the research process. It highlights potential power relationships between the researcher and research participants that might shape the data being collected, particularly when the researcher is a healthcare professional or educator and the participant is a patient, client, or student. 5 It also acknowledges how a researcher’s gender, ethnic background, profession, and social status influence the choices made within the study, such as the research question itself and the methods of data collection. 6 7

Research articles written in the qualitative paradigm should show evidence both of reflexive practice and of consideration of other relevant ethical issues. Ethics in qualitative research should extend beyond prescriptive guidelines and research ethics boards into a thorough exploration of the ethical consequences of collecting personal experiences and opening those experiences to public scrutiny (a detailed discussion of this problem within a research report may, however, be limited by the practicalities of word count limitations). 8 Issues of confidentiality and anonymity can become quite complex when data constitute personal reports of experience or perception; the need to minimise harm may involve not only protection from external scrutiny but also mechanisms to mitigate potential distress to participants from sharing their personal stories.

In conclusion: is what the researchers did clear?

The qualitative paradigm includes a wide range of theoretical and methodological options, and qualitative studies must include clear descriptions of how they were conducted, including the selection of the study sample, the data collection methods, and the analysis process. The list of key questions for beginning readers to ask when reading qualitative research articles (see box 1) is intended not as a finite checklist, but rather as a beginner’s guide to a complex topic. Critical appraisal of particular qualitative articles may differ according to the theories and methodologies used, and achieving a nuanced understanding in this area is fairly complex.

Further reading

Crabtree F, Miller WL, eds. Doing qualitative research . 2nd ed. Thousand Oaks, CA: Sage, 1999.

Denzin NK, Lincoln YS, eds. Handbook of qualitative research . 2nd ed. Thousand Oaks, CA: Sage, 2000.

Finlay L, Ballinger C, eds. Qualitative research for allied health professionals: challenging choices . Chichester: Wiley, 2006.

Flick U. An introduction to qualitative research . 2nd ed. London: Sage, 2002.

Green J, Thorogood N. Qualitative methods for health research . London: Sage, 2004.

Lingard L, Kennedy TJ. Qualitative research in medical education . Edinburgh: Association for the Study of Medical Education, 2007.

Mauthner M, Birch M, Jessop J, Miller T, eds. Ethics in Qualitative Research . Thousand Oaks, CA: Sage, 2002.

Seale C. The quality of qualitative research . London: Sage, 1999.

Silverman D. Doing qualitative research . Thousand Oaks, CA: Sage, 2000.

Journal articles

Greenhalgh T. How to read a paper: papers that go beyond numbers. BMJ 1997;315:740-3.

Mays N, Pope C. Qualitative research: Rigour and qualitative research. BMJ 1995;311:109-12.

Mays N, Pope C. Qualitative research in health care: assessing quality in qualitative research. BMJ 2000;320:50-2.

Popay J, Rogers A, Williams G. Rationale and standards for the systematic review of qualitative literature in health services research. Qual Health Res 1998;8:341-51.

Internet resources

National Health Service Public Health Resource Unit. Critical appraisal skills programme: qualitative research appraisal tool . 2006. www.phru.nhs.uk/Doc_Links/Qualitative%20Appraisal%20Tool.pdf

Cite this as: BMJ 2008;337:a1035

  • Related to doi: , 10.1136/bmj.a288
  • doi: , 10.1136/bmj.39602.690162.47
  • doi: , 10.1136/bmj.a1020
  • doi: , 10.1136/bmj.a879
  • doi: 10.1136/bmj.a949

This is the last in a series of six articles that aim to help readers to critically appraise the increasing number of qualitative research articles in clinical journals. The series editors are Ayelet Kuper and Scott Reeves.

For a definition of general terms relating to qualitative research, see the first article in this series.

Contributors: AK wrote the first draft of the article and collated comments for subsequent iterations. LL and WL made substantial contributions to the structure and content, provided examples, and gave feedback on successive drafts. AK is the guarantor.

Funding: None.

Competing interests: None declared.

Provenance and peer review: Commissioned; externally peer reviewed.

  • ↵ Holden JD. Hawthorne effects and research into professional practice. J Evaluation Clin Pract 2001 ; 7 : 65 -70. OpenUrl CrossRef PubMed Web of Science
  • ↵ Hammersley M, Atkinson P. Ethnography: principles in practice . 2nd ed. London: Routledge, 1995 .
  • ↵ Silverman D. Doing qualitative research . Thousand Oaks, CA: Sage, 2000 .
  • ↵ Mays N, Pope C. Qualitative research in health care: assessing quality in qualitative research. BMJ 2000 ; 320 : 50 -2. OpenUrl FREE Full Text
  • ↵ Lingard L, Kennedy TJ. Qualitative research in medical education . Edinburgh: Association for the Study of Medical Education, 2007 .
  • ↵ Seale C. The quality of qualitative research . London: Sage, 1999 .
  • ↵ Wallerstein N. Power between evaluator and community: research relationships within New Mexico’s healthier communities. Soc Sci Med 1999 ; 49 : 39 -54. OpenUrl CrossRef PubMed Web of Science
  • ↵ Mauthner M, Birch M, Jessop J, Miller T, eds. Ethics in qualitative research . Thousand Oaks, CA: Sage, 2002 .
  • ↵ Kuzel AJ. Sampling in qualitative inquiry. In: Crabtree F, Miller WL, eds. Doing qualitative research . 2nd ed. Thousand Oaks, CA: Sage, 1999 :33-45.

how to critically evaluate research papers

Critically Analyzing Information Sources: Critical Appraisal and Analysis

  • Critical Appraisal and Analysis

Initial Appraisal : Reviewing the source

  • What are the author's credentials--institutional affiliation (where he or she works), educational background, past writings, or experience? Is the book or article written on a topic in the author's area of expertise? You can use the various Who's Who publications for the U.S. and other countries and for specific subjects and the biographical information located in the publication itself to help determine the author's affiliation and credentials.
  • Has your instructor mentioned this author? Have you seen the author's name cited in other sources or bibliographies? Respected authors are cited frequently by other scholars. For this reason, always note those names that appear in many different sources.
  • Is the author associated with a reputable institution or organization? What are the basic values or goals of the organization or institution?

B. Date of Publication

  • When was the source published? This date is often located on the face of the title page below the name of the publisher. If it is not there, look for the copyright date on the reverse of the title page. On Web pages, the date of the last revision is usually at the bottom of the home page, sometimes every page.
  • Is the source current or out-of-date for your topic? Topic areas of continuing and rapid development, such as the sciences, demand more current information. On the other hand, topics in the humanities often require material that was written many years ago. At the other extreme, some news sources on the Web now note the hour and minute that articles are posted on their site.

C. Edition or Revision

Is this a first edition of this publication or not? Further editions indicate a source has been revised and updated to reflect changes in knowledge, include omissions, and harmonize with its intended reader's needs. Also, many printings or editions may indicate that the work has become a standard source in the area and is reliable. If you are using a Web source, do the pages indicate revision dates?

D. Publisher

Note the publisher. If the source is published by a university press, it is likely to be scholarly. Although the fact that the publisher is reputable does not necessarily guarantee quality, it does show that the publisher may have high regard for the source being published.

E. Title of Journal

Is this a scholarly or a popular journal? This distinction is important because it indicates different levels of complexity in conveying ideas. If you need help in determining the type of journal, see Distinguishing Scholarly from Non-Scholarly Periodicals . Or you may wish to check your journal title in the latest edition of Katz's Magazines for Libraries (Olin Reference Z 6941 .K21, shelved at the reference desk) for a brief evaluative description.

Critical Analysis of the Content

Having made an initial appraisal, you should now examine the body of the source. Read the preface to determine the author's intentions for the book. Scan the table of contents and the index to get a broad overview of the material it covers. Note whether bibliographies are included. Read the chapters that specifically address your topic. Reading the article abstract and scanning the table of contents of a journal or magazine issue is also useful. As with books, the presence and quality of a bibliography at the end of the article may reflect the care with which the authors have prepared their work.

A. Intended Audience

What type of audience is the author addressing? Is the publication aimed at a specialized or a general audience? Is this source too elementary, too technical, too advanced, or just right for your needs?

B. Objective Reasoning

  • Is the information covered fact, opinion, or propaganda? It is not always easy to separate fact from opinion. Facts can usually be verified; opinions, though they may be based on factual information, evolve from the interpretation of facts. Skilled writers can make you think their interpretations are facts.
  • Does the information appear to be valid and well-researched, or is it questionable and unsupported by evidence? Assumptions should be reasonable. Note errors or omissions.
  • Are the ideas and arguments advanced more or less in line with other works you have read on the same topic? The more radically an author departs from the views of others in the same field, the more carefully and critically you should scrutinize his or her ideas.
  • Is the author's point of view objective and impartial? Is the language free of emotion-arousing words and bias?

C. Coverage

  • Does the work update other sources, substantiate other materials you have read, or add new information? Does it extensively or marginally cover your topic? You should explore enough sources to obtain a variety of viewpoints.
  • Is the material primary or secondary in nature? Primary sources are the raw material of the research process. Secondary sources are based on primary sources. For example, if you were researching Konrad Adenauer's role in rebuilding West Germany after World War II, Adenauer's own writings would be one of many primary sources available on this topic. Others might include relevant government documents and contemporary German newspaper articles. Scholars use this primary material to help generate historical interpretations--a secondary source. Books, encyclopedia articles, and scholarly journal articles about Adenauer's role are considered secondary sources. In the sciences, journal articles and conference proceedings written by experimenters reporting the results of their research are primary documents. Choose both primary and secondary sources when you have the opportunity.

D. Writing Style

Is the publication organized logically? Are the main points clearly presented? Do you find the text easy to read, or is it stilted or choppy? Is the author's argument repetitive?

E. Evaluative Reviews

  • Locate critical reviews of books in a reviewing source , such as the Articles & Full Text , Book Review Index , Book Review Digest, and ProQuest Research Library . Is the review positive? Is the book under review considered a valuable contribution to the field? Does the reviewer mention other books that might be better? If so, locate these sources for more information on your topic.
  • Do the various reviewers agree on the value or attributes of the book or has it aroused controversy among the critics?
  • For Web sites, consider consulting this evaluation source from UC Berkeley .

Permissions Information

If you wish to use or adapt any or all of the content of this Guide go to Cornell Library's Research Guides Use Conditions to review our use permissions and our Creative Commons license.

  • Next: Tips >>
  • Last Updated: Apr 18, 2022 1:43 PM
  • URL: https://guides.library.cornell.edu/critically_analyzing

how to critically evaluate research papers

  • University of Oregon Libraries
  • Research Guides

How to Write a Literature Review

  • 5. Critically Analyze and Evaluate
  • Literature Reviews: A Recap
  • Reading Journal Articles
  • Does it Describe a Literature Review?
  • 1. Identify the Question
  • 2. Review Discipline Styles
  • Searching Article Databases
  • Finding Full-Text of an Article
  • Citation Chaining
  • When to Stop Searching
  • 4. Manage Your References

Critically analyze and evaluate

Tip: read and annotate pdfs.

  • 6. Synthesize
  • 7. Write a Literature Review

Chat

Ask yourself questions like these about each book or article you include:

  • What is the research question?
  • What is the primary methodology used?
  • How was the data gathered?
  • How is the data presented?
  • What are the main conclusions?
  • Are these conclusions reasonable?
  • What theories are used to support the researcher's conclusions?

Take notes on the articles as you read them and identify any themes or concepts that may apply to your research question.

This sample template (below) may also be useful for critically reading and organizing your articles. Or you can use this online form and email yourself a copy .

  • Sample Template for Critical Analysis of the Literature

Opening an article in PDF format in Acrobat Reader will allow you to use "sticky notes" and "highlighting" to make notes on the article without printing it out. Make sure to save the edited file so you don't lose your notes!

Some Citation Managers like Mendeley also have highlighting and annotation features.Here's a screen capture of a pdf in Mendeley with highlighting, notes, and various colors:

Screen capture of Mendeley desktop showing note, highlight, and color tools. Tips include adding notes and highlighting, and using different colors for other purposes like quotations

Screen capture from a UO Librarian's Mendeley Desktop app

  • Learn more about citation management software in the previous step: 4. Manage Your References
  • << Previous: 4. Manage Your References
  • Next: 6. Synthesize >>
  • Last Updated: May 3, 2024 5:17 PM
  • URL: https://researchguides.uoregon.edu/litreview

Contact Us Library Accessibility UO Libraries Privacy Notices and Procedures

Make a Gift

1501 Kincaid Street Eugene, OR 97403 P: 541-346-3053 F: 541-346-3485

  • Visit us on Facebook
  • Visit us on Twitter
  • Visit us on Youtube
  • Visit us on Instagram
  • Report a Concern
  • Nondiscrimination and Title IX
  • Accessibility
  • Privacy Policy
  • Find People

We’re reviewing our resources this spring (May-August 2024). We will do our best to minimize disruption, but you might notice changes over the next few months as we correct errors & delete redundant resources. 

Critical Analysis and Evaluation

Many assignments ask you to   critique   and   evaluate   a source. Sources might include journal articles, books, websites, government documents, portfolios, podcasts, or presentations.

When you   critique,   you offer both negative and positive analysis of the content, writing, and structure of a source.

When   you   evaluate , you assess how successful a source is at presenting information, measured against a standard or certain criteria.

Elements of a critical analysis:

opinion + evidence from the article + justification

Your   opinion   is your thoughtful reaction to the piece.

Evidence from the article  offers some proof to back up your opinion.

The   justification   is an explanation of how you arrived at your opinion or why you think it’s true.

How do you critique and evaluate?

When critiquing and evaluating someone else’s writing/research, your purpose is to reach an   informed opinion   about a source. In order to do that, try these three steps:

  • How do you feel?
  • What surprised you?
  • What left you confused?
  • What pleased or annoyed you?
  • What was interesting?
  • What is the purpose of this text?
  • Who is the intended audience?
  • What kind of bias is there?
  • What was missing?
  • See our resource on analysis and synthesis ( Move From Research to Writing: How to Think ) for other examples of questions to ask.
  • sophisticated
  • interesting
  • undocumented
  • disorganized
  • superficial
  • unconventional
  • inappropriate interpretation of evidence
  • unsound or discredited methodology
  • traditional
  • unsubstantiated
  • unsupported
  • well-researched
  • easy to understand
  • Opinion : This article’s assessment of the power balance in cities is   confusing.
  • Evidence:   It first says that the power to shape policy is evenly distributed among citizens, local government, and business (Rajal, 232).
  • Justification :  but then it goes on to focus almost exclusively on business. Next, in a much shorter section, it combines the idea of citizens and local government into a single point of evidence. This leaves the reader with the impression that the citizens have no voice at all. It is   not helpful   in trying to determine the role of the common voter in shaping public policy.  

Sample criteria for critical analysis

Sometimes the assignment will specify what criteria to use when critiquing and evaluating a source. If not, consider the following prompts to approach your analysis. Choose the questions that are most suitable for your source.

  • What do you think about the quality of the research? Is it significant?
  • Did the author answer the question they set out to? Did the author prove their thesis?
  • Did you find contradictions to other things you know?
  • What new insight or connections did the author make?
  • How does this piece fit within the context of your course, or the larger body of research in the field?
  • The structure of an article or book is often dictated by standards of the discipline or a theoretical model. Did the piece meet those standards?
  • Did the piece meet the needs of the intended audience?
  • Was the material presented in an organized and logical fashion?
  • Is the argument cohesive and convincing? Is the reasoning sound? Is there enough evidence?
  • Is it easy to read? Is it clear and easy to understand, even if the concepts are sophisticated?

Writing Academically

Proofreading, other editing & coaching for highly successful academic writing

  • Editing & coaching pricing
  • Academic coaching
  • How to conduct a targeted literature search

How to write a successful critical analysis

  • How to write a strong literature review
  • Cautious in tone
  • Formal English
  • Precise and concise English
  • Impartial and objective English
  • Substantiate your claims
  • The academic team

Kozzi-pros-cons-128x128

For further queries or assistance in writing a critical analysis email Bill Wrigley .

What do you critically analyse?

In a critical analysis you do not express your own opinion or views on the topic. You need to develop your thesis, position or stance on the topic from the views and research of others . In academic writing you critically analyse other researchers’:

  • concepts, terms
  • viewpoints, arguments, positions
  • methodologies, approaches
  • research results and conclusions

This means weighing up the strength of the arguments or research support on the topic, and deciding who or what has the more or stronger weight of evidence or support.

Therefore, your thesis argues, with evidence, why a particular theory, concept, viewpoint, methodology, or research result(s) is/are stronger, more sound, or more advantageous than others.

What does ‘analysis’ mean?

A critical analysis means analysing or breaking down the parts of the literature and grouping these into themes, patterns or trends.

In an analysis you need to:

1. Identify and separate out the parts of the topic by grouping the various key theories, main concepts, the main arguments or ideas, and the key research results and conclusions on the topic into themes, patterns or trends of agreement , dispute and omission .

2. Discuss each of these parts by explaining:

i. the areas of agreement/consensus, or similarity

ii. the issues or controversies: in dispute or debate, areas of difference

ii. the omissions, gaps, or areas that are under-researched

3. Discuss the relationship between these parts

4. Examine how each contributes to the whole topic

5. Make conclusions about their significance or importance in the topic

What does ‘critical’ mean?

A critical analysis does not mean writing angry, rude or disrespectful comments, or  expressing your views in judgmental terms of black and white, good and bad, or right and wrong.

To be critical, or to critique, means to evaluate . Therefore, to write critically in an academic analysis means to:

  • judge the quality, significance or worth of the theories, concepts, viewpoints, methodologies, and research results
  • evaluate in a fair and balanced manner
  • avoid extreme or emotional language

strengths and weaknesses computer keys showing performance or an

  • strengths, advantages, benefits, gains, or improvements
  • disadvantages, weaknesses, shortcomings, limitations, or drawbacks

How to critically analyse a theory, model or framework

The evaluative words used most often to refer to theory, model or framework are a sound theory or a strong theory.

The table below summarizes the criteria for judging the strengths and weaknesses of a theory:

  • comprehensive
  • empirically supported
  • parsimonious

Evaluating a Theory, Model or Framework

The table below lists the criteria for the strengths and their corresponding weaknesses that are usually considered in a theory.

Critical analysis examples of theories

The following sentences are examples of the phrases used to explain strengths and weaknesses.

Smith’s (2005) theory appears up to date, practical and applicable across many divergent settings.

Brown’s (2010) theory, although parsimonious and logical, lacks a sufficient body of evidence to support its propositions and predictions

Little scientific evidence has been presented to support the premises of this theory.

One of the limitations with this theory is that it does not explain why…

A significant strength of this model is that it takes into account …

The propositions of this model appear unambiguous and logical.

A key problem with this framework is the conceptual inconsistency between ….

How to critically analyse a concept

The table below summarizes the criteria for judging the strengths and weaknesses of a concept:

  • key variables identified
  • clear and well-defined

Evaluating Concepts

Critical analysis examples of concepts

Many researchers have used the concept of control in different ways.

There is little consensus about what constitutes automaticity.

Putting forth a very general definition of motivation means that it is possible that any behaviour could be included.

The concept of global education lacks clarity, is imprecisely defined and is overly complex.

Some have questioned the usefulness of resilience as a concept because it has been used so often and in so many contexts.

Research suggests that the concept of preoperative fasting is an outdated clinical approach.

How to critically analyse arguments, viewpoints or ideas

The table below summarizes the criteria for judging the strengths and weaknesses of an argument, viewpoint or idea:

  • reasons support the argument
  • argument is substantiated by evidence
  • evidence for the argument is relevant
  • evidence for the argument is unbiased, sufficient and important
  • evidence is reputable

Evaluating Arguments, Views or Ideas

Critical analysis examples of arguments, viewpoints or ideas

The validity of this argument is questionable as there is insufficient evidence to support it.

Many writers have challenged Jones’ claim on the grounds that …….

This argument fails to draw on the evidence of others in the field.

This explanation is incomplete because it does not explain why…

The key problem with this explanation is that ……


The existing accounts fail to resolve the contradiction between …

However, there is an inconsistency with this argument. The inconsistency lies in…

Although this argument has been proposed by some, it lacks justification.

However, the body of evidence showing that… contradicts this argument.

How to critically analyse a methodology

The table below provides the criteria for judging the strengths and weaknesses of methodology.

An evaluation of a methodology usually involves a critical analysis of its main sections:

design; sampling (participants); measurement tools and materials; procedure

  • design tests the hypotheses or research questions
  • method valid and reliable
  • potential bias or measurement error, and confounding variables addressed
  • method allows results to be generalized
  • representative sampling of cohort and phenomena; sufficient response rate
  • valid and reliable measurement tools
  • valid and reliable procedure
  • method clear and detailed to allow replication

Evaluating a Methodology

Critical analysis examples of a methodology

The unrepresentativeness of the sample makes these results misleading.

The presence of unmeasured variables in this study limits the interpretation of the results.

Other, unmeasured confounding variables may be influencing this association.

The interpretation of the data requires caution because the effect of confounding variables was not taken into account.

The insufficient control of several response biases in this study means the results are likely to be unreliable.

Although this correlational study shows association between the variables, it does not establish a causal relationship.

Taken together, the methodological shortcomings of this study suggest the need for serious caution in the meaningful interpretation of the study’s results.

How to critically analyse research results and conclusions

The table below provides the criteria for judging the strengths and weaknesses of research results and conclusions:

  • appropriate choice and use of statistics
  • correct interpretation of results
  • all results explained
  • alternative explanations considered
  • significance of all results discussed
  • consistency of results with previous research discussed
  • results add to existing understanding or knowledge
  • limitations discussed
  • results clearly explained
  • conclusions consistent with results

Evaluating the Results and Conclusions

Leave a Reply

Click here to cancel reply.

You must be logged in to post a comment.

WRITING FORMATS FOR EDITING OR COACHING

  • Essay or assignment
  • Thesis or dissertation
  • Proposal for PhD or Masters research
  • Literature review
  • Journal article or book chapter
  • IELTS writing tasks 1 & 2 for general and academic writing
  • Resumes & cover letters
  • Presentations
  • Applications & submissions

Do you have a question?

  • Academic writing skills
  • Academic English skills
  • The Academic team
  • Privacy policy
  • Terms and conditions
  • ABN: 15796080518
  • 66 Mungarie Street, Keperra, Qld 4054 Australia
  • Email: [email protected]

Website design and development by Caboodle Web

  • Writing Rules
  • Running Head & Page numbers
  • Using Quotations
  • Citing Sources
  • Reference List
  • General Reference List Principles
  • Structure of the Report

Introduction

  • References & Appendices
  • Unpacking the Assignment Topic
  • Planning and Structuring the Assignment
  • Writing the Assignment
  • Writing Concisely
  • Developing Arguments

Critically Evaluating Research

  • Editing the Assignment
  • Writing in the Third Person
  • Directive Words
  • Before You Submit
  • Cover Sheet & Title Page
  • Academic Integrity
  • Marking Criteria
  • Word Limit Rules
  • Submitting Your Work
  • Writing Effective E-mails
  • Writing Concisely Exercise
  • About Redbook

Some research reports or assessments will require you critically evaluate a journal article or piece of research. Below is a guide with examples of how to critically evaluate research and how to communicate your ideas in writing.

To develop the skill of being able to critically evaluate, when reading research articles in psychology read with an open mind and be active when reading. Ask questions as you go and see if the answers are provided. Initially skim through the article to gain an overview of the problem, the design, methods, and conclusions. Then read for details and consider the questions provided below for each section of a journal article.

  • Did the title describe the study?
  • Did the key words of the title serve as key elements of the article?
  • Was the title concise, i.e., free of distracting or extraneous phrases?
  • Was the abstract concise and to the point?
  • Did the abstract summarise the study’s purpose/research problem, the independent and dependent variables under study, methods, main findings, and conclusions?
  • Did the abstract provide you with sufficient information to determine what the study is about and whether you would be interested in reading the entire article?
  • Was the research problem clearly identified?
  • Is the problem significant enough to warrant the study that was conducted?
  • Did the authors present an appropriate theoretical rationale for the study?
  • Is the literature review informative and comprehensive or are there gaps?
  • Are the variables adequately explained and operationalised?
  • Are hypotheses and research questions clearly stated? Are they directional? Do the author’s hypotheses and/or research questions seem logical in light of the conceptual framework and research problem?
  • Overall, does the literature review lead logically into the Method section?
  • Is the sample clearly described, in terms of size, relevant characteristics (gender, age, SES, etc), selection and assignment procedures, and whether any inducements were used to solicit subjects (payment, subject credit, free therapy, etc)?
  • What population do the subjects represent (external validity)?
  • Are there sufficient subjects to produce adequate power (statistical validity)?
  • Have the variables and measurement techniques been clearly operationalised?
  • Do the measures/instruments seem appropriate as measures of the variables under study (construct validity)?
  • Have the authors included sufficient information about the psychometric properties (eg. reliability and validity) of the instruments?
  • Are the materials used in conducting the study or in collecting data clearly described?
  • Are the study’s scientific procedures thoroughly described in chronological order?
  • Is the design of the study identified (or made evident)?
  • Do the design and procedures seem appropriate in light of the research problem, conceptual framework, and research questions/hypotheses?
  • Are there other factors that might explain the differences between groups (internal validity)?
  • Were subjects randomly assigned to groups so there was no systematic bias in favour of one group? Was there a differential drop-out rate from groups so that bias was introduced (internal validity and attrition)?
  • Were all the necessary control groups used? Were participants in each group treated identically except for the administration of the independent variable?
  • Were steps taken to prevent subject bias and/or experimenter bias, eg, blind or double blind procedures?
  • Were steps taken to control for other possible confounds such as regression to the mean, history effects, order effects, etc (internal validity)?
  • Were ethical considerations adhered to, eg, debriefing, anonymity, informed consent, voluntary participation?
  • Overall, does the method section provide sufficient information to replicate the study?
  • Are the findings complete, clearly presented, comprehensible, and well organised?
  • Are data coding and analysis appropriate in light of the study’s design and hypotheses? Are the statistics reported correctly and fully, eg. are degrees of freedom and p values given?
  • Have the assumptions of the statistical analyses been met, eg. does one group have very different variance to the others?
  • Are salient results connected directly to hypotheses? Are there superfluous results presented that are not relevant to the hypotheses or research question?
  • Are tables and figures clearly labelled? Well-organised? Necessary (non-duplicative of text)?
  • If a significant result is obtained, consider effect size. Is the finding meaningful? If a non-significant result is found, could low power be an issue? Were there sufficient levels of the IV?
  • If necessary have appropriate post-hoc analyses been performed? Were any transformations performed; if so, were there valid reasons? Were data collapsed over any IVs; if so, were there valid reasons? If any data was eliminated, were valid reasons given?

Discussion and Conclusion

  • Are findings adequately interpreted and discussed in terms of the stated research problem, conceptual framework, and hypotheses?
  • Is the interpretation adequate? i.e., does it go too far given what was actually done or not far enough? Are non-significant findings interpreted inappropriately?
  • Is the discussion biased? Are the limitations of the study delineated?
  • Are implications for future research and/or practical application identified?
  • Are the overall conclusions warranted by the data and any limitations in the study? Are the conclusions restricted to the population under study or are they generalised too widely?
  • Is the reference list sufficiently specific to the topic under investigation and current?
  • Are citations used appropriately in the text?

General Evaluation

  • Is the article objective, well written and organised?
  • Does the information provided allow you to replicate the study in all its details?
  • Was the study worth doing? Does the study provide an answer to a practical or important problem? Does it have theoretical importance? Does it represent a methodological or technical advance? Does it demonstrate a previously undocumented phenomenon? Does it explore the conditions under which a phenomenon occurs?

How to turn your critical evaluation into writing

Example from a journal article.

how to critically evaluate research papers

  • The Open University
  • Guest user / Sign out
  • Study with The Open University

My OpenLearn Profile

Personalise your OpenLearn profile, save your favourite content and get recognition for your learning

About this free course

Become an ou student, download this course, share this free course.

Succeeding in postgraduate study

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

1 Important points to consider when critically evaluating published research papers

Simple review articles (also referred to as ‘narrative’ or ‘selective’ reviews), systematic reviews and meta-analyses provide rapid overviews and ‘snapshots’ of progress made within a field, summarising a given topic or research area. They can serve as useful guides, or as current and comprehensive ‘sources’ of information, and can act as a point of reference to relevant primary research studies within a given scientific area. Narrative or systematic reviews are often used as a first step towards a more detailed investigation of a topic or a specific enquiry (a hypothesis or research question), or to establish critical awareness of a rapidly-moving field (you will be required to demonstrate this as part of an assignment, an essay or a dissertation at postgraduate level).

The majority of primary ‘empirical’ research papers essentially follow the same structure (abbreviated here as IMRAD). There is a section on Introduction, followed by the Methods, then the Results, which includes figures and tables showing data described in the paper, and a Discussion. The paper typically ends with a Conclusion, and References and Acknowledgements sections.

The Title of the paper provides a concise first impression. The Abstract follows the basic structure of the extended article. It provides an ‘accessible’ and concise summary of the aims, methods, results and conclusions. The Introduction provides useful background information and context, and typically outlines the aims and objectives of the study. The Abstract can serve as a useful summary of the paper, presenting the purpose, scope and major findings. However, simply reading the abstract alone is not a substitute for critically reading the whole article. To really get a good understanding and to be able to critically evaluate a research study, it is necessary to read on.

While most research papers follow the above format, variations do exist. For example, the results and discussion sections may be combined. In some journals the materials and methods may follow the discussion, and in two of the most widely read journals, Science and Nature, the format does vary from the above due to restrictions on the length of articles. In addition, there may be supporting documents that accompany a paper, including supplementary materials such as supporting data, tables, figures, videos and so on. There may also be commentaries or editorials associated with a topical research paper, which provide an overview or critique of the study being presented.

Box 1 Key questions to ask when appraising a research paper

  • Is the study’s research question relevant?
  • Does the study add anything new to current knowledge and understanding?
  • Does the study test a stated hypothesis?
  • Is the design of the study appropriate to the research question?
  • Do the study methods address key potential sources of bias?
  • Were suitable ‘controls’ included in the study?
  • Were the statistical analyses appropriate and applied correctly?
  • Is there a clear statement of findings?
  • Does the data support the authors’ conclusions?
  • Are there any conflicts of interest or ethical concerns?

There are various strategies used in reading a scientific research paper, and one of these is to start with the title and the abstract, then look at the figures and tables, and move on to the introduction, before turning to the results and discussion, and finally, interrogating the methods.

Another strategy (outlined below) is to begin with the abstract and then the discussion, take a look at the methods, and then the results section (including any relevant tables and figures), before moving on to look more closely at the discussion and, finally, the conclusion. You should choose a strategy that works best for you. However, asking the ‘right’ questions is a central feature of critical appraisal, as with any enquiry, so where should you begin? Here are some critical questions to consider when evaluating a research paper.

Look at the Abstract and then the Discussion : Are these accessible and of general relevance or are they detailed, with far-reaching conclusions? Is it clear why the study was undertaken? Why are the conclusions important? Does the study add anything new to current knowledge and understanding? The reasons why a particular study design or statistical method were chosen should also be clear from reading a research paper. What is the research question being asked? Does the study test a stated hypothesis? Is the design of the study appropriate to the research question? Have the authors considered the limitations of their study and have they discussed these in context?

Take a look at the Methods : Were there any practical difficulties that could have compromised the study or its implementation? Were these considered in the protocol? Were there any missing values and, if so, was the number of missing values too large to permit meaningful analysis? Was the number of samples (cases or participants) too small to establish meaningful significance? Do the study methods address key potential sources of bias? Were suitable ‘controls’ included in the study? If controls are missing or not appropriate to the study design, we cannot be confident that the results really show what is happening in an experiment. Were the statistical analyses appropriate and applied correctly? Do the authors point out the limitations of methods or tests used? Were the methods referenced and described in sufficient detail for others to repeat or extend the study?

Take a look at the Results section and relevant tables and figures : Is there a clear statement of findings? Were the results expected? Do they make sense? What data supports them? Do the tables and figures clearly describe the data (highlighting trends etc.)? Try to distinguish between what the data show and what the authors say they show (i.e. their interpretation).

Moving on to look in greater depth at the Discussion and Conclusion : Are the results discussed in relation to similar (previous) studies? Do the authors indulge in excessive speculation? Are limitations of the study adequately addressed? Were the objectives of the study met and the hypothesis supported or refuted (and is a clear explanation provided)? Does the data support the authors’ conclusions? Maybe there is only one experiment to support a point. More often, several different experiments or approaches combine to support a particular conclusion. A rule of thumb here is that if multiple approaches and multiple lines of evidence from different directions are presented, and all point to the same conclusion, then the conclusions are more credible. But do question all assumptions. Identify any implicit or hidden assumptions that the authors may have used when interpreting their data. Be wary of data that is mixed up with interpretation and speculation! Remember, just because it is published, does not mean that it is right.

O ther points you should consider when evaluating a research paper : Are there any financial, ethical or other conflicts of interest associated with the study, its authors and sponsors? Are there ethical concerns with the study itself? Looking at the references, consider if the authors have preferentially cited their own previous publications (i.e. needlessly), and whether the list of references are recent (ensuring that the analysis is up-to-date). Finally, from a practical perspective, you should move beyond the text of a research paper, talk to your peers about it, consult available commentaries, online links to references and other external sources to help clarify any aspects you don’t understand.

The above can be taken as a general guide to help you begin to critically evaluate a scientific research paper, but only in the broadest sense. Do bear in mind that the way that research evidence is critiqued will also differ slightly according to the type of study being appraised, whether observational or experimental, and each study will have additional aspects that would need to be evaluated separately. For criteria recommended for the evaluation of qualitative research papers, see the article by Mildred Blaxter (1996), available online. Details are in the References.

Activity 1 Critical appraisal of a scientific research paper

A critical appraisal checklist, which you can download via the link below, can act as a useful tool to help you to interrogate research papers. The checklist is divided into four sections, broadly covering:

  • some general aspects
  • research design and methodology
  • the results
  • discussion, conclusion and references.

Science perspective – critical appraisal checklist [ Tip: hold Ctrl and click a link to open it in a new tab. ( Hide tip ) ]

  • Identify and obtain a research article based on a topic of your own choosing, using a search engine such as Google Scholar or PubMed (for example).
  • The selection criteria for your target paper are as follows: the article must be an open access primary research paper (not a review) containing empirical data, published in the last 2–3 years, and preferably no more than 5–6 pages in length.
  • Critically evaluate the research paper using the checklist provided, making notes on the key points and your overall impression.

Critical appraisal checklists are useful tools to help assess the quality of a study. Assessment of various factors, including the importance of the research question, the design and methodology of a study, the validity of the results and their usefulness (application or relevance), the legitimacy of the conclusions, and any potential conflicts of interest, are an important part of the critical appraisal process. Limitations and further improvements can then be considered.

Previous

Enago Read - Literature Review and Analysis tool for Researchers

Critical Reading 101:

how to critically evaluate research papers

Get ready to read research papers effectively

If you’re going to be reviewing and analysing lots of research papers, you need a plan. To know what to read and when to stop, to collect the evidence and ensure you understand.

Critical reading is the skill used by researchers to better comprehend and evaluate literature. It requires structure and the right tools to help you to stay focussed on interpreting, questioning and critiquing what you read.

how to critically evaluate research papers

How to critically read a research paper

Here’s the fundamentals and what you need to master critical reading:

Plan your investigation, before you begin reading

Start by defining your investigation questions and create a template to record evidence, what you understand and the pros and cons of the sections that are important.

3 things to include:

  • What do you need to know about the paper to understand the goals and context of their research?
  • What evidence are you looking for from the research paper, based on what you want to do with the information or the purpose of your review?
  • What questions will you ask to assess the claims and document the strengths and weaknesses?

It is also a good idea to organise your questions and apply cut off points, so you know when to stop reading. For example if you are not seeing anything new or if you identify a fallacy early on you need to decide whether it is worth spending any more time on the paper. However far you get, record what you find as everything you read provides insight.

What you will need: A template to record evidence and your critique consistently

Decide where to begin – problem, goal, methodology

If you know what you want to examine, you need to figure out where to look, so you don’t have to read every word to extract what’s important. Use the structure of the paper and skim read sections or use AI to help you to find information of interest.

What you will need: A strategy to find and collate specific information to review

Systematically find, read and question the information

So now you know what you are looking for and where to look, it’s time to start reading. It’s likely you will read important sections of a paper more than once. Firstly to ensure you understand and comprehend what the information is telling you, before you go on to analyse and question what is being said. Each time, actively think about what you want to know and the questions that you are trying to answer.

For example if you are reading to comprehend, question whether you understand the concepts, theory or terminology that is being used. Lookup things you don’t know, to avoid misinterpreting what you read.

What you will need: Highlight and note taking tools to help recall important information

Interpret and critique what you found

Test your comprehension and use the template you created to write up your findings straight after reading or analysing sections of the paper. Begin with your understanding of the research, followed by your analysis of the findings. Remember, you are evaluating this from the perspective of what the author wants to achieve before evaluating the pros and cons in relation to what you already know.

Reading, questioning and writing up your findings can be intense, but it will help you to stay focussed, help you to remember the insights and enable you to compare findings as you read more and more papers. What you will need: Your critique template and the ability to link information as you find connections

This is what you will need to help you to critically read:

  • A template to record evidence and your critique consistently
  • A way to find and collate specific information to review
  • Highlight and note taking tools to help recall important information
  • The ability to link information as you find connections

RAx provides the tools to help researchers to critically read

how to critically evaluate research papers

Author: Enago Read

Launching RAx Review Assistant for Peer Reviewers

RAx Citation Count: Check the impact of your research paper

Enago Read (RAx) is joining Enago (Crimson Interactive)

AI Text Summarizer: Best AI Summarizer for Free

Leave A Reply Cancel Reply

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Subscribe to stay updated with us!

Join thousands of subscribers to get regular updates on industry trends and our blog posts.

Logo for Paul Pope

How to Evaluate Research Papers Critically (A Student Guide)

Blog , Pedagogy July 17, 2023

Scientists collect data and test ideas about many topics and phenomena (e.g., how and why things happen). Their research is then written academically in a paper or article and published in scientific journals like Nature or Science . University students often evaluate research papers critically as part of a course assessment. So here are some tips to help students critically assess research papers and get better grades!

How to Evaluate Research Papers Critically (A Student Guide)

What Is the Critical Evaluation of Research Papers?

When academics ask students to evaluate research papers critically, they want them to appraise the quality of the work. This appraisal involves making a balanced judgement after thinking carefully about the facts presented in the paper.

Moreover, evaluating the positive and negative aspects of the work requires problem-solving, the ability to create new knowledge and effective communication. Thinking about the good and bad points refines your analytic skills and provides more perspectives on the same material.

So, the critical evaluation of research papers tests your knowledge of scientific facts and your ability to think beyond what you have read.

Just like Kurt Koffka’s (the German Gestalt psychologist) famous quote, “ The whole is something else than the sum of its parts .” So, the point of criticising research papers is “ To create something else than the sum of its parts “.

Evaluate Research Papers Critically by Retelling an Article for a Novice

One way that makes it easier for students to evaluate research papers critically is to read them with the view of retelling the work to a non-expert. This skill involves interpreting and translating the facts so a novice can understand the work’s essence. It’s not easy. Albert Einstein once said, “ Smart people simplify things “. But when one understands the crux of a research paper—it’s easier to evaluate the science.

Tips to Help Students Evaluate Research Papers Critically

Asking yourself questions when reading the different parts of a research paper will help you judge the quality of the science. And try to make your answers clear and simple so a non-expert can understand.

Here are some questions to ask yourself when reading research papers to help you evaluate the quality of the science. Remember, criticisms are legitimate if supported by well-reasoned facts.

Introduction

Firstly, examine the introduction and literature review to evaluate the study’s background and relevance to the field. Check if the authors provide a research question or hypothesis. And ask yourself:

  • Is it clear why the authors conducted the research?
  • Does the study add anything new to current knowledge and understanding?
  • Is the statement about what the authors expect to find clear?

Secondly, evaluate the study’s design and methods. Consider practical difficulties that could compromise the research, and ask yourself:

  • Was the number of participants too small to establish a finding?
  • Is it obvious why the authors chose the design or method?
  • Are the methods appropriate for the research question asked?
  • Do the authors describe methods in sufficient detail for others to replicate?

Thirdly, look at the results and data analysis sections. Evaluate whether the results are statistically significant and whether the authors provide sufficient detail on the statistical methods used. Ask yourself:

  • Do the results make sense?
  • Were the findings expected or surprising?
  • Do the figures and tables clearly describe the data?
  • Have the authors interpreted the results accurately?

Lastly, assess the validity and reliability of the study. Evaluate the authors’ interpretation of the results by asking yourself:

  • Are the results discussed alongside similar past studies?
  • Is the data interpreted meaningfully or speculatively?
  • Do the authors address the limitations and biases of the study?
  • Are the solutions to any problems reported clearly?

Other Points to Help You Critically Evaluate Research Papers

When comparing and contrasting multiple research papers, ask yourself these three additional questions:

  • Do the different authors agree or disagree with each other?
  • What are the strengths and weaknesses of the articles?
  • How do they expand current knowledge and understanding?

Create Something Other Than the Sum of Its Parts

Students typically receive high grades if their work is insightful and provides more perspectives on the same material. That is to say, a student’s work creates new knowledge different from the sum of its parts!

Moreover, you can provide more perspectives on the same material when evaluating research papers critically by considering the following approaches:

  • Compare and contrast research papers with others on the same topic. This approach will help you identify similarities and differences. And gaps that you could address in your evaluation.
  • Look at the different methodologies used in the papers and evaluate their strengths and weaknesses. Consider alternative methods and assess how they might impact the research findings.
  • Think about the theoretical frameworks applied in the studies. And consider whether alternative assumptions and perspectives could apply to the work.
  • Consider how the research might appear from different cultural and contextual perspectives. For example, a study conducted in one country might be interpreted differently in another country with different cultural norms and values.
  • Evaluate the ethical and social implications of the research and consider how different perspectives might interpret these implications differently. This approach can help you to identify potential biases and assumptions that might be present in the studies.

Criticising research papers involves examining the quality and validity of the science. The steps and approaches above can help students provide more perspectives on the same material when evaluating research papers. And develop a deeper understanding of the research and its implications.

In addition, if you are unsure how to write an academic essay at university, check out this post about How to Write an Academic Essay .

Did you enjoy this student guide ? If so, please like and share. Thank you!

Tags: "How-To" , Student Guide

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

My Instagram

@popesphotos

@popesphotos

how to critically evaluate research papers

© Paul Pope. All Rights Reserved. Privacy .

IMAGES

  1. How to Critically Evaluate a RESEARCH Paper? Part 1: Introduction Section

    how to critically evaluate research papers

  2. (PDF) A Critical Evaluation of Qualitative Reports and Their

    how to critically evaluate research papers

  3. Evaluation Essay

    how to critically evaluate research papers

  4. Critical Evaluation Essay Example

    how to critically evaluate research papers

  5. Critical Analysis Of A Research Paper

    how to critically evaluate research papers

  6. Evaluating Quality of a Research Article

    how to critically evaluate research papers

VIDEO

  1. How to Evaluate Research Sources

  2. BSN

  3. Criticality in Reviewing Literature

  4. What is CASP checklist for systematic review?

  5. How to indicate quality of literature in LitAssist

  6. HOW TO READ and ANALYZE A RESEARCH STUDY

COMMENTS

  1. Critical appraisal of published research papers

    INTRODUCTION. Critical appraisal of a research paper is defined as "The process of carefully and systematically examining research to judge its trustworthiness, value and relevance in a particular context."[] Since scientific literature is rapidly expanding with more than 12,000 articles being added to the MEDLINE database per week,[] critical appraisal is very important to distinguish ...

  2. Critically reviewing literature: A tutorial for new researchers

    For example, my critical review of the definition of consumer agency uncovered that many papers in consumer research, even those with agency in the title, did not define agency. ... However, research students usually find the task of critically evaluating the literature far more challenging. This article has explained the nature, purposes and ...

  3. A guide to critical appraisal of evidence : Nursing2020 Critical Care

    Critical appraisal is the assessment of research studies' worth to clinical practice. Critical appraisal—the heart of evidence-based practice—involves four phases: rapid critical appraisal, evaluation, synthesis, and recommendation. This article reviews each phase and provides examples, tips, and caveats to help evidence appraisers ...

  4. Evaluating Research in Academic Journals: A Practical Guide to

    Academic Journals. Evaluating Research in Academic Journals is a guide for students who are learning how to. evaluate reports of empirical research published in academic journals. It breaks down ...

  5. How to read a paper, critical review

    To be critical of a text means you question the information and opinions in the text, in an attempt to evaluate or judge its worth overall. An evaluation is an assessment of the strengths and weaknesses of a text. This should relate to specific criteria, in the case of a research article. You have to understand the purpose of each section, and ...

  6. Write a Critical Review of a Scientific Journal Article

    Use the questions below to help you evaluate the quality of the authors' research: Title. Does the title precisely state the subject of the paper? Abstract. Read the statement of purpose in the abstract. Does it match the one in the introduction? Acknowledgments. Could the source of the research funding have influenced the research topic or ...

  7. PDF Planning and writing a critical review

    appraisal, critical analysis) is a detailed commentary on and critical evaluation of a text. You might carry out a critical review as a stand-alone exercise, or as part of your research and preparation for writing a literature review. The following guidelines are designed to help you critically evaluate a research article. What is meant by ...

  8. The fundamentals of critically appraising an article

    In a nutshell when appraising an article, you are assessing: 1. Its relevance, methods, and validity. The strengths and weaknesses of the paper. Relevance to specific circumstances. 2. In this ...

  9. Evaluating Research

    Definition: Evaluating Research refers to the process of assessing the quality, credibility, and relevance of a research study or project. This involves examining the methods, data, and results of the research in order to determine its validity, reliability, and usefulness. Evaluating research can be done by both experts and non-experts in the ...

  10. Critically appraising qualitative research

    Six key questions will help readers to assess qualitative research #### Summary points Over the past decade, readers of medical journals have gained skills in critically appraising studies to determine whether the results can be trusted and applied to their own practice settings. Criteria have been designed to assess studies that use quantitative methods, and these are now in common use. In ...

  11. PDF Step'by-step guide to critiquing research. Part 1: quantitative research

    critiquing the literature, critical analysis, reviewing the literature, evaluation and appraisal of the literature which are in essence the same thing (Bassett and Bassett, 2003). Terminology in research can be confusing for the novice research reader where a term like 'random' refers to an organized manner of selecting items or participants ...

  12. Critical Appraisal and Analysis

    Primary sources are the raw material of the research process. Secondary sources are based on primary sources. For example, if you were researching Konrad Adenauer's role in rebuilding West Germany after World War II, Adenauer's own writings would be one of many primary sources available on this topic.

  13. 5. Critically Analyze and Evaluate

    Take notes on the articles as you read them and identify any themes or concepts that may apply to your research question. This sample template (below) may also be useful for critically reading and organizing your articles. Or you can use this online form and email yourself a copy.

  14. Critical Analysis and Evaluation

    Critical Analysis and Evaluation. Many assignments ask you to critique and evaluate a source. Sources might include journal articles, books, websites, government documents, portfolios, podcasts, or presentations. When you critique, you offer both negative and positive analysis of the content, writing, and structure of a source.

  15. How to write a successful critical analysis

    To be critical, or to critique, means to evaluate. Therefore, to write critically in an academic analysis means to: judge the quality, significance or worth of the theories, concepts, viewpoints, methodologies, and research results. evaluate in a fair and balanced manner. avoid extreme or emotional language. You evaluate or judge the quality ...

  16. Writing Tips: Critically Evaluating Research

    To develop the skill of being able to critically evaluate, when reading research articles in psychology read with an open mind and be active when reading. Ask questions as you go and see if the answers are provided. Initially skim through the article to gain an overview of the problem, the design, methods, and conclusions. Then read for details ...

  17. 1 Important points to consider when critically evaluating published

    Critically evaluate the research paper using the checklist provided, making notes on the key points and your overall impression. Discussion. Critical appraisal checklists are useful tools to help assess the quality of a study. Assessment of various factors, including the importance of the research question, the design and methodology of a study ...

  18. How to Write a Literature Review

    When you write a thesis, dissertation, or research paper, you will likely have to conduct a literature review to situate your research within existing knowledge. The literature review gives you a chance to: ... Critically evaluate: mention the strengths and weaknesses of your sources; Write in well-structured paragraphs: ...

  19. Critical reading: Get ready to read research papers effectively

    How to critically read a research paper. Here's the fundamentals and what you need to master critical reading: Plan your investigation, before you begin reading. Start by defining your investigation questions and create a template to record evidence, what you understand and the pros and cons of the sections that are important. 3 things to ...

  20. PDF How to Read, Critically Evaluate, and Write Research Papers

    The type of method used, of the sort summarized in Chapter 1 (case study, quasi-experiment, and so on), and the specific details of how the study was conducted are provided. The goal is to describe the methodology in enough detail that someone else could repeat the study in precisely the same way. The Method section often has distinct ...

  21. Evaluating Sources

    Lateral reading is the act of evaluating the credibility of a source by comparing it to other sources. This allows you to: Verify evidence. Contextualize information. Find potential weaknesses. If a source is using methods or drawing conclusions that are incompatible with other research in its field, it may not be reliable. Example: Lateral ...

  22. How to Evaluate Research Papers Critically (A Student Guide)

    When academics ask students to evaluate research papers critically, they want them to appraise the quality of the work. This appraisal involves making a balanced judgement after thinking carefully about the facts presented in the paper. Moreover, evaluating the positive and negative aspects of the work requires problem-solving, the ability to ...

  23. The underexposed nature-based solutions: A critical state-of-art review

    The research indicated a lack of scientific publications, especially in Belgium. Hence, grey literature was also included in the literature review. Only four papers included a quantitative assessment regarding the effectiveness of drought on a global level, all stating a positive impact on groundwater recharge.