• Search Menu
  • Advance Articles
  • Supplements
  • Virtual Collections
  • Author Guidelines
  • Submission Site
  • Open Access
  • Self-Archiving Policy
  • Benefits of Publishing with JBCR
  • About Journal of Burn Care and Research
  • About the American Burn Association
  • Editorial Board
  • Advertising & Corporate Services
  • Journals Career Network
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

To the editor:.

  • < Previous

Explanatory Case Study Design—A Clarification

  • Article contents
  • Figures & tables
  • Supplementary Data

Megan Anne Simons, Jenny Ziviani, Explanatory Case Study Design—A Clarification, Journal of Burn Care & Research , Volume 32, Issue 1, January-February 2011, Page e14, https://doi.org/10.1097/BCR.0b013e3182033569

  • Permissions Icon Permissions

For the purpose of clarity for the readership, we wish to address the description of the explanatory case study design (ECSD) as a qualitative research method in the invited critique to our original article. 1 Yin, 2 the primary source for ECSD, described case study as suitable when the number of variables of interest exceeds the number of data points (i.e., participants). He has positioned the case study as a stand-alone method, in which the collection of both quantitative and qualitative data is appropriate. 3 We agree that within a qualitative research paradigm, the sample size of seven participants 1 would likely be insufficient to satisfy the sampling strategy. However, ECSD is “driven to theory.” 3 , p. 1212 The use of multiple case studies (as in the original study) is deemed the equivalent of multiple experiments. 2 Generalization from the case studies is accomplished using replication logic derived from theoretical propositions (hypotheses) or theories about the case. Results are considered even more potent when two or more cases support the same theory but not an equally plausible, rival theory. 2 The problem of generalizing from case studies is the same as generalizing from experiments—where hypotheses and theory are the vehicles for generalization. 3 With this point of clarification, we acknowledge that the findings from the original study are limited to children with lower injury severity (when measured as %TBSA) within a shortened timeframe postburn injury (6 months).

Email alerts

Citing articles via, looking for your next opportunity.

  • Recommend to Your Librarian
  • Advertising and Corporate Services

Affiliations

  • Online ISSN 1559-0488
  • Print ISSN 1559-047X
  • Copyright © 2024 American Burn Association
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

  • Privacy Policy

Research Method

Home » Explanatory Research – Types, Methods, Guide

Explanatory Research – Types, Methods, Guide

Table of Contents

Explanatory Research

Explanatory Research

Definition :

Explanatory research is a type of research that aims to uncover the underlying causes and relationships between different variables. It seeks to explain why a particular phenomenon occurs and how it relates to other factors.

This type of research is typically used to test hypotheses or theories and to establish cause-and-effect relationships. Explanatory research often involves collecting data through surveys , experiments , or other empirical methods, and then analyzing that data to identify patterns and correlations. The results of explanatory research can provide a better understanding of the factors that contribute to a particular phenomenon and can help inform future research or policy decisions.

Types of Explanatory Research

There are several types of explanatory research, each with its own approach and focus. Some common types include:

Experimental Research

This involves manipulating one or more variables to observe the effect on other variables. It allows researchers to establish a cause-and-effect relationship between variables and is often used in natural and social sciences.

Quasi-experimental Research

This type of research is similar to experimental research but lacks full control over the variables. It is often used in situations where it is difficult or impossible to manipulate certain variables.

Correlational Research

This type of research aims to identify relationships between variables without manipulating them. It involves measuring and analyzing the strength and direction of the relationship between variables.

Case study Research

This involves an in-depth investigation of a specific case or situation. It is often used in social sciences and allows researchers to explore complex phenomena and contexts.

Historical Research

This involves the systematic study of past events and situations to understand their causes and effects. It is often used in fields such as history and sociology.

Survey Research

This involves collecting data from a sample of individuals through structured questionnaires or interviews. It allows researchers to investigate attitudes, behaviors, and opinions.

Explanatory Research Methods

There are several methods that can be used in explanatory research, depending on the research question and the type of data being collected. Some common methods include:

Experiments

In experimental research, researchers manipulate one or more variables to observe their effect on other variables. This allows them to establish a cause-and-effect relationship between the variables.

Surveys are used to collect data from a sample of individuals through structured questionnaires or interviews. This method can be used to investigate attitudes, behaviors, and opinions.

Correlational studies

This method aims to identify relationships between variables without manipulating them. It involves measuring and analyzing the strength and direction of the relationship between variables.

Case studies

Case studies involve an in-depth investigation of a specific case or situation. This method is often used in social sciences and allows researchers to explore complex phenomena and contexts.

Secondary Data Analysis

This method involves analyzing data that has already been collected by other researchers or organizations. It can be useful when primary data collection is not feasible or when additional data is needed to support research findings.

Data Analysis Methods

Explanatory research data analysis methods are used to explore the relationships between variables and to explain how they interact with each other. Here are some common data analysis methods used in explanatory research:

Correlation Analysis

Correlation analysis is used to identify the strength and direction of the relationship between two or more variables. This method is particularly useful when exploring the relationship between quantitative variables.

Regression Analysis

Regression analysis is used to identify the relationship between a dependent variable and one or more independent variables. This method is particularly useful when exploring the relationship between a dependent variable and several predictor variables.

Path Analysis

Path analysis is a method used to examine the direct and indirect relationships between variables. It is particularly useful when exploring complex relationships between variables.

Structural Equation Modeling (SEM)

SEM is a statistical method used to test and validate theoretical models of the relationships between variables. It is particularly useful when exploring complex models with multiple variables and relationships.

Factor Analysis

Factor analysis is used to identify underlying factors that contribute to the variation in a set of variables. This method is particularly useful when exploring relationships between multiple variables.

Content Analysis

Content analysis is used to analyze qualitative data by identifying themes and patterns in text, images, or other forms of data. This method is particularly useful when exploring the meaning and context of data.

Applications of Explanatory Research

The applications of explanatory research include:

  • Social sciences: Explanatory research is commonly used in social sciences to investigate the causes and effects of social phenomena, such as the relationship between poverty and crime, or the impact of social policies on individuals or communities.
  • Marketing : Explanatory research can be used in marketing to understand the reasons behind consumer behavior, such as why certain products are preferred over others or why customers choose to purchase from certain brands.
  • Healthcare : Explanatory research can be used in healthcare to identify the factors that contribute to disease or illness, as well as the effectiveness of different treatments and interventions.
  • Education : Explanatory research can be used in education to investigate the causes of academic achievement or failure, as well as the factors that influence teaching and learning processes.
  • Business : Explanatory research can be used in business to understand the factors that contribute to the success or failure of different strategies, as well as the impact of external factors, such as economic or political changes, on business operations.
  • Public policy: Explanatory research can be used in public policy to evaluate the effectiveness of policies and programs, as well as to identify the factors that contribute to social problems or inequalities.

Explanatory Research Question

An explanatory research question is a type of research question that seeks to explain the relationship between two or more variables, and to identify the underlying causes of that relationship. The goal of explanatory research is to test hypotheses or theories about the relationship between variables, and to gain a deeper understanding of complex phenomena.

Examples of explanatory research questions include:

  • What is the relationship between sleep quality and academic performance among college students, and what factors contribute to this relationship?
  • How do environmental factors, such as temperature and humidity, affect the spread of infectious diseases?
  • What are the factors that contribute to the success or failure of small businesses in a particular industry, and how do these factors interact with each other?
  • How do different teaching strategies impact student engagement and learning outcomes in the classroom?
  • What is the relationship between social support and mental health outcomes among individuals with chronic illnesses, and how does this relationship vary across different populations?

Examples of Explanatory Research

Here are a few Real-Time Examples of explanatory research:

  • Exploring the factors influencing customer loyalty: A business might conduct explanatory research to determine which factors, such as product quality, customer service, or price, have the greatest impact on customer loyalty. This research could involve collecting data through surveys, interviews, or other means and analyzing it using methods such as correlation or regression analysis.
  • Understanding the causes of crime: Law enforcement agencies might conduct explanatory research to identify the factors that contribute to crime in a particular area. This research could involve collecting data on factors such as poverty, unemployment, drug use, and social inequality and analyzing it using methods such as regression analysis or structural equation modeling.
  • Investigating the effectiveness of a new medical treatment: Medical researchers might conduct explanatory research to determine whether a new medical treatment is effective and which variables, such as dosage or patient age, are associated with its effectiveness. This research could involve conducting clinical trials and analyzing data using methods such as path analysis or SEM.
  • Exploring the impact of social media on mental health : Researchers might conduct explanatory research to determine whether social media use has a positive or negative impact on mental health and which variables, such as frequency of use or type of social media, are associated with mental health outcomes. This research could involve collecting data through surveys or interviews and analyzing it using methods such as factor analysis or content analysis.

When to use Explanatory Research

Here are some situations where explanatory research might be appropriate:

  • When exploring a new or complex phenomenon: Explanatory research can be used to understand the mechanisms of a new or complex phenomenon and to identify the variables that are most strongly associated with it.
  • When testing a theoretical model: Explanatory research can be used to test a theoretical model of the relationships between variables and to validate or modify the model based on empirical data.
  • When identifying the causal relationships between variables: Explanatory research can be used to identify the causal relationships between variables and to determine which variables have the greatest impact on the outcome of interest.
  • When conducting program evaluation: Explanatory research can be used to evaluate the effectiveness of a program or intervention and to identify the factors that contribute to its success or failure.
  • When making informed decisions: Explanatory research can be used to provide a basis for informed decision-making in business, government, or other contexts by identifying the factors that contribute to a particular outcome.

How to Conduct Explanatory Research

Here are the steps to conduct explanatory research:

  • Identify the research problem: Clearly define the research question or problem you want to investigate. This should involve identifying the variables that you want to explore, and the potential relationships between them.
  • Conduct a literature review: Review existing research on the topic to gain a deeper understanding of the variables and relationships you plan to explore. This can help you develop a hypothesis or research questions to guide your study.
  • Develop a research design: Decide on the research design that best suits your study. This may involve collecting data through surveys, interviews, experiments, or observations.
  • Collect and analyze data: Collect data from your selected sample and analyze it using appropriate statistical methods to identify any significant relationships between variables.
  • Interpret findings: Interpret the results of your analysis in light of your research question or hypothesis. Identify any patterns or relationships between variables, and discuss the implications of your findings for the wider field of study.
  • Draw conclusions: Draw conclusions based on your analysis and identify any areas for further research. Make recommendations for future research or policy based on your findings.

Purpose of Explanatory Research

The purpose of explanatory research is to identify and explain the relationships between different variables, as well as to determine the causes of those relationships. This type of research is often used to test hypotheses or theories, and to explore complex phenomena that are not well understood.

Explanatory research can help to answer questions such as “why” and “how” by providing a deeper understanding of the underlying causes and mechanisms of a particular phenomenon. For example, explanatory research can be used to determine the factors that contribute to a particular health condition, or to identify the reasons why certain marketing strategies are more effective than others.

The main purpose of explanatory research is to gain a deeper understanding of a particular phenomenon, with the goal of developing more effective solutions or interventions to address the problem. By identifying the underlying causes and mechanisms of a phenomenon, explanatory research can help to inform decision-making, policy development, and best practices in a wide range of fields, including healthcare, social sciences, business, and education

Advantages of Explanatory Research

Here are some advantages of explanatory research:

  • Provides a deeper understanding: Explanatory research aims to uncover the underlying causes and mechanisms of a particular phenomenon, providing a deeper understanding of complex phenomena that is not possible with other research designs.
  • Test hypotheses or theories: Explanatory research can be used to test hypotheses or theories by identifying the relationships between variables and determining the causes of those relationships.
  • Provides insights for decision-making: Explanatory research can provide insights that can inform decision-making in a wide range of fields, from healthcare to business.
  • Can lead to the development of effective solutions: By identifying the underlying causes of a problem, explanatory research can help to develop more effective solutions or interventions to address the problem.
  • Can improve the validity of research: By identifying and controlling for potential confounding variables, explanatory research can improve the validity and reliability of research findings.
  • Can be used in combination with other research designs : Explanatory research can be used in combination with other research designs, such as exploratory or descriptive research, to provide a more comprehensive understanding of a phenomenon.

Limitations of Explanatory Research

Here are some limitations of explanatory research:

  • Limited generalizability: Explanatory research typically involves studying a specific sample, which can limit the generalizability of findings to other populations or settings.
  • Time-consuming and resource-intensive: Explanatory research can be time-consuming and resource-intensive, particularly if it involves collecting and analyzing large amounts of data.
  • Limited scope: Explanatory research is typically focused on a narrow research question or hypothesis, which can limit its scope in comparison to other research designs such as exploratory or descriptive research.
  • Limited control over variables: Explanatory research can be limited by the researcher’s ability to control for all possible variables that may influence the relationship between variables of interest.
  • Potential for bias: Explanatory research can be subject to various types of bias, such as selection bias, measurement bias, and recall bias, which can influence the validity of research findings.
  • Ethical considerations: Explanatory research may involve the use of invasive or risky procedures, which can raise ethical concerns and require careful consideration of the potential risks and benefits of the study.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Quantitative Research

Quantitative Research – Methods, Types and...

Survey Research

Survey Research – Types, Methods, Examples

Experimental Research Design

Experimental Design – Types, Methods, Guide

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.9(8); 2019

Logo of bmjo

Beyond exploratory: a tailored framework for designing and assessing qualitative health research

Katharine a rendle.

1 Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA

Corey M Abramson

2 School of Sociology, University of Arizona, Tucson, Arizona, USA

Sarah B Garrett

3 Philip R. Lee Institute for Health Policy Studies, University of California San Francisco, San Francisco, California, USA

Meghan C Halley

4 Palo Alto Medical Foundation for Health Care Research and Education, Palo Alto, California, USA

Daniel Dohan

Associated data.

The objective of this commentary is to develop a framework for assessing the rigour of qualitative approaches that identifies and distinguishes between the diverse objectives of qualitative health research, guided by a narrative review of the published literature on qualitative guidelines and standards from peer-reviewed journals and national funding organisations that support health services research, patient-centered outcomes research and other applied health research fields. In this framework, we identify and distinguish three objectives of qualitative studies in applied health research: exploratory, descriptive and comparative. For each objective, we propose methodological standards that may be used to assess and improve rigour across all study phases—from design to reporting. Similar to hierarchies of quality of evidence within quantitative studies, we argue that standards for qualitative rigour differ, appropriately, for studies with different objectives and should be evaluated as such. Distinguishing between different objectives of qualitative health research improves the ability to appreciate variation in qualitative studies and to develop appropriate evaluations of the rigour and success of qualitative studies in meeting their stated objectives. Researchers, funders and journal editors should consider how further developing and adopting the framework for assessing qualitative rigour outlined here may advance the rigour and potential impact of this important mode of inquiry.

Article summary

  • Qualitative research in health services research (HSR) and allied fields has maintained steady, yet unsettled, interest and value over recent decades.
  • Qualitative methods are epistemologically and theoretically diverse, which is a strength. However, it also means that investigators do not necessarily approach qualitative research using a unified set of evidentiary rules. As such, assessing rigour and quality across studies can be challenging.
  • To help address these challenges, we propose a framework for assessing the rigour of qualitative approaches that identifies and distinguishes between three diverse study objectives. For each type of study, we propose preliminary methodological considerations to help improve rigour across all study phases. As is the case for quantitative studies, we argue that standards for qualitative rigourdiffer, appropriately, for different kinds of studies.
  • The objective of this commentary is not to resolve all potential conflicts between philosophical assumptions of different qualitative approaches, but rather help to advance a broader and richer understanding of qualitative rigour in relationship to other evidence hierarchies.

In recent decades, the role of qualitative research in health services research (HSR) and allied fields has maintained steady, yet unsettled, interest and value. Evidence of steady interest includes publication of qualitative reviews and guidelines by leading journals including Health Services Research , 1 2 Medical Care Research and Review 3–5 and BMJ , 6 7 and by funders including the Robert Wood Johnson Foundation, 8 National Institutes of Health 9 10 and National Science Foundation. 11 12 In fields such as Patient-Centered Outcomes Research (PCOR) and implementation science, qualitative research has been embraced with particular enthusiasm for its ability to capture, advance and address questions meaningful to patients, clinicians and other healthcare system stakeholders. 2 13 The majority (82%) of inaugural projects awarded by the Patient-Centered Outcomes Research Institute (PCORI) incorporated qualitative research methods. 13 More recently, reflective of the continued prevalence of these approaches in the field, PCORI incorporated qualitative methods into their methodological standards.

Yet, despite this sustained interest, the status of qualitative health research remains unsettled, as illustrated by the BMJ’s changing engagement with the method. After championing qualitative methods in 2008, 7 14–17 BMJ editors in 2016 noted that they tended to assign low priority to qualitative studies because such studies are ‘usually exploratory by their very nature’. 18 This statement came in response to an open letter from scholars arguing that BMJ should adopt formal policies and training for editorial staff on what distinguishes ‘good from poor qualitative research’ rather than de-emphasising the method in toto . 19 In sum, despite sustained interest from the HSR community, the status of qualitative research remains contested. This status reflects debate over the purpose of qualitative research—is it a valuable tool to advance the field or a low-priority exercise in exploration? —and an ongoing desire for guidance on how best to distinguish high-quality from low-quality qualitative research.

Assessing rigour and quality in qualitative research is challenging because qualitative methods are epistemologically diverse. 20–22 This diversity is a strength because it allows for the theoretical and methodological flexibility necessary to fully understand a specific topic from multiple perspectives. 16 However, it also means that investigators do not necessarily approach qualitative research using a unified set of evidentiary rules. 22 Thus, scholars may measure the quality of studies using different or even incompatible yardsticks.

The challenge of diverse epistemologies has become more acute as qualitative health research has expanded beyond its historical roots in phenomenological or grounded theory studies. Contemporary researchers may use qualitative data and methods to improve the descriptive accuracy of health-related phenomena that have already been characterised by exploratory work or are difficult to capture using other approaches. 23 Researchers also use larger scale, comparative qualitative studies in ways that resemble quantitative efforts to identify explanatory pathways. 24 Therefore, assessing the rigour of a specific qualitative study benefits from first identifying the analytic goals and objectives of the study—that is, identifying which yardstick investigators themselves have adopted—and then using this yardstick to examine how the study measures up.

To address these challenges, we propose a tailored framework for designing and informing assessments of different types of qualitative health research common within HSR. The framework recognises that qualitative investigators have different objectives and yardsticks in mind when undertaking studies and that rigour should be assessed accordingly. We distinguish three central types of qualitative study objectives common in applied health research: exploratory, descriptive and comparative. For each objective, we propose preliminary methodological considerations to help improve rigour across all study phases—from design to reporting. As is the case for quantitative studies, we argue that standards for qualitative rigour differ, appropriately, for different kinds of studies. The objective of this commentary is not to resolve all potential conflicts between philosophical assumptions of different qualitative approaches, but rather help to advance a broader and richer understanding of qualitative rigour in relationship to other evidence hierarchies. The proposed framework offers a nuanced set of categories by which to conduct and recognise high-quality qualitative research. The framework also supports efforts to shift debates over the value of qualitative research to discussions on how we can promote rigour across different types of valuable qualitative studies, and underscore how qualitative methods can advance clinical and applied health research.

Designing a tailored framework: methods and results

Our framework is based on a team-based review of published guidelines and standards discussing the scientific conduct of qualitative health research. Guided by expert consensus and a targeted literature scan, we identified and reviewed 17 peer-reviewed articles and expert reports published by journals widely read by the HSR community and by major funders or sponsors of qualitative health research (1–12, 21, 33–36). In contrast to previous reviews, 25 we did not seek to synthesise these guidelines. Rather we drew on them to develop a conceptual framework for designing and informing formal assessments of rigorous qualitative research.

Range of approaches in qualitative research

Qualitative research incorporates a range of methods including in-depth interviews, focus groups, ethnography and many others. 26 Even within a single method, accepted approaches and standards for rigour vary depending on disciplinary and theoretical orientations. Correspondingly, qualitative research cannot be defined by a single theoretical approach or data collection procedure. Rather many, often debated, approaches exist with distinct implications for appropriate standards for data collection, analysis and interpretation.

On one end of the spectrum, qualitative researchers guided by realism subscribe to the assumption that rigorous scientific research can provide an accurate and objective representation of reality, and that objectivity should be a primary goal of all scientific inquiries, including qualitative research. 27 These qualitative researchers generally consider standards such as validity, reliability, reproducibility and generalisability as similarly legitimate yardsticks for qualitative research as they are in quantitative research. 28 On the other end of the spectrum, relativist philosophical approaches to qualitative research typically argue that all research is inherently subjective and/or political, 29 and some relativists criticise the scientific approach specifically because it claims to be objective. 30 31 Much of applied qualitative health research falls somewhere between the two ends of the spectrum. For example, Mays and Pope consider themselves ‘subtle realists’. 6 They acknowledge that all research involves subjectivity and includes political dimensions, but they also contend that qualitative research should, nevertheless, be assessed by a similar set of quality criteria as quantitative studies. Although we recognise the value strictly relativist approaches provide, the framework and design considerations we propose are largely guided by a realist (or subtle realist) orientation. However, in addition to resonating with those who operate under similar orientations, we hope this framework will serve to advance discussions of how best to communicate and assess qualitative research using different theoretical and epistemological standpoints.

Tailored framework for qualitative health research

Given the diversity of approaches, a foundational step to improving the assessment of rigour in qualitative research is to abandon the attempt to develop a single standard for the best practices regardless of study orientation and objective. Instead, standards must begin with an assessment of epistemological assumptions and corresponding study objectives, an approach that is similar to standards for quantitative PCOR research 32 and mixed-methods research. 33 In this vein, we identified and defined three general types of study objectives broadly used in applied qualitative health research (see figure 1 ). These three types reflect differences in primary study objectives and existing knowledge within a topic area.

An external file that holds a picture, illustration, etc.
Object name is bmjopen-2019-030123f01.jpg

Three broad types of qualitative health research.

In table 1 , we provide preliminary distinctions on how exploratory, descriptive and comparative studies compare across a range of standards and guidelines that have been proposed for qualitative research (see table 1 ). Regardless of study type, researchers should report study details in clear, comprehensive ways, using standardised reporting guidelines whenever possible. 34 35

Framework for designing different types of applied qualitative health research and developing evaluative instruments to assess their rigour

Compared with descriptive or comparative studies, exploratory studies approach the topic of study primarily in an inductive fashion to investigate the areas of potential research interest that remain mostly or wholly unexamined by the scientific community. Investigators undertaking exploratory studies typically have few expectations for what they might find, and their research design and approach may shift dramatically as they learn more about the phenomena of interest. An example of an exploratory study is a study that uses convenience sampling and unstructured interviews to explore what patients think about a new treatment in a single healthcare setting.

At the opposite end of this spectrum, investigators conducting comparative studies aim to use a primarily deductive approach designed to compare and document how well-defined qualitative phenomena are represented in different settings or populations. The qualitative methods employed in a comparative study are typically defined in advance, sampling should be systematic and structured by aims, and investigators enter the field with hypothesised ideas of what findings they may uncover and how to interpret those findings in light of previous research. An example of a comparative study is a multisite ethnography that seeks to compare how patient-provider communication varies by location, and uses random sampling of patient-provider interactions to collect data.

Descriptive studies occupy a middle position, building on previously conducted exploratory work so researchers will be able to proceed with more-focused inquiry. This should include well-defined procedures including sampling protocols and analytical plans, and investigators should usually articulate expected findings prior to beginning the study. However, as researchers investigate phenomena in new settings or patient populations, it is reasonable to expect descriptive studies to generate surprises. Thus, descriptive studies also feature inductive elements to detect unexpected findings, and must be flexible enough in design to accommodate shifts in research focus and methods based on empirical findings. An example of a descriptive study is a longitudinal study of patients with ovarian cancer that employs semistructured interviews and directed content analysis to examine decision-making across patients in a novel setting.

Our review identified a number of qualitative standards and guidelines that have been published. The conceptual framework we present here draws on those extant guidelines through the recognition that qualitative health research includes studies of diverse theoretical and epistemological orientations, each of which has distinct understandings of scientific quality and rigour. Given this intellectual diversity, it is inappropriate to use a single yardstick for all qualitative research. Rather, assessments of qualitative quality must begin with an assessment of a study’s theoretical orientations and research objectives to ensure that rigour is assessed on a study’s own terms. This framework and suggested approaches may help to advance evaluations of qualitative rigour that acknowledge and differentiate between the studies that report exploratory, descriptive or comparative study objectives.

Existing standards for conducting health research and grading evidence, such as Grading of Recommendations Assessment, Development and Evaluation (GRADE), 36 do not capture the diversity of qualitative studies—often designating all qualitative studies as providing weak levels of evidence. PCORI’s own methodological standards have been largely silent regarding qualitative methods until recently, 32 leaving applicants without clear direction on how to conduct rigorous qualitative research. Incorporation of tailored qualitative standards could help to clarify and improve the rigour of proposal design, review and completion. The establishment and integration of such standards could also guide journal editors in developing transparent standards for deciding priorities for publication. For example, editors may decide against publication of exploratory or descriptive studies, but prioritise well-executed comparative studies that advance the field in ways quantitative studies could not.

In addition to these immediate applications, implementing standards that incorporate the diversity of objectives within applied qualitative research has the potential to address broader challenges facing qualitative health research. These include: (1) the need to educate broader audiences about the many goals of qualitative research, including but not limited to exploration; (2) the need to create rigorous standards for conducting and reporting various types of qualitative studies to help audiences, editors and funders evaluate studies on their own merits and (3) the challenges of publishing qualitative research in prestigious and high-impact journals that will reach a wide range of practitioners, researchers and lay audiences. We contend that these challenges can be reframed as opportunities to advance the science of qualitative research, and its potential for improving outcomes for patients, providers and communities.

Supplementary Material

Acknowledgments.

Portions of this work were presented at the 2016 Academy Health Annual Research Meeting and 2017 Society of Behavioral Medicine Annual Meeting. The authors thank Katherine Gillespie and members of our Stakeholder Advisory Board for their invaluable contributions to the project.

Presented in: Earlier version presented at the 2016 Academy Health Annual Research Meeting in Boston, MA, USA.

Contributors: All authors (KAR, CMA, SBG, MCH and DD) helped to design and conceptualise this work including reviewing guidelines and conceptualising the proposed framework. KAR drafted the manuscript, and CMA, SBG, MCH and DD provided substantial review and writing to revisions.

Funding: This work was funded by the Patient-Centered Outcomes Research Institute (PCORI) Award (ME-1409-22996).

Disclaimer: The views presented in this article are solely the responsibility of the author(s) and do not necessarily represent the views of the Patient-Centered Outcomes Research Institute (PCORI), its Board of Governors or Methodology Committee. Funders had no role in the collection, analysis and interpretation of the data; in the writing of the report; and in the decision to submit the paper for publication.

Competing interests: None declared.

Patient consent for publication: Not required.

Provenance and peer review: Not commissioned; externally peer reviewed.

  • Open access
  • Published: 10 November 2020

Case study research for better evaluations of complex interventions: rationale and challenges

  • Sara Paparini   ORCID: orcid.org/0000-0002-1909-2481 1 ,
  • Judith Green 2 ,
  • Chrysanthi Papoutsi 1 ,
  • Jamie Murdoch 3 ,
  • Mark Petticrew 4 ,
  • Trish Greenhalgh 1 ,
  • Benjamin Hanckel 5 &
  • Sara Shaw 1  

BMC Medicine volume  18 , Article number:  301 ( 2020 ) Cite this article

17k Accesses

42 Citations

35 Altmetric

Metrics details

The need for better methods for evaluation in health research has been widely recognised. The ‘complexity turn’ has drawn attention to the limitations of relying on causal inference from randomised controlled trials alone for understanding whether, and under which conditions, interventions in complex systems improve health services or the public health, and what mechanisms might link interventions and outcomes. We argue that case study research—currently denigrated as poor evidence—is an under-utilised resource for not only providing evidence about context and transferability, but also for helping strengthen causal inferences when pathways between intervention and effects are likely to be non-linear.

Case study research, as an overall approach, is based on in-depth explorations of complex phenomena in their natural, or real-life, settings. Empirical case studies typically enable dynamic understanding of complex challenges and provide evidence about causal mechanisms and the necessary and sufficient conditions (contexts) for intervention implementation and effects. This is essential evidence not just for researchers concerned about internal and external validity, but also research users in policy and practice who need to know what the likely effects of complex programmes or interventions will be in their settings. The health sciences have much to learn from scholarship on case study methodology in the social sciences. However, there are multiple challenges in fully exploiting the potential learning from case study research. First are misconceptions that case study research can only provide exploratory or descriptive evidence. Second, there is little consensus about what a case study is, and considerable diversity in how empirical case studies are conducted and reported. Finally, as case study researchers typically (and appropriately) focus on thick description (that captures contextual detail), it can be challenging to identify the key messages related to intervention evaluation from case study reports.

Whilst the diversity of published case studies in health services and public health research is rich and productive, we recommend further clarity and specific methodological guidance for those reporting case study research for evaluation audiences.

Peer Review reports

The need for methodological development to address the most urgent challenges in health research has been well-documented. Many of the most pressing questions for public health research, where the focus is on system-level determinants [ 1 , 2 ], and for health services research, where provisions typically vary across sites and are provided through interlocking networks of services [ 3 ], require methodological approaches that can attend to complexity. The need for methodological advance has arisen, in part, as a result of the diminishing returns from randomised controlled trials (RCTs) where they have been used to answer questions about the effects of interventions in complex systems [ 4 , 5 , 6 ]. In conditions of complexity, there is limited value in maintaining the current orientation to experimental trial designs in the health sciences as providing ‘gold standard’ evidence of effect.

There are increasing calls for methodological pluralism [ 7 , 8 ], with the recognition that complex intervention and context are not easily or usefully separated (as is often the situation when using trial design), and that system interruptions may have effects that are not reducible to linear causal pathways between intervention and outcome. These calls are reflected in a shifting and contested discourse of trial design, seen with the emergence of realist [ 9 ], adaptive and hybrid (types 1, 2 and 3) [ 10 , 11 ] trials that blend studies of effectiveness with a close consideration of the contexts of implementation. Similarly, process evaluation has now become a core component of complex healthcare intervention trials, reflected in MRC guidance on how to explore implementation, causal mechanisms and context [ 12 ].

Evidence about the context of an intervention is crucial for questions of external validity. As Woolcock [ 4 ] notes, even if RCT designs are accepted as robust for maximising internal validity, questions of transferability (how well the intervention works in different contexts) and generalisability (how well the intervention can be scaled up) remain unanswered [ 5 , 13 ]. For research evidence to have impact on policy and systems organisation, and thus to improve population and patient health, there is an urgent need for better methods for strengthening external validity, including a better understanding of the relationship between intervention and context [ 14 ].

Policymakers, healthcare commissioners and other research users require credible evidence of relevance to their settings and populations [ 15 ], to perform what Rosengarten and Savransky [ 16 ] call ‘careful abstraction’ to the locales that matter for them. They also require robust evidence for understanding complex causal pathways. Case study research, currently under-utilised in public health and health services evaluation, can offer considerable potential for strengthening faith in both external and internal validity. For example, in an empirical case study of how the policy of free bus travel had specific health effects in London, UK, a quasi-experimental evaluation (led by JG) identified how important aspects of context (a good public transport system) and intervention (that it was universal) were necessary conditions for the observed effects, thus providing useful, actionable evidence for decision-makers in other contexts [ 17 ].

The overall approach of case study research is based on the in-depth exploration of complex phenomena in their natural, or ‘real-life’, settings. Empirical case studies typically enable dynamic understanding of complex challenges rather than restricting the focus on narrow problem delineations and simple fixes. Case study research is a diverse and somewhat contested field, with multiple definitions and perspectives grounded in different ways of viewing the world, and involving different combinations of methods. In this paper, we raise awareness of such plurality and highlight the contribution that case study research can make to the evaluation of complex system-level interventions. We review some of the challenges in exploiting the current evidence base from empirical case studies and conclude by recommending that further guidance and minimum reporting criteria for evaluation using case studies, appropriate for audiences in the health sciences, can enhance the take-up of evidence from case study research.

Case study research offers evidence about context, causal inference in complex systems and implementation

Well-conducted and described empirical case studies provide evidence on context, complexity and mechanisms for understanding how, where and why interventions have their observed effects. Recognition of the importance of context for understanding the relationships between interventions and outcomes is hardly new. In 1943, Canguilhem berated an over-reliance on experimental designs for determining universal physiological laws: ‘As if one could determine a phenomenon’s essence apart from its conditions! As if conditions were a mask or frame which changed neither the face nor the picture!’ ([ 18 ] p126). More recently, a concern with context has been expressed in health systems and public health research as part of what has been called the ‘complexity turn’ [ 1 ]: a recognition that many of the most enduring challenges for developing an evidence base require a consideration of system-level effects [ 1 ] and the conceptualisation of interventions as interruptions in systems [ 19 ].

The case study approach is widely recognised as offering an invaluable resource for understanding the dynamic and evolving influence of context on complex, system-level interventions [ 20 , 21 , 22 , 23 ]. Empirically, case studies can directly inform assessments of where, when, how and for whom interventions might be successfully implemented, by helping to specify the necessary and sufficient conditions under which interventions might have effects and to consolidate learning on how interdependencies, emergence and unpredictability can be managed to achieve and sustain desired effects. Case study research has the potential to address four objectives for improving research and reporting of context recently set out by guidance on taking account of context in population health research [ 24 ], that is to (1) improve the appropriateness of intervention development for specific contexts, (2) improve understanding of ‘how’ interventions work, (3) better understand how and why impacts vary across contexts and (4) ensure reports of intervention studies are most useful for decision-makers and researchers.

However, evaluations of complex healthcare interventions have arguably not exploited the full potential of case study research and can learn much from other disciplines. For evaluative research, exploratory case studies have had a traditional role of providing data on ‘process’, or initial ‘hypothesis-generating’ scoping, but might also have an increasing salience for explanatory aims. Across the social and political sciences, different kinds of case studies are undertaken to meet diverse aims (description, exploration or explanation) and across different scales (from small N qualitative studies that aim to elucidate processes, or provide thick description, to more systematic techniques designed for medium-to-large N cases).

Case studies with explanatory aims vary in terms of their positioning within mixed-methods projects, with designs including (but not restricted to) (1) single N of 1 studies of interventions in specific contexts, where the overall design is a case study that may incorporate one or more (randomised or not) comparisons over time and between variables within the case; (2) a series of cases conducted or synthesised to provide explanation from variations between cases; and (3) case studies of particular settings within RCT or quasi-experimental designs to explore variation in effects or implementation.

Detailed qualitative research (typically done as ‘case studies’ within process evaluations) provides evidence for the plausibility of mechanisms [ 25 ], offering theoretical generalisations for how interventions may function under different conditions. Although RCT designs reduce many threats to internal validity, the mechanisms of effect remain opaque, particularly when the causal pathways between ‘intervention’ and ‘effect’ are long and potentially non-linear: case study research has a more fundamental role here, in providing detailed observational evidence for causal claims [ 26 ] as well as producing a rich, nuanced picture of tensions and multiple perspectives [ 8 ].

Longitudinal or cross-case analysis may be best suited for evidence generation in system-level evaluative research. Turner [ 27 ], for instance, reflecting on the complex processes in major system change, has argued for the need for methods that integrate learning across cases, to develop theoretical knowledge that would enable inferences beyond the single case, and to develop generalisable theory about organisational and structural change in health systems. Qualitative Comparative Analysis (QCA) [ 28 ] is one such formal method for deriving causal claims, using set theory mathematics to integrate data from empirical case studies to answer questions about the configurations of causal pathways linking conditions to outcomes [ 29 , 30 ].

Nonetheless, the single N case study, too, provides opportunities for theoretical development [ 31 ], and theoretical generalisation or analytical refinement [ 32 ]. How ‘the case’ and ‘context’ are conceptualised is crucial here. Findings from the single case may seem to be confined to its intrinsic particularities in a specific and distinct context [ 33 ]. However, if such context is viewed as exemplifying wider social and political forces, the single case can be ‘telling’, rather than ‘typical’, and offer insight into a wider issue [ 34 ]. Internal comparisons within the case can offer rich possibilities for logical inferences about causation [ 17 ]. Further, case studies of any size can be used for theory testing through refutation [ 22 ]. The potential lies, then, in utilising the strengths and plurality of case study to support theory-driven research within different methodological paradigms.

Evaluation research in health has much to learn from a range of social sciences where case study methodology has been used to develop various kinds of causal inference. For instance, Gerring [ 35 ] expands on the within-case variations utilised to make causal claims. For Gerring [ 35 ], case studies come into their own with regard to invariant or strong causal claims (such as X is a necessary and/or sufficient condition for Y) rather than for probabilistic causal claims. For the latter (where experimental methods might have an advantage in estimating effect sizes), case studies offer evidence on mechanisms: from observations of X affecting Y, from process tracing or from pattern matching. Case studies also support the study of emergent causation, that is, the multiple interacting properties that account for particular and unexpected outcomes in complex systems, such as in healthcare [ 8 ].

Finally, efficacy (or beliefs about efficacy) is not the only contributor to intervention uptake, with a range of organisational and policy contingencies affecting whether an intervention is likely to be rolled out in practice. Case study research is, therefore, invaluable for learning about contextual contingencies and identifying the conditions necessary for interventions to become normalised (i.e. implemented routinely) in practice [ 36 ].

The challenges in exploiting evidence from case study research

At present, there are significant challenges in exploiting the benefits of case study research in evaluative health research, which relate to status, definition and reporting. Case study research has been marginalised at the bottom of an evidence hierarchy, seen to offer little by way of explanatory power, if nonetheless useful for adding descriptive data on process or providing useful illustrations for policymakers [ 37 ]. This is an opportune moment to revisit this low status. As health researchers are increasingly charged with evaluating ‘natural experiments’—the use of face masks in the response to the COVID-19 pandemic being a recent example [ 38 ]—rather than interventions that take place in settings that can be controlled, research approaches using methods to strengthen causal inference that does not require randomisation become more relevant.

A second challenge for improving the use of case study evidence in evaluative health research is that, as we have seen, what is meant by ‘case study’ varies widely, not only across but also within disciplines. There is indeed little consensus amongst methodologists as to how to define ‘a case study’. Definitions focus, variously, on small sample size or lack of control over the intervention (e.g. [ 39 ] p194), on in-depth study and context [ 40 , 41 ], on the logic of inference used [ 35 ] or on distinct research strategies which incorporate a number of methods to address questions of ‘how’ and ‘why’ [ 42 ]. Moreover, definitions developed for specific disciplines do not capture the range of ways in which case study research is carried out across disciplines. Multiple definitions of case study reflect the richness and diversity of the approach. However, evidence suggests that a lack of consensus across methodologists results in some of the limitations of published reports of empirical case studies [ 43 , 44 ]. Hyett and colleagues [ 43 ], for instance, reviewing reports in qualitative journals, found little match between methodological definitions of case study research and how authors used the term.

This raises the third challenge we identify that case study reports are typically not written in ways that are accessible or useful for the evaluation research community and policymakers. Case studies may not appear in journals widely read by those in the health sciences, either because space constraints preclude the reporting of rich, thick descriptions, or because of the reported lack of willingness of some biomedical journals to publish research that uses qualitative methods [ 45 ], signalling the persistence of the aforementioned evidence hierarchy. Where they do, however, the term ‘case study’ is used to indicate, interchangeably, a qualitative study, an N of 1 sample, or a multi-method, in-depth analysis of one example from a population of phenomena. Definitions of what constitutes the ‘case’ are frequently lacking and appear to be used as a synonym for the settings in which the research is conducted. Despite offering insights for evaluation, the primary aims may not have been evaluative, so the implications may not be explicitly drawn out. Indeed, some case study reports might properly be aiming for thick description without necessarily seeking to inform about context or causality.

Acknowledging plurality and developing guidance

We recognise that definitional and methodological plurality is not only inevitable, but also a necessary and creative reflection of the very different epistemological and disciplinary origins of health researchers, and the aims they have in doing and reporting case study research. Indeed, to provide some clarity, Thomas [ 46 ] has suggested a typology of subject/purpose/approach/process for classifying aims (e.g. evaluative or exploratory), sample rationale and selection and methods for data generation of case studies. We also recognise that the diversity of methods used in case study research, and the necessary focus on narrative reporting, does not lend itself to straightforward development of formal quality or reporting criteria.

Existing checklists for reporting case study research from the social sciences—for example Lincoln and Guba’s [ 47 ] and Stake’s [ 33 ]—are primarily orientated to the quality of narrative produced, and the extent to which they encapsulate thick description, rather than the more pragmatic issues of implications for intervention effects. Those designed for clinical settings, such as the CARE (CAse REports) guidelines, provide specific reporting guidelines for medical case reports about single, or small groups of patients [ 48 ], not for case study research.

The Design of Case Study Research in Health Care (DESCARTE) model [ 44 ] suggests a series of questions to be asked of a case study researcher (including clarity about the philosophy underpinning their research), study design (with a focus on case definition) and analysis (to improve process). The model resembles toolkits for enhancing the quality and robustness of qualitative and mixed-methods research reporting, and it is usefully open-ended and non-prescriptive. However, even if it does include some reflections on context, the model does not fully address aspects of context, logic and causal inference that are perhaps most relevant for evaluative research in health.

Hence, for evaluative research where the aim is to report empirical findings in ways that are intended to be pragmatically useful for health policy and practice, this may be an opportune time to consider how to best navigate plurality around what is (minimally) important to report when publishing empirical case studies, especially with regards to the complex relationships between context and interventions, information that case study research is well placed to provide.

The conventional scientific quest for certainty, predictability and linear causality (maximised in RCT designs) has to be augmented by the study of uncertainty, unpredictability and emergent causality [ 8 ] in complex systems. This will require methodological pluralism, and openness to broadening the evidence base to better understand both causality in and the transferability of system change intervention [ 14 , 20 , 23 , 25 ]. Case study research evidence is essential, yet is currently under exploited in the health sciences. If evaluative health research is to move beyond the current impasse on methods for understanding interventions as interruptions in complex systems, we need to consider in more detail how researchers can conduct and report empirical case studies which do aim to elucidate the contextual factors which interact with interventions to produce particular effects. To this end, supported by the UK’s Medical Research Council, we are embracing the challenge to develop guidance for case study researchers studying complex interventions. Following a meta-narrative review of the literature, we are planning a Delphi study to inform guidance that will, at minimum, cover the value of case study research for evaluating the interrelationship between context and complex system-level interventions; for situating and defining ‘the case’, and generalising from case studies; as well as provide specific guidance on conducting, analysing and reporting case study research. Our hope is that such guidance can support researchers evaluating interventions in complex systems to better exploit the diversity and richness of case study research.

Availability of data and materials

Not applicable (article based on existing available academic publications)

Abbreviations

Qualitative comparative analysis

Quasi-experimental design

Randomised controlled trial

Diez Roux AV. Complex systems thinking and current impasses in health disparities research. Am J Public Health. 2011;101(9):1627–34.

Article   Google Scholar  

Ogilvie D, Mitchell R, Mutrie N, M P, Platt S. Evaluating health effects of transport interventions: methodologic case study. Am J Prev Med 2006;31:118–126.

Walshe C. The evaluation of complex interventions in palliative care: an exploration of the potential of case study research strategies. Palliat Med. 2011;25(8):774–81.

Woolcock M. Using case studies to explore the external validity of ‘complex’ development interventions. Evaluation. 2013;19:229–48.

Cartwright N. Are RCTs the gold standard? BioSocieties. 2007;2(1):11–20.

Deaton A, Cartwright N. Understanding and misunderstanding randomized controlled trials. Soc Sci Med. 2018;210:2–21.

Salway S, Green J. Towards a critical complex systems approach to public health. Crit Public Health. 2017;27(5):523–4.

Greenhalgh T, Papoutsi C. Studying complexity in health services research: desperately seeking an overdue paradigm shift. BMC Med. 2018;16(1):95.

Bonell C, Warren E, Fletcher A. Realist trials and the testing of context-mechanism-outcome configurations: a response to Van Belle et al. Trials. 2016;17:478.

Pallmann P, Bedding AW, Choodari-Oskooei B. Adaptive designs in clinical trials: why use them, and how to run and report them. BMC Med. 2018;16:29.

Curran G, Bauer M, Mittman B, Pyne J, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26. https://doi.org/10.1097/MLR.0b013e3182408812 .

Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015 [cited 2020 Jun 27];350. Available from: https://www.bmj.com/content/350/bmj.h1258 .

Evans RE, Craig P, Hoddinott P, Littlecott H, Moore L, Murphy S, et al. When and how do ‘effective’ interventions need to be adapted and/or re-evaluated in new contexts? The need for guidance. J Epidemiol Community Health. 2019;73(6):481–2.

Shoveller J. A critical examination of representations of context within research on population health interventions. Crit Public Health. 2016;26(5):487–500.

Treweek S, Zwarenstein M. Making trials matter: pragmatic and explanatory trials and the problem of applicability. Trials. 2009;10(1):37.

Rosengarten M, Savransky M. A careful biomedicine? Generalization and abstraction in RCTs. Crit Public Health. 2019;29(2):181–91.

Green J, Roberts H, Petticrew M, Steinbach R, Goodman A, Jones A, et al. Integrating quasi-experimental and inductive designs in evaluation: a case study of the impact of free bus travel on public health. Evaluation. 2015;21(4):391–406.

Canguilhem G. The normal and the pathological. New York: Zone Books; 1991. (1949).

Google Scholar  

Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009;43:267–76.

King G, Keohane RO, Verba S. Designing social inquiry: scientific inference in qualitative research: Princeton University Press; 1994.

Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

Yin R. Enhancing the quality of case studies in health services research. Health Serv Res. 1999;34(5 Pt 2):1209.

CAS   PubMed   PubMed Central   Google Scholar  

Raine R, Fitzpatrick R, Barratt H, Bevan G, Black N, Boaden R, et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Serv Deliv Res. 2016 [cited 2020 Jun 30];4(16). Available from: https://www.journalslibrary.nihr.ac.uk/hsdr/hsdr04160#/abstract .

Craig P, Di Ruggiero E, Frohlich KL, E M, White M, Group CCGA. Taking account of context in population health intervention research: guidance for producers, users and funders of research. NIHR Evaluation, Trials and Studies Coordinating Centre; 2018.

Grant RL, Hood R. Complex systems, explanation and policy: implications of the crisis of replication for public health research. Crit Public Health. 2017;27(5):525–32.

Mahoney J. Strategies of causal inference in small-N analysis. Sociol Methods Res. 2000;4:387–424.

Turner S. Major system change: a management and organisational research perspective. In: Rosalind Raine, Ray Fitzpatrick, Helen Barratt, Gywn Bevan, Nick Black, Ruth Boaden, et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Serv Deliv Res. 2016;4(16) 2016. https://doi.org/10.3310/hsdr04160.

Ragin CC. Using qualitative comparative analysis to study causal complexity. Health Serv Res. 1999;34(5 Pt 2):1225.

Hanckel B, Petticrew M, Thomas J, Green J. Protocol for a systematic review of the use of qualitative comparative analysis for evaluative questions in public health research. Syst Rev. 2019;8(1):252.

Schneider CQ, Wagemann C. Set-theoretic methods for the social sciences: a guide to qualitative comparative analysis: Cambridge University Press; 2012. 369 p.

Flyvbjerg B. Five misunderstandings about case-study research. Qual Inq. 2006;12:219–45.

Tsoukas H. Craving for generality and small-N studies: a Wittgensteinian approach towards the epistemology of the particular in organization and management studies. Sage Handb Organ Res Methods. 2009:285–301.

Stake RE. The art of case study research. London: Sage Publications Ltd; 1995.

Mitchell JC. Typicality and the case study. Ethnographic research: A guide to general conduct. Vol. 238241. 1984.

Gerring J. What is a case study and what is it good for? Am Polit Sci Rev. 2004;98(2):341–54.

May C, Mort M, Williams T, F M, Gask L. Health technology assessment in its local contexts: studies of telehealthcare. Soc Sci Med 2003;57:697–710.

McGill E. Trading quality for relevance: non-health decision-makers’ use of evidence on the social determinants of health. BMJ Open. 2015;5(4):007053.

Greenhalgh T. We can’t be 100% sure face masks work – but that shouldn’t stop us wearing them | Trish Greenhalgh. The Guardian. 2020 [cited 2020 Jun 27]; Available from: https://www.theguardian.com/commentisfree/2020/jun/05/face-masks-coronavirus .

Hammersley M. So, what are case studies? In: What’s wrong with ethnography? New York: Routledge; 1992.

Crowe S, Cresswell K, Robertson A, Huby G, Avery A, Sheikh A. The case study approach. BMC Med Res Methodol. 2011;11(1):100.

Luck L, Jackson D, Usher K. Case study: a bridge across the paradigms. Nurs Inq. 2006;13(2):103–9.

Yin RK. Case study research and applications: design and methods: Sage; 2017.

Hyett N, A K, Dickson-Swift V. Methodology or method? A critical review of qualitative case study reports. Int J Qual Stud Health Well-Being. 2014;9:23606.

Carolan CM, Forbat L, Smith A. Developing the DESCARTE model: the design of case study research in health care. Qual Health Res. 2016;26(5):626–39.

Greenhalgh T, Annandale E, Ashcroft R, Barlow J, Black N, Bleakley A, et al. An open letter to the BMJ editors on qualitative research. Bmj. 2016;352.

Thomas G. A typology for the case study in social science following a review of definition, discourse, and structure. Qual Inq. 2011;17(6):511–21.

Lincoln YS, Guba EG. Judging the quality of case study reports. Int J Qual Stud Educ. 1990;3(1):53–9.

Riley DS, Barber MS, Kienle GS, Aronson JK, Schoen-Angerer T, Tugwell P, et al. CARE guidelines for case reports: explanation and elaboration document. J Clin Epidemiol. 2017;89:218–35.

Download references

Acknowledgements

Not applicable

This work was funded by the Medical Research Council - MRC Award MR/S014632/1 HCS: Case study, Context and Complex interventions (TRIPLE C). SP was additionally funded by the University of Oxford's Higher Education Innovation Fund (HEIF).

Author information

Authors and affiliations.

Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK

Sara Paparini, Chrysanthi Papoutsi, Trish Greenhalgh & Sara Shaw

Wellcome Centre for Cultures & Environments of Health, University of Exeter, Exeter, UK

Judith Green

School of Health Sciences, University of East Anglia, Norwich, UK

Jamie Murdoch

Public Health, Environments and Society, London School of Hygiene & Tropical Medicin, London, UK

Mark Petticrew

Institute for Culture and Society, Western Sydney University, Penrith, Australia

Benjamin Hanckel

You can also search for this author in PubMed   Google Scholar

Contributions

JG, MP, SP, JM, TG, CP and SS drafted the initial paper; all authors contributed to the drafting of the final version, and read and approved the final manuscript.

Corresponding author

Correspondence to Sara Paparini .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Paparini, S., Green, J., Papoutsi, C. et al. Case study research for better evaluations of complex interventions: rationale and challenges. BMC Med 18 , 301 (2020). https://doi.org/10.1186/s12916-020-01777-6

Download citation

Received : 03 July 2020

Accepted : 07 September 2020

Published : 10 November 2020

DOI : https://doi.org/10.1186/s12916-020-01777-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Qualitative
  • Case studies
  • Mixed-method
  • Public health
  • Health services research
  • Interventions

BMC Medicine

ISSN: 1741-7015

explanatory case study in qualitative research

IMAGES

  1. Explanatory research: Definition & characteristics

    explanatory case study in qualitative research

  2. Explanatory Research

    explanatory case study in qualitative research

  3. case study method of qualitative research

    explanatory case study in qualitative research

  4. Explanatory Research

    explanatory case study in qualitative research

  5. Characteristics of three types of case study research

    explanatory case study in qualitative research

  6. Exploratory Descriptive and Explanatory Research

    explanatory case study in qualitative research

VIDEO

  1. Case Study

  2. Explanatory Research and Exploratory Research

  3. QUALITATIVE RESEARCH DESIGN IN EDUCATIONAL RESEAERCH

  4. Lecture 47: Qualitative Resarch

  5. Lecture 50: Qualitative Resarch

  6. Sequential Explanatory Design

COMMENTS

  1. Case Study Methodology of Qualitative Research: Key Attributes and

    A case study is one of the most commonly used methodologies of social research. This article attempts to look into the various dimensions of a case study research strategy, the different epistemological strands which determine the particular case study type and approach adopted in the field, discusses the factors which can enhance the effectiveness of a case study research, and the debate ...

  2. Explanatory Research

    Explanatory Research | Definition, Guide, & Examples. Published on December 3, 2021 by Tegan George and Julia Merkus. Revised on November 20, 2023. Explanatory research is a research method that explores why something occurs when limited information is available. It can help you increase your understanding of a given topic, ascertain how or why a particular phenomenon is occurring, and predict ...

  3. Explanatory Case Study Design—A Clarification

    To the Editor: For the purpose of clarity for the readership, we wish to address the description of the explanatory case study design (ECSD) as a qualitative research method in the invited critique to our original article. 1 Yin, 2 the primary source for ECSD, described case study as suitable when the number of variables of interest exceeds the number of data points (i.e., participants).

  4. (PDF) Qualitative Case Study Methodology: Study Design and

    McMaster University, West Hamilton, Ontario, Canada. Qualitative case study methodology prov ides tools for researchers to study. complex phenomena within their contexts. When the approach is ...

  5. Case Study Methods and Examples

    The purpose of case study research is twofold: (1) to provide descriptive information and (2) to suggest theoretical relevance. Rich description enables an in-depth or sharpened understanding of the case. It is unique given one characteristic: case studies draw from more than one data source. Case studies are inherently multimodal or mixed ...

  6. Case Study Methodology of Qualitative Research: Key Attributes and

    1. Case study is a research strategy, and not just a method/technique/process of data collection. 2. A case study involves a detailed study of the concerned unit of analysis within its natural setting. A de-contextualised study has no relevance in a case study research. 3. Since an in-depth study is conducted, a case study research allows the

  7. Explanatory case studies: Implications and applications for clinical

    Explanatory case study methodology has been used to research complex systems in the fields of business, public policy and urban planning, to name a few. While it has been suggested by some that this might be a useful way to progress complex research issues in health science research, to date, there has been little evidence of this happening.

  8. Two or Three Approaches to Explanatory Case Study Research?

    Explanatory approach in case study research using covariational analysis [16] and cross-case analysis [17] were used to analyze the data in order to present the results and identify the ...

  9. PDF yTwo or three approaches to explanatory case study research

    The three approaches to explanatory case study research. In this section we will compare our three approaches to explanatory case studies research regarding research goals and questions, case and ...

  10. Explanatory Research

    Explanatory research is a type of research that aims to uncover the underlying causes and relationships between different variables. It seeks to explain why a particular phenomenon occurs and how it relates to other factors. This type of research is typically used to test hypotheses or theories and to establish cause-and-effect relationships.

  11. Continuing to enhance the quality of case study methodology in health

    Introduction. The popularity of case study research methodology in Health Services Research (HSR) has grown over the past 40 years. 1 This may be attributed to a shift towards the use of implementation research and a newfound appreciation of contextual factors affecting the uptake of evidence-based interventions within diverse settings. 2 Incorporating context-specific information on the ...

  12. Case study research for better evaluations of complex interventions

    The overall approach of case study research is based on the in-depth exploration of complex phenomena in their natural, or 'real-life', settings. Empirical case studies typically enable dynamic understanding of complex challenges rather than restricting the focus on narrow problem delineations and simple fixes.

  13. PDF Embedded Case Study Methods TYPES OF CASE STUDIES

    The label case study is most frequently associated with the exploratory case study. It usually precedes a final study, which can, itself, be a case study, but it can also have a different research design (Boos, 1992). Exploratory case studies help to gain insight into the structure of a phenomenon in order to develop hypotheses, models, or ...

  14. Beyond exploratory: a tailored framework for designing and assessing

    The objective of this commentary is to develop a framework for assessing the rigour of qualitative approaches that identifies and distinguishes between the diverse objectives of qualitative health research, guided by a narrative review of the published literature on qualitative guidelines and standards from peer-reviewed journals and national funding organisations that support health services ...

  15. Validating explanatory qualitative research: enhancing the

    Abstract. The literature on qualitative interview methodology includes little guidance on how to use interviews for explanatory purposes. Still, within many fields of research, explanatory interview research could play an important role, sometimes in combination with quantitative methods on research topics where the latter methods have traditionally been dominant.

  16. Explanatory case studies: Implications and applications for clinical

    Explanatory case study methodology has been used to research complex systems in the fields of business, public policy and urban planning, to name a few. While it has been suggested by some that this might be a useful way to progress complex research issues in health science research, to date, there has been little evidence of this happening.

  17. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  18. PDF DEFINING THE CASE STUDY

    Five elements of a research design: Identify data to be collected— define: 1. question: case studies most useful for answering how, why. 2. propositions, if any to help problematize your question (e.g., organizations collaborate because they derive mutual benefit).

  19. Explanatory case studies: Implications and applications for clinical

    The qualitative content analysis method in case study research was the design of choice in this inquiry when testing the IMP model of neuro-occupation, as these two cases were the first part of a ...

  20. Case study research for better evaluations of complex interventions

    The Design of Case Study Research in Health Care (DESCARTE) model suggests a series of questions to be asked of a case study researcher (including clarity about the philosophy underpinning their research), study design (with a focus on case definition) and analysis (to improve process). The model resembles toolkits for enhancing the quality and ...

  21. The Qualitative Case Study Research Strategy as Applied on a Rural

    on how I applied the qualitative case study research strategy on my doctoral research project. Saunders, Lewis and Thornhill (2009, p. 136) define re- ... They identify several research strategies that may be utilised in im-plementing exploratory, descriptive and/or explanatory pieces of research. These include experiment, survey, case study ...