• - Google Chrome

Intended for healthcare professionals

  • Access provided by Google Indexer
  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • Implementation...

Implementation research: what it is and how to do it

  • Related content
  • Peer review
  • David H Peters , professor 1 ,
  • Taghreed Adam , scientist 2 ,
  • Olakunle Alonge , assistant scientist 1 ,
  • Irene Akua Agyepong , specialist public health 3 ,
  • Nhan Tran , manager 4
  • 1 Johns Hopkins University Bloomberg School of Public Health, Department of International Health, 615 N Wolfe St, Baltimore, MD 21205, USA
  • 2 Alliance for Health Policy and Systems Research, World Health Organization, CH-1211 Geneva 27, Switzerland
  • 3 University of Ghana School of Public Health/Ghana Health Service, Accra, Ghana
  • 4 Alliance for Health Policy and Systems Research, Implementation Research Platform, World Health Organization, CH-1211 Geneva 27, Switzerland
  • Correspondence to: D H Peters  dpeters{at}jhsph.edu
  • Accepted 8 October 2013

Implementation research is a growing but not well understood field of health research that can contribute to more effective public health and clinical policies and programmes. This article provides a broad definition of implementation research and outlines key principles for how to do it

The field of implementation research is growing, but it is not well understood despite the need for better research to inform decisions about health policies, programmes, and practices. This article focuses on the context and factors affecting implementation, the key audiences for the research, implementation outcome variables that describe various aspects of how implementation occurs, and the study of implementation strategies that support the delivery of health services, programmes, and policies. We provide a framework for using the research question as the basis for selecting among the wide range of qualitative, quantitative, and mixed methods that can be applied in implementation research, along with brief descriptions of methods specifically suitable for implementation research. Expanding the use of well designed implementation research should contribute to more effective public health and clinical policies and programmes.

Defining implementation research

Implementation research attempts to solve a wide range of implementation problems; it has its origins in several disciplines and research traditions (supplementary table A). Although progress has been made in conceptualising implementation research over the past decade, 1 considerable confusion persists about its terminology and scope. 2 3 4 The word “implement” comes from the Latin “implere,” meaning to fulfil or to carry into effect. 5 This provides a basis for a broad definition of implementation research that can be used across research traditions and has meaning for practitioners, policy makers, and the interested public: “Implementation research is the scientific inquiry into questions concerning implementation—the act of carrying an intention into effect, which in health research can be policies, programmes, or individual practices (collectively called interventions).”

Implementation research can consider any aspect of implementation, including the factors affecting implementation, the processes of implementation, and the results of implementation, including how to introduce potential solutions into a health system or how to promote their large scale use and sustainability. The intent is to understand what, why, and how interventions work in “real world” settings and to test approaches to improve them.

Principles of implementation research

Implementation research seeks to understand and work within real world conditions, rather than trying to control for these conditions or to remove their influence as causal effects. This implies working with populations that will be affected by an intervention, rather than selecting beneficiaries who may not represent the target population of an intervention (such as studying healthy volunteers or excluding patients who have comorbidities).

Context plays a central role in implementation research. Context can include the social, cultural, economic, political, legal, and physical environment, as well as the institutional setting, comprising various stakeholders and their interactions, and the demographic and epidemiological conditions. The structure of the health systems (for example, the roles played by governments, non-governmental organisations, other private providers, and citizens) is particularly important for implementation research on health.

Implementation research is especially concerned with the users of the research and not purely the production of knowledge. These users may include managers and teams using quality improvement strategies, executive decision makers seeking advice for specific decisions, policy makers who need to be informed about particular programmes, practitioners who need to be convinced to use interventions that are based on evidence, people who are influenced to change their behaviour to have a healthier life, or communities who are conducting the research and taking action through the research to improve their conditions (supplementary table A). One important implication is that often these actors should be intimately involved in the identification, design, and conduct phases of research and not just be targets for dissemination of study results.

Implementation outcome variables

Implementation outcome variables describe the intentional actions to deliver services. 6 These implementation outcome variables—acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, coverage, and sustainability—can all serve as indicators of the success of implementation (table 1 ⇓ ). Implementation research uses these variables to assess how well implementation has occurred or to provide insights about how this contributes to one’s health status or other important health outcomes.

 Implementation outcome variables

  • View inline

Implementation strategies

Curran and colleagues defined an “implementation intervention” as a method to “enhance the adoption of a ‘clinical’ intervention,” such as the use of job aids, provider education, or audit procedures. 7 The concept can be broadened to any type of strategy that is designed to support a clinical or population and public health intervention (for example, outreach clinics and supervision checklists are implementation strategies used to improve the coverage and quality of immunisation).

A review of ways to improve health service delivery in low and middle income countries identified a wide range of successful implementation strategies (supplementary table B). 8 Even in the most resource constrained environments, measuring change, informing stakeholders, and using information to guide decision making were found to be critical to successful implementation.

Implementation influencing variables

Other factors that influence implementation may need to be considered in implementation research. Sabatier summarised a set of such factors that influence policy implementation (clarity of objectives, causal theory, implementing personnel, support of interest groups, and managerial authority and resources). 9

The large array of contextual factors that influence implementation, interact with each other, and change over time highlights the fact that implementation often occurs as part of complex adaptive systems. 10 Some implementation strategies are particularly suitable for working in complex systems. These include strategies to provide feedback to key stakeholders and to encourage learning and adaptation by implementing agencies and beneficiary groups. Such strategies have implications for research, as the study methods need to be sufficiently flexible to account for changes or adaptations in what is actually being implemented. 8 11 Research designs that depend on having a single and fixed intervention, such as a typical randomised controlled trial, would not be an appropriate design to study phenomena that change, especially when they change in unpredictable and variable ways.

Another implication of studying complex systems is that the research may need to use multiple methods and different sources of information to understand an implementation problem. Because implementation activities and effects are not usually static or linear processes, research designs often need to be able to observe and analyse these sometimes iterative and changing elements at several points in time and to consider unintended consequences.

Implementation research questions

As in other types of health systems research, the research question is the king in implementation research. Implementation research takes a pragmatic approach, placing the research question (or implementation problem) as the starting point to inquiry; this then dictates the research methods and assumptions to be used. Implementation research questions can cover a wide variety of topics and are frequently organised around theories of change or the type of research objective (examples are in supplementary table C). 12 13

Implementation research can overlap with other types of research used in medicine and public health, and the distinctions are not always clear cut. A range of implementation research exists, based on the centrality of implementation in the research question, the degree to which the research takes place in a real world setting with routine populations, and the role of implementation strategies and implementation variables in the research (figure ⇓ ).

Spectrum of implementation research 33

  • Download figure
  • Open in new tab
  • Download powerpoint

A more detailed description of the research question can help researchers and practitioners to determine the type of research methods that should be used. In table 2 ⇓ , we break down the research question first by its objective: to explore, describe, influence, explain, or predict. This is followed by a typical implementation research question based on each objective. Finally, we describe a set of research methods for each type of research question.

 Type of implementation research objective, implementation question, and research methods

Much of evidence based medicine is concerned with the objective of influence, or whether an intervention produces an expected outcome, which can be broken down further by the level of certainty in the conclusions drawn from the study. The nature of the inquiry (for example, the amount of risk and considerations of ethics, costs, and timeliness), and the interests of different audiences, should determine the level of uncertainty. 8 14 Research questions concerning programmatic decisions about the process of an implementation strategy may justify a lower level of certainty for the manager and policy maker, using research methods that would support an adequacy or plausibility inference. 14 Where a high risk of harm exists and sufficient time and resources are available, a probability study design might be more appropriate, in which the result in an area where the intervention is implemented is compared with areas without implementation with a low probability of error (for example, P< 0.05). These differences in the level of confidence affect the study design in terms of sample size and the need for concurrent or randomised comparison groups. 8 14

Implementation specific research methods

A wide range of qualitative and quantitative research methods can be used in implementation research (table 2 ⇑ ). The box gives a set of basic questions to guide the design or reporting of implementation research that can be used across methods. More in-depth criteria have also been proposed to assess the external validity or generalisability of findings. 15 Some research methods have been developed specifically to deal with implementation research questions or are particularly suitable to implementation research, as identified below.

Key questions to assess research designs or reports on implementation research 33

Does the research clearly aim to answer a question concerning implementation?

Does the research clearly identify the primary audiences for the research and how they would use the research?

Is there a clear description of what is being implemented (for example, details of the practice, programme, or policy)?

Does the research involve an implementation strategy? If so, is it described and examined in its fullness?

Is the research conducted in a “real world” setting? If so, is the context and sample population described in sufficient detail?

Does the research appropriately consider implementation outcome variables?

Does the research appropriately consider context and other factors that influence implementation?

Does the research appropriately consider changes over time and the level of complexity of the system, including unintended consequences?

Pragmatic trials

Pragmatic trials, or practical trials, are randomised controlled trials in which the main research question focuses on effectiveness of an intervention in a normal practice setting with the full range of study participants. 16 This may include pragmatic trials on new healthcare delivery strategies, such as integrated chronic care clinics or nurse run community clinics. This contrasts with typical randomised controlled trials that look at the efficacy of an intervention in an “ideal” or controlled setting and with highly selected patients and standardised clinical outcomes, usually of a short term nature.

Effectiveness-implementation hybrid trials

Effectiveness-implementation hybrid designs are intended to assess the effectiveness of both an intervention and an implementation strategy. 7 These studies include components of an effectiveness design (for example, randomised allocation to intervention and comparison arms) but add the testing of an implementation strategy, which may also be randomised. This might include testing the effectiveness of a package of delivery and postnatal care in under-served areas, as well testing several strategies for providing the care. Whereas pragmatic trials try to fix the intervention under study, effectiveness-implementation hybrids also intervene and/or observe the implementation process as it actually occurs. This can be done by assessing implementation outcome variables.

Quality improvement studies

Quality improvement studies typically involve a set of structured and cyclical processes, often called the plan-do-study-act cycle, and apply scientific methods on a continuous basis to formulate a plan, implement the plan, and analyse and interpret the results, followed by an iteration of what to do next. 17 18 The focus might be on a clinical process, such as how to reduce hospital acquired infections in the intensive care unit, or management processes such as how to reduce waiting times in the emergency room. Guidelines exist on how to design and report such research—the Standards for Quality Improvement Reporting Excellence (SQUIRE). 17

Speroff and O’Connor describe a range of plan-do-study-act research designs, noting that they have in common the assessment of responses measured repeatedly and regularly over time, either in a single case or with comparison groups. 18 Balanced scorecards integrate performance measures across a range of domains and feed into regular decision making. 19 20 Standardised guidance for using good quality health information systems and health facility surveys has been developed and often provides the sources of information for these quasi-experimental designs. 21 22 23

Participatory action research

Participatory action research refers to a range of research methods that emphasise participation and action (that is, implementation), using methods that involve iterative processes of reflection and action, “carried out with and by local people rather than on them.” 24 In participatory action research, a distinguishing feature is that the power and control over the process rests with the participants themselves. Although most participatory action methods involve qualitative methods, quantitative and mixed methods techniques are increasingly being used, such as for participatory rural appraisal or participatory statistics. 25 26

Mixed methods

Mixed methods research uses both qualitative and quantitative methods of data collection and analysis in the same study. Although not designed specifically for implementation research, mixed methods are particularly suitable because they provide a practical way to understand multiple perspectives, different types of causal pathways, and multiple types of outcomes—all common features of implementation research problems.

Many different schemes exist for describing different types of mixed methods research, on the basis of the emphasis of the study, the sampling schemes for the different components, the timing and sequencing of the qualitative and quantitative methods, and the level of mixing between the qualitative and quantitative methods. 27 28 Broad guidance on the design and conduct of mixed methods designs is available. 29 30 31 A scheme for good reporting of mixed methods studies involves describing the justification for using a mixed methods approach to the research question; describing the design in terms of the purpose, priority, and sequence of methods; describing each method in terms of sampling, data collection, and analysis; describing where the integration has occurred, how it has occurred, and who has participated in it; describing any limitation of one method associated with the presence of the other method; and describing any insights gained from mixing or integrating methods. 32

Implementation research aims to cover a wide set of research questions, implementation outcome variables, factors affecting implementation, and implementation strategies. This paper has identified a range of qualitative, quantitative, and mixed methods that can be used according to the specific research question, as well as several research designs that are particularly suited to implementation research. Further details of these concepts can be found in a new guide developed by the Alliance for Health Policy and Systems Research. 33

Summary points

Implementation research has its origins in many disciplines and is usefully defined as scientific inquiry into questions concerning implementation—the act of fulfilling or carrying out an intention

In health research, these intentions can be policies, programmes, or individual practices (collectively called interventions)

Implementation research seeks to understand and work in “real world” or usual practice settings, paying particular attention to the audience that will use the research, the context in which implementation occurs, and the factors that influence implementation

A wide variety of qualitative, quantitative, and mixed methods techniques can be used in implementation research, which are best selected on the basis of the research objective and specific questions related to what, why, and how interventions work

Implementation research may examine strategies that are specifically designed to improve the carrying out of health interventions or assess variables that are defined as implementation outcomes

Implementation outcomes include acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, coverage, and sustainability

Cite this as: BMJ 2013;347:f6753

Contributors: All authors contributed to the conception and design, analysis and interpretation, drafting the article, or revising it critically for important intellectual content, and all gave final approval of the version to be published. NT had the original idea for the article, which was discussed by the authors (except OA) as well as George Pariyo, Jim Sherry, and Dena Javadi at a meeting at the World Health Organization (WHO). DHP and OA did the literature reviews, and DHP wrote the original outline and the draft manuscript, tables, and boxes. OA prepared the original figure. All authors reviewed the draft article and made substantial revisions to the manuscript. DHP is the guarantor.

Funding: Funding was provided by the governments of Norway and Sweden and the UK Department for International Development (DFID) in support of the WHO Implementation Research Platform, which financed a meeting of authors and salary support for NT. DHP is supported by the Future Health Systems research programme consortium, funded by DFID for the benefit of developing countries (grant number H050474). The funders played no role in the design, conduct, or reporting of the research.

Competing interests: All authors have completed the ICMJE uniform disclosure form at www.icmje.org/coi_disclosure.pdf and declare: support for the submitted work as described above; NT and TA are employees of the Alliance for Health Policy and Systems Research at WHO, which is supporting their salaries to work on implementation research; no financial relationships with any organisations that might have an interest in the submitted work in the previous three years; no other relationships or activities that could appear to have influenced the submitted work.

Provenance and peer review: Invited by journal; commissioned by WHO; externally peer reviewed.

  • ↵ Brownson RC, Colditz GA, Proctor EK, eds. Dissemination and implementation research in health: translating science to practice. Oxford University Press, 2012.
  • ↵ Ciliska D, Robinson P, Armour T, Ellis P, Brouwers M, Gauld M, et al. Diffusion and dissemination of evidence-based dietary strategies for the prevention of cancer. Nutr J 2005 ; 4 (1): 13 . OpenUrl CrossRef PubMed
  • ↵ Remme JHF, Adam T, Becerra-Posada F, D’Arcangues C, Devlin M, Gardner C, et al. Defining research to improve health systems. PLoS Med 2010 ; 7 : e1001000 . OpenUrl CrossRef PubMed
  • ↵ McKibbon KA, Lokker C, Mathew D. Implementation research. 2012. http://whatiskt.wikispaces.com/Implementation+Research .
  • ↵ The compact edition of the Oxford English dictionary. Oxford University Press, 1971.
  • ↵ Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2010 ; 38 : 65 -76. OpenUrl
  • ↵ Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care 2012 ; 50 : 217 -26. OpenUrl CrossRef PubMed Web of Science
  • ↵ Peters DH, El-Saharty S, Siadat B, Janovsky K, Vujicic M, eds. Improving health services in developing countries: from evidence to action. World Bank, 2009.
  • ↵ Sabatier PA. Top-down and bottom-up approaches to implementation research. J Public Policy 1986 ; 6 (1): 21 -48. OpenUrl CrossRef
  • ↵ Paina L, Peters DH. Understanding pathways for scaling up health services through the lens of complex adaptive systems. Health Policy Plan 2012 ; 27 : 365 -73. OpenUrl Abstract / FREE Full Text
  • ↵ Gilson L, ed. Health policy and systems research: a methodology reader. World Health Organization, 2012.
  • ↵ Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med 2012 ; 43 : 337 -50. OpenUrl CrossRef PubMed
  • ↵ Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG). Designing theoretically-informed implementation interventions. Implement Sci 2006 ; 1 : 4 . OpenUrl CrossRef PubMed
  • ↵ Habicht JP, Victora CG, Vaughn JP. Evaluation designs for adequacy, plausibility, and probability of public health programme performance and impact. Int J Epidemiol 1999 ; 28 : 10 -8. OpenUrl Abstract / FREE Full Text
  • ↵ Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research. Eval Health Prof 2006 ; 29 : 126 -53. OpenUrl Abstract / FREE Full Text
  • ↵ Swarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, Haynes B, et al, for the CONSORT and Pragmatic Trials in Healthcare (Practihc) Groups. Improving the reporting of pragmatic trials: an extension of the CONSORT statement. BMJ 2008 ; 337 : a2390 . OpenUrl Abstract / FREE Full Text
  • ↵ Davidoff F, Batalden P, Stevens D, Ogrince G, Mooney SE, for the SQUIRE Development Group. Publication guidelines for quality improvement in health care: evolution of the SQUIRE project. Qual Saf Health Care 2008 ; 17 (suppl I): i3 -9. OpenUrl Abstract / FREE Full Text
  • ↵ Speroff T, O’Connor GT. Study designs for PDSA quality improvement research. Q Manage Health Care 2004 ; 13 (1): 17 -32. OpenUrl CrossRef
  • ↵ Peters DH, Noor AA, Singh LP, Kakar FK, Hansen PM, Burnham G. A balanced scorecard for health services in Afghanistan. Bull World Health Organ 2007 ; 85 : 146 -51. OpenUrl CrossRef PubMed Web of Science
  • ↵ Edward A, Kumar B, Kakar F, Salehi AS, Burnham G. Peters DH. Configuring balanced scorecards for measuring health systems performance: evidence from five years’ evaluation in Afghanistan. PLOS Med 2011 ; 7 : e1001066 . OpenUrl
  • ↵ Health Facility Assessment Technical Working Group. Profiles of health facility assessment method, MEASURE Evaluation, USAID, 2008.
  • ↵ Hotchkiss D, Diana M, Foreit K. How can routine health information systems improve health systems functioning in low-resource settings? Assessing the evidence base. MEASURE Evaluation, USAID, 2012.
  • ↵ Lindelow M, Wagstaff A. Assessment of health facility performance: an introduction to data and measurement issues. In: Amin S, Das J, Goldstein M, eds. Are you being served? New tools for measuring service delivery. World Bank, 2008:19-66.
  • ↵ Cornwall A, Jewkes R. “What is participatory research?” Soc Sci Med 1995 ; 41 : 1667 -76. OpenUrl CrossRef PubMed Web of Science
  • ↵ Mergler D. Worker participation in occupational health research: theory and practice. Int J Health Serv 1987 ; 17 : 151 . OpenUrl Abstract / FREE Full Text
  • ↵ Chambers R. Revolutions in development inquiry. Earthscan, 2008.
  • ↵ Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. Sage Publications, 2011.
  • ↵ Tashakkori A, Teddlie C. Mixed methodology: combining qualitative and quantitative approaches. Sage Publications, 2003.
  • ↵ Leech NL, Onwuegbuzie AJ. Guidelines for conducting and reporting mixed research in the field of counseling and beyond. Journal of Counseling and Development 2010 ; 88 : 61 -9. OpenUrl CrossRef Web of Science
  • ↵ Creswell JW. Mixed methods procedures. In: Research design: qualitative, quantitative and mixed methods approaches. 3rd ed. Sage Publications, 2009.
  • ↵ Creswell JW, Klassen AC, Plano Clark VL, Clegg Smith K. Best practices for mixed methods research in the health sciences. National Institutes of Health, Office of Behavioral and Social Sciences Research, 2011.
  • ↵ O’Cathain A, Murphy E, Nicholl J. The quality of mixed methods studies in health services research. J Health Serv Res Policy 2008 ; 13 : 92 -8. OpenUrl Abstract / FREE Full Text
  • ↵ Peters DH, Tran N, Adam T, Ghaffar A. Implementation research in health: a practical guide. Alliance for Health Policy and Systems Research, World Health Organization, 2013.
  • Rogers EM. Diffusion of innovations. 5th ed. Free Press, 2003.
  • Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci 2007 ; 2 : 40 . OpenUrl CrossRef PubMed
  • Victora CG, Schellenberg JA, Huicho L, Amaral J, El Arifeen S, Pariyo G, et al. Context matters: interpreting impact findings in child survival evaluations. Health Policy Plan 2005 ; 20 (suppl 1): i18 -31. OpenUrl Abstract

scholarly articles on implementation research

  • Open access
  • Published: 29 June 2021

Pragmatic approaches to analyzing qualitative data for implementation science: an introduction

  • Shoba Ramanadhan   ORCID: orcid.org/0000-0003-0650-9433 1 ,
  • Anna C. Revette 2 ,
  • Rebekka M. Lee 1 &
  • Emma L. Aveling 3  

Implementation Science Communications volume  2 , Article number:  70 ( 2021 ) Cite this article

70 Citations

46 Altmetric

Metrics details

Qualitative methods are critical for implementation science as they generate opportunities to examine complexity and include a diversity of perspectives. However, it can be a challenge to identify the approach that will provide the best fit for achieving a given set of practice-driven research needs. After all, implementation scientists must find a balance between speed and rigor, reliance on existing frameworks and new discoveries, and inclusion of insider and outsider perspectives. This paper offers guidance on taking a pragmatic approach to analysis, which entails strategically combining and borrowing from established qualitative approaches to meet a study’s needs, typically with guidance from an existing framework and with explicit research and practice change goals.

Section 1 offers a series of practical questions to guide the development of a pragmatic analytic approach. These include examining the balance of inductive and deductive procedures, the extent to which insider or outsider perspectives are privileged, study requirements related to data and products that support scientific advancement and practice change, and strategic resource allocation. This is followed by an introduction to three approaches commonly considered for implementation science projects: grounded theory, framework analysis, and interpretive phenomenological analysis, highlighting core analytic procedures that may be borrowed for a pragmatic approach. Section 2 addresses opportunities to ensure and communicate rigor of pragmatic analytic approaches. Section 3 provides an illustrative example from the team’s work, highlighting how a pragmatic analytic approach was designed and executed and the diversity of research and practice products generated.

As qualitative inquiry gains prominence in implementation science, it is critical to take advantage of qualitative methods’ diversity and flexibility. This paper furthers the conversation regarding how to strategically mix and match components of established qualitative approaches to meet the analytic needs of implementation science projects, thereby supporting high-impact research and improved opportunities to create practice change.

Peer Review reports

Contributions to the literature

Qualitative methods are increasingly being used in implementation science, yet many researchers are new to these methods or unaware of the flexibility afforded by applied qualitative research.

Implementation scientists can benefit from guidance on creating a pragmatic approach to analysis, which includes the strategic combining and borrowing from established approaches to meet a given study’s needs, typically with guidance from an implementation science framework and explicit research and practice change goals.

Through practical questions and examples, we provide guidance for using pragmatic analytic approaches to meet the needs and constraints of implementation science projects while maintaining and communicating the work’s rigor.

Implementation science (IS) is truly pragmatic at its core, answering questions about how existing evidence can be best translated into practice to accelerate impact on population health and health equity. Qualitative methods are critical to support this endeavor as they support the examination of the dynamic context and systems into which evidence-based interventions (EBIs) are integrated — addressing the “hows and whys” of implementation [ 1 ]. Numerous IS frameworks highlight the complexity of the systems in which implementation efforts occur and the uncertainty regarding how various determinants interact to produce multi-level outcomes [ 2 ]. With that lens, it is unsurprising that diverse qualitative methodologies are receiving increasing attention in IS as they allow for an in-depth understanding of complex processes and interactions [ 1 , 3 , 4 ]. Given the wide variety of possible analytic approaches and techniques, an important question is which analytic approach best fits a given set of practice-driven research needs. Thoughtful design is needed to align research questions and objectives, the nature of the subject matter, the overall approach, the methods (specific tools and techniques used to achieve research goals, including data collection procedures), and the analytic strategies (including procedures used for exploring and interpreting data) [ 5 , 6 ]. Achieving this kind of alignment is often described as “fit,” “methodological integrity,” or “internal coherence” [ 3 , 7 , 8 ]. Tailoring research designs to the unique constellation of these considerations in a given study may also require creative adaptation or innovation of analytic procedures [ 7 ]. Yet, for IS researchers newer to qualitative approaches, a lack of understanding of the range of relevant options may limit their ability to effectively connect qualitative approaches and research goals.

For IS studies, several factors further complicate the selection of analytic approaches. First, there is a tension between the speed with which IS must move to be relevant and the need to conduct rigorous research. Second, though qualitative research is often associated with attempts to generate new theories, qualitative IS studies’ goals may also include elaborating conceptual definitions, creating classifications or typologies, and examining mechanisms and associations [ 9 ]. Given the wealth of existing IS frameworks and models, covering determinants, processes, and outcomes [ 10 ], IS studies often focus on extending or applying existing frameworks. Third, as an applied field, IS work usually entails integrating different kinds of “insider” and “outsider” expertise to support implementation or practice change [ 11 ]. Fourth, diverse traditions have contributed to the new field of IS, including agriculture, operations research, public health, medicine, anthropology, sociology, and more [ 12 ]. The diversity of disciplines among IS researchers can bring a wealth of complementary perspectives but may also pose challenges in communicating about research processes.

Pragmatic approaches to qualitative analysis are likely valuable for IS researchers yet have not received enough attention in the IS literature to support researchers in using them confidently. By pragmatic approaches, we mean strategic combining and borrowing from established qualitative approaches to meet the needs of a given IS study, often with guidance from an IS framework and with clear research and practice change goals. Pragmatic approaches are not new, but they receive less attention in qualitative research overall and are not always clearly explicated in the literature [ 9 ]. Part of the challenge in using pragmatic approaches is the lack of guidance on how to mix and match components of established approaches in a coherent, credible manner.

Our motivation in offering this guidance reflects our experiences as researchers, collaborators, and teachers connecting qualitative methods and IS research questions. The author team includes two behavioral scientists who conduct stakeholder-engaged implementation science and regularly utilize qualitative approaches (SR and RL). The team also includes a sociologist and a social psychologist who were trained in qualitative methods and have rich expertise with health services and implementation research (AR and EA). Through conducting qualitative IS studies and supporting students and colleagues new to qualitative approaches, we noticed a regularly occurring set of concerns and queries. Many questions seem to stem from a sense that there is a singular, “right” way to conduct qualitative projects. Such concerns are often amplified by fear that deviation from rigid adherence to established sets of procedures may jeopardize the (perceived or actual) rigor of the work. While the appeal of recipe-like means of ensuring rigor is understandable, fixation on compliance with “established” approaches overlooks the fact that versions of recognizable, named approaches (e.g., grounded theory) often use different procedures [ 7 ]. As Braun and Clarke suggest, this “hallowed quest” for a singular, ideal approach leads many researchers astray and risks limiting appropriate and necessary adaptations and innovations in methods [ 13 ]. IS researchers seeking to broaden the range of approaches they can apply should take comfort that there is “no single right way to do qualitative data analysis […]. Much depends on the purpose of the research, and it is important that the proposed method of analysis is carefully considered in planning the research, and is integrated from the start with other parts of the research, rather than being an afterthought.” [ 14 ]. At the same time, given the wealth of traditions represented in the IS community, it can be difficult for researchers to effectively ensure and convey the quality and rigor of their work. This paper aims to serve as a resource for IS researchers seeking innovative and accessible approaches to qualitative research. We present suggestions for developing and communicating approaches to analysis that are the right “fit” for complex IS research projects and demonstrate rigor and quality.

Accordingly, section 1 offers guidance on identifying an analytic approach that aligns with study goals and allows for practical constraints. We describe three approaches commonly considered for IS projects: grounded theory, framework analysis, and interpretive phenomenological analysis, highlighting core elements that researchers can borrow to create a tailored, pragmatic approach. Section 2 addresses opportunities to ensure and communicate the rigor of pragmatic analytic approaches. Section 3 provides an illustrative example from the team’s work, describing the design and execution of a pragmatic analytic approach and the diversity of research and practice products generated.

Section 1: ensuring fit between research goals, practical constraints, and analytic approaches

Decision-making about all aspects of research design, including analysis, entails judgment about “fit.” Researchers need not identify a single analytic approach and attempt to force its strict application, regardless of fit. Indeed, the flexible, study-specific combination of design elements is a hallmark of applied qualitative research practice [ 9 ]. Relevant considerations for fit include the inquiry’s purpose and nature of the subject matter; the diversity of intended audiences for findings; the criteria used to judge the quality and practical value of the results; and the research context (including characteristics of the setting, participants, and investigators). Other important considerations relate to constraints of available resources (e.g., funding, time, and staff) and access to relevant participants [ 3 ]. We contend that in the applied IS setting, finding an appropriate fit often includes borrowing procedures from different approaches to create a pragmatic, hybrid approach. A pragmatic approach also addresses the IS-specific tensions outlined above, i.e., a need to conduct research that is time-bounded, engages with theories/frameworks/models, supports application in practice, and speaks to a diversity of colleagues. To promote goals of achieving fit and internal coherence in light of IS-specific requirements, we offer the considerations above and additional guiding questions for selecting analytic procedures to create a pragmatic approach, as summarized in Fig. 1 .

figure 1

Developing a pragmatic qualitative data analysis approach for IS: key considerations for selection of analytic procedures

Key questions include the following:

What is the appropriate balance of inductive and deductive analytic procedures given the research goals?

A deductive process emphasizes themes and explanations derived from previously established concepts, pre-existing theories, or the relevant literature [ 9 ]. For example, an analysis that leans heavily on a deductive process might use the core components of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework [ 15 ] to inform the coding structure and analysis. This process would support efforts to bound the investigation’s scope or expand an existing framework or model [ 16 ]. On the other hand, rather than trying to fit data with pre-existing concepts or theory, an inductive process generates interpretation and understanding that is primarily grounded in and driven by the data [ 9 ].

A balance of deductive and inductive processes might use an IS framework as a starting point for the deductive portion and then emphasize inductive processes to garner additional insight into topics not anticipated by the team or framework. For example, a selected IS framework may not attend sufficiently to the ways in which implementation context drives inequities [ 17 ], if the dataset includes valuable information on this topic, including inductive processes would allow a fuller exploration of such patterns.

To what extent will the analysis emphasize the perspectives of participants vs. researchers?

An important decision relates to where the research team wishes to ground the analysis on the continuum between insider (emic) and outsider (etic) perspectives. The appropriate balance of insider/outsider orientation will reflect the overall research design and questions. Specific decisions about how to execute the desired balance through the analysis include; for example, the types of codes used or the value placed on participant reflections. As described below in section 2, value is often placed on incorporating participants’ feedback on the development analysis, sometimes called “member checks” or “member reflections” [ 8 ].

An insider (emic) orientation represents findings in the ways that participants experience them, and insider knowledge is valued and privileged [ 9 ]. As an example, MacFarlane and colleagues used Normalization Process Theory and participatory approaches to identify appropriate implementation strategies to support the integration of evidence-based cross-cultural communication in European primary care settings. The participatory nature of the project offered the opportunity to gain “insider” insight rather than imposing and prioritizing the academic researchers’ “outsider” perspective. The insider (emic) orientation was operationalized in the analytic approach by using stakeholder co-analysis, which engages a wider set of stakeholders in the iterative processes of thematically analyzing the data [ 18 ]. By contrast, an outsider (etic) orientation represents the setting and participants in terms that the researcher or external audiences bring to the study and emphasizes the outsider’s perspective [ 9 ]. For instance, Van deGriend and colleagues conducted an analysis of influences on scaling-up group prenatal care. They used outsider (etic) codes that drew on researchers’ concepts and the literature to complement the insider (emic) codes that reflected participants’ concepts and views [ 19 ]. Balancing insider and outsider orientations is useful for pragmatic, qualitative IS studies increase the potential for the study to highlight practice- and community-based expertise, build the literature, and ultimately support the integration of evidence into practice.

How can the analytic plan be designed to yield the outputs and products needed to support the integration of evidence into research and practice?

The research team can maximize efficiency and impact by intentionally connecting the analytic plan and the kind of products needed to meet scientific and practice goals (e.g., journal articles versus policy briefs). The ultimate use of the research outputs can also impact decisions around the breadth versus depth of the analysis. For example, in a recent implementation evaluation for community-clinical partnerships delivering EBIs in underserved communities, members of this author team (SR and RL) analyzed data to explore how partnership networks impacted implementation outcomes. At the same time, given the broader goal of supporting the establishment of health policies to support partnered EBI delivery, the team was also charged (by the state Department of Public Health) with capturing stories that would resonate with legislators regarding the need for broad, sustained investments [ 20 ]. We created a unique code to identify these stories during analysis and easily incorporate them into products for health department leaders. Given the practice-focused orientation, qualitative IS studies often support products for practitioners, e.g., “playbooks” to guide the process of implementing an intervention or novel care process [ 1 ].

How can analysis resources be used strategically in time-sensitive projects or where there is limited staff or resource availability?

IS research is often conducted by teams, and strategic analytic decisions can promote rigor while capitalizing on the potential for teamwork to speed up analysis. Deterding and Waters’ strategy of flexible coding, for example, offers such benefits [ 21 ]. Through an initial, framework-driven analytic step, large chunks of text can be quickly indexed deductively into predefined categories, such as the five Consolidated Framework for Implementation Research domains of inner setting, outer setting, characteristics of individuals, intervention attributes, and processes [ 22 ]. This is a more straightforward coding task appropriate for research assistants who have been trained in qualitative research and understand the IS framework. Then, during the second analytic coding step, more in-depth coding by research team members with more experience can ensure a deeper exploration of existing and new themes. This two-step process can also enable team members to lead different parts of an IS project with different goals, purposes, or audiences. Other innovations in team-based analyses are becoming increasingly common in IS, such as rapid ethnographic approaches [ 23 ].

Building blocks for pragmatic analysis: examples from pattern-based analytic approaches

We offer illustrative examples of established analytic approaches in the following, highlighting their utility for IS and procedures that a pragmatic approach might usefully borrow and combine. These examples are not exhaustive; instead, they represent selected, pattern-based analytic approaches commonly used in IS. We aim to offer helpful anchor points that encompass the breadth and flexibility to apply to a wide range of IS projects [ 24 ] while also reflecting and speaking to a diversity of home disciplines, including sociology, applied policy, and psychology.

Grounded theory

Grounded theory is one of the most recognizable and influential approaches to qualitative analysis, although many variations have emerged since its introduction. Sociologists developed the approach, and the history and underlying philosophy are richly detailed elsewhere [ 25 , 26 ]. The central goal of this approach is to generate a theoretical explanation grounded in close inspection of the data and without a preconceived starting point. In many instances, the emphasis of grounded theory on a purely inductive orientation may be at odds with the focus in IS on the use of existing theories and frameworks, as highlighted by the QUALRIS group [ 4 ]. Additionally, a “full” grounded theory study, aligned with all its methodological assumptions and prescriptions (e.g., for sampling), is very demanding and time-consuming and may not be appropriate when timely turnaround in the service of research or practice change is required. For these reasons, a full grounded theory approach is rarely seen in the IS literature. Instead, IS researchers who use this approach are likely to use a modified version, sometimes described as “grounded theory lite” [ 6 ].

Core features and procedures characteristic of grounded theory that can be incorporated into a pragmatic approach include inductive coding techniques [ 27 ]. Open, inductive coding allows the researcher to “open up the inquiry” by examining the data to see what concepts best fit the data, without a preconceived explanation or framework [ 28 , 29 , 30 ]. Concepts and categories derived from open coding prompt the researcher to consider aspects of the research topic that were overlooked or unanticipated [ 31 ]. The intermediate stages of coding in grounded theory, referred to as axial or focused coding, build on the open coding and generate a more refined set of key categories and identify relationships between these categories [ 32 ]. Another useful procedure from grounded theory is the constant comparison method, in which data are collected, categorized, and compared to previously collected data. This continuing, iterative process prompts continuous engagement with the analysis process and reshapes and redefines ideas, which is useful for most qualitative studies [ 25 , 29 , 33 ]. Grounded theory also allows for community expertise and broader outsider perspectives to complement one another for a more comprehensive understanding of practices [ 34 ].

An illustration of the utility of grounded theory procedures comes from a study that explored how implementing organizations can influence local context to support the scale-up of mental health interventions in middle-income countries [ 35 ]. Using a multiple case study design, the study team used an analytic approach based on grounded theory to analyze data from 159 semi-structured interviews across five case sites. They utilized line-by-line open coding, constant comparison, and exploration of connections between themes in the process of developing an overarching theoretical framework. To increase rigor, they employed triangulation by data source and type and member reflections. Their team-based plan included multiple coders who negotiated conflicts and refined the thematic framework jointly. The output of the analysis was a model of processes by which entrepreneurial organizations could marshal and create resources to support the delivery of mental health interventions in limited-resource settings. By taking a divergent perspective (grounded in social entrepreneurship, in this case), the study output provided a basis for further inquiry into the design and scale-up of mental health interventions in middle-income countries.

Framework analysis

Framework analysis comes from the policy sphere and tends to have a practical orientation; this applied nature typically includes a more structured and deductive approach. The history, philosophical assumptions, and core processes are richly described by Ritchie and Spencer [ 36 ]. Framework analysis entails several features common to many qualitative analytic approaches, including defining concepts, creating typologies, and identifying patterns and relationships, but does so in a more predefined and structured way [ 37 , 38 ]. For example, the research team can create codes based on a framework selected in advance and can also include open-ended inquiry to capture additional insights. This analytic approach is well-suited to multi-disciplinary teams whose members have varying levels of experience with qualitative research [ 37 ]. It may require fewer staff resources and less time than some other approaches.

The framework analysis process includes five key steps. Step 1 is familiarization: Team members immerse themselves in the data, e.g., reading, taking notes, and listening to audio. Step 2 is identifying a coding framework: The research team develops a coding scheme, typically using an iterative process primarily driven by deductive coding (e.g., based on the IS framework). Step 3 is indexing: The team applies the coding structure to the entire data set. Step 4 is charting: The team rearranges the coded data and compares patterns between and within cases. Step 5 is mapping and interpretation: The team looks at the range and nature of relationships across and between codes [ 36 , 39 , 40 ]. The team can use tables and diagrams to systematically synthesize and display the data based on predetermined concepts, frameworks, or areas of interest. While more structured than other approaches, framework analysis still presents a flexible design that combines well with other analytic approaches to achieve study objectives [ 37 ]. The case example given in section 3 offers a detailed application of a modified framework analytic approach.

Interpretive phenomenological analysis (IPA)

Broadly, the purpose of a phenomenological inquiry is to understand the experiences and perceptions of individuals related to an occurrence of interest [ 41 , 42 ]. For example, a phenomenological inquiry might focus on implementers’ experiences with remote training to support implementing a new EBI, aiming to explore their views, how those changed over time, and why implementers reacted the way they did. Drawing on this tradition, IPA focuses specifically on particular individuals (or cases), understanding both the experience of individuals and the sense they are making of those experiences. With roots in psychology, this approach prioritizes the perspective of the participant, who is understood to be part of a broader system of interest; additional details about the philosophical underpinnings are available elsewhere [ 41 ]. Research questions are open and broad, taking an inductive, exploratory perspective. Samples are typically small and somewhat homogeneous as the emphasis is placed on an in-depth exploration of a small set of cases to identify patterns of interest [ 43 ]. Despite the smaller sample size, the deep, detailed analysis requires thoughtful and time-intensive engagement with the data. The resulting outputs can be useful to develop theories that attend to a particular EBI or IS-related process or to refine existing frameworks and models [ 44 ].

A useful example comes from a study that sought to understand resistance to using evidence-based guidelines from the perspective of physicians focused on providing clinical care [ 45 ]. The analysis drew on data collected from interviews of 11 physicians selected for their expertise and diversity across a set of sociodemographic characteristics. In the first phase of the analysis, the team analyzed the full-length interviews and identified key themes and the relationships between them. Particular attention was paid to implicit and explicit meanings, repeated ideas or phrases, and metaphor choices. Two authors conducted the analyses separately and then compared them to reach a consensus. In the second phase of the analysis, the team considered the group of 11 interviews as a set. Using an inductive perspective, the team identified superordinate (or high-level) themes that addressed the full dataset. The final phase of the analysis was to identify a single superordinate theme that would serve as the core description of clinical practice. The team engaged other colleagues from diverse backgrounds to support reflection and refinement of the analysis. The analysis yielded a theoretical model that focused on a core concept (clinical practice as engagement), broken out into five constituent parts addressing how clinicians experience their practice, separate from following external guidelines.

Section 2: ensuring and communicating rigor of a pragmatic analysis

Building on the discussion of pragmatic combination of approaches for a given study, we turn now to the question of ensuring and communicating rigor so that consumers of the scientific products will feel confident assessing, interpreting, and engaging with the findings [ 46 ]. This is of particular importance for IS given that the field tends to emphasize quantitative methods and there may be perceptions that qualitative research (and particularly research that must be completed more quickly) is less rigorous. To address those field-specific concerns and ensure pragmatic approaches are understood and valued, IS researchers must ensure and communicate the rigor of their approach. Given journal constraints, authors may consider using supplementary files to offer rich details to describe the study context and details of coding and analysis procedures (see for example, Aveling et al. [ 47 ]). We build on the work of Mays and Pope [ 38 ], Tracy [ 8 ], and others [ 48 , 49 , 50 , 51 , 52 ] to offer a shortlist of considerations for IS researchers to ensure pragmatic analysis is conducted with rigor and its quality and credibility are communicated (Table 1 ). We also recommend these articles as valuable resources for further reading.

Reporting checklists can help researchers ensure the details of the pragmatic analytic approach are communicated effectively, and inclusion of such a checklist is often required by journals for manuscript submission. Popular choices include the Standards for Reporting Qualitative Research (SRQR) and Consolidated Criteria for Reporting Qualitative (COREQ) checklists. These were developed based on reviews of other checklists and are intended to capture a breadth of information to increase transparency, rather than being driven by a philosophical underpinning regarding how to design rigorous qualitative research [ 53 , 54 ]. For that reason, researchers should use these checklists with a critical lens as they do not alone demonstrate rigor. Instead, they can be thought of as a flexible guide and support, without focusing solely on technical components at the expense of the broader qualitative expertise that drives the research effort [ 55 ].

Section 3: case example of a modified framework analysis approach

To illustrate the ideas presented above, we offer a recent example of work conducted by two authors (AR and SR) and colleagues [ 56 ]. The broad motivation for the study was to increase the use of EBIs in community-based organizations (CBOs) and faith-based organizations (FBOs) working with underserved communities. Our past work and the literature highlighted challenges in matching practitioner capacity (i.e., knowledge, motivation, skills, and resources) with the skillset required to use EBIs successfully [ 57 , 58 ]. The study utilized a participatory implementation science perspective, which offered a unique opportunity to integrate insider and outsider perspectives and increase the likelihood that solutions developed would reflect the realities of practice. The work was conducted in partnership with a Community Advisory Board and attempted to balance research and action [ 59 , 60 ].

The qualitative portion of the project had two primary goals. The research goal was to identify improvements to the design and delivery of capacity-building interventions for CBOs and FBOs working with underserved populations. The practice-related goal was to identify local training needs and refine an existing EBI capacity-building curriculum. We drew on the EPIS Framework [ 15 ] to support our exploration of multi-level factors that drive EBI implementation in social service settings. We conducted four focus group discussions with intended capacity-building recipients ( n = 27) and key informant interviews with community leaders ( n = 15). Given (1) the applied nature of the research and practice goals, (2) our reliance on an existing IS framework, (3) limited staff resources, and (4) a need to analyze data rapidly to support intervention refinement, we chose a modified framework analysis approach. Modifications included incorporating aspects of grounded theory, including open coding, to increase the emphasis on inductive perspectives. The team also modified the charting procedures, replacing tabular summaries with narrative summaries of coded data.

Analysis was conducted by three doctoral-level researchers with complementary training (IS, sociology, and nursing). We started by familiarizing ourselves with the data — the three researchers read a subset of the transcripts, with purposeful overlap in reading assignments to facilitate discussion. Then, we created the coding framework and indexed the data. We went back and forth between indexing and charting, starting with deductive codes based on the EPIS framework, and then using a more inductive open coding strategy to identify emergent codes that fell outside the EPIS framework, e.g., the importance of investing in resources that remain in the community. The new coding framework, with both inductive and deductive codes, was applied to all interview transcripts. Each transcript was independently coded by two of the three investigators, followed by coding comparison to address discrepancies. We used NVivo 12 software [ 61 ], which enabled the exploration and reorganization of data to examine patterns within specific codes and across the data set. We utilized narrative summaries to organize our findings. Finally, we revisited the relevant data to identify broad themes of interest. This step was collaborative and iterative, with each team member taking the lead on a subset of codes and themes that aligned with their expertise, and the interpretations were shared with the other research investigators and discussed. This “divide-and-conquer” tactic was similar to the Deterding and Waters example of flexible coding [ 21 ]. We used triangulation to explore perceptions by different groups of participants (e.g., leaders vs. program implementers and individuals representing CBOs vs. FBOs). This type of triangulation is sometimes referred to as “triangulation of data” and stands in contrast to triangulation between different methods [ 62 ].

Our analytic plan was informed by the participatory design of the larger project. At multiple points in the analytic process, we presented interpretations to the advisory board and then refined interpretations and subsequent steps of the analysis accordingly. This was critical because our use of an IS framework likely imposed an outsider’s perspective on the use of EBIs in practice and we wanted to ensure the interpretations reflected insider perspectives on the realities of practice. The incorporation of practice-based expertise in our analytic process also reflected the participatory nature of the research project. We note that advisory board members did not wish to analyze the data in-depth and instead preferred this manner of engagement.

To meet our research goals, we produced scientific publications that expanded the literature on capacity-building strategies to promote evidence-based prevention in CBOs and FBOs addressing health equity. The modified framework analysis approach allowed us to build on and extend the EPIS framework by allowing for framework-driven deductive coding and open, inductive coding. As an example, the EPIS framework highlights relationships between patient/client characteristics (within the “outer context” domain) and EBI fit (within the “innovation” domain). We added an emergent code to capture the wide range of resources CBO- and FBO-based practitioners needed to improve the fit between available EBIs and community needs. This included attention to the limitations of available EBIs to address the multi-level barriers to good health experienced by underserved communities. Participants highlighted the importance of solutions to these gaps coming not from external resources (such as those highlighted within the “bridging factors” domain of the framework), but instead from resources built and maintained within the community. Per the journal’s requirements, we presented the SRQR checklist to explain how we ensured a rigorous analysis.

To achieve practice goals, we drew on the rich dataset to refine the capacity-building intervention, from recruitment to the training components and ongoing supports. For example, we were able to create more compelling arguments for organizational leaders to send staff to the training and support the use of EBIs in their organizations, use language during trainings that better resonated with trainees, and include local examples related to barriers and facilitators to EBI use. We also revised programmatic offerings to include co-teaching by community members and created shorter, implementation-focused training opportunities. The balance of framework-driven, deductive processes, and open, inductive processes allowed us to capture patterns in anticipated and unanticipated content areas. This balance also allowed us to develop research briefs that provide high-level summaries that could be useful to other practitioners considering how best to invest limited professional development resources.

Conclusions

We encourage IS researchers to explore the diversity and flexibility of qualitative analytic approaches and combine them pragmatically to best meet their needs. We recognize that some approaches to analysis are tied to particular methodological orientations and others are not, but a pragmatic approach can offer the opportunity to combine analytic strategies and procedures. To do this successfully, it is essential for the research team to ensure fit, preserve quality, and rigor, and provide transparent explanations connecting the analytic approach and findings so that others can assess and build on the research. We believe pragmatic approaches offer an important opportunity to make strategic analytic decisions, such as identifying an appropriate balance of insider and outsider perspectives, to extend current IS frameworks and models. Given the urgency to increase the utilization and utility of EBIs in practice settings, we see a natural fit with the pragmatist prompt to judge our research efforts based on whether or not the knowledge obtained serves our purposes [ 63 ]. In that spirit, the use of pragmatic approaches can support high-quality, efficient, practice-focused research, which can broaden the scope and ultimate impact of IS research.

Availability of data and materials

Not applicable

Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516. https://doi.org/10.1016/j.psychres.2019.112516 .

Article   PubMed   PubMed Central   Google Scholar  

Tabak RG, Chambers D, Hook M, Brownson RC. The conceptual basis for dissemination and implementation research: lessons from existing models and frameworks. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2018. p. 73–88.

Google Scholar  

Patton MQ. Qualitative research & evaluation methods: integrating theory and practice: Sage publications; 2014.

QualRIS (Qualitative Research in Implementation Science). Qualitative methods in implementation science. Division of Cancer Control and Population Sciences, National Cancer Institute; 2019.

Creswell JW, Poth CN. Qualitative inquiry and research design: choosing among five approaches: Sage publications; 2016.

Braun V, Clarke V. Successful qualitative research: a practical guide for beginners: sage; 2013.

Levitt HM, Motulsky SL, Wertz FJ, Morrow SL, Ponterotto JG. Recommendations for designing and reviewing qualitative research in psychology: promoting methodological integrity. Qual Psychol. 2017;4(1):2–22. https://doi.org/10.1037/qup0000082 .

Article   Google Scholar  

Tracy SJ. Qualitative quality: eight “big-tent” criteria for excellent qualitative research. Qualitative inquiry. 2010;16(10):837–51. https://doi.org/10.1177/1077800410383121 .

Green J, Thorogood N. Qualitative methods for health research. 4th ed. Thousand Oaks: SAGE; 2018.

Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53. https://doi.org/10.1186/s13012-015-0242-0 .

Aveling E-L, Zegeye DT, Silverman M. Obstacles to implementation of an intervention to improve surgical services in an Ethiopian hospital: a qualitative study of an international health partnership project. BMC health services research. 2016;16(1):393. https://doi.org/10.1186/s12913-016-1639-4 .

Dearing JW, Kee KF, Peng T. Historical roots of dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 47–61.

Braun V, Clarke V. Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern-based qualitative analytic approaches. Counsel Psychother Res. 2021;21(1):37–47. https://doi.org/10.1002/capr.12360 .

Punch KF, Oancea A. Introduction to research methods in education: Sage; 2014.

Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Mental Health Mental Health Serv Res. 2011;38(1):4–23. https://doi.org/10.1007/s10488-010-0327-7 .

Miles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook: Sage; 1994.

Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. https://doi.org/10.1186/s12913-020-4975-3 .

MacFarlane A, O’Donnell C, Mair F, O’Reilly-de Brún M, de Brún T, Spiegel W, et al. REsearch into implementation STrategies to support patients of different ORigins and language background in a variety of European primary care settings (RESTORE): study protocol. Implementation Science. 2012;7(1):111. https://doi.org/10.1186/1748-5908-7-111 .

Van De Griend KM, Billings DL, Frongillo EA, Messias DKH, Crockett AH, Covington-Kolb S. Core strategies, social processes, and contextual influences of early phases of implementation and statewide scale-up of group prenatal care in South Carolina. Eval Program Plann. 2020;79:101760. https://doi.org/10.1016/j.evalprogplan.2019.101760 .

Ramanadhan S, Daly J, Lee RM, Kruse G, Deutsch C. Network-based delivery and sustainment of evidence-based prevention in community-clinical partnerships addressing health equity: a qualitative exploration. Front Public Health. 2020;8:213. https://doi.org/10.3389/fpubh.2020.00213 .

Deterding NM, Waters MC. Flexible coding of in-depth interviews: a twenty-first-century approach. Soc Methods Res. 2018:0049124118799377.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009;4:50. https://doi.org/10.1186/1748-5908-4-50 .

Palinkas LA, Zatzick D. Rapid assessment procedure informed clinical ethnography (rapice) in pragmatic clinical trials of mental health services implementation: methods and applied case study. Administration and Policy in Mental Health and Mental Health Services Research. 2019;46(2):255–70. https://doi.org/10.1007/s10488-018-0909-3 .

Article   PubMed   Google Scholar  

Pistrang N, Barker C. Varieties of qualitative research: a pragmatic approach to selecting methods. 2012.

Glaser B, Strauss A. The discovery grounded theory: strategies for qualitative inquiry. Chicago: Aldine Publishing Company; 1967.

Birks M, Mills J. Grounded theory: a practical guide: Sage; 2015.

Lara Varpio MATM, Mylopoulos M. 21 Qualitative research methodologies: embracing methodological borrowing, shifting and importing. Res Med Educ. 2015;18:245.

Strauss AL. Qualitative analysis for social scientists: Cambridge university press; 1987, DOI: https://doi.org/10.1017/CBO9780511557842 .

Creswell JW. Qualitative inquiry and research design: choosing among five approaches. 3rd ed. Thousand Oaks: Sage Publications; 2012.

Burkholder GJ, Cox KA, Crawford LM, Hitchcock JH. Research design and methods: an applied guide for the scholar-practitioner: SAGE Publications, Incorporated; 2019.

Schutt RK. Investigating the social world: the process and practice of research. Thousand Oaks: Sage Publications; 2018.

Chun Tie Y, Birks M, Francis K. Grounded theory research: a design framework for novice researchers. SAGE Open Med. 2019;7:2050312118822927.

Curry L, Nunez-Smith M. Mixed methods in health sciences research: a practical primer: Sage Publications; 2014.

Hoare KJ, Buetow S, Mills J, Francis K. Using an emic and etic ethnographic technique in a grounded theory study of information use by practice nurses in New Zealand. Journal of Research in Nursing. 2013;18(8):720–31. https://doi.org/10.1177/1744987111434190 .

Kidd SA, Madan A, Rallabandi S, Cole DC, Muskat E, Raja S, et al. A multiple case study of mental health interventions in middle income countries: considering the science of delivery. PloS one. 2016;11(3):e0152083. https://doi.org/10.1371/journal.pone.0152083 .

Article   CAS   PubMed   PubMed Central   Google Scholar  

Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In A. M. Huberman & M. B. Miles (Eds.), The qualitative researcher’s companion. Thousand Oaks: SAGE; 2002;305–30.

Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13(1):1–8.

Mays N, Pope C. Assessing quality in qualitative research. BMJ. 2000;320(7226):50–2. https://doi.org/10.1136/bmj.320.7226.50 .

Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC medical research methodology. 2013;13(1):117. https://doi.org/10.1186/1471-2288-13-117 .

Bonello M, Meehan B. Transparency and coherence in a doctoral study case analysis: reflecting on the use of NVivo within a “framework” approach. Qual Rep. 2019;24(3):483–99.

Eatough V, Smith JA. Interpretative phenomenological analysis. In: Willig C, Stainton-Rogers W, editors. The Sage handbook of qualitative research in psychology. 179 Thousand Oaks: SAGE; 2008;193-211.

McWilliam CL, Kothari A, Ward-Griffin C, Forbes D, Leipert B, Collaboration SWCCACHC. Evolving the theory and praxis of knowledge translation through social interaction: a social phenomenological study. Implement Sci. 2009;4(1):26. https://doi.org/10.1186/1748-5908-4-26 .

Smith JA, Shinebourne P. Interpretative phenomenological analysis: American Psychological Association; 2012.

Kislov R, Pope C, Martin GP, Wilson PM. Harnessing the power of theorising in implementation science. Implementation Science. 2019;14(1):103. https://doi.org/10.1186/s13012-019-0957-4 .

Saraga M, Boudreau D, Fuks A. Engagement and practical wisdom in clinical practice: a phenomenological study. Medicine, Health Care and Philosophy. 2019;22(1):41–52. https://doi.org/10.1007/s11019-018-9838-x .

Lincoln YS, Guba EG. Naturalistic inquiry. Beverly Hill: Sage; 1985.

Book   Google Scholar  

Aveling E-L, Stone J, Sundt T, Wright C, Gino F, Singer S. Factors influencing team behaviors in surgery: a qualitative study to inform teamwork interventions. The Annals of thoracic surgery. 2018;106(1):115–20. https://doi.org/10.1016/j.athoracsur.2017.12.045 .

Waring J, Jones L. Maintaining the link between methodology and method in ethnographic health research. BMJ quality & safety. 2016;25(7):556–7. https://doi.org/10.1136/bmjqs-2016-005325 .

Ritchie J, Lewis J, Nicholls CM, Ormston R. Qualitative research practice: a guide for social science students and researchers: sage; 2013.

Patton MQ. Enhancing the quality and credibility of qualitative analysis. Health Serv Res. 1999;34(5 Pt 2):1189–208.

CAS   PubMed   PubMed Central   Google Scholar  

Barry CA, Britten N, Barber N, Bradley C, Stevenson F. Using reflexivity to optimize teamwork in qualitative research. Qual Health Res. 1999;9(1):26–44. https://doi.org/10.1177/104973299129121677 .

Article   CAS   PubMed   Google Scholar  

Booth A, Carroll C, Ilott I, Low LL, Cooper K. Desperately seeking dissonance: identifying the disconfirming case in qualitative evidence synthesis. Qual Health Res. 2013;23(1):126–41. https://doi.org/10.1177/1049732312466295 .

Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57. https://doi.org/10.1093/intqhc/mzm042 .

O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51. https://doi.org/10.1097/ACM.0000000000000388 .

Barbour RS. Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ. 2001;322(7294):1115–7. https://doi.org/10.1136/bmj.322.7294.1115 .

Ramanadhan S, Galbraith-Gyan K, Revette A, Foti A, James CR, Martinez-Dominguez VL, et al. Key considerations for designing capacity-building interventions to support evidence-based programming in underserved communities: a qualitative exploration. Translat Behav Med. 2021;11(2):452–61. https://doi.org/10.1093/tbm/ibz177 .

Ramanadhan S, Aronstein D, Martinez-Dominguez VL, Xuan Z, Viswanath K. Designing capacity-building supports to promote evidence-based programs in community-based organizations working with underserved populations. Progress in Community Health Partnerships. 2020;14(2):149–60. https://doi.org/10.1353/cpr.2020.0027 .

Leeman J, Calancie L, Hartman MA, Escoffery CT, Herrmann AK, Tague LE, et al. What strategies are used to build practitioners' capacity to implement community-based interventions and are they effective?: a systematic review. Implementation Science. 2015;10(1):80. https://doi.org/10.1186/s13012-015-0272-7 .

Ramanadhan S, Davis MM, Armstrong RA, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(3):363–9. https://doi.org/10.1007/s10552-018-1008-1 .

Minkler M, Salvatore AL, Chang C. Participatory approaches for study design and analysis in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health. 2nd ed. New York: Oxford; 2018. p. 175–90.

QSR International Pty Ltd. NVivo qualitative data analysis software; Version 12. Melbourne, Australia. 2018.

Flick U. Triangulation in qualitative research. In: Flick U, vonKardorff E, Steinke I, editors. A companion to qualitative research. 2004;178-83.

Rorty RM. Philosophy and social hope: Penguin UK; 1999.

Download references

Acknowledgements

We wish to thank Priscilla Gazarian, RN, PhD, for insightful feedback on a draft of the manuscript.

This work was conducted with support from the National Cancer Institute (P50 NCA244433) and from Harvard Catalyst/National Center for Advancing Translational Sciences (UL 1TR002541). The content is solely the responsibility of the authors and does not necessarily represent the official views of Harvard Catalyst, Harvard University, or the National Institutes of Health.

Author information

Authors and affiliations.

Department of Social and Behavioral Sciences, Harvard T.H. Chan School of Public Health, Boston, MA, 02115, USA

Shoba Ramanadhan & Rebekka M. Lee

Division of Population Sciences, Dana-Farber Cancer Institute, 450 Brookline Ave, Boston, MA, 02215, USA

Anna C. Revette

Department of Health Policy and Management, Harvard T.H. Chan School of Public Health, Boston, MA, 02115, USA

Emma L. Aveling

You can also search for this author in PubMed   Google Scholar

Contributions

SR conceptualized the manuscript. SR, AR, RL, and AE co-wrote and edited the manuscript. The authors read and approved the final version.

Corresponding author

Correspondence to Shoba Ramanadhan .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Ramanadhan, S., Revette, A.C., Lee, R.M. et al. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun 2 , 70 (2021). https://doi.org/10.1186/s43058-021-00174-1

Download citation

Received : 14 December 2020

Accepted : 11 June 2021

Published : 29 June 2021

DOI : https://doi.org/10.1186/s43058-021-00174-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementation science
  • Qualitative
  • Practice-based

Implementation Science Communications

ISSN: 2662-2211

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

scholarly articles on implementation research

Advertisement

Advertisement

Stakeholder Engagement in Adoption, Implementation, and Sustainment of an Evidence-Based Intervention to Increase Mammography Adherence Among Low-Income Women

  • Open access
  • Published: 22 March 2021
  • Volume 37 , pages 1486–1495, ( 2022 )

Cite this article

You have full access to this open access article

scholarly articles on implementation research

  • Jennifer Holcomb   ORCID: orcid.org/0000-0001-7226-7642 1 ,
  • Gayla M. Ferguson 1 ,
  • Jiali Sun 1 ,
  • Gretchen H. Walton 1 &
  • Linda Highfield 1 , 2 , 3  

7 Citations

Explore all metrics

Multi-level organizational stakeholder engagement plays an important role across the research process in a clinical setting. Stakeholders provide organizational specific adaptions in evidence-based interventions to ensure effective adoption, implementation, and sustainability. Stakeholder engagement strategies involve building mutual trust, providing clear communication, and seeking feedback. Using constructs from the Consolidated Framework for Implementation Research and The International Association for Public Participation spectrum, a conceptual framework was created to guide stakeholder engagement in an evidence-based intervention to increase mammography appointment adherence in underserved and low-income women. A document review was used to explore the alignment of the conceptual framework with intervention activities and stakeholder engagement strategies. The results indicate an alignment with the conceptual framework constructs and a real-world application of stakeholder engagement in a mammography evidence-based intervention. The conceptual framework and stakeholder engagement strategies can be applied across a range of community-based cancer programs and interventions, organizations, and clinical settings.

Similar content being viewed by others

scholarly articles on implementation research

Implementation of an Evidence-Based Intervention with Safety Net Clinics to Improve Mammography Appointment Adherence Among Underserved Women

scholarly articles on implementation research

Adapting and implementing breast cancer follow-up in primary care: protocol for a mixed methods hybrid type 1 effectiveness-implementation cluster randomized study

scholarly articles on implementation research

Integrating evidence-based practices for increasing cancer screenings in safety net health systems: a multiple case study using the Consolidated Framework for Implementation Research

Avoid common mistakes on your manuscript.

With renewed interest in patient-centered care, stakeholder engagement (SE) is in the spotlight [ 1 , 2 , 3 , 4 ]. SE is a bidirectional partnership between researchers with patients, clinical and community partners, and other healthcare stakeholders to achieve a desired outcome [ 1 , 3 , 5 ]. Stakeholders can be engaged in a range of research activities including planning, proposal development, data collection, data analysis, and dissemination of results [ 3 , 6 ]. Previous research has shown infrequent SE with clinical stakeholders who are not clinicians [ 1 ]. SE is more common in early research stages when topic planning is pursued, and in contrast to implementation and dissemination stages of research [ 1 ]. A transition step of practical trainings and tools for advancing SE within various research stages has shown to be limited to date [ 1 ]. Further examination is necessary to define the extent of SE across the research process . The International Association for Public Participation (IAP2) spectrum is a framework for SE across multiple research stages [ 5 , 7 , 8 ]. The IAP2 framework outlines five participation phases hypothesized to influence SE and decision-making: Inform , Consult , Involve , Collaborate , and Empower [ 5 , 8 ]. Implementation science is then helpful to identify specific mechanisms to successfully engage multiple stakeholders within each phase. The Consolidated Framework for Implementation Research (CFIR) is a comprehensive conceptual framework composed of 39 constructs associated with effective intervention implementation identified in implementation science theories and empirical studies [ 9 , 10 ]. The constructs are organized by five major domains: Intervention Characteristics , Inner Setting , Outer Setting , Characteristics of individuals involved in implementation , and the Implementation Process [ 9 ]. CFIR provides a pragmatic structure for engaging multiple stakeholders to promote effective program planning and implementation in evidence-based interventions (EBIs) [ 9 , 10 , 11 ].

Although there is an increasing focus on stakeholder-engaged studies, more research is needed to better understand the role of SE in breast cancer research [ 12 ]. Previous research to improve breast cancer screening suggests the need to implement patient-centered tools to relay technical and process knowledge to women seeking a mammogram [ 13 , 14 ]. To ensure effectiveness of these tools and stakeholder capacity to utilize these tools, stakeholders need to be engaged in mammography screening intervention implementation and dissemination [ 13 , 14 ]. A research to practice gap exists in understanding how to effectively engage stakeholders in EBIs to improve mammography screening adherence. Few studies examining how to effectively adapt and scale mammography screening EBIs within a specific setting even exist [ 15 , 16 ]. SE is a key in the dissemination and implementation of breast cancer EBIs across all research stages [ 17 ]. Multi-level SE can provide site specific considerations for effective adaptation of EBIs into practice [ 17 ]. The Patient-Centered Outcomes Research Institute (PCORI) published guidelines for patient and SE in breast cancer research which included the themes: Authenticity , Real-World Perspective , Mutual Trust , Plain Language , Equitable Partnerships , Relationship Building , Community Engagement , and Feedback [ 12 ]. Real-world application of the SE guidelines is needed to develop effective, appropriate programs and robust research results to impact disparities in breast cancer diagnosis and screening [ 12 ].

Peace of Mind Program

Breast cancer is the second leading cause of death in US women, with the highest mortality rates seen among Latino and African American women [ 18 , 19 ]. While mammogram screening and early diagnosis reduce breast cancer morbidity and mortality, racial and economic disparities in mammography screening access and adherence continue to exist, leading to delayed diagnosis and worsened prognosis [ 20 , 21 ]. The goal of the Peace of Mind Program (PMP) was to bridge the research to practice gap through the dissemination and implementation of a multi-level EBI to increase mammography appointment adherence in underserved women [ 16 , 22 ]. The program was a telephone-based reminder phone call intervention to assess a woman’s readiness to attend a mammogram appointment and provide structured counseling for both cognitive and system barriers (e.g., paperwork and cost). A full description of PMP has been reported elsewhere [ 16 , 22 ]. In EBI development, interviews and focus groups to examine barriers and facilitators of mammography screening were conducted with African American women whose clinical partners identified them as having missed a mammography appointment within the last 6 months [ 23 ]. Similar patient stakeholders reviewed and provided feedback on scripts for a previously adapted PMP program for underserved women. With clinical and organizational level stakeholders, intervention mapping (IM) was used to supplement the current structure and to develop an implementation intervention to facilitate the program adoption, implementation, and maintenance in federally qualified health centers (FQHCs) and charity clinics [ 15 , 16 ]. The IM process incorporates theory and evidence-based health promotion planning and engaging multi-level stakeholders throughout the planning and implementation process [ 24 ]. IM methods have been used to guide the design and implementation of EBIs, but the application of IM in the PMP was an innovative approach to expansion and adaption of a mammography screening intervention in FQHCs and charity clinics [ 16 ]. A planning group was created to guide program intervention planning and implementation [ 16 ]. The planning group included researchers and administrative staff and certified community health workers (CHWs) from The Breast Health Collaborative of Texas (BHCTexas)—a nonprofit organization focused on ending breast cancer disparities with experience working on mammography screening programs in FQHCs and charity clinics. The planning group identified three levels of stakeholders at FQHCs and charity clinics across the BHCTexas membership network which included Decision-Makers , Program Champions , and Patient Navigators . Decision-Makers made critical decisions related to participation in PMP and implementation changes needed in their clinic. The Decision-Makers identified a Program Champion to be the key contact person for PMP in their clinic. The Program Champion’s primary responsibility was to help facilitate implementation at the clinic, promote PMP, and support new adopters. Patient Navigators , all of who were CHWs, worked directly with patients to schedule mammogram appointments, provide reminder calls, and obtain patient consent for inclusion in the intervention.

Utilizing SE principles and CFIR constructs, we developed a conceptual framework to reflect SE in PMP. We compared our conceptual framework with the stakeholders’ experiences and a review of program documents. We describe the stakeholders targeted and SE strategies across program activities. We then aimed to identify SE barriers and facilitators with a focus on the readiness of the stakeholders to promote adoption, implementation, and sustainment of PMP.

PMP was implemented in the Greater Houston area in 25 clinical delivery locations; three mobile mammography providers and sixteen FQHCs and safety-net clinics. A grouped stepped-wedge design was used with three groups of clinics in each wedge moving from baseline to intervention at pre-set time periods [ 22 ]. Each group was made up of clinic Decision-Makers , Program Champions , and Patient Navigators from each clinic. PMP implementation included an adoption meeting to recruit each clinic for PMP. A site assessment meeting followed the adoption meeting at each clinical location to determine current clinic mammogram program processes and capacity for PMP implementation. Each group of clinics participated in nine stakeholder committee meetings, trainings on EBI methods and research ethics, and received ongoing support from the BHCTexas CHWs and researchers. The stakeholder committee meetings focused on the review of program adaptation and implementation materials; the use of program scripts; stakeholder assessment of implementation readiness; stakeholder discussion of program adaptation recommendations; and implementation problem solving.

Conceptual Framework for Stakeholder Engagement

As shown in Fig. 1 , the five IAP2 phases outlined in the IAP2 framework combined with constructs from CFIR were used to guide SE in PMP [ 8 , 9 ]. The Inform phase is an opportunity for researchers to teach the stakeholders about the intervention and encourage recruitment. Providing intervention source information, evidence of strength and quality, and relative advantage of the intervention helps improve stakeholders’ knowledge regarding the intervention. Addressing the complexity, design quality, packaging, and cost helps the stakeholders make an informed decision about their desire to participate. The researchers sought feedback from the stakeholders regarding implementation during the Consult phase. The constructs of adaptability, trialability, patient needs and resources, cosmopolitanism (i.e., the organization’s network with other organizations), implementation climate, culture, and structural characteristics were used to determine the methods of implementation. The stakeholders provide valuable information regarding external policy and incentives, peer pressure, networks, and communication styles. These constructs showed the organizational factors which would facilitate or hinder successful implementation. Once stakeholders provided their feedback regarding the barriers and facilitators for implementation, work continued in the Involve phase to ensure the plans and actions to leverage facilitators and overcome barriers for implementation were effective and accurately reflected the information provided by the stakeholders. Implementation climate, readiness, and adaptability were explored for each participating clinical site. The individual stages of change, self-efficacy, individual identification with organization, knowledge, and beliefs about the intervention of the clinic staff were explored. Throughout the Collaborate phase, researchers sought feedback from stakeholders on implementation struggles and successes and provided positive reinforcement, as well as suggestions for improvements. Throughout the Empower phase, researchers aimed to equip the stakeholders with skills and the resources for intervention sustainment. The conclusory goal of PMP was for stakeholders to continue the intervention activities, and use the tools to implement additional organizational changes. In the Empower phase, stakeholders gained confidence by addressing self-efficacy, individual stage of change, executing, reflecting, and evaluation.

figure 1

Combined conceptual framework for stakeholder engagement in the Peace of Mind Program (PMP). Adapted from International Association of Public Participation (IAP2) framework and constructs from the Consolidated Framework for Implementation Research (CFIR) [ 5 , 8 , 9 ]. Consolidated Framework for Implementation Research (CFIR) Domains indicated in each International Association of Public Participation (IAP2) phase: Intervention Characteristics 1 ; Inner Setting 2 ; Outer Setting 3 ; Characteristics of individuals involved in intervention implementation 4 ; and Implementation Process 5 . Description: Black and white graphic with a continuing sequence of International Association of Public Participation (IAP2) phases in a circular flow with indicated Consolidated Framework for Implementation Research (CFIR) constructs and domains for each phase

Research on Stakeholder Engagement

A qualitative research design was used to conceptualize the framework by examining PMP stakeholder experiences and program documents. Documents including program published articles, records, and adaptation and implementation materials were reviewed. The program records included the stakeholder enrollment letter; PMP introduction webinar; stakeholder committee meeting agendas, minutes, and sign-in forms available for each meeting; PowerPoint presentations reflecting meeting content available for each meeting; and PowerPoint presentations of stakeholder training on EBI methods and research ethics. The program materials included a Stakeholder Committee Handbook , Implementation Guide , and the Clinic Handbook included in the Stakeholder Handbook . A printed version of the Stakeholder Handbook what was provided to stakeholders through an online application. However, it was determined a printed version of materials should be provided as a back-up plan, and for those who prefer to use “flip-able” notebook pages. The Stakeholder Committee Handbook provided a PMP overview and committee onboarding process. The Implementation Guide contained instructions for clinical sites to implement PMP and templates for required staff, scheduling appointments at clinics, and site assessment and implementation readiness checklists. This guide provided an overview of the program delivery, administration, and evaluation. The Clinic Handbook provided each clinical implementation site with a digestible introduction to the program components and timeline. It contained job descriptions for the clinic staff involved in the program, an overview of the communication techniques used, barrier scripts for patient interactions, cumulative instructional Power Point slide sets, and approved informed consent forms.

A thematic content analysis was used to code and analyze SE document data [ 25 , 26 ]. Coding began using an a priori set of codes from the conceptual framework based in SE and implementation science literature with program activities. Open coding was used to identify any codes not initially identified from these two sources. Coding included words and phrases used by stakeholders and tasks outlined in the documents. Two researchers initially identified 73 codes reflecting SE strategies within each PMP activity and IAP2 phase. The final 40 codes were considered in terms of their relationships to one another and were interpreted based on the corresponding 25 CFIR constructs within the five IAP2 phases in the conceptual framework. The complete code list was also interpreted in terms of facilitators and barriers in SE with the corresponding 25 CFIR constructs. Quality checks on identified codes were conducted by two additional researchers who did not participate in the initial coding. The researchers reviewed codes from each of the five corresponding IAP2 phases and double-checked their relevance and corresponding CFIR construct. Any discrepancies were resolved through iterative document review and discussion until consensus was reached.

Stakeholder Engagement Strategies

Documents were mapped to identify SE strategies in each program activity corresponding with an IAP2 phase in the conceptual framework (Table 1 ). Building mutual trust, providing clear communication, and seeking feedback through SE early in program adoption were important for effective program implementation. The engagement strategies in the Inform phase focused on gaining buy-in from clinic decision-makers based on the Intervention Characteristics. Clinic Decision-Makers were engaged by email and participated in an initial adoption meeting and webinar with the researchers. The researchers provided information on the effectiveness of PMP to increase mammogram appointment adherence, the advantage of PMP compared to current practices and other mammography programs, the cost of PMP implementation, and the implementation plan and materials. Once a clinic agreed to participate, the researchers facilitated relationships with mobile mammography providers, if needed, for clinics to implement PMP. SE provided encouragement for clinics to adopt the program, cultivate awareness, and promote buy-in from the clinic Decision-Makers for continued participation in PMP. In the Consult phase, feedback on the PMP implementation was sought in the clinical site assessment meeting, stakeholder committee meetings, and mammography program drives based on the PMP Intervention Characteristics, Inner Setting of the clinic, and Individual Characteristics of the stakeholders participating in PMP. The climate of the clinic, readiness to participate in PMP, and adaptability of PMP to baseline practices were assessed by the clinic Decision-Makers and Program Champions were through a site assessment checklist. In the Involve phase, feedback about PMP in stakeholder committee meetings was used for PMP implementation planning and preparation. During the stakeholder committee meetings, mammography program drives for each clinic were reviewed and analyzed to provide baseline data and processes prior to training and implementation. The Inner Setting and Outer Setting characteristics in terms of the clinic’s relationships with other clinics providing competing services, state and local policies guiding the clinic, and the needs of the populations served were examined to adapt and refine PMP materials. These materials were then used during the trainings to ensure continuity and fidelity of implementation. After the trainings were conducted, an implementation readiness checklist was completed by each participating clinic to assist in initial implementation.

PMP implementation began with the Collaborate phase and continued throughout the Empower phase. In the Collaborate phase, adaptability of PMP was continuingly explored through identifying implementation barriers and facilitating solutions. Stakeholder’s knowledge and beliefs, self-efficacy, individual stage of change to surrounding PMP implementation, and the stakeholder’s perception of their clinic site were examined through ongoing support from the researchers and BHCTexas. In the Empower phase, stakeholders executed PMP implementation on their own. The stakeholders were empowered to assess their own self-efficacy and individual stage of change, to evaluate and maintain intervention activities, and to address any implementation barriers.

Barriers and Facilitators in Stakeholder Engagement

Several barriers in establishing or maintaining SE were identified alongside facilitators that enabled or promoted SE in each IAP2 phase. First, gaining initial buy-in of the clinic leadership ( Decision-Makers ) in the Inform phase at each clinical site was important for recruitment of clinic staff ( Program Champions and Patient Navigators ) and attendance at stakeholder committee meetings. In the few clinics without clinic leadership buy-in of PMP effectiveness and advantage, it proved more challenging to gain the clinic staff buy-in needed for valuable participation in the stakeholder committee meetings. For other clinics with clinic leadership buy-in, staff engagement created a climate that was more conducive for PMP success. Clinic staff aligned with the climate of the clinic and goals of PMP, which lead to readiness to participate in the program. The leadership set both the tone and the priorities for the clinic staff and could therefore assist or hinder the issues faced by the clinic staff including competing demands and meeting time availability. Second, in the Consult phase, readiness for implementation posed the biggest barrier in the lack of dedicated mammography program staff and office space at each clinical site location. The lack of dedicated program staff in some clinics made the engagement and requirement of dedicated stakeholder committee members to represent each role in the program a challenge. Successfully contacting clinic staff was contingent on the researchers having current contact information and maintaining an accurate contact information registry. Next, in the Involve phase, once all of the appropriate program staff from each clinic were identified for stakeholder meetings, Inner Setting characteristics rather than Intervention Characteristics or Outer Setting characteristics proved to be barriers to SE. Structural characteristics in the competing departments and various levels of staff hierarchy posed a challenge to the program goal of free and open communication during the stakeholder committee meetings. The culture of each clinic, first determined by the clinic leadership, and perceived by the clinic staff determined the scope and depth of communication and problem solving. The freedom with which a clinic staff would speak frankly about the implementation practices and/or concerns depended on the clinic’s culture and learning climate. This was particularly true for clinics with leadership active in the stakeholder process. The clinic’s staff needed to feel he or she possessed the autonomy to honestly discuss the intervention and happenings of the clinic without reprimand or negative consequences from leadership. The researchers observed clinic staff were hesitant to share struggles or information which could paint their respective organizations in a negative light. Though clinic staff were observed to be hesitant to describe potentially unflattering details about their program implementation to one another during stakeholder committee meetings, the staff did build a relationship and trust with researchers over time to share their struggles and concerns. Online communication and engagement seemed to provide the most flexibility for clinic staff and other stakeholders. Individual conference calls through UberConference software allowed each clinical site stakeholder to feel more comfortable sharing successes and problems with the implementation of the program at their site. While the UberConference software allowed for flexibility of scheduling, finding a consistent time that multiple individuals were available to meet was still a challenge. Adding scheduling to the competing demands of a clinic, many of which superseded the stakeholder committee meeting, made it difficult to find a consistent block of time to have all committee members or clinics present for the same meeting. The clinic staff performed multiple jobs and the demands on their time and attention left them distracted and at times, disinterested.

Barriers in program uptake and staff turnover were identified across IAP2 phases. Maintaining accurate contact information proved difficult with high staff turnover at participating clinical sites. In the Consult and Involve phases, staff turnover and competing demands also made it difficult to have consistent stakeholder committee meeting attendance. To identify and address any issues with differing program uptake and staff turnover, adapted program materials, trainings, and ongoing support provided in various modalities were provided. The program materials and trainings were adapted, packaged, printed, and distributed to the clinics’ staff. All methods were adapted to allow implementers to utilize the most effective delivery methods which involved stakeholder’s collaboration and feedback. The inclusion of BHCTexas CHWs in training and the opportunity to re-train staff as needed also added value to the self-efficacy of stakeholders and built staff capacity to implement the program. In the Collaborate phase, clinical staff counseling and peer support were a benefit of the stakeholder process. The CHWs collaborated and empowered the clinical staff to implement the program. The CHWs were embedded on site in the clinics which allowed for relationship and trust building to occur. By having CHWs that were familiar with the PMP practices, as well as completed their certification education and training, the clinical staff had individuals they could communicate with, model, and seek assistance from during trainings and implementation. Lastly in the Empower phase, the researchers and BHCTexas worked closely with clinic staff to transition program ownership and maintenance. Embedded in the clinic at the beginning of implementation, the CHWs role modeled the reminder phone call process and over time integrated clinic staff into leading the process. The CHWs provided reinforcement through reflection and evaluation to improve self-efficacy of clinic staff. The goal was to have clinic staff take ownership in executing and maintaining the program before wrap-up. After program wrap-up, materials and ongoing support were still readily available as the program was integrated into clinic operations.

A conceptual framework was developed from the IAP2 methodology and implementation science structures to guide SE activities in PMP. The conceptual framework proved logical, realistic, and sustainable across participating clinical sites. The actual experiences shared by stakeholders at the clinical sites uniformly aligned with the theoretical constructs in PMP. While the SE approach was guided by the conceptual framework, the strategies used also reflect a real-world application of the PCORI guidelines for SE in breast cancer research [ 12 ]. To our knowledge, this is the first publication of a real-world application of a SE approach and relevant guidelines in an EBI to increase mammography adherence among underserved women. Mutual trust was developed first by partnering with BHCTexas who had existing relationships with the identified PMP clinical stakeholders through their clinic membership network. These partnerships extended engagement with stakeholders across the clinic setting: Decision-Makers , Program Champions , and Patient Navigators at FQHCs and charity clinics [ 1 ]. Through the existing partnership with BHCTexas, multiple communication modalities were used to explain PMP objectives and gain buy-in from PMP clinical stakeholders. Initial emails were sent to Decision-Makers to invite clinics to participate, a webinar was developed to explain the PMP, and adoption meetings were scheduled to obtain agreement to participate. As a part of the agreement, a Memorandum of Understanding (MOU) was a practical tool developed to outline roles and expectations of the clinical stakeholders, BHCTexas, and researchers throughout implementation. The clinical stakeholders provided feedback on PMP first in the clinical site assessment meeting and throughout the stakeholder committee meetings. The clinical site assessment provided stakeholders an opportunity to provide PMP “fit” with current clinic processes during real-time and normal clinic workflow. In addition, practical tools (e.g., Clinic Handbook) were developed by researchers and BHCTexas with feedback from clinical stakeholders. Stakeholders were able to make adaptions to the program materials to synchronize the clinics’ current processes and reflect populations served while ensuring fidelity of the EBI components. Practical tools have been developed in previous research [ 13 , 14 ], but this paper addresses an ongoing gap in how best to engage stakeholders in adapting practical tools and providing training for effective dissemination of these tools [ 1 ]. Stakeholders were provided paper and digital copies of the adapted program materials in the trainings for Patient Navigators . The enhanced training for Patient Navigators included assigning continuing education unit (CEU) credits through the researchers’ academic institution. Stakeholders were also able to provide feedback on barriers and challenges to PMP implementation throughout stakeholder committee meeting discussion and subsequent onsite clinic support from BHCTexas CHWs. The clinical sites appreciated the ongoing support through PMP implementation to further adapt the program for their specific clinic needs and patient populations while providing sustainable practical tools and intervention activities after program completion.

Stakeholders were engaged in later research stages of implementation and dissemination rather than just in problem identification and intervention planning [ 1 ]. Future research might focus on engagement of multiple clinical stakeholders throughout the research process to develop practical tools and trainings. Researchers can build on the SE conceptual framework and strategies described in this paper to ensure intervention adoption, implementation effectiveness, and sustainment of intervention activities. While patients were involved in the EBI development, patients seeking a mammogram screening at the participating FQHC and charity clinics were not involved in PMP implementation. Future research might focus on if and how to include patients in SE across IAP2 phases to better integrate patients throughout the research process. Research in a clinical setting resulted in challenges and limitations. First, due to a shift in our internal record keeping systems, data from all the stakeholder committee meetings conducted were not available for the document review. While this is a limitation of the study, it does not diminish the results derived as the process was replicated for each group of clinics during the project and the interactions with the clinics and the clinic staff were consistent between the groups. Second, midway through the stakeholder committee meeting process, the researchers shifted from collective stakeholder committee meetings to one-on-one meetings with each clinic. It became apparent that clinic staff viewed the staff from other clinics participating in the program as competitors. The adjustment was necessary for researchers to successfully engage all clinical staff during the stakeholder process in a meaningful way.

PMP is an innovative adaptation of a mammography screening EBI with multi-level SE. SE strategy successes and challenges were identified in adoption, implementation, and sustainment. The overall effectiveness of SE could be replicated when coupled with similar EBIs. Utilizing a theoretical base allowed for effectively bridging the research-practice gap. The conceptual framework can be easily adopted to mammography screening initiatives, or modified to address other community-based cancer screenings programs.

Data Availability

Not applicable

Concannon TW, Grant S, Welch V, Petkovic J, Selby J, Crowe S, Synnot A, Greer-Smith R, Mayo-Wilson E, Tambor E, Tugwell P, for the Multi Stakeholder Engagement (MuSE) Consortium Practical guidance for involving stakeholders in health research. J Gen Intern Med 34:458–463. https://doi.org/10.1007/s11606-018-4738-6

Cottrell, E., Whitlock, E., Kato, E., Uhl, S., Belinson, S., Chang, C., Hoomans, T., Meltzer, D., Noorani, H., Robinson, K. & Schoelles, K. (2014). Defining the benefits of stakeholder engagement in systematic reviews. Agency for Healthcare Research and Quality. https://www.ncbi.nlm.nih.gov/sites/books/NBK196180/

Google Scholar  

Forsythe LP, Ellis LE, Edmundson L, Sabharwal R, Rein A, Konopka K, Frank L (2016) Patient and stakeholder engagement in the PCORI pilot projects: description and lessons learned. J Gen Intern Med 31(1):13–21. https://doi.org/10.1007/s11606-015-3450-z

Article   PubMed   Google Scholar  

Whitlock EP, Lopez SA, Chang S, Helfand M, Eder M, Floyd N (2010) AHRQ series paper 3: identifying, selecting, and refining topics for comparative effectiveness systematic reviews: AHRQ and the effective health-care program. J Clin Epidemiol 63(5):491–501. https://doi.org/10.1016/j.jclinepi.2009.03.008

Akwanalo C, Njuguna B, Mercer T, Patakia SD, Mwangi A, Dick J, Dickhaus J, Andesia J, Bloomfield GS, Valente T (2019) Strategies for effective stakeholder engagement in strengthening referral networks for management of hypertension across health systems in Kenya. Glob Heart 14(2):173–179. https://doi.org/10.1016/j.gheart.2019.06.003

de Wit M, Kirwan JR, Tugwell P, Beaton D, Boers M, Brooks P, Collins S, Conaghan PG, D’Agostino MA, Hofstetter C, Hughes R (2017) Successful stepwise development of patient research partnership: 14 years’ experience of actions and consequences in Outcome Measures in Rheumatology (OMERACT). Patient Res 10(2):141–152. https://doi.org/10.1007/s40271-016-0198-4

Article   Google Scholar  

Bammer G (2019) Key issues in co-creation with stakeholders when research problems are complex. J Res Pract 15(3):423–435. https://doi.org/10.1332/174426419X15532579188099

International Association for Public Participation (IAP2). (2018). IAP2 Spectrum of Public Participation. https://cdn.ymaws.com/www.iap2.org/resource/resmgr/pillars/Spectrum_8.5x11_Print.pdf

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC (2009) Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 4(1):1–15. https://doi.org/10.1186/1748-5908-4-50

Norris JM, White DE, Nowell L, Mrklas K, Stelfox HT (2017) How do stakeholders from multiple hierarchical levels of a large provincial health system define engagement? A qualitative study. Implement Sci 12(1):98. https://doi.org/10.1186/s13012-017-0625-5

Article   PubMed   PubMed Central   Google Scholar  

Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF (2017) Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement Sci 12(1):15. https://doi.org/10.1186/s13012-017-0550-7

Greene SM, Brandzel S, Wernli KJ (2018) From principles to practice: real-world patient and stakeholder engagement in breast cancer research. Perm J 22:17–232. https://doi.org/10.7812/TPP/17-232

Gunn, C. M., Maschke, A., Paasche-Orlow, M. K., Kressin, N. R., Schonberg, M. A., & Battaglia, T. A. (2020). Engaging women with limited health literacy in mammography decision-making: perspectives of patients and primary care providers. Journal of General Internal Medicine . 1-8. https://doi.org/10.1007/s11606-020-06213-2

Pasternack I, Saalasti-Koskinen U, Mäkelä M (2011) Decision aid for women considering breast cancer screening. Int J Technol Assess Health Care 27(4):357–362. https://doi.org/10.1017/S026646231100050X

Hartman MA, Stronks K, Highfield L, Cremer SW, Verhoeff AP, Nierkens V (2015) Disseminating evidence-based interventions to new populations: a systematic approach to consider the need for adaptation. Implement Sci 10(1). https://doi.org/10.1186/1748-5908-10-S1-A49

Highfield L, Valerio MA, Fernandez ME, Bartholomew Eldredge LK (2018) Development of an implementation intervention using intervention mapping to increase mammography among low income women. Front Public Health 6:300. https://doi.org/10.3389/fpubh.2018.00300

Rositch AF, Unger-Saldaña K, DeBoer RJ, Ng’ang’a A, Weiner BJ (2020) The role of dissemination and implementation science in global breast cancer control programs: frameworks, methods, and examples. Cancer . 126:2394–2404. https://doi.org/10.1002/cncr.32877

Bambhroliya AB, Burau KD, Sexton K (2012) Spatial analysis of county-level breast cancer mortality in Texas. J Environ Public Health 2012:1–8. https://doi.org/10.1155/2012/959343

DeSantis CE, Ma J, Goding Sauer A, Newman LA, Jemal A (2017) Breast cancer statistics, 2017, racial disparity in mortality by state. CA Cancer J Clin 67(6):439–448. https://doi.org/10.3322/caac.21412

Ahmed AT, Welch BT, Brinjikji W, Farah WH, Henrichsen TL, Murad MH, Knudsen JM (2017) Racial disparities in screening mammography in the United States: a systematic review and meta-analysis. J Am Coll Radiol 14(2):157–165. https://doi.org/10.1016/j.jacr.2016.07.034

Miller BC, Bowers JM, Payne JB, Moyer A (2019) Barriers to mammography screening among racial and ethnic minority women. Soc Sci Med 239:112494. https://doi.org/10.1016/j.socscimed.2019.112494

Highfield L, Rajan SS, Valerio MA, Walton G, Fernandez ME, Bartholomew LK (2015) A non-randomized controlled stepped wedge trial to evaluate the effectiveness of a multi-level mammography intervention in improving appointment adherence in underserved women. Implement Sci 10(1):1–8. https://doi.org/10.1186/s13012-015-0334-x

Highfield L, Bartholomew LK, Hartman MA, Ford MM, Balihe P (2014) Grounding evidence-based approaches to cancer prevention in the community: a case study of mammography barriers in underserved African American women. Health Promot Pract 15(6):904–914. https://doi.org/10.1177/2F1524839914534685

Fernandez ME, Ruiter RAC, Markham CM, Kok G (2019) Intervention mapping: theory- and evidence-based health promotion program planning: perspective and examples. Front Public Health 7:209. https://doi.org/10.3389/fpubh.2019.00209

Coffey A (2014) Analyzing documents. In: Flick U (ed) The SAGE handbook of qualitative data analysis . Sage Publications, Thousand Oaks, CA, pp 367–379. https://doi.org/10.4135/9781446282243

Chapter   Google Scholar  

Gross JMS (2018) Document analysis. In: Frey B (ed) The SAGE encyclopedia of educational research, measurement, and evaluation . Sage Publications, Thousand Oaks, CA, pp 545–548. https://doi.org/10.4135/9781506326139

Download references

Acknowledgements

The authors would like to acknowledge the Breast Health Collaborative of Texas (BHCTexas) and participating clinical partners for their ongoing collaboration in the intervention and the Agency for Healthcare Research and Quality for their support.

Code Availability (Software Application or Custom Code)

Funding for this study is provided by the Agency for Healthcare Research and Quality (Grant number: 1R18HS023255-01).

Author information

Authors and affiliations.

Department of Management, Policy and Community Health, The University of Texas Health Science Center at Houston (UTHealth) School of Public Health, Houston, TX, USA

Jennifer Holcomb, Gayla M. Ferguson, Jiali Sun, Gretchen H. Walton & Linda Highfield

Department of Epidemiology, Human Genetics and Environmental Sciences, The University of Texas Health Science Center at Houston (UTHealth) School of Public Health, Houston, TX, USA

Linda Highfield

Department of Internal Medicine, The University of Texas Health Science Center at Houston (UTHealth) John P and Katherine G McGovern Medical School, Houston, TX, USA

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the study conception and design. Data collection and analysis were performed by Jennifer Holcomb, Gayla M. Ferguson, and Jiali Sun. Jennifer Holcomb lead the manuscript development and all authors contributed to the manuscript text. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jennifer Holcomb .

Ethics declarations

Conflict of interest.

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Holcomb, J., Ferguson, G.M., Sun, J. et al. Stakeholder Engagement in Adoption, Implementation, and Sustainment of an Evidence-Based Intervention to Increase Mammography Adherence Among Low-Income Women. J Canc Educ 37 , 1486–1495 (2022). https://doi.org/10.1007/s13187-021-01988-2

Download citation

Accepted : 01 March 2021

Published : 22 March 2021

Issue Date : October 2022

DOI : https://doi.org/10.1007/s13187-021-01988-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Stakeholder engagement
  • Consolidated Framework for Implementation Research
  • Mammography
  • Evidence-based intervention
  • Community-based participatory research
  • Find a journal
  • Publish with us
  • Track your research

Photo of a person's hands typing on a laptop.

AI-assisted writing is quietly booming in academic journals. Here’s why that’s OK

scholarly articles on implementation research

Lecturer in Bioethics, Monash University & Honorary fellow, Melbourne Law School, Monash University

Disclosure statement

Julian Koplin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Monash University provides funding as a founding partner of The Conversation AU.

View all partners

If you search Google Scholar for the phrase “ as an AI language model ”, you’ll find plenty of AI research literature and also some rather suspicious results. For example, one paper on agricultural technology says:

As an AI language model, I don’t have direct access to current research articles or studies. However, I can provide you with an overview of some recent trends and advancements …

Obvious gaffes like this aren’t the only signs that researchers are increasingly turning to generative AI tools when writing up their research. A recent study examined the frequency of certain words in academic writing (such as “commendable”, “meticulously” and “intricate”), and found they became far more common after the launch of ChatGPT – so much so that 1% of all journal articles published in 2023 may have contained AI-generated text.

(Why do AI models overuse these words? There is speculation it’s because they are more common in English as spoken in Nigeria, where key elements of model training often occur.)

The aforementioned study also looks at preliminary data from 2024, which indicates that AI writing assistance is only becoming more common. Is this a crisis for modern scholarship, or a boon for academic productivity?

Who should take credit for AI writing?

Many people are worried by the use of AI in academic papers. Indeed, the practice has been described as “ contaminating ” scholarly literature.

Some argue that using AI output amounts to plagiarism. If your ideas are copy-pasted from ChatGPT, it is questionable whether you really deserve credit for them.

But there are important differences between “plagiarising” text authored by humans and text authored by AI. Those who plagiarise humans’ work receive credit for ideas that ought to have gone to the original author.

By contrast, it is debatable whether AI systems like ChatGPT can have ideas, let alone deserve credit for them. An AI tool is more like your phone’s autocomplete function than a human researcher.

The question of bias

Another worry is that AI outputs might be biased in ways that could seep into the scholarly record. Infamously, older language models tended to portray people who are female, black and/or gay in distinctly unflattering ways, compared with people who are male, white and/or straight.

This kind of bias is less pronounced in the current version of ChatGPT.

However, other studies have found a different kind of bias in ChatGPT and other large language models : a tendency to reflect a left-liberal political ideology.

Any such bias could subtly distort scholarly writing produced using these tools.

The hallucination problem

The most serious worry relates to a well-known limitation of generative AI systems: that they often make serious mistakes.

For example, when I asked ChatGPT-4 to generate an ASCII image of a mushroom, it provided me with the following output.

It then confidently told me I could use this image of a “mushroom” for my own purposes.

These kinds of overconfident mistakes have been referred to as “ AI hallucinations ” and “ AI bullshit ”. While it is easy to spot that the above ASCII image looks nothing like a mushroom (and quite a bit like a snail), it may be much harder to identify any mistakes ChatGPT makes when surveying scientific literature or describing the state of a philosophical debate.

Unlike (most) humans, AI systems are fundamentally unconcerned with the truth of what they say. If used carelessly, their hallucinations could corrupt the scholarly record.

Should AI-produced text be banned?

One response to the rise of text generators has been to ban them outright. For example, Science – one of the world’s most influential academic journals – disallows any use of AI-generated text .

I see two problems with this approach.

The first problem is a practical one: current tools for detecting AI-generated text are highly unreliable. This includes the detector created by ChatGPT’s own developers, which was taken offline after it was found to have only a 26% accuracy rate (and a 9% false positive rate ). Humans also make mistakes when assessing whether something was written by AI.

It is also possible to circumvent AI text detectors. Online communities are actively exploring how to prompt ChatGPT in ways that allow the user to evade detection. Human users can also superficially rewrite AI outputs, effectively scrubbing away the traces of AI (like its overuse of the words “commendable”, “meticulously” and “intricate”).

The second problem is that banning generative AI outright prevents us from realising these technologies’ benefits. Used well, generative AI can boost academic productivity by streamlining the writing process. In this way, it could help further human knowledge. Ideally, we should try to reap these benefits while avoiding the problems.

The problem is poor quality control, not AI

The most serious problem with AI is the risk of introducing unnoticed errors, leading to sloppy scholarship. Instead of banning AI, we should try to ensure that mistaken, implausible or biased claims cannot make it onto the academic record.

After all, humans can also produce writing with serious errors, and mechanisms such as peer review often fail to prevent its publication.

We need to get better at ensuring academic papers are free from serious mistakes, regardless of whether these mistakes are caused by careless use of AI or sloppy human scholarship. Not only is this more achievable than policing AI usage, it will improve the standards of academic research as a whole.

This would be (as ChatGPT might say) a commendable and meticulously intricate solution.

  • Artificial intelligence (AI)
  • Academic journals
  • Academic publishing
  • Hallucinations
  • Scholarly publishing
  • Academic writing
  • Large language models
  • Generative AI

scholarly articles on implementation research

Lecturer / Senior Lecturer - Marketing

scholarly articles on implementation research

Head, School of Psychology

scholarly articles on implementation research

Senior Lecturer (ED) Ballarat

scholarly articles on implementation research

Senior Research Fellow - Women's Health Services

scholarly articles on implementation research

Assistant Editor - 1 year cadetship

Opinion The problem with diversity statements — and what to do about them

DEI statements have too often led to self-censorship and ideological policing.

scholarly articles on implementation research

As the United States reckoned with racial inequality during and after the 2020 Black Lives Matter protests, many saw Diversity, Equity and Inclusion (DEI) programs as a way to address the issues in higher education. As part of the trend, many schools began requiring candidates for teaching positions to submit DEI statements. In these statements, potential hires explain how they would advance diversity, equity and inclusion in their teaching and research activities. One 2021 study found that about one-third of job postings at elite universities required them.

Now, however, some in academia are starting to express second thoughts about this practice. In April, Harvard Law School professor Randall L. Kennedy urged abolition of DEI statements, arguing that they amount to “compulsion” and “ideological litmus tests.” Not long after Mr. Kennedy’s article appeared, the Massachusetts Institute of Technology became the first top university to voluntarily end their use. The decision came after extensive consultations among all six of the school’s academic deans. MIT’s president, Sally Kornbluth, explained : “We can build an inclusive environment in many ways, but compelled statements impinge on freedom of expression, and they don’t work.”

scholarly articles on implementation research

In doing away with DEI statements, MIT was not abandoning the goals of greater diversity, equity and inclusion, which remain not only valid but also vital. DEI programs can have an important place. They should not be abolished or undermined — as red states such as Florida and Texas have done, by forbidding the use of state funds for DEI in public universities. Reshaping universities via such a heavy-handed use of state power could set a dangerous precedent for academic freedom more generally.

And yet as a specific policy, DEI statements advance their declared objectives at too high a cost. In fact, they stoke what Mr. Kennedy, a self-described “scholar on the left,” who formerly served as a law clerk for Justice Thurgood Marshall, called “intense and growing resentment” among academics. Not surprisingly, 90 percent of self-described conservative faculty view the statements as political litmus tests, but so do more than 50 percent of moderates and even one-quarter of liberals, according to a survey by the Foundation for Individual Rights and Expression, a nonpartisan watchdog group specializing in campus free speech issues.

Because the criteria for acceptable DEI statements are often vague, jobseekers must do the work of anticipating the ideological and political preferences of university administrators and faculty, who are disproportionately left-leaning . The MIT Communication Lab, for instance, explained that a diversity statement is an “opportunity to show that you care about the inclusion of many forms of identity in academia and in your field, including but not limited to gender, race/ethnicity, age, nationality, sexual orientation, religion, and ability status” and notes “it may be appropriate to acknowledge aspects of your own marginalized identity and/or your own privilege.” Harvard University’s Bok Center for Teaching and Learning included a list of guiding questions including, “Do you seek to identify and mitigate how inequitable and colonial social systems are reinforced in the academy by attending to and adjusting the power dynamics in your courses?”

Yet jobseekers who disagree with the ideological premises of such inquiries have an overwhelming incentive to suppress their true beliefs, or pretend to have the “right” ones, lest they be eliminated from consideration. It’s a dilemma, especially given the high stakes: As the University of California at Davis’s vice chancellor for DEI explained, “In these searches, it is the candidate’s diversity statement that is considered first; only those who submit persuasive and inspiring statements can advance for complete consideration.” In one faculty search at University of California at Berkeley, around 75 percent of applicants were screened out of consideration — irrespective of criteria such as teaching ability and research skills. Small wonder that many applicants engage in what Daniel Sargent, a history professor at UC Berkeley, calls “ performative dishonesty .”

The last thing academia — or the country — needs is another incentive for people to be insincere or dishonest. The very purpose of the university is to encourage a free exchange of ideas, seek the truth wherever it may lead, and to elevate intellectual curiosity and openness among both faculty and students. Whatever their original intent, the use of DEI statements has too often resulted in self-censorship and ideological policing. Fundamentally reconsidering them could actually strengthen DEI, by placing it on a more sustainable basis — intellectually and politically. MIT is one of the first to tackle the issue; here’s hoping it won’t be the last.

The Post’s View | About the Editorial Board

Editorials represent the views of The Post as an institution, as determined through discussion among members of the Editorial Board , based in the Opinions section and separate from the newsroom.

Members of the Editorial Board: Opinion Editor David Shipley , Deputy Opinion Editor Charles Lane and Deputy Opinion Editor Stephen Stromberg , as well as writers Mary Duenwald, Shadi Hamid , David E. Hoffman , James Hohmann , Heather Long , Mili Mitra , Eduardo Porter , Keith B. Richburg and Molly Roberts .

scholarly articles on implementation research

scholarly articles on implementation research

  African Journal of Pharmaceutical Research and Development Journal / African Journal of Pharmaceutical Research and Development / Vol. 16 No. 1 (2024) / Articles (function() { function async_load(){ var s = document.createElement('script'); s.type = 'text/javascript'; s.async = true; var theUrl = 'https://www.journalquality.info/journalquality/ratings/2405-www-ajol-info-ajopred'; s.src = theUrl + ( theUrl.indexOf("?") >= 0 ? "&" : "?") + 'ref=' + encodeURIComponent(window.location.href); var embedder = document.getElementById('jpps-embedder-ajol-ajopred'); embedder.parentNode.insertBefore(s, embedder); } if (window.attachEvent) window.attachEvent('onload', async_load); else window.addEventListener('load', async_load, false); })();  

Article sidebar.

Open Access

Article Details

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License .

Main Article Content

Unraveling the impact of electromagnetic radiation on human health: a comprehensive review, abraham ehinomhen ubhenin, joseph isabona, fatimah anura, ramatu iya idris.

This comprehensive review examines the potential health effects of prolonged exposure to electromagnetic radiation, specifically from  cell phones and base station transmitters. The study incorporates experimental and epidemiological research from reputable databases  up to 2024 to assess the biological impacts of radiofrequency (RF) radiation on various organs and health outcomes. The mechanisms of electromagnetic radiation-induced damage include DNA damage, disruptions in the blood-brain barrier, oxidative stress, and effects on cognitive function and sleep. Potential risks, such as brain tumors, cancers, fertility issues, and neurological effects, are discussed. Safety measures by organizations like ICNIRP and WHO, public awareness campaigns, and natural remedies to reduce exposure are also  addressed. Acknowledging limitations, the review calls for continued research and collaboration among stakeholders to consider new  developments. Challenges related to base station transmitter exposure, including public concerns and regulatory compliance, are  identified. In conclusion, the review emphasizes responsible decision-making and ongoing research to ensure the safe deployment of  wireless technologies while protecting public well-being from electromagnetic radiation exposure. Implementation of safety guidelines is  crucial for mitigating potential risks. 

AJOL is a Non Profit Organisation that cannot function without donations. AJOL and the millions of African and international researchers who rely on our free services are deeply grateful for your contribution. AJOL is annually audited and was also independently assessed in 2019 by E&Y.

Your donation is guaranteed to directly contribute to Africans sharing their research output with a global readership.

  • For annual AJOL Supporter contributions, please view our Supporters page.

Journal Identifiers

scholarly articles on implementation research

  • Systematic review
  • Open access
  • Published: 28 October 2021

Alignment in implementation of evidence-based interventions: a scoping review

  • Robert Lundmark   ORCID: orcid.org/0000-0001-9484-6047 1 ,
  • Henna Hasson 2 , 3 ,
  • Anne Richter 2 , 3 ,
  • Ermine Khachatryan 3 ,
  • Amanda Åkesson 3 &
  • Leif Eriksson 3  

Implementation Science volume  16 , Article number:  93 ( 2021 ) Cite this article

18 Citations

13 Altmetric

Metrics details

Alignment (i.e., the process of creating fit between elements of the inner and outer context of an organization or system) in conjunction with implementation of an evidence-based intervention (EBI) has been identified as important for implementation outcomes. However, research evidence has so far not been systematically summarized. The aim of this scoping review is therefore to create an overview of how the concept of alignment has been applied in the EBI implementation literature to provide a starting point for future implementation efforts in health care.

We searched for peer-reviewed English language articles in four databases (MEDLINE, Cinahl, Embase, and Web of Science) published between 2003 and 2019. Extracted data were analyzed to address the study aims. A qualitative content analysis was carried out for items with more extensive information. The review was reported according to the preferred reporting items for systematic reviews and meta-analyses extension for scoping review (PRISMA-ScR) guidelines.

The database searches yielded 3629 publications, of which 235 were considered potentially relevant based on the predetermined eligibility criteria, and retrieved in full text. In this review, the results of 53 studies are presented. Different definitions and conceptualizations of alignment were found, which in general could be categorized as structural, as well as social, types of alignments. Whereas the majority of studies viewed alignment as important to understand the implementation process, only a few studies actually assessed alignment. Outcomes of alignment were focused on either EBI implementation, EBI sustainment, or healthcare procedures. Different actors were identified as important for creating alignment and five overall strategies were found for achieving alignment.

Conclusions

Although investigating alignment has not been the primary focus of studies focusing on EBI implementation, it has still been identified as an important factor for the implementation success. Based on the findings from this review, future research should incorporate alignment and put a stronger emphasize on testing the effectiveness of alignment related to implementation outcomes.

Peer Review reports

Contributions to the literature

Although alignment is frequently suggested as important for successful implementation, it has rarely been the centerpiece of studies. Our study systematically collected evidence related to alignment from implementation studies in different health care settings. This is the first initiative to summarize existing research on alignment in conjunction with implementation of EBIs.

Results from this study highlight the research gaps related to alignment in the context of implementation of EBIs. Based on the gathered evidence, suggestions for theoretical development and future research are provided.

Over the last years, the concept of alignment has become frequently included in implementation studies as an explanation to why implementation of an evidence-based intervention (EBI) succeeded or failed [ 1 , 2 ]. Alignment can be understood as the process of creating a fit between elements of the inner and outer context of an organization or system [ 3 ]. The purpose of this inter-linking process is to have goals, strategies, systems, culture, needs, leadership, etc. pulling in the same direction, and thereby optimize chances of reaching desired outcomes [ 1 ]. In the context of implementing and sustaining EBIs, alignment can also involve creating a fit between the EBI itself and elements of the inner and outer context of an organization or system [ 4 ].

The need for considering alignment seems especially important when implementing EBIs in complex and pluralistic health care organizations, which are characterized by multiple objectives and diffuse power. Due to the complexity of these organizations, implementation efforts are often extra challenging. Assuring that elements of the organizations’ inner and outer context are aligned with each other, and with the EBI, is therefore critical for a successful implementation [ 4 , 5 ]. For example, an EBI may include new work practices, and for these to become realized, they need to be aligned with current practices. In turn, both old and new practices need to be aligned with organizational objectives, to increase the chances of implementation success.

However, although alignment has repeatedly been depicted as important, it has seldom been the centerpiece of implementation studies [ 1 , 4 ]. Guidance on how to consider alignment during implementation of EBI is hence sparse. The lack of guidance concerns both the alignment of the EBI with elements of the inner and outer context of a health care organization, as well as the alignment of inner and outer context elements with each other in conjunction with an EBI. Additionally, authors commonly refer to different isolated aspects of alignment depending on what is being studied (e.g., alignment of an EBI with a specific practice or policy) [ 1 ], or have placed emphasis on a specific form of alignment (e.g., inter-organizational alignment) to foster the implementation of EBIs [ 6 ]. This is also evident in frameworks commonly used to guide implementation of EBIs. For example in the consolidated framework for implementation research (CFIR) [ 7 ], creating fit between the EBI and elements of the inner context is described as important; however, the process of creating fit is not described in depth.

From a conceptual perspective, the implementation literature has to a limited extent incorporated knowledge of alignment from other disciplines. Alignment is a central theme in many business research fields, such as management, organizational behavior, manufacturing, operations, marketing, information systems, human resources, and business strategy [ 3 , 8 ]. Here, the main focus has been on two dimensions of alignment: structural and social [ 3 , 8 ]. The structural dimension of alignment strives to enable the different components of a system to pull towards a common objective. This is done, for example, by ensuring that no conflicts exist among goals, plans, workflows, procedures, or incentives within the organizational structure. The social dimension of alignment comprises stakeholders’ shared understanding of, commitment to, and acting toward common objectives. Social alignment thus refers to the alignment of cognitive, emotional, and behavioral aspects among the different actors in the organization [ 8 ]. These two dimensions are often seen as complementary. Consequently, achieving alignment among strategies, structures, and planning systems (i.e., structural alignment) is a vital prerequisite for working effectively toward a common goal. At the same time, it is also necessary to develop a shared understanding of, and commitment to, strategies and goals (i.e., social alignment) in order to achieve those goals [ 3 ].

Although the business research literature is informative for understanding the concept of alignment and gives insights to the mechanisms and components of an alignment process, it seldom involves descriptions of an alignment process when implementing EBIs in a health care context. Hence, considering alignment during implementation of an EBI in health care organizations or systems involves addressing the complexity of this setting. It also involves moving beyond the alignment of elements of the inner and outer context of an organization or system, by also taking into account the fit of the EBI with these elements. Thus, the aim of this scoping review is therefore to create an overview of how the concept of alignment has been applied in the EBI implementation literature to provide a starting point for future implementation efforts in health care.

A scoping review is conducted to get an overview of a broad topic and map the existing literature so that it can serve as a foundation for future research needs [ 9 ]. This scoping review was guided by the methodology suggested by Arksey and O’Malley [ 10 ] and the additional clarifications by Levac et al. [ 11 ]. The following five steps were performed: (1) identify the research question; (2) identify relevant studies; (3) study selection; (4) chart the data; and (5) collate, summarize, and report results. The PRISMA.ScR checklist [ 12 ] was used to guide reporting (Additional file  1 ).

Step 1: identify the research question

Considering implementation of EBI in a healthcare context, the following research questions guided the review:

How is alignment defined and conceptualized?

How has alignment been assessed, what structural and social elements is/should be aligned, what are the outcomes of alignment, how is/can alignment (be) achieved, step 2: identify relevant studies.

In collaboration with the university library at Karolinska Institutet, Sweden, a search strategy based on the research questions was developed. In an iterative search process, search terms were developed by using initially identified articles that met the inclusion criteria. When reviewing the search results, we ensured that these initially identified articles were included. The search process lasted from the beginning of February until the end of March 2019. Searches were performed in four electronic databases (MEDLINE (OVID), Cinahl (Ebsco), Embase, and Web of Science (Clarivate)). As an example, the search strategy used in Web of Science is presented in Table  1 . Search strategies for all databases are available in Additional file  2 . In addition, references in full-text articles were scanned for potential additional articles to include.

The search strategy aimed to identify peer-reviewed full text articles in English published between 2003 and 2019. Articles that were eligible consisted of empirical research, including case studies, study protocols, methodological papers, and conceptual/debate papers published in peer-reviewed journals. Included studies reported on alignment as a facilitation strategy and/or when alignment was identified to affect implementation or change. Studies included were both descriptive studies (e.g., study protocols) as well as results from implementation of EBIs in different types of healthcare settings (e.g., primary care, hospital care, social service, and community healthcare). By purpose, our search strategy, in terms of setting, study design and type of EBI was broad. Given the lack of gathered guidance, we wanted to encompass the potential multitude of ways that the concept of alignment has been used in the literature regarding implementation of EBIs.

Step 3: study selection

Articles from the search process were imported to Rayyan [ 13 ]. Next, all abstracts were screened separately by two reviewers (RL and research assistant 1). At weekly meetings, conflicts detected in Rayyan were discussed between the two reviewers, and if necessary, with a third reviewer (AR or HH). If disagreement remained after discussions, the article was included for full text screening. Throughout the review process, we exercised the recommended approach for retaining a high inter-reviewer reliability when reviewing topics that may include difficult judgements [ 14 ]. In this case, primarily to make sure that selected studies included a conceptual use of alignment (i.e., that describe inter-linkage of an EBI with elements of the inner and outer context of a health care organization, or elements of the inner and outer context with each other as a consequence of implementing an EBI). Use of the term alignment to describe other phenomenon’s (e.g., that results are in alignment with previous findings, or alignment between salary and performance) were excluded.

Included articles were divided among three reviewers (EK, AÅ, and research assistant 2) and two reviewers assessed each article in full text separately. When there was disagreement between the two reviewers, a third reviewer read the article and the article was discussed by all the authors at a weekly meeting as a learning opportunity and for reaching a consensus decision. Almost one-third of the articles (70 out of 235) were read by three reviewers and discussed by all authors.

Step 4: chart the data

A data charting guide was developed by all the authors in Excel and as a first step nine articles were selected for full-text reading and tested independently by five of the authors (RL, AR, EK, AÅ, and LE). Two authors (EK and AÅ) read and extracted data from all nine articles, while the others (RL, AR, and LE) reviewed four articles each. Thus, at least three reviewers reviewed each of the nine articles. Extracted data were compared and discussed among reviewers, resulting in some modifications in the data-charting guide. The final items for data charting are presented in Table  2 and their definitions in Additional file  3 .

Out of the 235 full text articles that were read, 53 remained for data extraction and synthesis. The 53 full-text articles were divided between two authors (EK and AÅ), who independently read the articles and extracted data. To begin with and as a quality control both authors (EK and AÅ) read 12 articles together with a third person (RL or LE), followed by a discussion. Besides minor disagreements that generated adjustments in the data-charting guide, there was a generally good consensus between the reviewers.

Step 5: collate, summarize, and report results

All data were stored and handled in Excel. A synthesis of the literature was provided by summarizing items and reporting them in text, tables, and figures. For descriptive information, such as study design, we used information provided in the article. Items with more extensive information underwent an inductive qualitative content analysis inspired by Elo and Kyngas [ 15 ]. This process implied reading each extract and assigning it a code. Thereafter, for each item, codes were grouped, based on commonality, into categories at different levels. The summary and synthesis of data was handled by four authors (RL, EK, AÅ and LE). Thereafter, all other authors were consulted at regular meetings to discuss the analysis and ensure agreement about results and synthesis.

The searches generated 3629 potentially relevant articles. After removal of duplicates, 2076 articles remained and underwent screening of abstracts. The screening resulted in 235 articles included for full text assessment. Finally, after the full text assessment, 53 articles were included in this review. The screening process and reasons for exclusion are presented in a PRISMA flow diagram [ 16 ] (Fig.  1 ).

figure 1

PRISMA flow diagram

Study and EBI characteristics

The articles included in this review are studies with different designs (Table  3 ). The majority of the articles presented results for a performed EBI ( n = 50), whereas three studies planned to evaluate an EBI (study protocols). The most common study designs were case study and cross sectional study. A majority of the studies (62.3%) used a single data collection method, where the most common method used was interview, followed by survey, document review, and observation. When multiple data collection methods were used (37.7%), the most common combinations were interview, together with survey, followed by interviews together with document review. Four studies used multiple methods with other types of combinations of data collection methods.

The majority of the studies were carried out in North America ( n = 28, 52.8%), followed by Europe ( n = 11, 20.8%), Oceania ( n = 6, 11.3%), Africa ( n = 7, 13.2%), and Asia ( n = 1, 1.9%). The included studies described the implementation of various types of EBIs, mostly in a hospital care setting (Table  3 ). A majority of the articles focused on the implementation of strategies/practices, and within that group, about one-third ( n = 11) were different types of e-health initiatives [ 17 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 , 27 , 28 ]. Almost half of the EBIs tried to improve health outcomes in an organization, whereas the remaining EBIs targeted health outcomes on population level, health system development, and reorganization of care services. A detailed list of the study characteristics of the 53 included articles is available in Additional file  4 .

All the included articles referred to alignment as an important factor to be considered during implementation of an EBI and/or as an explanation of findings (i.e., either as a lack of, or as an important part of reaching results). In most of the studies alignment referred to elements within an organization ( n = 33, 62.3%), between organizations ( n = 15, 28.3.6%) or on health system level ( n = 5, 9.4%) (Additional file  5 , column 5). However, a clear definition of alignment was seldom provided. Of the 53 included articles, 12 provided a definition of alignment [ 6 , 19 , 22 , 26 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 36 ]. Six of the definitions were the authors’ own (i.e., no reference where given), the other six provided a reference for their definition [ 19 , 22 , 29 , 33 , 35 , 36 ]. Most of these definitions either focused on a specific aspect of alignment (e.g., service charter with goals) or used a general definition of the concept (e.g., interdependency of all human, organization, and technology elements). Furthermore, of the 12 articles providing a definition of alignment, only one clearly expressed (on a general level) that the EBI should be aligned with elements of an organization [ 32 ] (see also Additional file  6 ).

Beyond these 12 definitions, Hilligoss et al. [ 1 ] provided a more extensive description of alignment of EBI, in which they distinguished between, structural and social alignment. Structural alignment was the alignment of surface-level structures and processes (e.g., integrating EBI with existing routines) and adjusting existing practices to align with new routines. Social alignment was human elements, such as cognitive and sociocultural aspects of stakeholders (e.g., congruence among the perceptions of different actors).

Eleven articles used an existing implementation framework ( n = 5) and/or organizational change theories ( n = 2), or developing a framework ( n = 4) to facilitate the conceptualization of alignment. The consolidated framework for implementation research (CFIR) [ 7 ] was used in two articles [ 37 , 38 ], and the integrated-promoting action on research implementation in health services (i-PARIHS) [ 39 ] in another [ 40 ]. One article [ 21 ] builds on organizational theory of implementation effectiveness [ 41 ], another article [ 6 ] expanded on the exploration, preparation, implementation, sustainment (EPIS) framework [ 42 ], and yet another article [ 37 ] on the national implementation research network (NIRN) frameworks [ 43 ]. Additionally, in two articles [ 30 , 44 ] organizational theories (i.e., relational development system theory and goal-setting theory) were used to explicate mechanisms for enabling alignment. In one article, a conceptual model for healthcare organizational transformation, with alignment as a central component, was developed [ 33 ]. Likewise, a change model with alignment as a key factor was developed based on implicit motivational theories to clarify how alignment facilitates implementation of EBI [ 1 ]. Two articles [ 28 , 45 ] developed evaluation models that included alignment as a central component.

A total of 8 out of 53 articles assessed alignment [ 6 , 19 , 26 , 30 , 35 , 44 , 46 , 47 ], whereas the 45 remaining articles identified alignment as an important factor when analyzing, presenting, and discussing results, but without directly assessing alignment (Additional file  5 , column 2). Among the eight articles assessing alignment, five [ 6 , 19 , 26 , 30 , 35 ] had a definition of alignment (Additional file  6 ), while the remaining three [ 44 , 46 , 47 ] lacked a definition. Alignment data was collected by surveys [ 6 , 44 , 46 , 47 ], interviews [ 6 , 26 , 30 , 35 ], observations [ 19 , 26 ], and reviews of documents [ 26 , 35 ]. Three studies used multiple data collection methods [ 6 , 26 , 35 ], whereas the other five studies used a single method. For example, Walston and Chou [ 44 ] used a single method—employee survey—to evaluate alignment at 10 hospitals. The survey measured alignment as a function of goal commitment, goal clarity, goal acceptance, goal specificity, staff participation, available skill set, and knowledge, controlling for hospital size. Another study using a single method was Zaff et al. [ 6 ] where qualitative data was collected from several community levels and alignment across levels were assessed using cross-case analysis. Nabyonga-Orem et al. [ 35 ] used predetermined parameters for alignment, and reviewed strategic planning processes to assess impact on realizing alignment and conducted interviews at different health system levels to get views on efforts to ensure alignment. Iveroth et al. [ 26 ] was another study using multiple methods—key questions were asked to respondents about their experience and understanding of information technology, strategies used, and information technology alignment. Interviews were supplemented with observations and document reviews to add richness.

All of the included studies included information on what should be aligned. We found three types of structural dimensions: visions and goals, system and processes (e.g., workflows, and operations), and resources and competing tasks (e.g., priorities, concurrent programs). We also found three types of social dimensions: behaviors (e.g., leadership and staff actions), thoughts and emotions (e.g., values and understandings), and interpersonal aspects (e.g., culture/climate and relationships). A high degree of alignment of these structural and social dimensions with the implementation object (or each other) were, in all cases, suggested as important for implementation outcomes. Of the 53 included articles, 25 focused only on aspects of structural alignment, and eight only on aspects of social alignment. The remaining 20 included both structural and social alignment (Table  4 , see also Additional file  5 , column 3). Contrary to the focus of the main part of the 12 definitions of alignment (presented above), only 8 of the 53 articles focused only on alignment between inner and outer contextual elements of an organization or system with each other [ 31 , 36 , 37 , 47 , 48 , 49 , 50 , 51 ]. For example, aligning leadership across organizational levels [ 47 ], or goals and cultures across organizations [ 37 ], were in these eight articles highlighted as important facilitators during implementation of the EBI. The remaining 45 articles instead focused mainly on alignment between the EBI and social and/or structural elements of the organization or system.

In the included articles, we identified different outcomes of alignment in a chain-of-effect continuum [ 72 ]. These outcomes are summarized in three categories: EBI implementation, EBI sustainment, and healthcare performance (Table  5 and Additional file  5 , column 4). Most common where descriptions of alignment as a vital facilitator in the process of implementing the EBI, or as a concluded failure of EBI implementation as a result of stakeholders not considering alignment. In turn, alignment, or lack of alignment, during implementation of an EBI also affects the sustainment of the EBI and the health care performance of the organizations. Some articles included several outcomes and are therefore mentioned in more than one category.

Different actors and strategies were identified as central for achieving alignment. Actors that were involved in creating and/or sustaining alignment were leaders, healthcare providers, change agents, administrative staff, community actors, policymakers, patients, and others (see Table  6 and Additional file  5 , column 6 for an overview). The majority of the articles mentioned involved actors from more than one group and emphasized the importance of a collaborative process to create and/or sustain alignment. The most commonly mentioned actors were leaders and healthcare providers, highlighted in more than half of the articles in this review as having a crucial role in creating and/or sustaining alignment.

Besides actors that drive alignment (see Table  6 ), five categories of strategies were identified as important to achieve alignment: design and prepare, contextualize, communicate, motivate, and evaluate (Table  7 and Additional file  5 , column 7). Most commonly identified categories were design and prepare and contextualize, which both contain different types of adaption of the EBI and/or the context in order to assure alignment.

In this scoping review, we identified 53 articles that touched upon the concept of alignment in relation to EBI implementation in health care. The studies represented a large span of settings and different types of EBIs, which may indicate that alignment as a concept is important, independent of the context or implementation object. In all included studies, alignment between an EBI and elements of the inner or outer context of the organization or system (e.g., goals or behaviors), and/or between contextual elements with each other, was considered important for outcomes. Yet, alignment was seldom clearly defined or empirically measured. Instead, most studies retrospectively considered alignment, or lack of alignment, as an important factor that could help explain outcomes. Thus, although the results from the studies included in this review indicate that alignment is important for EBI implementation and its outcomes, there is a lack of solid data to support this. We propose that future studies in implementation science could benefit of including a hypothesis on and a direct evaluation of alignment. Using rigorous study designs and proper measurement of alignment may clarify the effects of alignment on implementation outcomes, e.g., by integration of questions asking about stakeholders’ perceptions of alignment and looking at consistency of behaviors.

Depending on what is to be aligned (e.g., EBI with elements of the inner or outer context, or elements of the inner or outer context with each other), we also encourage future research to examine the relative importance of different actors and strategies. Likewise, future studies should strive to clarify what outcomes to expect depending on form of alignment, as the studies included in this review did not provide sufficient information in this regard. Additionally, only eight of the studies focused on alignment between elements of the inner and outer context with each other—across levels, functions and/or organizational boundaries. Considering the complexity and multi-level nature of the implementation process, this suggests that there is a need for more research focusing on mechanisms and effects beyond that of only aligning the EBI with specific elements of the inner and/or outer context.

In a majority of the studies included in this review, the explicit focus was on structural alignment, focusing on alignment of an EBI with the organization’s processes, as well as vision and goals, or between these structural elements with each other (e.g., processes with goals). Social alignment (i.e., alignment of behaviors, thoughts and emotions, and culture and social aspects) was somewhat less explicitly studied. At the same time, the results also show that most studies emphasized the actions of different stakeholders (e.g., leaders, healthcare providers, change agents, and administrative staff) as being important for creating and sustaining alignment. The importance of this shared process of including relevant stakeholders could thereby be viewed as an indication that social alignment is important to achieve structural alignment, and that the two forms of alignment are complementary. In other words, structural alignment (e.g., aligning the EBI with current practices or with available resources) may be necessary for the chance to act in a new way. However, for change to occur, the EBI also must be aligned with stakeholders’ perceptions (e.g., that the EBI is in line with their values and culture) and stakeholders’ perceptions aligned with each other. Considering that structural and social aspects of alignment goes hand in hand, we suggest that future research should more clearly focus on their complementation. This complementary approach could, for example, involve developing and evaluating strategies that target the inter-linking of structural and social alignment elements (e.g., between goals and behaviors). Here, organizational climate and culture literature [ 73 ], which often discusses the inter-relatedness between structural and social aspects, may be of particular interest to refer to when addressing this question.

We identified several strategies that were considered important to achieve alignment. These strategies related to the design and preparation, contextualization, communication, motivation, and evaluation of the EBI. Similar strategies have been concluded to be important elements of an EBI implementation in general [ 74 ]. Thus, strategies to achieve alignment should, perhaps, not be understood as one more thing to do, but rather as a complement that can be integrated with already suggested important implementation strategies. For example, when communicating EBI implementation goals, one could explain how successful implementation contributes to reaching the overall goals and visions of the organization. Considering alignment may also ensure a smoother implementation, as it may clarify the need to make appropriate adaptations to the EBI, or to the structures and processes of the organization or communities, where the implementation is taking place.

Some of the studies included in this review build on an implementation framework or a change theory [ 1 , 6 , 21 , 30 , 33 , 37 , 38 , 40 , 44 ]. Among these, CFIR [ 7 ], i-PARISH [ 39 ], and EPIS [ 42 ] are commonly used within implementation science and considered important to guide EBI implementation. Alignment or fit is briefly mentioned in all these frameworks. In CFIR, to create fit is brought up in relation to the innovation and the inner context (implementation climate) elements, e.g., the importance of aligning characteristics of an innovation with norms and values among individuals in the inner context [ 7 ]. In the i-PARIHS framework, the degree of fit of the innovation with existing practice and value was highlighted [ 39 ]. Further, the facilitator, which is the active ingredient of implementation in i-PARIHS, is considered responsible to oversee alignment between the innovation and the other elements (context and recipients). In the EPIS framework, fit is mentioned in relation to the innovation and the context elements [ 42 ]. In addition, there is a component called bridging factors, which recognize the connection and relationship between the inner and the outer context, and the implementation process. Although not explicitly mentioning alignment or fit in this component, we interpret alignment as central here in achieving fit of the EBI. In our review we have identified elements that should be aligned (structural and social), actors involved in alignment (e.g., leaders, healthcare providers and change agents) and strategies to achieve alignment (e.g., design, contextualize, communicate and evaluate). Altogether, our findings suggest that alignment is complex and concern several components outlined in these implementation frameworks. Thus, one potential important next step could be to integrate alignment to encompass all the connections between elements included in this review matching the included components of the implementation frameworks. For example, when designing and preparing for implementation of an EBI stakeholder could consider and assess what structural and social elements of an organization or system that the EBI needs to be aligned with to become successful. Furthermore, stakeholders could also assess whether the current alignment of elements in the organization and/or system need to be re-aligned considering the changes introduced by the EBI.

Thus, to further integrate, and put to concrete use, the findings of this scoping review we suggest a process of three major steps to guide the understanding, creation and assessment of alignment in conjunction with implementing EBIs: .

First, attention should be paid to the alignment of the EBI with structural and social dimension of the inner and outer context of an organization and/or system. We could not find any clear definition of EBI alignment in the scoped literature. However, in line with Hilligoss et al. [ 1 ], we suggest that taking a practical perspective may be useful basis for considering alignment during implementation of EBIs. A practical perspective focus on explaining how an organization and/or system move from one state to another, viewing actions as consequences of organizational and social structures [ 75 ]. As EBI alignment involves alignment in the context of change, moving beyond the traditional present state focused organizational alignment, we propose a definition that focuses on the actual alignment of the EBI. Thus, EBI alignment primarily involves creating a fit between an EBI and structural (e.g., visions, goals, system, processes, resources, and competing tasks) as well as social (e.g., behaviors, thoughts, emotions, and interpersonal) elements of the inner and outer context of an organization or system.

Second, implementation of an EBI, not only requires alignment of an EBI with structural and social elements of the organization and/or system, but may also involve considering re-alignment of these elements with each other (i.e., organizational alignment) to facilitate and sustain the introduced change. Implementation of an EBI can cause a ripple effect (i.e., a series of events in a system, resulting in the evolvement of new structures of interactions and new shared meanings) [ 76 ]. Therefore, activities to re-align structural and social elements of the inner and outer context may be needed as a consequence of the implementation of an EBI.

Third, considering alignment across different organizational and/or system levels and functions can be assisted by considering both a vertical—top-down— perspective (e.g., alignment between main objectives and departments’ objectives), and horizontal—sideway—perspective (e.g., between different priorities within a department) [ 6 ]. Thus, the (re-) alignment between structural and social elements should not be viewed as a process between two isolated elements, but rather as potentially involving all affected levels and functions, structurally and socially.

Limitations

In this review, we only considered peer-reviewed articles explicitly using the term “alignment.” Terms with similar meaning, such as “collaboration,” “coordination,” or “consistency,” were therefore excluded. This may have led to the exclusion of literature potentially contributing to the understanding of the importance of interrelatedness between different variables. However, although there is some overlap between these terms, from a conceptual perspective they do not have the same exact meaning and constitute somewhat different mechanisms for creating fit between variables. Using a wider scope would also have, from our perspective, risked making this review too extensive, less comprehensible, and most importantly less theoretically substantiated. Throughout we used a rigorous process with several reviewers involved to make decisions on exclusion to ensure that set criteria were followed in order to capture the conceptual use of alignment in implementation literature.

The included studies evaluated a wide variety of EBIs in many different settings with limited commonalities in regard to content (e.g., health promotion intervention implementation on a national level in Africa and improvement effort in American hospitals). We believe that this review reflects this contextual and substantive breadth of implementation science, and as such, contributes to the understanding of how alignment can be conceptualized across settings and type of intervention. By categorizing the literature based on the type and level of intervention (Additional file  4 ), we have tried to facilitate readers who wish to find relevant literature for a specific form of intervention or setting.

Although seldom the centerpiece of implementation studies, alignment is proposed to play an important role for outcomes of implementation of EBIs. In this scoping review, we identify the current knowledge produced on how alignment is conceptualized in the implementation field, how it has been measured, and what elements should align with the implementation object, and/or with each other. We also examine its relation to outcomes, as well as who and what activities are involved in achieving alignment. Based on these findings, we recommend that the concept of alignment be given a more profound role in the design and evaluation of healthcare EBIs.

Availability of data and materials

All data generated or analyzed during this study are included in this published article and its additional files.

Abbreviations

Community based operations

The consolidated framework for implementation research

The integrated-promoting action on research implementation in health services framework

Evidence-based intervention

The exploration, preparation, implementation, sustainment framework

The national implementation research network frameworks

Randomized control trial

The preferred reporting items for systematic reviews and meta-analyses extension for scoping review guidelines

Hilligoss B, Song PH, McAlearney AS. Aligning for accountable care: Strategic practices for change in accountable care organizations. Health Care Manag Rev. 2017;42(3):192–202. https://doi.org/10.1097/HMR.0000000000000110 .

Article   Google Scholar  

Bokhour BG, Fix GM, Mueller NM, Barker AM, Lavela SL, Hill JN, et al. How can healthcare organizations implement patient-centered care? Examining a large-scale cultural transformation. BMC Health Serv Res. 2018;18(1):168.

Article   PubMed   PubMed Central   Google Scholar  

Kathuria R, Joshi Maheshkumar P, Porth SJ. Organizational alignment and performance: past, present and future. Manag Decis. 2007;45(3):503–17.

Ford RC, Sivo SA, Fottler MD, Dickson D, Bradley K, Johnson L. Aligning internal organizational factors with a service excellence mission: an exploratory investigation in health care. Health Care Manag Rev. 2006;31(4):259–69.

von Thiele Schwarz U, Hasson H. Alignment for Achieving a Healthy Organization. In: Bauer G., Jenny G. eds. Salutogenic organizations and change. Dordrecht: Springer; 2013. https://doi.org/10.1007/978-94-007-6470-5_7 .

Lyon AR, Whitaker K, Locke J, Cook CR, King KM, Duong M, et al. The impact of inter-organizational alignment (IOA) on implementation outcomes: evaluating unique and shared organizational influences in education sector mental health. Implement Sci. 2018;13(1):24. https://doi.org/10.1186/s13012-018-0721-1 .

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

Parisi C. The impact of organisational alignment on the effectiveness of firms’ sustainability strategic performance measurement systems: an empirical analysis. J Manag Gov. 2013;17(1):71–97.

Peterson J, Pearce PF, Ferguson LA, Langford CA. Understanding scoping reviews: Definition, purpose, and process. J Am Assoc Nurse Pract. 2017;29(1):12–6. https://doi.org/10.1002/2327-6924.12380 .

Article   PubMed   Google Scholar  

Arksey H, O'Malley L. Scoping studies: Towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32.

Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69.

Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73. https://doi.org/10.7326/M18-0850 .

Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5(1):210. https://doi.org/10.1186/s13643-016-0384-4 .

Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JPA, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol. 2009;62(10):e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .

Elo S, Kyngas H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15. https://doi.org/10.1111/j.1365-2648.2007.04569.x .

Moher D, Liberati A, Tetzlaff J, Altman DG, The PG. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med. 2009;6(7):e1000097. https://doi.org/10.1371/journal.pmed.1000097 .

Greenhalgh T, Morris L, Wyatt JC, Thomas G, Gunning K. Introducing a nationally shared electronic patient record: case study comparison of Scotland, England, Wales and Northern Ireland. Int J Med Inform. 2013;82(5):e125–38. https://doi.org/10.1016/j.ijmedinf.2013.01.002 .

Wade VA, Taylor AD, Kidd MR, Carati C. Transitioning a home telehealth project into a sustainable, large-scale service: a qualitative study. BMC Health Serv Res. 2016;16:183.

Gebre-Mariam M. Governance lessons from an interorganizational health information system implementation in Ethiopia. Electron J Inf Syst Dev Ctries. 2018;84:e12045.

Reszel J, Dunn SI, Sprague AE, Graham ID, Grimshaw JM, Peterson WE, et al. Use of a maternal newborn audit and feedback system in Ontario: a collective case study. BMJ Qual Saf. 2019;28(8):635–44. https://doi.org/10.1136/bmjqs-2018-008354 .

Shaw RJ, Kaufman MA, Bosworth HB, Weiner BJ, Zullig LL, Lee SY, et al. Organizational factors associated with readiness to implement and translate a primary care based telemedicine behavioral program to improve blood pressure control: the HTN-IMPROVE study. Implement Sci. 2013;8:106.

Yusof MM. A case study evaluation of a Critical Care Information System adoption using the socio-technical and fit approach. Int J Med Inform. 2015;84(7):486–99. https://doi.org/10.1016/j.ijmedinf.2015.03.001 .

Sorensen AV, Harrison MI, Kane HL, Roussel AE, Halpern MT, Bernard SL. From research to practice: factors affecting implementation of prospective targeted injury-detection systems. BMJ Qual Saf. 2011;20(6):527–33. https://doi.org/10.1136/bmjqs.2010.045039 .

Article   CAS   PubMed   Google Scholar  

Piscotty RJ, Tzeng HM. Exploring the clinical information system implementation readiness activities to support nursing in hospital settings. Comput Inform Nurs. 2011;29(11):648–56.

Nazi KM. The personal health record paradox: health care professionals’ perspectives and the information ecology of personal health record systems in organizational and clinical settings. J Med Internet Res. 2013;15(4):e70. https://doi.org/10.2196/jmir.2443 .

Iveroth E, Fryk P, Rapp B. Information technology strategy and alignment issues in health care organizations. Health Care Manag Rev. 2013;38(3):188–200. https://doi.org/10.1097/HMR.0b013e31826119d7 .

Nicks SE, Weaver NL, Recktenwald A, Jupka KA, Elkana M, Tompkins R. Translating an Evidence-Based Injury Prevention Program for Implementation in a Home Visitation Setting. Health Promot Pract. 2016;17(4):578–85. https://doi.org/10.1177/1524839915622196 .

Postema TR, Peeters JM, Friele RD. Key factors influencing the implementation success of a home telecare application. Int J Med Inform. 2012;81(6):415–23. https://doi.org/10.1016/j.ijmedinf.2011.12.003 .

Schmit C, d’Hoore W, Lejeune C, Vas A. Predictors of successful organizational change: The alignment of goals, logics of action and leaders’ roles to initiate clinical pathways. Int J Care Pathw. 2011;15(1):4–14.

Zaff JF, Jones EP, Aasland K, Donlan AE, Lin ES, Prescott JE, et al. Alignment of perceived needs across levels of a community. J Appl Dev Psychol. 2015;40:8–16.

Thomassen JP, Ahaus K, Van de Walle S. Developing and implementing a service charter for an integrated regional stroke service: an exploratory case study. BMC Health Serv Res. 2014;14:141.

Abejirinde IO, Ingabire CM, van Vugt M, Mutesa L, van den Borne B, Busari JO. Qualitative analysis of the health system effects of a community-based malaria elimination program in Rwanda. Res Rep Trop Med. 2018;9:63–75. https://doi.org/10.2147/RRTM.S158131 .

Lukas CV, Holmes SK, Cohen AB, Restuccia J, Cramer IE, Shwartz M, et al. Transformational change in health care systems: an organizational model. Health Care Manag Rev. 2007;32(4):309–20. https://doi.org/10.1097/01.HMR.0000296785.29718.5d .

Kertesz SG, Austin EL, Holmes SK, Pollio DE, Schumacher JE, White B, et al. Making housing first happen: organizational leadership in VA's expansion of permanent supportive housing. J Gen Intern Med. 2014;29(Suppl 4):835–44.

Nabyonga-Orem J, Nabukalu BJ, Andemichael G, Khosi-Mthetwa R, Shaame A, Myeni S, et al. Moving towards universal health coverage: the need for a strengthened planning process. Int J Health Plann Manag. 2018;33(4):1093–109. https://doi.org/10.1002/hpm.2585 .

Carroll V, Reeve CA, Humphreys JS, Wakerman J, Carter M. Re-orienting a remote acute care model towards a primary health care approach: key enablers. Rural Remote Health. 2015;15(3):2942.

PubMed   Google Scholar  

Selick A, Durbin J, Casson I, Lee J, Lunsky Y. Barriers and facilitators to improving health care for adults with intellectual and developmental disabilities: what do staff tell us? Health Promot Chronic Dis Prev Can Res Policy Pract. 2018;38(10):349–57.

Teeters LA, Heerman WJ, Schlundt D, Harris D, Barkin SL. Community readiness assessment for obesity research: pilot implementation of the Healthier Families programme. Health Res Policy Syst. 2018;16(1):2. https://doi.org/10.1186/s12961-017-0262-0 .

Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11(1):33.

Bayly M, Forbes D, Blake C, Peacock S, Morgan D. Developing and implementation of dementia-related integrated knowledge translation. Online J Rural Nurs Health Care. 2018;18(2):29–64. https://doi.org/10.14574/ojrnhc.v18i2.509 .

Weiner BJ, Lewis MA, Linnan LA. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs. Health Educ Res. 2008;24(2):292–305. https://doi.org/10.1093/her/cyn019 .

Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14(1):1. https://doi.org/10.1186/s13012-018-0842-6 .

(NIRN) NIRN. National implementation research network’s active implementation hub: University of North Carolina Chapel Hill’s FPG Child Development Institute; 2020. Available from: http://implementation.fpg.unc.edu/ . Accessed 26 Apr 2021.

Walston SL, Chou AF. Healthcare restructuring and hierarchical alignment: why do staff and managers perceive change outcomes differently? Med Care. 2006;44(9):879–89. https://doi.org/10.1097/01.mlr.0000220692.39762.bf .

Reedy AM, Luna RG, Olivas GS, Sujeer A. Local public health performance measurement: implementation strategies and lessons learned from aligning program evaluation indicators with the 10 essential public health services. J Public Health Manag Pract. 2005;11(4):317–25. https://doi.org/10.1097/00124784-200507000-00010 .

Wood SJ. Cascading strategy in a large health system: Bridging gaps in hospital alignment through implementation. Health Serv Manag Res. 2019;32(3):113–23. https://doi.org/10.1177/0951484818805371 .

O'Reilly CA, Caldwell DF, Chatman JA, Lapiz M, Self W. How leadership matters: The effects of leaders’ alignment on strategy implementation. Leadersh Q. 2010;21(1):104–13.

Nelson WA, Taylor E, Walsh T. Building an ethical organizational culture. Health Care Manag (Frederick). 2014;33(2):158–64. https://doi.org/10.1097/HCM.0000000000000008 .

Buzza CD, Williams MB, Vander Weg MW, Christensen AJ, Kaboli PJ, Reisinger HS. Part II, provider perspectives: should patients be activated to request evidence-based medicine? A qualitative study of the VA project to implement diuretics (VAPID). Implement Sci. 2010;5:24.

Adsul P, Wray R, Gautam K, Jupka K, Weaver N, Wilson K. Becoming a health literate organization: Formative research results from healthcare organizations providing care for undeserved communities. Health Serv Manag Res. 2017;30(4):188–96. https://doi.org/10.1177/0951484817727130 .

Wright J, Dugdale B, Hammond I, Jarman B, Neary M, Newton D, et al. Learning from death: a hospital mortality reduction programme. J R Soc Med. 2006;99(6):303–8. https://doi.org/10.1258/jrsm.99.6.303 .

Rycroft-Malone J, Burton CR, Wilkinson J, Harvey G, McCormack B, Baker R, et al. Collective action for implementation: a realist evaluation of organisational collaboration in healthcare. Implement Sci. 2016;11:17.

Harrison MI, Paez K, Carman KL, Stephens J, Smeeding L, Devers KJ, et al. Effects of organizational context on Lean implementation in five hospital systems. Health Care Manag Rev. 2016;41(2):127–44. https://doi.org/10.1097/HMR.0000000000000049 .

Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. Nursing unit leaders’ influence on the long-term sustainability of evidence-based practice improvements. J Nurs Manag. 2016;24(3):309–18. https://doi.org/10.1111/jonm.12320 .

Vos L, Duckers ML, Wagner C, van Merode GG. Applying the quality improvement collaborative method to process redesign: a multiple case study. Implement Sci. 2010;5:19.

Egeland KM, Skar AS, Endsjo M, Laukvik EH, Baekkelund H, Babaii A, et al. Testing the leadership and organizational change for implementation (LOCI) intervention in Norwegian mental health clinics: a stepped-wedge cluster randomized design study protocol. Implement Sci. 2019;14(1):28. https://doi.org/10.1186/s13012-019-0873-7 .

de Savigny D, Webster J, Agyepong IA, Mwita A, Bart-Plange C, Baffoe-Wilmot A, et al. Introducing vouchers for malaria prevention in Ghana and Tanzania: context and adoption of innovation in health systems. Health Policy Plan. 2012;27(Suppl 4):iv32–43.

Rahm AK, Boggs JM, Martin C, Price DW, Beck A, Backer TE, et al. Facilitators and barriers to implementing Screening, Brief Intervention, and Referral to Treatment (SBIRT) in primary care in integrated health care settings. Subst Abus. 2015;36(3):281–8. https://doi.org/10.1080/08897077.2014.951140 .

Kawonga M, Blaauw D, Fonn S. Aligning vertical interventions to health systems: a case study of the HIV monitoring and evaluation system in South Africa. Health Res Policy Syst. 2012;10:2.

Schneider H, English R, Tabana H, Padayachee T, Orgill M. Whole-system change: case study of factors facilitating early implementation of a primary health care reform in a South African province. BMC Health Serv Res. 2014;14:609.

McIntyre EM, Baker CN, Overstreet S. New Orleans Trauma-Informed Schools Learning C. Evaluating foundational professional development training for trauma-informed approaches in schools. Psychol Serv. 2019;16(1):95–102. https://doi.org/10.1037/ser0000312 .

Healey J, Conlon CM, Malama K, Hobson R, Kaharuza F, Kekitiinwa A, et al. Sustainability and Scale of the Saving Mothers, Giving Life Approach in Uganda and Zambia. Glob Health Sci Pract. 2019;7(Suppl 1):S188–206. https://doi.org/10.9745/GHSP-D-18-00265 .

Laws R, Hesketh KD, Ball K, Cooper C, Vrljic K, Campbell KJ. Translating an early childhood obesity prevention program for local community implementation: a case study of the Melbourne InFANT Program. BMC Public Health. 2016;16:748.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Sarkies MN, White J, Morris ME, Taylor NF, Williams C, O'Brien L, et al. Implementation of evidence-based weekend service recommendations for allied health managers: a cluster randomised controlled trial protocol. Implement Sci. 2018;13(1):60. https://doi.org/10.1186/s13012-018-0752-7 .

Pucher KK, Candel MJ, Krumeich A, Boot NM, De Vries NK. Effectiveness of a systematic approach to promote intersectoral collaboration in comprehensive school health promotion-a multiple-case study using quantitative and qualitative data. BMC Public Health. 2015;15:613.

Stumbo SP, Ford JH 2nd, Green CA. Factors influencing the long-term sustainment of quality improvements made in addiction treatment facilities: a qualitative study. Addict Sci Clin Pract. 2017;12(1):26. https://doi.org/10.1186/s13722-017-0093-x .

Glisson C, Williams NJ, Hemmelgarn A, Proctor E, Green P. Aligning organizational priorities with ARC to improve youth mental health service outcomes. J Consult Clin Psychol. 2016;84(8):713–25. https://doi.org/10.1037/ccp0000107 .

Margolis PA, DeWalt DA, Simon JE, Horowitz S, Scoville R, Kahn N, et al. Designing a large-scale multilevel improvement initiative: the improving performance in practice program. J Contin Educ Heal Prof. 2010;30(3):187–96. https://doi.org/10.1002/chp.20080 .

Turner S, Ramsay A, Perry C, Boaden R, McKevitt C, Morris S, et al. Lessons for major system change: centralization of stroke services in two metropolitan areas of England. J Health Serv Res Policy. 2016;21(3):156–65. https://doi.org/10.1177/1355819615626189 .

Freeman T, Baum F, Labonte R, Javanparast S, Lawless A. Primary health care reform, dilemmatic space and risk of burnout among health workers. Health (London). 2018;22(3):277–97.

Kegeles SM, Rebchook G, Tebbetts S, Arnold E, Team T. Facilitators and barriers to effective scale-up of an evidence-based multilevel HIV prevention intervention. Implement Sci. 2015;10:50.

von Thiele SU, Lundmark R, Hasson H. The Dynamic Integrated Evaluation Model (DIEM): achieving sustainability in organizational intervention through a participatory evaluation approach. Stress Health. 2016;32(4):285–93.

Schneider B, Ehrhart MG, Macey WH. Organizational climate and culture. Annu Rev Psychol. 2013;64(1):361–88.

Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

Feldman MS, Orlikowski WJ. Theorizing practice and practicing theory. Organ Sci. 2011;22(5):1240–53.

Jagosh J, Bush PL, Salsberg J, Macaulay AC, Greenhalgh T, Wong G, et al. A realist evaluation of community-based participatory research: partnership synergy, trust building and related ripple effects. BMC Public Health. 2015;15(1):725.

Download references

Acknowledgements

We would like to thank librarians GunBrit Knutssøn and Sabina Gillsund at the Karolinska Institutet University Library (KIB), for helping with the development of search strategies and literature searches. We would also like to thank Anton Tollin and Josefine Larsson for their work in the initial parts of the scoping review process.

We received no funding for this scoping review. Information on funding of the included articles is reported in each article or journal site. Funding has not been subject to any analysis in this scoping review as we consider funding to be of little interest given our research questions. Open Access funding provided by Umea University.

Author information

Authors and affiliations.

Department of Psychology, Umeå University, SE 901 87, Umeå, Sweden

Robert Lundmark

Procome research group, Department of Learning, Informatics, Management and Ethics, Medical Management Centre, Karolinska Institutet, SE 171 77, Stockholm, Sweden

Henna Hasson & Anne Richter

Unit for implementation and evaluation, Center for Epidemiology and Community Medicine, Stockholm County Council, SE 171 29, Stockholm, Sweden

Henna Hasson, Anne Richter, Ermine Khachatryan, Amanda Åkesson & Leif Eriksson

You can also search for this author in PubMed   Google Scholar

Contributions

RL, AR, and HH developed the study conception and design, identified the research questions and designed the search strategy. RL, AR, EK, AÅ, and LE screened search results and reviewed articles against inclusion criteria. EK and AÅ extracted data and assessed articles with support from RL, AR and LE. RL, ER, AÅ, and LE analyzed the findings and synthesized the results. RL and LE drafted the manuscript. All authors critically revised the manuscript and approved the final version.

Corresponding author

Correspondence to Robert Lundmark .

Ethics declarations

Ethics approval and consent to participate.

Not applicable

Consent for publication

Competing interests.

All authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1..

PRSIMA-ScR checklist.

Additional file 2: Table A1–A4.

Search strategies.

Additional file 3: Table A5.

Definitions of items for data charting.

Additional file 4: Table A6.

Study characteristics.

Additional file 5: Table A7.

Alignment characteristics.

Additional file 6: Table A8.

Definitions of alignment.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Lundmark, R., Hasson, H., Richter, A. et al. Alignment in implementation of evidence-based interventions: a scoping review. Implementation Sci 16 , 93 (2021). https://doi.org/10.1186/s13012-021-01160-w

Download citation

Received : 04 May 2021

Accepted : 02 October 2021

Published : 28 October 2021

DOI : https://doi.org/10.1186/s13012-021-01160-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementation

Implementation Science

ISSN: 1748-5908

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

scholarly articles on implementation research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Implement Sci

Logo of implemsci

Understanding the implementation of evidence-informed policies and practices from a policy perspective: a critical interpretive synthesis

Heather l. bullock.

1 Department of Health Research Methods, Evidence and Impact, McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4L6 Canada

John N. Lavis

2 McMaster Health Forum, Hamilton, Canada

Michael G. Wilson

Gillian mulvale.

3 DeGroote School of Business, McMaster University, Burlington, Canada

Ashleigh Miatello

Associated data.

Not applicable

The fields of implementation science and knowledge translation have evolved somewhat independently from the field of policy implementation research, despite calls for better integration. As a result, implementation theory and empirical work do not often reflect the implementation experience from a policy lens nor benefit from the scholarship in all three fields. This means policymakers, researchers, and practitioners may find it challenging to draw from theory that adequately reflects their implementation efforts.

We developed an integrated theoretical framework of the implementation process from a policy perspective by combining findings from these fields using the critical interpretive synthesis method. We began with the compass question: How is policy currently described in implementation theory and processes and what aspects of policy are important for implementation success? We then searched 12 databases as well as gray literature and supplemented these documents with other sources to fill conceptual gaps. Using a grounded and interpretive approach to analysis, we built the framework constructs, drawing largely from the theoretical literature and then tested and refined the framework using empirical literature.

A total of 11,434 documents were retrieved and assessed for eligibility and 35 additional documents were identified through other sources. Eighty-six unique documents were ultimately included in the analysis. Our findings indicate that policy is described as (1) the context, (2) a focusing lens, (3) the innovation itself, (4) a lever of influence, (5) an enabler/facilitator or barrier, or (6) an outcome. Policy actors were also identified as important participants or leaders of implementation. Our analysis led to the development of a two-part conceptual framework, including process and determinant components.

Conclusions

This framework begins to bridge the divide between disciplines and provides a new perspective about implementation processes at the systems level. It offers researchers, policymakers, and implementers a new way of thinking about implementation that better integrates policy considerations and can be used for planning or evaluating implementation efforts.

Contributions to the literature

  • This study unpacks the implementation of evidence-informed policies and practices through the systematic development of new theory drawing from three distinct fields of scholarship: policy implementation, implementation science, and knowledge translation, answering a call from implementation researchers for more integration.
  • The conceptual framework views implementation from the “outer context” and includes (1) a model describing the process of implementation and (2) a framework that identifies the policy-related determinants of implementation success.
  • This conceptual framework provides researchers, policymakers, and implementers with a new way of thinking about implementation and can be used for planning or evaluating implementation efforts.

Implementation has captured the attention of public policy scholars for well over 50 years [ 1 ], yet remains relatively under-studied compared to other stages of policy-making. The reasons for this are many and include challenges with isolating implementation from other parts of the policy process and a lack of agreement about conceptual underpinnings [ 2 ]. This then leads to challenges in identifying relevant explanatory variables and analysts often must resort to a “long list of variables that are potentially useful” [ 2 ]. Even once decisions regarding these challenges have been made, the complex, multi-level, and multi-faceted nature of implementation creates difficulties designing and conducting high-quality empirical research that can offer useful generalizations to those interested in improving the process of implementation and thus achieving better policy results [ 2 ].

Research on implementation has also independently come into sharp focus through the related fields of knowledge translation and implementation science. Conceptual work on implementation from these fields has increased at a seemingly exponential rate to the point where there is a great deal of focus on sorting and classifying the many frameworks, models, and theories and providing guidance toward their use [ 3 – 6 ]. The empirical literature is also rapidly increasing, with over 6200 systematic reviews on consumer-targeted, provider-targeted, and organization-targeted implementation strategies in the health field alone (based on a search of www.healthsystemsevidence.org ).

Despite the large number of models, theories, and frameworks being generated in the knowledge translation and implementation science fields, the role of policy in the implementation process appears to be under-theorized. When policy is included in conceptual work, it is often identified as a contextual variable [ 7 , 8 ] rather than being central to the implementation concept itself. It is also often presented as a broad category of “policy”, rather than as a variable that is specific and therefore measurable in empirical work. This lack of conceptual clarity and empirical work about policy and other policy-related structural constructs has been noted by several researchers. For example, a systematic review of measures assessing the constructs affecting implementation of health innovations makes specific reference to the “relatively few” measures available to assess structural constructs, which they define as “political norms, policies and relative resources/socio-economic status” [ 9 ]. As a result, the field of public policy appears to have on the one hand, a challenge of too many policy-related implementation variables, and on the other hand, the fields of knowledge translation and implementation appear to have too few.

In recent years some researchers have recognized these silos in scholarship and have called for more implementation research that integrates public policy and implementation science and knowledge translation perspectives [ 10 ]. For example, Johansson concludes that implementation problems could be better understood through the inclusion of research in public administration, with more focus on issues such as resource allocation, priorities, ethical considerations, and the distribution of power between actors and organizational boundaries [ 11 ].

In addition to these challenges, much of the seminal policy scholarship on implementation from both the public policy and knowledge translation and implementation literatures come from the USA [ 12 – 15 ]. This has resulted in a concentration of theoretical and empirical works that reflect the governance, financial and delivery arrangements that are particular to the USA [ 16 , 17 ] and that may not always readily apply in other contexts. These differences are particularly marked when it comes to the policy domain of health given the differences of the US system compared to most others [ 18 ]. One notable exception to this is the European contributions to the “second generation” of policy scholarship on implementation, which adopted the perspective of those at the “coal face” of policy implementation [ 19 ].

In response to these challenges, the objective of our study was to develop an integrated theoretical framework of the implementation process from a policy perspective by combining findings from the public policy, implementation science, and knowledge translation fields. By integrating knowledge from these fields using a critical interpretive synthesis approach, we specifically examine how policy considerations are described in implementation theories, frameworks, and processes from existing published and gray literature. Our goal was to generate a theoretical framework to foster an improved understanding of the policy contributions to implementation that can be used in future studies to generate testable hypotheses about large-scale system implementation efforts.

Study design

Given the broad goal of this study, the question of interest, and the scope of potentially applicable literature from discrete fields that could inform this work, we selected a critical interpretive synthesis (CIS) approach. Drawing from the techniques of meta-ethnography combined with traditional systematic review processes, CIS employs an inductive and interpretive technique to critically inspect the literature and develop a new conceptualization of the phenomenon of interest. Unlike traditional systematic reviews that often focus on questions of effectiveness, CIS is helpful in generating mid-range theories with strong explanatory power [ 20 , 21 ]. This is suitable for our goal of developing a conceptual framework that better integrates findings from diverse fields and affords the opportunity to critically inspect both individual studies and the literature from each field as a whole in terms of the nature of the assumptions underlying each field, and what has influenced their proposed solution [ 22 ]. The method begins with a compass question, which evolves throughout the course of the review [ 22 , 23 ]. Our compass question was as follows: How is policy currently described in implementation theory and processes and what aspects of policy are important for implementation success?

Review scope

Our review casts a very broad net in terms of implementation processes and theories. While our main focus is on large-scale implementation efforts in health, behavioral health, and human services areas that are not specific to a particular condition, we also drew from other large-scale implementation theories and empirical work, such as from the field of environmental science, that may yield important insights toward a more integrated framework of implementation. We drew from two key sources of literature: (1) existing frameworks, models, and theories (public policy, implementation science and knowledge translation) and (2) empirical studies that report on specific implementation processes.

Given our interest in implementation processes from a policy perspective, we limited our review to implementation frameworks, models, theories and empirical reports that describe implementation efforts at a community or systems level (e.g., city, province/state or country) where policy considerations are most likely to be an important factor. Implementation of a single evidence-based practice (unless across a large-scale) or implementation in a single organization were excluded, as was research that focused on behavior change at the individual level.

Electronic search strategy

Using the compass question, and in consultation with a librarian, we constructed a table of Boolean-linked key words and then tested several search strategies (Table ​ (Table1). 1 ). The search was then conducted in October 2020 for the time period of January 2000–September 2020 using the following 12 databases: ASSIA, CINAHL (via EBSCO), EMBASE (via Ovid), ERIC, Health Star (via Ovid), MEDLINE (via Ovid), PAIS Index, PolSci, PsychINFO, Social Sciences Abstracts, Social Services Abstracts, and Web of Science. The dates for the policy databases (PolSci and Social Sciences Abstracts) were extended to 1973 to ensure key conceptual articles would be retrieved, such as the seminal work by Sabatier and Mazmanian in 1980 [ 14 ]. A gray literature search was also conducted using Health Systems Evidence (which indexes policy documents related to health system arrangements and implementation strategies, as well as systematic reviews). Similar search strings were used across all databases with minor adjustments to ensure searches were optimized. We prioritized sensitivity (comprehensiveness) over specificity (precision) in our search strategy.

Search terms

Article selection

We excluded articles based on their titles and abstracts if they did not fit within the study scope or if they were not conceptual or empirical works. We created additional inclusion criteria that were based on the following questions: (1) Is there a moderate (or greater) chance that the article will shed light on the role of policy in an implementation process or on the outcomes of the process? (2) Does the article describe implementation efforts at a community or systems level? And (3) does the article identify actors at the government, organizational or practice level such as policy entrepreneurs who may be central to policy implementation efforts? Any articles that did not meet at least one of these criteria were excluded.

Complementary to the formal search and in keeping with the inductive strategies that are part of the CIS process, we also conducted hand searches of the reference lists of relevant publications and searched the authors’ personal files to identify further articles and theoretically sampled additional articles to fill conceptual gaps as the analysis proceeded.

After completing the searches, an Endnote database was created to store and manage results. Once duplicates were removed, a random selection of two percent of the articles was independently screened by two reviewers (H.B. and A.M.) who were blinded to each other’s ratings and used the same inclusion criteria. The reviewers classified each title and abstract as “include”, “exclude”, or “uncertain”. Inter-rater agreement was determined using the kappa statistic. This process was undertaken to improve the methodological rigor by enhancing trustworthiness and stimulating reflexivity, not to establish a quantitative assessment per se [ 24 ]. Any discrepancies were then discussed between reviewers until consensus was reached. Next, one reviewer assessed the remaining titles and abstracts. Articles classified as “include” or “uncertain” were kept for full text review.

The full text of the remaining articles was then assessed by one reviewer. Articles were excluded at this stage if they did not provide detailed insight into the compass question. Articles were also sorted according to whether they were a conceptual contribution (i.e., presented a model, theory, framework or theoretical concept on implementation) or an empirical contribution (i.e., used qualitative, quantitative, review or other research methods to present new findings, or an analysis of implementation).

Data analysis and synthesis

Our data analysis proceeded in four stages. First, while screening and assessing the articles for inclusion, we noted some general observations of how policy was incorporated in the literature from each field of interest (policy/public administration, implementation, and knowledge translation). Second, we classified articles according to how policy was portrayed in implementation theory and processes. Third, we constructed a data extraction template for conceptual and empirical studies that included (1) descriptive categories (the author(s), the name of the model, theory or framework (if provided), year of publication, author location, focus of the article, and whether a graphic or visual aid was included), (2) content from the article that addressed the compass question regarding how policy is portrayed and what aspects are important for success, and (3) interpretive categories including “synthetic constructs” developed by the review team from the article and additional notes on how the article contributed to the development of the conceptual model. Additionally, the data extraction form for the conceptual articles included a classification of the type of framework according to Nilsen’s taxonomy of implementation models, theories, and frameworks [ 3 ].

In the fourth and final stage, we initially focused on the conceptual literature and used it as a base from which to build our integrated conceptual model. We developed the synthetic constructs by reviewing the content from each article that addressed the compass question and interpreting the underlying evidence using a constant comparative method to ensure that the emerging synthetic constructs were grounded in the data, similar to a grounded theory approach [ 25 ]. These synthetic constructs were then used to begin to build the conceptual model and an accompanying graphic representation of it. We then critiqued the emerging constructs to identify gaps in the evidence and emerging constructs.

Using this emerging model, we purposively sampled additional conceptual literature to fill the gaps that we identified and to ensure we incorporated as many relevant concepts as possible. We did this by consulting reviews of existing models, theories, and frameworks [ 2 – 6 ] to identify additional relevant concepts not captured by our search strategy and by hand searching the reference sections of some seminal conceptual papers [ 7 , 26 ]. Once saturation of the conceptual literature was reached, we purposively sampled a subset of the empirical literature and used this subset to “test” the model and add additional detail to the theoretical constructs gleaned from empirical report. We used a similar data extraction template with the exceptions of removing the descriptive category of model or theory name and the interpretive classification using the Nilsen taxonomy [ 3 ], but adding the descriptive category of “methodology”. If our model did not capture findings from the empirical studies, we revised it and re-tested. This process continued until saturation was reached and additional empirical studies yielded no further insights into our model.

The methods reported here are based on a protocol developed prior to initiating the study. The protocol and a note about the four ways that the reported methods differed from the protocol are available upon request.

Search results and article selection

Our database search retrieved 16,107 documents and 11,434 unique documents once duplicates were removed. The review of titles and abstracts was completed independently by two reviewers on a random sample ( n = 171) of the documents. The Kappa score was 0.72 indicating substantial agreement. Figure ​ Figure1 1 provides a flow diagram outlining the search strategy. Following these criteria for the remaining titles and abstracts resulted in 1208 documents included for full text review. The full text review excluded an additional 940 documents leaving 268 potentially relevant documents (excluded documents and the rationale for exclusion are available upon request). Of these, 23 conceptual documents, 243 empirical documents, and two documents that included both conceptual and empirical elements were included for the data extraction and analysis phase. We sampled and extracted data on all of the conceptual articles. For the empirical articles, we chose a maximum variation sampling approach based on the subject matter and article topic with an initial sample of 10% of the articles. We also noted that nine of the articles related to a large, multi-year national implementation study [ 27 – 35 ]. Because this was the largest and most comprehensive account of the role of policy in large-scale implementation efforts identified through our search, we included these as a sub-group for data extraction. This approach led to data extraction for 34 empirical articles.

An external file that holds a picture, illustration, etc.
Object name is 13012_2021_1082_Fig1_HTML.jpg

Literature search and study selection flow diagram

In addition to these two approaches, we sampled articles that filled in conceptual gaps as our model developed. This process resulted in the retrieval of an additional 26 conceptual articles and 3 empirical articles. In total, 86 unique documents were included with two of these documents used in both the conceptual and empirical data extraction (Tables ​ (Tables2 2 and ​ and3). 3 ). While our process was inclusive of English language publications from any country, the majority of articles were conducted by US researchers ( n = 57), with the others coming mainly from other Western countries (the UK ( n = 8), the Netherlands ( n = 7), Australia ( n = 5), Canada ( n = 2), Sweden ( n = 2), Germany ( n = 2), and Europe, China, and OECD ( n = 1)). Articles covered a range of topics including health and health care, public health, mental health and addictions, children and youth, social care, justice, and climate change, among others. The conceptual documents included all of the categories of theories, models and frameworks identified by Nilsen [ 3 ], with the Determinants Framework type being most common. The empirical articles employed a wide array of methods that fall into the broad categories of qualitative, quantitative, and mixed methods.

Overview of included conceptual literature

a Also included in empirical literature

Overview of Included Empirical Literature

a Also included in conceptual

b Nine articles described individually in subsequent rows

General observations

Through this process, we noted several general observations regarding the characteristics of existing literature. In terms of the scholarly disciplines, most of the implementation science literature focused on the organizational or service provider levels with an emphasis on changing practice, often by introducing an evidence-informed policy or practice (EIPP). The knowledge translation literature included policymakers as a target audience for research evidence, but the focus was on the agenda setting or policy formulation stages of the policy cycle, as opposed to the implementation of an EIPP. Here, the scholarship focused on strategies to increase the use of evidence in policy decision-making. The public policy literature included theory describing “top-down”, “bottom-up”, and integrated approaches to implementing an EIPP. The object of implementation in this area was the policy itself, rather than a specific program or practice. There was often no clear articulation of independent and dependent policy-related implementation variables across any field, although many articles did partially address this.

How policy is described in implementation theory and processes

Our coding based on the compass question resulted in the following characterization of how policy is described in implementation theory and processes:

Policy is described as follows:

  • Context in which implementation occurs (i.e., only briefly citing a policy as the reason for implementation)
  • Focusing lens, signaling to systems what the priorities should be (i.e., referring to policy statements or attention by policymakers as a signal about what is important to prioritize)
  • Innovation itself—the implementation object (i.e., the “thing” being implemented is policy, such as new legislative policy on tobacco cessation)
  • Lever of influence in the implementation process (i.e., policy is identified as at least one of the factors influencing the implementation process)
  • Enabler/facilitator or barrier to implementation (moderating variable) (i.e., while policy is identified as being external to the implementation effort, it is later found to be a barrier or facilitator to implementation)
  • Outcome—the success of the implementation process is at least partially defined and measured by a change in policy.
  • Policy actors as important participants or leaders in implementation

Theoretical framework

Our approach to developing the theoretical framework was twofold. The findings from our analysis suggested constructs that addressed both the process of implementation and the factors underpinning the success or failure of implementation. We therefore first developed a process model [ 3 ] that describes the steps in the process of translating EIPPs into effectively embedded system changes. Next, we developed a determinants framework, which specifies the types of policy determinants (independent variables) that affect implementation outcomes (dependent variables). This two-part theoretical framework achieves two goals: (1) the process model is most useful in describing the process of implementation from a policy perspective and (2) the determinants framework is most useful for understanding and explaining policy-related influences on implementation outcomes.

Part 1—process model

Figure ​ Figure2 2 depicts this novel process model focusing on one policy or system level. What follows is a narrative description of the model.

An external file that holds a picture, illustration, etc.
Object name is 13012_2021_1082_Fig2_HTML.jpg

Process model of implementation from a policy perspective depicting the process at one policy level

Policy is shaped as it moves through systems. The process through which policy travels from one level to another is known as policy transfer [ 36 , 45 , 46 ]. Each policy level is nested in a context that includes existing ideas (values, evidence, etc.), interests (interest groups, civil society, etc.), institutions (existing rules and institutional structures), and external factors (natural disaster, change in economic conditions) that affect the interpretation of the policy package [ 107 , 108 ]. This context affects how a problem is defined, whether it has the attention of decision makers and whether it is up for active decision-making. This aligns with the “problem definition” and “agenda setting” stages of the policy cycle but is also described as part of the “exploration phase” in implementation science [ 12 , 109 ]. Once a decision has been reached that something should be done to address a given issue, attention shifts to the “policy development” stage of the policy cycle, which aligns with the “adoption decision and preparation” stage of implementation. It is during the policy development/adoption decision and preparation stage that the policy package gets developed.

Policy package

A policy package usually includes a mix of policy levers or instruments, including legal and regulatory instruments, economic instruments, voluntary instruments, or information and education instruments [ 58 , 110 ]. The policy package can also include some implementation guidance such as a description of the overall implementation strategy architecture, the major streams of activity, timing of events and milestones, and roles and responsibilities.

The level of ambiguity of the policy package in terms of its goals and means of attaining them, and the amount of conflict among actors with respect to the policy package are important to help characterize the implementation process and to explain its outcomes. According to Matland [ 64 ] the consideration of ambiguity and conflict leads to four types of implementation processes: (1) administrative implementation occurs when there is low policy ambiguity and low policy conflict (e.g., eradication of small pox), (2) political implementation occurs when there is low ambiguity but high levels of conflict (e.g., public transit), (3) experimental implementation occurs when there is high ambiguity but low conflict (e.g., Head Start programs for young children), and (4) symbolic implementation occurs when both ambiguity and conflict are high and policies only have a referential goal and differing perspectives on how to translate the abstract goal into instrumental actions (e.g., establishing youth employment agencies).

Implementation process

The policy implementation process can start at any level, move in any direction and can “skip” levels. Power also shifts as implementation proceeds through levels [ 29 , 56 ]. The level with the most implementation activity tends to have the most power. This is true not only for different levels of governance, but as implementation cascades across organizations, through “street level bureaucrats” [ 13 ] and on to the end-user or target population (the “recipient”) of the implementation process. Policy decisions at one level become context for other levels. Implementation activities at one level can exert either direct or indirect effects on another level. The context surrounding each level (prevailing ideas, interests, institutions, and external events) influences the acceptability and ultimate success of implementation. Finally, the overall implementation approach may need to shift over time in response to a constantly evolving context. For example, one study found it necessary to change the implementation approach for a road safety program in respond to changes in policy authority [ 81 ].

The process of implementation is undertaken in order to lead to outcomes, which can be separated and measured at different levels. Proctor et al. [ 69 ]e identifies three separate outcomes: (1) implementation outcomes, (2) service outcomes, and (3) recipient-related outcomes. Along with these outcomes, our model includes policy- and systems-level outcomes. These can be evaluated according to the policy outputs (i.e., enforcement variables, change of perspective of street-level staff), policy outcomes (i.e., unemployment levels, life-expectancy of population) or indices of policy system change (i.e., administrative re-organization, privatization) [ 56 ]. While the measures and levels will vary depending on the size, scale, and focus of implementation, there is broad agreement that outcomes should be clearly defined a priori and precisely measured. Evaluation findings regarding outputs and outcomes can dynamically feed back into the implementation process as it unfolds. This creates feedback loops and the process becomes very dynamic and multi-directional.

Part 2—determinants framework

Figure ​ Figure3 3 presents an overview of our determinants framework and the relationship among the determinants. Our findings point to three sets of policy-related factors that affect the process, outputs, and outcomes of implementation: (1) policy instruments and strategies, (2) determinants of implementation, and (3) policy actors, including their characteristics, relationships, and context. Collectively, these feed into the process of implementation that proceeds in an iterative fashion along the stages: exploration, installation/preparation, initial implementation, and full implementation/sustainment [ 12 , 109 ]. The types of policy influences vary according to the stage of implementation [ 12 ]. The process of implementation leads to a variety of outputs and outcomes as described above.

An external file that holds a picture, illustration, etc.
Object name is 13012_2021_1082_Fig3_HTML.jpg

Determinants framework of implementation from a policy perspective

Policy instruments and strategies

Policy instruments and strategies are the most common set of factors mentioned in the literature and we found evidence for each of the instrument types described here, although with varying levels of detail. Policy instruments can be applied to implementation in differing ways, often with two or three levers used concurrently to implement a single initiative or strategy [ 90 ]. In order to classify these strategies in a meaningful way, we drew on and adapted elements of a mutually exclusive and collectively exhaustive framework that identifies key features of health and social systems [ 107 ] and honed in on strategies that are particularly important for implementation (Table ​ (Table4). 4 ). These include strategies focused on the governance arrangements, financial arrangements, service delivery arrangements, and implementation-related supports in systems. We then divided these strategies according to the intended “target” of implementation. Common targets of implementation from a policy perspective include the whole system, organizations, the workforce or service providers, consumers, and the innovation itself (the EIPP to be implemented). We wish to note, however, that because policy-related variables have not necessarily been treated with the same specificity as other types of implementation variables, the most common strategies do not reflect the full array of strategies that could be employed.

Policy-related strategies and examples of those strategies for implementation according to type of target

Determinants

Our framework identifies eight categories of determinants (see “Determinants” box and elsewhere in Fig. ​ Fig.3). 3 ). Each of these categories represents a suite of factors that are hypothesized to independently affect implementation outcomes. These determinants are described briefly below and in more detail in Table ​ Table5 5 .

Determinants of implementation from a policy perspective and the factors that characterize the determinants

I—Characteristics of the evidence-informed policy or practice (EIPP). The success or failure of a particular policy package cannot be evaluated based on its intrinsic characteristics alone [ 56 ]. Instead, it is important to examine whether the policy selected is an appropriate “fit” with the problem [ 91 ], well-justified [ 78 ], and aligned with existing context [ 12 , 88 ].

II — Policy formulation process. This is the shape given to a policy by the initial formation processes [ 45 ]. It includes who in government is responsible for formulating the policy, their legitimacy and the extent to which there is opportunity to provide feedback, how much feedback is given, and the responsiveness in terms of adjustments made [ 45 ].

III—Vertical public administration and thickness of hierarchy. Vertical public administration is the term used to identify the layers in the policy transfer process. It refers to separate governments exercising their authority with relative autonomy [ 45 ]. Policies generated outside of a socio-political level may be more or less acceptable to that level. Within a given layer, a particular policy area may require the mobilization of any number of institutions, departments, or agencies, and these agencies must act in a coordinated, interdependent fashion, termed “thickness of the hierarchy” [ 55 ].

IV—Networks/inter-organizational relationships. The existence and nature of the relationships between parallel organizations who must collaborative in order to achieve effective implementation and who do not have a hierarchical relationship [ 45 ].

V—Implementing agency responses. The factors affecting the responses of implementing agencies can be divided into issues related to the overall characteristics of the agencies and the behavior of front-line or street-level staff [ 13 , 56 ].

VI—Attributes and responses from those affected by EIPP . Attributes include the diversity of target group behavior and the target group as a percentage of the population [ 14 ]. Responses include thing like impacts on workforce stability [ 12 ].

VII—Timing/sequencing. As implementation is a process that unfolds over time, it does not always align with the cycles to which it is subject and the time constraints inherent therein [ 86 , 87 ]. Additionally, the external context in which implementation occurs is ever changing and “quintessentially unstable”, and success hinges on the ability to perceive those changes and take the necessary actions to adjust along the way [ 68 ]. In Fig. ​ Fig.3, 3 , timing/sequencing is placed outside of the determinants box to reflect its importance across all of the other elements.

VIII—External environment or policy context. Much of the literature identified factors outside the policy area of focus that may influence implementation (Fig. ​ (Fig.3, 3 , outside the hatched line). Many authors referred to this generally as the “political and social climate”, as unmodifiable or macro “context”, or as “socio-economic conditions” [ 9 , 14 , 38 , 40 , 52 , 70 , 75 , 80 ]. We organized this determinant using (1) the 3I+E framework [ 113 ] and (2) a taxonomy of health and social system arrangements [ 114 ].

In general, these categories of determinants should be viewed as interactive and not completely discrete [ 56 ] and the inter-relationship among the determinants is key [ 45 ].

Policy actors

Our analysis revealed a wide range of policy actors who are important for implementation. In an attempt to create a category of variables that is analytically useful across contexts, we first divided the types of policy actors into the broad categories of political actors, bureaucratic actors, special interests, and experts [ 115 ]. To provide more specificity, we further divided these into a non-exhaustive list of actor sub-types that were frequently mentioned in the literature and included examples of the types of roles they tend to assume in implementation (Table ​ (Table6). 6 ). While many of the sub-types are commonly identified in other phases of the policy cycle, some receive particular attention in the implementation literature. These include two types of special interests: (1) implementing agencies, organizations or programs that are responsible for implementing the EIPP (e.g., hospitals, schools), and 2 street-level bureaucrats who, due to the relatively high degree of discretion in their jobs, and therefore discretion over the dispensation of public benefits or sanctions to citizens, can be critical to realizing any large-scale implementation efforts. There are also three expert sub-types that are particularly visible during implementation: (1) field or practice leaders who can be influential in supporting practice change among professionals, (2) innovation developers/disseminators who have developed the EIPP to be implemented and who may contribute or adapt tools and other types of support to encourage successful implementation, and (3) intermediaries/technical assistance providers who are organizations, programs, or individuals that work “in between” policymakers, funders, and front-line implementers, to facilitate effective implementation drawing on expertise in implementation.

Types of policy actors identified in implementation

There are also three categories of actor-related variables that are important: (1) actor characteristics, (2) actor relationships, and (3) the context in which the actors are embedded (Fig. ​ (Fig.4). 4 ). First, the characteristics of the policy actors (either individual- or organizational-level) such as their knowledge of the implementation context, their legitimacy, power and control, and their leadership in the context of the implementation effort are often cited as being critical to the success in large-scale implementation initiatives. Second, the relationships policy actors have with other actors, such as the level of shared values and beliefs or the coordination and alignment of actors and their activities, can be predictive of successful implementation. Finally, the context of the actors, such as the sustainment of political will and commitment and the stability of the actors themselves can predict the long-term success of implementation.

An external file that holds a picture, illustration, etc.
Object name is 13012_2021_1082_Fig4_HTML.jpg

Characteristics, relationships, and the context of policy actors important for implementation

Our study represents one of the first comprehensive attempts to answer the call of scholars to integrate the fields of implementation science, knowledge translation, and policy implementation in an effort to build a more comprehensive and accurate understanding of implementation. By integrating conceptual and empirical works from all three fields, the resultant two-part theoretical framework provides additional clarity regarding the process of implementation viewed from a policy perspective and identifies a number of policy-related determinants that can be tested empirically in the future.

A key strength of our study was the methodological approach we took to theory building.

First was the comprehensiveness of the search strategy, which aimed to identify scholarship from more than one academic discipline and across wide range of topics beyond health. The literature identified through the search process revealed some interesting parallels and unique differences between the fields that made it clear to us the extent of the lack of integration up to this point. Perhaps not surprisingly, the area of public health seemed to be the most fertile ground for integration. This is likely due to their focus on population-level concerns requiring system-wide implementation of EIPPs and a diverse implementation ecosystem. The search strategy was part of the mixed methods approach of the CIS, which blended the rigor of a systematic search methodology that is explicit and replicable, with the inductive, iterative, and purposive sampling techniques from qualitative review methods to build mid-range theory. The result is a theoretical framework that is clearly linked to the literature, which should instill some confidence in the academic community regarding its grounding. Critical interpretive synthesis is a relatively new approach but is growing in popularity for these reasons.

Despite the merits of our approach, we did identify some challenges. First, we believe the literature from public policy may be underrepresented for several reasons: (1) search terms did not retrieve as much from those fields (it could be that there are terms used more commonly in those fields that would have increased yield), (2) the disciplinary approach to the scholarship in public policy often means the articles were less explicit about methods and this meant that more were excluded as not being “high yield”, and (3) more of that scholarship is captured through other media (e.g., books) and while some of these were included, our approach was not as sensitive to retrieving these types of documents. We also did not include all of the empirical articles for data extraction and we may have missed a key theme or framework component. While we believe this is unlikely because we continued to sample until saturation was reached, it is still possible something was missed. Finally, there were few documents from low- and middle-income countries included in the final sample. Specific efforts to include relevant documents from LMICs in future may enrich and refine the model.

As a result of this research, policymakers and practitioners looking to use a conceptual model to guide their implementation activities have two additional options that they can be confident draw from existing theory and empirical works. Large-scale implementation endeavors or those that have started small and are looking to scale-up should at least be mindful of the critical roles of policy during the process and what policy-related factors may be important for success. Those planning implementation activities can consider the elements presented in the framework as factors that may require consideration and adjustment prior to implementing something new. Our work supports thinking beyond the program or practice levels and unpacks policy considerations that may have influence on, or affect the effectiveness of, a program or practice. Furthermore, the inclusion of policy-related outputs and outcomes in our framework offers policymakers and practitioners the option of additional indicators of success on which they can measure and report.

Like any new theoretical contribution, our framework would benefit from further refinement and testing by the research community. Future research could adopt the process model to guide a policy-intensive implementation effort and test it to determine its usefulness in such efforts. Researchers could also select particular framework elements and unpack them further for additional precision and clarity, drawing from multiple fields of scholarship. Our framework also offers some much-needed policy variables that have been lacking in the implementation science and knowledge translation fields, which could be incorporated as part of a suite of variables in implementation research.

Our study represents an early effort at integrating the fields of public policy, implementation science, and knowledge translation. We have learned that there is indeed a great deal that each of the fields can learn from the other to advance our understanding of policy- and systems-level implementation efforts and hope that these efforts are followed by more interdisciplinary research in order to truly bridge this divide.

Acknowledgements

Authors’ contributions.

HLB was responsible for conceiving of the focus and design of the study (with support from JNL) and for completing all data collection, analysis, and interpretation. JNL also contributed to the analysis during ongoing iterative cycles of interpretation and synthesis that led to the development of the final theoretical model. AM independently assessed a sub-sample of the documents for eligibility and worked with me to refine the inclusion criteria. HLB drafted the manuscript, and JNL, MGW, and GM provided comments and suggestions that were incorporated into revisions. All of authors approved the final version of the manuscript.

This study was partially supported through a doctoral scholarship from the P.E. Trudeau Foundation.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Search Menu
  • Sign in through your institution
  • Advance Articles
  • Editor's Choice
  • Supplements
  • Spotlight Issues
  • Image Gallery
  • ESC Journals App
  • ESC Content Collections
  • Author Guidelines
  • Submission Site
  • Why publish with CVR?
  • Open Access Options
  • Read & Publish
  • Author Resources
  • Self-Archiving Policy
  • About Cardiovascular Research
  • About the European Society of Cardiology
  • ESC Publications
  • Journal Career Network
  • Editorial Board
  • ESC membership
  • Advertising and Corporate Services
  • Developing Countries Initiative
  • Dispatch Dates
  • Terms and Conditions
  • Journals on Oxford Academic
  • Books on Oxford Academic

Article Contents

Inclisiran administration potently and durably lowers ldl-c over an extended-term follow-up: the orion-8 trial.

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

R Scott Wright, Frederick J Raal, Wolfgang Koenig, Ulf Landmesser, Lawrence A Leiter, Sheikh Vikarunnessa, Anastasia Lesogor, Pierre Maheux, Zsolt Talloczy, Xiao Zang, Gregory G Schwartz, Kausik K Ray, Inclisiran administration potently and durably lowers LDL-C over an extended-term follow-up: the ORION-8 trial, Cardiovascular Research , 2024;, cvae109, https://doi.org/10.1093/cvr/cvae109

  • Permissions Icon Permissions

Data describing the long-term efficacy, safety, and tolerability of inclisiran are limited. This was explored in ORION-8, an open-label extension study of preceding Phase 2 and Phase 3 placebo-controlled and open-label extension trials.

Adults with ASCVD, ASCVD risk equivalent, or HeFH received open-label inclisiran every 180 days (after completion of the parent trial) until Day 990, followed by an end-of-study (EOS) visit at Day 1080 or ≥90 days after last dose. Study endpoints included proportion of patients achieving pre-specified LDL-C goals (ASCVD: <1.8 mmol/L [<70 mg/dL]; ASCVD risk equivalent: <2.6 mmol/L [<100 mg/dL]), percentage and absolute changes in LDL-C at EOS, and safety of inclisiran.

Of 3274 patients included in the analysis, 2446 (74.7%) were followed until EOS. Mean age was 64.9±9.9 years, 82.7% (n=2709) had ASCVD, and mean baseline LDL-C was 2.9±1.2 mmol/L. Mean cumulative exposure to inclisiran (including parent trials) was 3.7 years; maximum exposure was 6.8 years. With inclisiran, 78.4% (95% CI: 76.8, 80.0) of patients achieved pre-specified LDL-C goals and mean percentage LDL-C reduction was −49.4% (95% CI: −50.4, −48.3). No attenuation of LDL-C lowering over time was observed. Treatment-emergent adverse events at the injection site (all mild or moderate) occurred in 5.9% of inclisiran-treated patients. Inclisiran-associated anti-drug antibodies were infrequent (5.5%) and had no impact on the efficacy or safety of inclisiran. No new safety signals were identified.

In the largest and longest follow-up to date, inclisiran demonstrated sustained and substantial LDL-C lowering with a favourable long-term safety and tolerability profile.

ClinicalTrials.gov identifier: NCT03814187

Graphical Abstract

  • atherosclerosis
  • ldl cholesterol lipoproteins

Supplementary data

Email alerts, more on this topic, related articles in pubmed, citing articles via.

  • Recommend to Your Librarian
  • Journals Career Network

Affiliations

  • Online ISSN 1755-3245
  • Print ISSN 0008-6363
  • Copyright © 2024 European Society of Cardiology
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

IMAGES

  1. PPT

    scholarly articles on implementation research

  2. (PDF) Effective implementation of research into practice: An overview

    scholarly articles on implementation research

  3. ⚡ Example of academic article. Reference examples: journal articles

    scholarly articles on implementation research

  4. Anatomy of a Scholarly Article

    scholarly articles on implementation research

  5. Reading a Scholarly Article

    scholarly articles on implementation research

  6. (PDF) Research Articles in Simplified HTML: A Web-first format for HTML

    scholarly articles on implementation research

VIDEO

  1. Implementation

  2. Scholarly Vs. Popular Sources

  3. How to use implementation hybrid designs

  4. Implementation Science for Innovation and Discovery Book Club

  5. Implementation Science for Innovation and Discovery Book Club

  6. Revised Research

COMMENTS

  1. Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda

    In this article, we briefly review progress in implementation science, and suggest five priorities for enhancing the impact of implementation strategies. Specifically, we suggest the need to: (1) enhance methods for designing and tailoring implementation strategies; (2) specify and test mechanisms of change; (3) conduct more effectiveness ...

  2. The Implementation Research Logic Model: a method for planning

    Background Numerous models, frameworks, and theories exist for specific aspects of implementation research, including for determinants, strategies, and outcomes. However, implementation research projects often fail to provide a coherent rationale or justification for how these aspects are selected and tested in relation to one another. Despite this need to better specify the conceptual ...

  3. Designs and methods for implementation research: Advancing the mission

    Results: Examples of specific research designs and methods illustrate their use in implementation science. We propose that the CTSA program takes advantage of the momentum of the field's capacity building in three ways: 1) integrate state-of-the-science implementation methods and designs into its existing body of research; 2) position itself at the forefront of advancing the science of ...

  4. Implementation research: what it is and how to do it

    Implementation research is a growing but not well understood field of health research that can contribute to more effective public health and clinical policies and programmes. This article provides a broad definition of implementation research and outlines key principles for how to do it. The field of implementation research is growing, but it ...

  5. Implementation Research: An Efficient and Effective Tool to Accelerate

    Success in the implementation of evidence-based interventions (EBIs) in different settings has had variable success. Implementation research offers the approach needed to understand the variability of health outcomes from implementation strategies in different settings and why interventions were successful in some countries and failed in others.

  6. The updated Consolidated Framework for Implementation Research based on

    The Consolidated Framework for Implementation Research (CFIR) is one of the most commonly used determinant frameworks to assess these contextual factors; however, it has been over 10 years since publication and there is a need for updates. ... Article Google Scholar Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing ...

  7. Measuring implementation outcomes: An updated systematic review of

    The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the National Institute of Mental Health (NIMH) "Advancing implementation science through measure development and evaluation" (1R01MH106510), awarded to Dr. Cara Lewis as principal ...

  8. Implementation Research and Practice: Sage Journals

    Implementation Research and Practice is an international, peer-reviewed, open access journal publishing interdisciplinary research that advances the implementation of effective approaches to assess, prevent, and treat mental health, substance use, … | View full journal description. This journal is a member of the Committee on Publication ...

  9. Making sense of implementation theories, models and frameworks

    Implementation science has progressed towards increased use of theoretical approaches to provide better understanding and explanation of how and why implementation succeeds or fails. The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection and application of ...

  10. Ten recommendations for using implementation frameworks in research and

    Background Recent reviews of the use and application of implementation frameworks in implementation efforts highlight the limited use of frameworks, despite the value in doing so. As such, this article aims to provide recommendations to enhance the application of implementation frameworks, for implementation researchers, intermediaries, and practitioners. Discussion Ideally, an implementation ...

  11. Full article: Continuous improvement implementation models: a

    A sample of ultimately 27 implementation models is collected from the practitioner and academic literature. The models are assessed on quality and completeness using a research framework comprising organizational dimensions, phases in time, readiness factors, activities, and sustainability factors, leading to 415 coded observations.

  12. Pragmatic approaches to analyzing qualitative data for implementation

    Qualitative methods are critical for implementation science as they generate opportunities to examine complexity and include a diversity of perspectives. However, it can be a challenge to identify the approach that will provide the best fit for achieving a given set of practice-driven research needs. After all, implementation scientists must find a balance between speed and rigor, reliance on ...

  13. Experimental and quasi-experimental designs in implementation research

    Quasi-experimental designs can be used to answer implementation science questions in the absence of randomization. •. The choice of study designs in implementation science requires balancing scientific, pragmatic, and ethical issues. Implementation science is focused on maximizing the adoption, appropriate use, and sustainability of effective ...

  14. Strategy implementation: A review and an introductory framework

    Abstract. Effective strategy implementation is a critical component of organizational success and a potential source of competitive advantage. However, despite many calls for increased attention, research on the subject remains a disparate constellation of recommendations, case studies, and empirical work that provides insight but lacks a ...

  15. Research and Scholarly Methods: Implementation Science Studies

    This section explores each of the key ingredients and provides guidance on including these across the three stages of IS research. Further, to illustrate how these ingredients are applied at each stage of research (Figures 2 and and3), 3), we will refer to a case example describing the implementation of a clinical decision support (CDS) tool to improve guideline-concordant prescribing for ...

  16. Implementation science: What is it and why should I care?

    The relatively new field of implementation science has developed to enhance the uptake of evidence-based practices and thereby increase their public health impact. Implementation science shares many characteristics, and the rigorous approach, of clinical research. However, it is distinct in that it attends to factors in addition to the ...

  17. Stakeholder Engagement in Adoption, Implementation, and ...

    Multi-level organizational stakeholder engagement plays an important role across the research process in a clinical setting. Stakeholders provide organizational specific adaptions in evidence-based interventions to ensure effective adoption, implementation, and sustainability. Stakeholder engagement strategies involve building mutual trust, providing clear communication, and seeking feedback ...

  18. Social/Emotional Learning Implementation and Student Outcomes

    Focusing on SEL and the specificity of use of CASEL SELect programs, the research questions of this study explored differences between implementation and usage, length of implementation, poverty classification, student growth composite (as measured by TVAAS composite), and attendance in elementary schools in Tennessee.

  19. TRENDS OF IMPLEMENTATION OF RESEARCH-EXPERIMENTAL ...

    The article characterizes the current state of implementation of children's experimentation in preschool pedagogy of Ukraine. The results of a survey of pedagogues of preschool educational institutions regarding the peculiarities of the organization and implementation of the functions of research-experimental activities of preschoolers have been given. The authors carried out definitive ...

  20. Promises and pitfalls in implementation science from the perspective of

    Research on dissemination and implementation has a long, rich history [].We are grateful to be a part of that history as some of the first US researchers to build implementation science careers as the field was formalizing [].Our backgrounds are in psychology, public health, social work, education, and medicine with foundations in intervention science, clinical science, community psychology ...

  21. The effectiveness of research implementation strategies for promoting

    Background It is widely acknowledged that health policy and management decisions rarely reflect research evidence. Therefore, it is important to determine how to improve evidence-informed decision-making. The primary aim of this systematic review was to evaluate the effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare ...

  22. IMPLEMENTATION AND COMPARISON OF MERN STACK ...

    DOI: 10.56726/irjmets46315 Corpus ID: 265325800; IMPLEMENTATION AND COMPARISON OF MERN STACK TECHNOLOGY WITH HTMLCSS, SQL, PHP & MEAN IN WEB DEVELOPMENT @article{2023IMPLEMENTATIONAC, title={IMPLEMENTATION AND COMPARISON OF MERN STACK TECHNOLOGY WITH HTMLCSS, SQL, PHP \& MEAN IN WEB DEVELOPMENT}, author={}, journal={International Research Journal of Modernization in Engineering Technology and ...

  23. AI-assisted writing is quietly booming in academic journals. Here's why

    If you search Google Scholar for the phrase "as an AI language model", you'll find plenty of AI research literature and also some rather suspicious results. For example, one paper on ...

  24. Quantitative Approaches for the Evaluation of Implementation Research

    Evaluation of the factor structure of implementation research measures adapted for a novel context and multiple professional roles. [PMC free article] [Google Scholar] Stetler CB, Legro MW, Wallace CM, Bowman C, Guihan M, Hagedorn H,… Smith JL (2006). The role of formative evaluation in implementation research and the QUERI experience.

  25. Flood of Fake Science Forces Multiple Journal Closures

    Fake studies have flooded the publishers of top scientific journals, leading to thousands of retractions and millions of dollars in lost revenue. The biggest hit has come to Wiley, a 217-year-old ...

  26. Opinion

    In fact, they stoke what Mr. Kennedy, a self-described "scholar on the left," who formerly served as a law clerk for Justice Thurgood Marshall, called "intense and growing resentment ...

  27. African Journal of Pharmaceutical Research and Development

    This comprehensive review examines the potential health effects of prolonged exposure to electromagnetic radiation, specifically from cell phones and base station transmitters. The study incorporates experimental and epidemiological research from reputable databases up to 2024 to assess the biological impacts of radiofrequency (RF) radiation on various organs and health outcomes.

  28. Alignment in implementation of evidence-based interventions: a scoping

    Background Alignment (i.e., the process of creating fit between elements of the inner and outer context of an organization or system) in conjunction with implementation of an evidence-based intervention (EBI) has been identified as important for implementation outcomes. However, research evidence has so far not been systematically summarized. The aim of this scoping review is therefore to ...

  29. Understanding the implementation of evidence-informed policies and

    Integrated Promoting Action on Research Implementation in Health Services (I-PARIHS) Determinants framework: Hendriks et al. 2013: Netherlands: Public health (childhood obesity) ... A qualitative study of academic leaders in implementation science. Glob Heal. 2012; 8:11. doi: 10.1186/1744-8603-8-11. [PMC free article] [Google Scholar] 106 ...

  30. Inclisiran administration potently and durably lowers ...

    R Scott Wright, Frederick J Raal, Wolfgang Koenig, Ulf Landmesser, Lawrence A Leiter, Sheikh Vikarunnessa, Anastasia Lesogor, Pierre Maheux, Zsolt Talloczy, Xiao Zang, Gregory G Schwartz, Kausik K Ray, Inclisiran administration potently and durably lowers LDL-C over an extended-term follow-up: the ORION-8 trial, Cardiovascular Research, 2024 ...