• Open access
  • Published: 22 March 2018

How do NHS organisations plan research capacity development? Strategies, strengths, and opportunities for improvement

  • Melanie Gee   ORCID: orcid.org/0000-0001-9149-4314 1 &
  • Jo Cooke 2  

BMC Health Services Research volume  18 , Article number:  198 ( 2018 ) Cite this article

5839 Accesses

26 Citations

9 Altmetric

Metrics details

Research that is integral into a ‘learning healthcare system’ can promote cost effective services and knowledge creation. As such, research is defined as a ‘core function’ in UK health service organisations, and is often planned through research and development (R&D) strategies that aim to promote research activity and research capacity development (RCD).

The discussion focuses around the content of ten R&D strategies for healthcare organisations in England and Scotland, with respect to RCD. These organisations were engaged with a research interest network called ACORN (Addressing Organisational Capacity to do Research Network) that included two Scottish Health Boards, four community and mental health trusts, two provincial district hospitals, and two teaching hospitals.

We undertook a thematic documentary analysis of the R&D strategies which identified 11 ‘core activities’ of RCD. The potential for building research capacity in these ‘core activities’ was established by reviewing them through the lens of a RCD framework.

Core activities aimed to ‘hard wire’ RCD into health organisations. They demonstrated a complex interplay between developing a strong internal organisational infrastructure, and supporting individual career planning and skills development, in turn enabled by organisational processes. They also included activities to build stronger inter-organisational relationships and networks. Practitioner, manager and patient involvement was a cross cutting theme. The potential to demonstrate progress was included in plans through monitoring activity across all RCD principles. Strategies were primarily aimed at research production rather than research use. Developing ‘actionable dissemination’ was poorly addressed in the strategies, and represents an area for improvement.

We describe strengths of RCD planning activities, and opportunities for improvement. We explore how national policy and research funders can influence health systems’ engagement in research.

Peer Review reports

Research Capacity Development in Healthcare Systems

There is broad consensus that healthcare systems should integrate research in order to promote health, wealth and knowledge creation [ 1 ]. Faden et al. [ 2 ] suggest that it is both ethical and moral to support ‘learning healthcare systems’ that integrate research and healthcare practice through continuously studied, tested and improved services. Many authors support a ‘whole systems’ approach in order to strengthen a research culture, and increase research capacity within organisations and the workforce [ 3 , 4 , 5 ].

The current national policy in the UK calls for ‘NHS [National Health Service] and patient participation in research to improve outcomes and promote economic growth’ [ 6 ], recognizing the need for developing cost effective services, and developing links with industries to impact on the economy. The NHS Constitution [ 7 ] refers to ‘the promotion, conduct and use of research to improve the current and future health and care of the population’. There is therefore recognition across NHS trusts that research should become embedded into the organisation - it should be part of ‘core business’.

Healthcare systems need to invest and plan strategies and activities to develop research capacity. Research Capacity Development (RCD) has been defined as ‘a funded, dynamic intervention operationalized through a range of foci and levels to augment ability to carry out research or achieve objectives in the field of research over the long-term, with aspects of social change as an ultimate outcome’ [ 8 ]. The current NHS R&D policy ‘Best Research for Best Health’ [ 9 ] offers some opportunities for Trusts in shaping and resourcing these strategies. This policy instigated a National Institute of Health Research (NIHR), with an ambition to support research ‘reach’ into each NHS organisation. It aims to do this firstly, by funding a Clinical Research Network (CRN) whose function is to support research delivery in the NHS. Resource supports clinician time and recruitment in each organisation, and additional funds are linked to activity and efficiency of recruitment. Secondly, by developing a NIHR Faculty. This aims to identify, recognise and resource research active staff. It also offers funding opportunities for clinical academic careers through a series of ‘stepped’ fellowships. Thirdly, by commissioning research programmes that clinical academics could compete for, with the aim of providing benefit to patients. And finally, by creating systems to manage and support research and its outputs that are embedded in the NHS, including research ethics and governance systems.

As Condell and Begley [ 8 ] observe, some authors use RCD interchangeably with RCB (Research Capacity Building). In this paper we use the term RCD, as the term ‘development’ conveys activities around expanding and upgrading pre-existing capabilities in the organisation, rather than starting from scratch [ 10 ].

Incorporating RCD into organisational planning: ACORN

A research interest network called ACORN (Addressing Capacity in Organisations to do Research Network) was developed as part of the capacity programme within a Collaboration and Leadership of Applied Health Research and Care in Yorkshire and Humber (CLAHRC YH), in the North of England. CLAHRCs are National Institute of Health Research (NIHR) funded collaborations, with a key objective to build health services research capacity. ACORN was developed to build a community of practice (CoP) as this approach has become increasingly influential within management research and practice to drive capacity development in these organisations [ 11 ]. CoP is described as ‘a vehicle for collective learning in a field of shared human endeavour, enhanced by mutual concerns, passions and regular group interactions’ [ 12 ]. An agreed aim at the outset in the development of the ACORN community was to share the research and development (R&D) strategies within the group, and this enabled a cross documentary review of strategies. This paper is an outcome of this review, which aims to look for common themes and explore joint learning for the ACORN group and others interested in building RCD at an organisational level.

At the time of this R&D strategies review, ACORN comprised ten NHS organisations, including two Scottish Health Boards and eight trusts from the north of England. It included four community and mental health trusts, two provincial district hospitals, and two teaching hospitals. All ACORN members had R&D strategies published before the group was developed, some covering the whole organisation, and others that focussed on nursing and Allied Health Professionals (AHPs) (see Table  1 ). All the strategy documents articulated overarching strategy aims (e.g. “to increase the volume and quality of applied research that leads to improvements in patient/client health and well-being and service delivery”; “to offer training opportunities for the public, service users and carers who have expressed an interest in active involvement in research”; “to maximised the use of research to support cost efficiencies”). They all incorporated implementational planning elements, to differing levels of detail: all described RCD activities that were being planned or undertaken in each organisation, and some additionally described milestones, measurements and key performance indicators associated with those activities. Our aim was to categorise the RCD activities, and identify ‘core activities’ by determining which ones were the most frequently described across this diverse range of organisations.

We also wanted to see how these activities compared to the evidence of what works in RCD. We did this by describing how they mapped against the six principles of an adapted version of Cooke’s evidence based framework [ 13 ]. The Cooke framework was developed through the blending of knowledge from analysis of the literature, R&D policy documents, and the experience of one Research and Development Support Unit in the UK. It has been further developed with a particular focus on healthcare organisations [ 14 , 15 ]. The ACORN CoP have agreed to use it to review current activity, and plan further work. This documentary analysis is the first step in this process.

The adapted version of Cooke framework that we used is shown in Fig.  1 . It contains the following principles, or ways of doing RCD: promoting actionable dissemination (DISS); developing research ‘close to practice’ (CTP); developing a support infrastructure (INF); supporting linkages and collaborations (LINKS); developing research skills and confidence in the health services workforce (SKILLS); and planning sustainability (SUS). Our definitions for.

Adapted Cooke framework for RCD. (Adapted from Cooke’s evidence-based framework [ 13 ])

The utility of the Cooke framework for RCD analysis and evaluation has been demonstrated [ 16 , 17 , 18 ], and this framework has been widely used in supporting RCD in a range of contexts [ 19 , 20 ]. In undertaking this methodology we were able to capture core activities that may be transferable to other healthcare contexts, and by looking through the lens of the adapted framework we were able to determine the theoretical underpinning of how such activities can contribute to capacity building endeavour. Thus we ascertain our findings may be a useful starting point for discussion for others planning RCD in health services.

Documentary analysis of the organisation R&D strategies

We performed a documentary analysis of the ten ACORN members’ R&D strategies, using NVivo 10 Software [ 21 ]. We only coded parts of the strategies describing ongoing or planned activities associated with RCD; some parts of the strategies, such as context-setting and information about wider organisational developments, were not coded. We used open coding to label the RCD activities described in the documents. Coding was carried out by one reviewer (MG) but areas of doubt or ambiguity were discussed with the second reviewer (JC) who also double-coded a sample from one strategy as part of the code-checking process. Where it was not immediately obvious how to code any section of text, node-linked memos were created to capture the decision-making, and node definitions were developed and refined as appropriate. Having coded all the strategies, we interrogated NVivo to check for coding consistency between the strategies, and refined and consolidated the activity nodes (e.g. where different labels were applied to essentially the same sort of activity). The final coding tree for the activity codes is provided in Additional File  1 .

We identified those RCD activities which were evident in at least 7/10 strategy documents, from our coding in NVivo. These ‘core activities’ are listed on the left hand side of Table  3 . We then viewed the data pertaining to each of these core activities through the lens of Cooke’s adapted framework to ascertain which RCD principles from the framework were being used to address research capacity. This was enabled through our shared understanding of the scope of each principle, and collaborative production of the principle definitions presented in Table  2 . The mapping between core activities and RCD principles is shown in Table 3 .

Evidence for core RCD activities in the R&D strategies

We now provide a description of the core RCD activities we identified, indicating in parentheses which of the RCD principles apply.

Developing and sustaining research collaborations

Most strategies described activities relating to external research collaborations and developing research proposals and competitive bids, both of which relate to sustainability planning through increased externally-funded research activity (SUS).

Investment in nurturing external collaborations and networks (LINKS) was seen to increase the likelihood of gaining external funding. Links with academics working within universities (including the provision of joint academic posts), the NIHR CRN, and funded collaborations such as CLAHRC, were strongly supported. These links also provided opportunity for research training and development (see under ‘Research skills development’ below) (SKILLS). One strategy sought evidence of sustained clinical-academic dialogue through joint working groups and collaborative agreements. The NHS organisations recognised their unique position of increasing access to patient groups within research partnerships (see ‘Patient and public involvement and engagement below) (CTP).

Some strategies also described links with commercial partners, in particular with regard to priority setting and targets (see under ‘Developing research priorities’ and ‘Setting targets and monitoring performance’ below).

Research ‘business’ plans were often described, aimed at capturing funds through being a recruitment site for high quality nationally and commercially funded projects (called portfolio projects by the NIHR). Partnership with the NIHR CRN) was seen as important in this regard, as this network funds time for clinical researchers and research nurses to recruit patients to portfolio studies so as not to impinge on clinical budgets. Additionally, the CRN can contribute to research infrastructure by funding support for governance and ethics applications (LINKS and INF). This highlights how the national policy around research delivery has influenced local strategy and reach into NHS organisations.

Research funded through competitive tendering was seen to be a way of engaging with research of high quality and increasing the reputation of the trust. The influence and impact for conducting this type of research on clinical care was also seen as important (CTP). The NIHR as a funding body was prioritized by trusts, as such grants attracts additional ‘between’ grant funds to NHS organisations (SUS) for practitioners with an ambition and ability to do research. This is called Research Capability Funding (RCF) and is aimed at increasing research capacity in the NHS. Such grant income was therefore seen as doubly effective in funding activity and strengthening capacity, and thus was included as an element of the business case for R&D.

Developing research priorities

Most strategies highlighted the need to align research activity to wider organisational strategic objectives, business planning, quality strategies or audit activities (INF). Many emphasised the role of service users, carers and the public in priority setting to ensure that research was relevant and of benefit to patients (CTP). Mechanisms for Public and Patient Involvement (PPI) included links with patient and carer research groups. One trust aimed to develop a more systematic approach to eliciting patient views through asking about research priorities in patient experience surveys that were routinely undertaken for quality assurance in care provision. Another trust planned to run a development day/workshop to identify and prioritise research questions from service user, carer, and professional perspectives.

Research coproduction encompassing priority setting with wider partners was planned. This included coproduction with academic partners, private and voluntary sector organisations, and research networks (LINKS), and this was seen as a way of attracting further income (SUS). One strategy stated that ‘a focused approach, building research expertise and experience in a few key areas, is more likely to be successful in establishing local programmes of research that are nationally competitive than a ‘scattergun approach’.

Academic dissemination

Research dissemination featured in most strategies. Some explicitly referred to creating and implementing dissemination strategies (DIS). This predominantly focussed on academic dissemination in peer-reviewed publications in order to enhance individual and organisational research profiles. Many strategies planned activities to develop staff members’ academic writing skills (SKILLS) and engender a culture of support with an expectation to publish as an outcome. Publications were often cited as a key performance indicator for the impact of R&D strategies, along with the number of oral and poster conference presentations.

Fewer strategies prioritised local research dissemination (i.e. within the organisation and to affected patient groups). However, examples of internal publications were an e-journal/newsletter, and an R&D magazine. One trust planned to maintain an internal Publication Register to support appropriate dissemination, and another planned to ask researchers to write research summaries for internal dissemination. Internal research conferences and events to ‘showcase’ and celebrate activity were planned by several trusts. These, again, may be a mechanism for raising the organisation’s research profile within and outside the organisation.

Evidence-based practice and knowledge transfer

All the strategies referred to a desire to translate the trust’s research into local practice, with a resultant impact on patients and service transformation (CTP). There was a recognition that this was more likely to happen if the research has maximum local relevance: as described under ‘Developing research priorities’ above, several strategies described ongoing and planned working with key partners to ensure this.

Several strategies identified the need to build infrastructure to support evidence-based practice and knowledge transfer (INF). These included mechanisms for internal dissemination of research findings and identifying key staff to follow them up for service improvements (academic local research dissemination is discussed above under ‘Academic dissemination’); involving general managers in knowledge mobilisation programmes; and using community of practice groups.

‘Hard-wiring’ research into the organisation

This activity was strongly associated with making research ‘core business’. All strategies included mechanisms to ensure research would become ‘hard-wired’ into the organisation, i.e. embedded in its day-to-day activity. These mechanisms fell into two broad categories: firstly, linking research planning to wider organisational and business planning processes; and secondly, providing an explicit expectation that individual members of staff would engage in research activities.

All strategies described links between RCD and organisational business planning (INF). Some described alignment with the overall strategic direction of the organisation, and in one, the research strategy objectives were explicitly mapped to corporate priority strategic objectives. For example, the research objective ‘To develop collaborative working with patients/public in research’ was mapped to the corporate objective ‘To work in partnership with service users, communities and stakeholders to deliver service solutions, particularly around integrated care and care closer to home’. At a local level, some strategies referred to linkages between the organisational research aims and smaller clinical units within them (CTP).

R&D strategies often highlighted internal policy links that offered opportunities for alignment and synergy, including intellectual property (IP) policy, staff training, and models for PPI (INF). Including patient’s views was embedded within the systems of several trusts to identify priorities (see ‘Patient and public involvement and engagement in research’ below).

A commonly cited philosophy was that research should be everyone’s business, from the top to the bottom of the organisation. For example some strategies planned to report research performance as a regular item on the trust Board agenda (INF). Many referred to plans for integrating an expectation of research activity into working practice through job descriptions for existing staff and in advertised posts, in job plans, and professional development pathways (INF, CTP). Providing protected time for research activities, either through job planning or ‘release’ of staff, was also recognised as important for developing research skills (SKILLS), in order to enable clinical staff members to ‘learn by doing’. Engaging clinical managers in the research agenda was also thought to be beneficial in order to support the workforce in performing research activities, and to build this into appraisal and performance management processes (SKILLS and INF).

Proactive and timely communication of research opportunities

Rapidly identifying, assessing, and communicating the relevant research opportunities to the right people were seen as important, requiring organisational infrastructure to support this (INF). Some trusts already used, or planned to use, dedicated research support staff to fulfil this role. Others worked in clinical-academic partnerships to do this. Trust websites, intranet sites or research share points were referred as mechanisms for disseminating opportunities in the future. One trust planned to send monthly reports directly to clinical staff at clinical speciality level.

Patient and public involvement and engagement in research

Most strategies supported PPI in research, motivated by improving the quality of care provided by the NHS through research (CTP). One strategy stated that ‘Involving patients and public in research can lead to more appropriate people centred care, improved health outcomes and sustainable solutions.’ Ongoing dialogue with patients was seen as an important function for NHS partners in academia partnerships (LINKS). Many trusts had identified funding to support PPI, through NIHR bodies (LINKS).

Some trusts had developed a directory of patient groups with specific conditions willing to support R&D functions, whilst others developed groups specifically to undertake research governance functions (INF), sometimes within wider research collaborations (LINKS). These groups received training and support (SKILLS) from NHS R&D departments (INF) to enable engagement in research development. PPI involvement is a requirement for many funding bodies, therefore many strategies aimed to budget for PPI in projects. One trust had a target that PPI should between 10 and 20% of the total costs of funding applications. Many trusts also aimed to involve patients in the dissemination of research findings in conferences, newsletters and films/videos.

Some planned PPI activity included increasing the general public’s awareness that research is part of core NHS business, for instance including statements to that effect in clinic appointment letters, or through creating patient ambassador roles (INF). Increased public awareness of projects that exist in trusts would, it was hoped, support patient recruitment, increase the quality of care, and also impact on R&D business case (funds follow recruitment numbers). One trust planned to survey patients and members of the public involved in research to find out their experiences and how they might be involved in the future in order to increase recruitments and participation.

Many trusts aimed to evaluate the impact of PPI. For example one trust intended to survey people involved in PPI activity in order to ‘ensure appropriate levels of recognition and reward for involvement are maintained’. Another intended to identify the number of projects that included PPI within them, and to describe this activity in detail.

Research governance support

The Department of Health has clear regulatory and legal requirements for the conduct of research in the NHS [ 22 ]. Existing or planned research governance and support offices were an important constituent in the strategies (INF), whether provided through trust-based research support offices, or by partner organisations (LINK). Such offices are able to navigate the system, reducing potential barriers to research participation and enabling research to progress, thus increasing research capacity.

Research education and learning

Most strategies included the planned provision of research skills training and development for research-active clinical staff (SKILLS), and in some cases also to other clinical staff, and managers. Two strategies planned to review training arrangements by research management groups (INF). Supporting skills around change management and innovation was recognised as a way of enabling learning through change and sharing that learning with others (CTP).

Planned training activities included workshops for generic research methods such as study design, accessing and appraising evidence, ethics, research governance, and writing for publication. Such training would be provided in-house (INF) or provided through academic networks such as the CLAHRC YH (LINKS). One strategy referred to commissioning bespoke training, and two planned to identify funding opportunities to support training.

A key theme was support for new and emerging researchers. This included mechanisms to identify clinicians with a research ambition, and to maintain a database of staff with research potential (INF). One larger trust offered an intensive ‘Research Boot Camp’ focussing on grant applications and publications for early career ‘clinical researchers, with associated mentoring and peer support opportunities. Mentorship was included in several strategies.

Setting targets and monitoring performance

Several strategies described audit activities to gather ‘baseline’ research activity data against which targets for improvement could be set, thus developing a culture for continuous improvement. Strategies included provision for monitoring and reporting progress against these targets (INF). Most would monitor research activity performance by number of studies undertaken, levels of staff and PPI engagement in studies (CTP), and recruitment rates into studies. Targets for recruitment rates on portfolios studies were set externally by the NIHR CRN, and linked to income for the Trust (SUS). The number of grant applications, grant-funded or commercially sponsored research studies was often cited (SUS). Monitoring of research outputs (peer reviewed publications and conference presentation) was common (DIS).

Targets for research income were evident and most strategies included a level of desirable growth in income (SUS). Links with industry were evident in target setting (LINKS). One strategy referred to shared performance targets with commercial partners, and another aimed to assess which partnerships were better at achieving commercial income (SUS).

Most strategies set objectives around staff engagement in research activities. These included an increase in ‘research ready’ staff or research activity in named staff groups, and some specified a desirable number of people at stages within a clinical academic pathway. Planned mechanisms for achieving this included developing research skills in staff (SKILLS) and links to academic networks and universities (LINKS).

Plans to provide regular performance reports to the Board of Directors were also included in a number of strategies, reflecting high level ‘hard-wiring’ into the organisation (INF). One strategy described a structure of distributed leadership ‘across the organisation from the Trust Board, through care groups into directorate and into clinical teams’ (CTP). It was thought that these leaders could enable enthusiasm for research, and spread innovation from research active groups to other parts of the organisation (DISS).

Internal investment: allocating resources to promote research capacity

The strategies highlighted the existing and planned use of organisational and financial resource to support RCD (SUS). The financial resource for this purpose came from ‘invest to save’ strategic use of trust finance, and from research income. The latter included grants, commercial collaborations, NIHR portfolio study recruitment funds, NIHR ‘between grant’ research capacity funds, and charitable monies linked to the organisation. An activity strongly aligned to supporting a research culture was through executive level support to identify additional resources agreed through ‘matched’ funding into research collaborations such as the CLAHRC YH. ‘Matched funding’ is a process by which members of the NHS workforce work with research teams to undertake research and implementation projects that align to the trust objectives. The NHS organisation agrees to provide practitioner and manager time into the project, and the academic’s time is externally funded and freely available to the trust. This reciprocal arrangement increases access to practical and methodological support.

How resources were utilised varied amongst the strategies. Many described a funding distribution model, agreed at a senior level to recompense research activity and incentivise clinical engagement in research priorities (CTP), as well as focusing on likely return on this investment (SUS).

Internal investment was planned to fund: research support services (e.g. research governance functions, PPI, and portfolio study recruitment activities at English sites) (INF); other training and support activities, typically involving funding academics to provide training and work with practitioners and managers to prepare grant applications (SKILLS); protected research time in practitioners’ job plans to ‘legitimise’ research alongside practice commitments (often as in recognition of previous research activity) (INF); joint clinical/academic posts with academic partners (LINKS); and seed-corn priority setting events and funds to develop research ideas with academics (LINKS).

Conclusions

Activities and rcd principles: making research ‘core business’ in health organisations..

We have described a range of activities identified through thematic analysis of ten NHS organisations’ R&D strategies in two countries in the UK. Whilst the data arises from planning documents in health services in high income countries, the activities address a range of principles (see Table 2 ) developed from a framework shaped by international evidence [ 13 ], indicating potential for building research capacity elsewhere. This paper offers some concrete examples of how the principles can be articulated in strategy documents. Targeted at organisational level, the activities described aim to make research ‘core business’ in a full range of healthcare organisations. We propose that these would be good candidate ideas for other health organisations planning RCD strategies.

The activities demonstrate a complex interplay between planning at an organisational level to develop a strong internal infrastructure, and undertakings that support individual career planning. They also aim to build stronger inter-organisational relationships and networks within health systems. The approach planned by many of the ACORN organisations is multi-layered and multifaceted, which has been shown to be effective elsewhere [ 8 , 13 , 23 ], and affirms observations made by others about the complexity of effective RCD organisational ‘interventions’ [ 14 , 16 , 24 , 25 ].

Strengths of activities for RCD

A number of activities cover the majority of RCD principles as presented in Table 2 . These include: setting targets and monitoring performance; investing in internal resource for capacity development; developing and sustaining research collaborations; and developing research priorities, along with PPI. These activities balance inward looking and outward looking approaches, reinforcing the need for collaboration and networking recognised by others. Partnership development has been found to be an important determining factor for successful RCD across different organisations [ 24 , 25 , 26 ].

The only activity that addresses all RCD principles is that of setting targets and monitoring performance. This is an interesting observation: it is well recognised that measuring RCD is challenging. For example, Vasquez et al. [ 27 ] state that there is ‘limited consensus on and precedence for systematic evaluations of HRCS [health research capacity strengthening] initiatives, making it difficult to establish a clear benchmarks for success ‘, but it appears that the majority of these ACORN organisations have an ambition to do this. Some of this monitoring includes traditional measures, for example grant income and peer reviewed publications, whilst other ideas are more innovative, tracking practitioner, manager and service user engagement in research projects and planning. Levine et al. [ 25 ] advocate a concurrent use of different success criteria in order to capture process, and enhance cross-organisational comparison. The importance of such monitoring will bring shared learning across the ACORN group as this community of practice progresses.

From Table 2 it can be seen that the RCD principles are not equally represented by the core activities. The principle that is evident across the most activities is that of infrastructure development. This is unsurprising as this principle is very tangible at an organisational level, and it is easy to operationalise. Infrastructure development could also be considered first step in the R&D developmental of a health provider organisation. Levine et al. [ 25 ] have noted that infrastructure development is associated with the how established research activity is within an organisation, with novice ‘seed’ organisations having little or no research infrastructure and ‘fertilizer’ organisations having a well-developed research infrastructure. Healthcare provider organisations are more likely to be the former, and planning to establish and maintain an infrastructure is an important element of planning RCD at an organisational level. The strategy documents we examined have articulated important activities which can help achieve this.

The ‘close to practice’ principle is also evident within many activities, demonstrating an aspiration to involve practitioners, managers and patients throughout the research cycle. Many organisations realise this particular ‘offer’ by contributing to academic-practice partnerships, particularly in relation to PPI, and aim to monitor the impact of this activity through linked grant capture. However there was less measurement around clinical impact of research in services, although this was often a stated ambition of the strategy.

Much of the ‘hard wiring’ activity aims to make research ‘everybody’s business’ for example, within job descriptions, through mentoring, and integrating research aims in job plans, appraisal and performance management. There are also plans to reduce barriers to research engagement through establishing protected time for practitioners to do research and providing support for researchers to navigate the complex system of ethics and governance approvals. This approach is important as experiential ‘learning by doing’ is associated is in clinical academic career development [ 12 , 28 , 29 ] .

Authors have stressed the importance of providing funds for developing and sustaining research capacity [ 8 , 30 ]. This is reinforced by our findings. Financial and business planning was woven throughout the strategy documents, underpinned by financial incentives for recruitment, making strategic judgements about return on investment, and assessment of partners who are likely to achieve commercial income or further grant capture. The national funding body (NIHR) had a great deal of influence in shaping the NHS business case. This included activity funded through recruitment incentives, in the use of ‘match funding’ in research partnerships like the CLAHRC, and by providing capacity funds associated with ‘between grants’ activity (RCF). It is noticeable that developing links with industry was evident across the strategies, which contradicts reports elsewhere in regard to RCD in NHS organisations [ 14 ], and could reflect more recent national NIHR policy and guidance. The aim of such business planning is to use resulting funds to strengthen the amount of capacity building activity and clinical academic careers. Whether this ambition is realised is yet to be established, however.

Under-represented principles: opportunities for improving impact

Actionable dissemination is the least well represented principle (see Table 2 ). These findings are consistent with those of a qualitative study exploring barriers, motivators and critical success factors in establishing a strong research culture within AHPs in Australia [ 4 ]. Many of the trusts have planned to use traditional benchmarks of peer reviewed publications as a measure of research capacity, rather than outputs that could have an impact on clinical practice, and local dissemination was limited to newsletters and local conferences rather than using ‘actionable’ outputs. However, the use of such ‘boundary objects’ have been found to be useful ways of bridging the research-practice gap [ 11 ], and can demonstrate direct benefits to the clinical and quality objectives of an NHS organisation.

Summary: how to make research ‘core business’ in health organisations

Many ACORN organisations aim to ‘hard wire’ RCB through strategically planning a range of RCD activities through linking research planning to wider organisational and business planning processes. The strategic plans were dominated by developing a strong infrastructure, and activities that enable research ‘close to practice’ were also apparent. Actionable dissemination was less evident in these plans: the model was primarily one of research production rather than research use and application. The strategic plans covered a broad range of target setting and monitoring performance, offering potential for measuring the impact of these plans on RCD, and for learning from one another in the ACORN group. The impact of research on clinical services and direct benefit for patients was missing, and warrants further investigation.

Our findings have demonstrated how research funders can influence health systems and capacity building through providing incentives for research activity, and supporting creative funding matched arrangements, where practitioner and health managers’ time is matched with grant funds to cement health systems engagement in such collaborations. ‘Between grant funding’ also supports organisational planning to sustain and strengthen capacity.

This paper offers some examples of how a number of health organisations are planning to build and sustain research capacity, and measure progress, and offers some ideas for further examination and debate. Our findings are based on written plans, which are recognised as an important step in organisational change [ 31 ]. The implementation and impact of such plans are not described, but would be worthy of investigation. Nevertheless our findings offer some interesting insights into how national policy and research funders can influence the plans of health systems engagement in research.

Abbreviations

(Local) Clinical Research Network.

Addressing Capacity in Organisations to do Research Network.

Allied Health Professional.

Clinical Business Unit.

Collaboration for Leadership in Applied Health Research and Care.

Close to Practice (from the Cooke framework).

Actionable Dissemination (from the Cooke framework).

Infrastructure (from the Cooke framework).

Linkages and Collaborations (from the Cooke framework).

National Health Service.

National Institute for Health Research.

Patient and Public Involvement.

Research and Development.

Research Capacity Building.

Research Capacity Development.

Research Capability Funding.

Research Design Service.

Research Excellence Framework.

Skills Development (from the Cooke framework).

Sustainability (from the Cooke framework).

Cooke J, Booth A, Nancarrow S, Wilkinson A. Re: Cap - Identifying the evidence base for research capacity development in health and social care. Sheffield: University of Sheffield, Trent Research and Development Support Unit; 2006.

Google Scholar  

Faden RR, Kass NE, Goodman SN, Pronovost P, Tunis S, Beauchamp TL. An ethics framework for a learning health care system: a departure from traditional research ethics and clinical ethics. Hast Cent Rep. 2013;43(s1):S16–27.

Article   Google Scholar  

Farmer E, Weston K. A conceptual model for capacity building in Australian primary health care research. Aust Fam Physician. 2002;31(12):1139.

PubMed   Google Scholar  

Golenko X, Pager S, Holden L. A thematic analysis of the role of the organisation in building allied health research capacity: a senior managers’ perspective. BMC Health Serv Res. 2012;12:276-6963-12-276.

Whitworth A, Haining S, Stringer H. Enhancing research capacity across healthcare and higher education sectors: development and evaluation of an integrated model. BMC Health Serv Res. 2012;12:287-6963-12-287.

NHS England. Putting patients first: The NHS England business plan for 2013/14–2015/16. London: NHS England; 2013.

Department of Health. The NHS Constitution for England. London: Department of Health; 2015.

Condell SL, Begley C. Capacity building: A concept analysis of the term applied to research. Int J Nurs Pract. 2007;13(5):268–75.

Article   PubMed   Google Scholar  

Department of Health. Best Research for Best Health: a new national health research strategy. 2006.

Kislov R, Waterman H, Harvey G, Boaden R. Rethinking capacity building for knowledge mobilisation: developing multilevel capabilities in healthcare organisations. Implement Sci. 2014;9:166-014-0166-0.

Kislov R, Harvey G, Walshe K. Collaborations for Leadership in Applied Health Research and Care: lessons from the theory of communities of practice. Implement Sci. 2011;6(1):1.

Gullick JG, West SH. Building research capacity and productivity among advanced practice nurses: an evaluation of the Community of Practice model. J Adv Nurs. 2016;72(3):605–19.

Cooke JA. framework to evaluate research capacity building in health care. BMC Fam Pract. 2005;6:44.

Article   PubMed   PubMed Central   Google Scholar  

Sarre G, Cooke J. Developing indicators for measuring Research Capacity Development in primary care organizations: a consensus approach using a nominal group technique. Health & Social Care in the Community. 2009;17(3):244–53.

Taliercio V, Logan JR, Kalpathy-Cramer J, Oteroa P. Developing a survey to assess factors that contribute to physician involvement in clinical research. Stud Health Technol Inform. 2013;192:107–11.

PubMed   PubMed Central   Google Scholar  

Suter E, Lait J, Macdonald L, Wener P, Law R, Khalili H, McCarthy PL. Strategic approach to building research capacity in inter-professional education and collaboration. Healthc Q. 2011;14(2):54–60.

Cooke J, Nancarrow S, Dyas J, Williams M. An evaluation of the Designated Research Team’ approach to building research capacity in primary care. BMC Fam Pract. 2008;9(1):1.

Payne S, Seymour J, Grande G, Froggatt K, Molassiotis A, Lloyd-Williams M, Foster C, Addington-Hall J, Rolls E, Todd C. An evaluation of research capacity building from the Cancer Experiences Collaborative. BMJ Support Palliat Care. 2012;2(3):280–5.

Njie-Carr V, Kalengé S, Kelley J, Wilson A, Muliira JK, Nabirye RC, Glass N, Bollinger R, Alamo-Talisuna S, Chang LW. Research Capacity–Building Program for Clinicians and Staff at a Community-Based HIV Clinic in Uganda: A Pre/Post Evaluation. J Assoc Nurses AIDS Care. 2012;23(5):431–41.

Hogg W, Donskov M, Russell G, Pottie K, Liddy C, Johnston S, Chambers L. Riding the wave of primary care research: development of a primary health care research centre. Can Fam Physician. 2009;55(10):e35–40.

QSR International Pty Ltd. NVivo qualitative data analysis Software. Version 10. 2012.

Department of Health. Research Governance Framework for Health and Social Care. London: Department of Health; 2005.

Holden L, Pager S, Golenko X, Ware RS. Validation of the research capacity and culture (RCC) tool: measuring RCC at individual, team and organisation levels. Australian Journal of Primary Health. 2012;18:62–7.

Segrott J, McIvor M, Green B. Challenges and strategies in developing nursing research capacity: a review of the literature. Int J Nurs Stud. 2006;43(5):637–51.

Levine R, Russ-Eft D, Burling A, Stephens J, Downey J. Evaluating health services research capacity building programs: Implications for health services and human resource development. Eval Program Plann. 2013;37:1–11.

Macfarlane F, Shaw S, Greenhalgh T, Carter YH. General practices as emergent research organizations: a qualitative study into organizational development. Fam Pract. 2005;22(3):298–304.

Vasquez EE, Hirsch JS, Giang LM, Parker RG. Rethinking health research capacity strengthening. Global Public Health. 2013;8(sup1):S104–24.

Lansang MA, Dennis R. Building capacity in health research in the developing world. Bull World Health Organ. 2004;82(10):764–70.

Cooke J, Bray K, Sriram V. Mapping research capacity in the CLAHRC community: supporting non-medical professionals. Report for the National CLAHRC Directors Forum.. 2016.

Hulcombe J, Sturgess J, Souvlis T, Fitzgerald C. An approach to building research capacity for health practitioners in a public health environment: an organisational perspective. Aust Health Rev. 2014;38(3):252–8.

Al-Haddad S, Kotnour T. Integrating the organizational change literature: a model for successful change. J Organ Change Manage. 2015;28(2):234–62.

Download references

Acknowledgements

Not applicable.

This research was funded and supported by the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care Yorkshire and Humber (NIHR CLAHRC YH). www.clahrc-yh.nihr.ac.uk . The views and opinions expressed are those of the authors, and not necessarily those of the NHS, the NIHR or the Department of Health.

Availability of data and materials

The data in the form of the original R&D strategies that were analysed to inform this article are available from Jo Cooke, NIHR CLAHRC Yorkshire and Humber, Sheffield Teaching Hospitals NHS Foundation Trust, Room D33, D Floor, Royal Hallamshire Hospital, Sheffield, S10 2JF, United Kingdom.

Author information

Authors and affiliations.

Faculty of Health and Wellbeing, Sheffield Hallam University, Montgomery House, 32 Collegiate Crescent, Collegiate Campus S10 2BP, Sheffield, UK

Melanie Gee

NIHR CLAHRC Yorkshire and Humber, Sheffield Teaching Hospitals NHS Foundation Trust, Room D33, D Floor, Royal Hallamshire Hospital S10 2JF, Sheffield, UK

You can also search for this author in PubMed   Google Scholar

Contributions

JC conceived the documentary analysis which informed this debate paper and both authors made substantial contributions to its design. MG performed the initial coding for the documentary analysis. Both authors refined and consolidated the coding and interpreted the data using the RCD framework. Both authors made substantial contributions to the manuscript and both authors read and approved the final manuscript.

Corresponding author

Correspondence to Melanie Gee .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

JC is the coordinator to the ACORN community of practice whose R&D strategies were analysed. MG has no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:.

Final coding tree for the activity codes. (DOCX 19 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Gee, M., Cooke, J. How do NHS organisations plan research capacity development? Strategies, strengths, and opportunities for improvement. BMC Health Serv Res 18 , 198 (2018). https://doi.org/10.1186/s12913-018-2992-2

Download citation

Received : 12 August 2016

Accepted : 14 March 2018

Published : 22 March 2018

DOI : https://doi.org/10.1186/s12913-018-2992-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Research capacity development
  • Organisational infrastructure
  • Research funding
  • Health systems research

BMC Health Services Research

ISSN: 1472-6963

research and capacity building organisations

To read this content please select one of the options below:

Please note you do not have access to teaching notes, research engagement and research capacity building: a priority for healthcare organisations in the uk.

Journal of Health Organization and Management

ISSN : 1477-7266

Article publication date: 27 March 2023

Issue publication date: 17 May 2023

To research involvement of healthcare staff in the UK and identify practical organisational and policy solutions to improve and boost capacity of the existing workforce to conduct research.

Design/methodology/approach

A mixed-method study presenting three work packages here: secondary analysis of levels of staff research activity, funding, academic outputs and workforce among healthcare organisations in the United Kingdom; 39 Research and Development lead and funder interviews; an online survey of 11 healthcare organisations across the UK, with 1,016 responses from healthcare staff included for analysis; and 51 interviews of healthcare staff in different roles from six UK healthcare organisations.

Interest in research involvement is strong and widespread but hampered by a lack of systematic organisational support despite national policies and strategies to increase staff engagement in research. While useful, these external strategies have limited universal success due to lack of organisational support. Healthcare organisations should embed research within organisational and human resources policies and increase the visibility of research through strategic organisational goals and governance processes. A systems-based approach is needed.

Research limitations/implications

The research gathered data from a limited number of NHS trusts but these were purposively sampled to provide a range of different acute/community health service organisations in different areas. But data was therefore more detailed and nuanced due to a more in-depth approach.

Practical implications

The findings are relevant for developing policies and practice within healthcare organisations to support research engagement. The findings also set out key policy and strategic recommendations that will support greater research engagement.

Social implications

Increased research activity and engagement in healthcare providers improves healthcare outcomes for patients.

Originality/value

This is a large scale (UK-wide) study involving a broad range of healthcare staff, with good engagement of nurses, midwives and Allied Healthcare Professionals who have not been previously achieved. This allowed valuable analysis of under-researched groups and comparisons by professional groups. The findings highlight the need for tailored action to embed research reporting, skills, professional development and infrastructure into organisational policies, strategies and systems, along with broader system-wide development.

  • Health services research
  • Capacity building
  • Health professionals
  • Research engagement

Acknowledgements

The authors would like to thank CRUK and the project steering group for their assistance in the development of the survey – especially Jess Newberry le Vay. The authors also thank all those healthcare staff members who completed and returned the surveys or participated in the interviews.

Funding : This work was supported by Cancer Research UK (C41319/A28600). Views expressed are those of the researchers and not necessarily those of the funder.

Peckham, S. , Zhang, W. , Eida, T. , Hashem, F. and Kendall, S. (2023), "Research engagement and research capacity building: a priority for healthcare organisations in the UK", Journal of Health Organization and Management , Vol. 37 No. 3, pp. 343-359. https://doi.org/10.1108/JHOM-12-2021-0436

Emerald Publishing Limited

Copyright © 2023, Emerald Publishing Limited

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

Loading metrics

Open Access

Health in Action

The Health in Action section provides a place where groups or individuals who are not represented regularly in a medical journal have a forum to describe the important issues from their perspective. Authors might include patient advocacy groups, healthcare workers, or non-governmental organizations.

See all article types »

Evaluating Health Research Capacity Building: An Evidence-Based Tool

* To whom correspondence should be addressed. [email protected]

  • Alex Yaw Osei Akoto,
  • Daniel Ansong,
  • Patrick Karikari,
  • George Bedu-Addo,
  • Julia Critchley,
  • Tsiri Agbenyega,
  • Anthony Nsiah-Asare
  • Imelda Bates, 
  • Alex Yaw Osei Akoto, 
  • Daniel Ansong, 
  • Patrick Karikari, 
  • George Bedu-Addo, 
  • Julia Critchley, 
  • Tsiri Agbenyega, 

PLOS

Published: July 18, 2006

  • https://doi.org/10.1371/journal.pmed.0030299
  • Reader Comments

Figure 1

Citation: Bates I, Akoto AYO, Ansong D, Karikari P, Bedu-Addo G, Critchley J, et al. (2006) Evaluating Health Research Capacity Building: An Evidence-Based Tool. PLoS Med 3(8): e299. https://doi.org/10.1371/journal.pmed.0030299

Copyright: © 2006 Bates et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The capacity-building programme in Ghana is funded by Komfo Anokye Teaching Hospital and the School of Medical Sciences, Kwame Nkrumah University of Science and Technology, Kumasi, Ghana, and the Malaria Knowledge Programme and Effective Health Care Alliance Programme (funded by the Department for International Development, United Kingdom) at the Liverpool School of Tropical Medicine, Liverpool, United Kingdom. The Department for International Development, United Kingdom, had no role in the study design; the collection, analysis, and interpretation of data; the writing of this paper; or the decision to submit this paper for publication; and accepts no responsibility for the information or views expressed.

Competing interests: The authors declare that no competing interests exist.

Abbreviations: EBM, evidence-based medicine; KATH, Komfo Anokye Teaching Hospital; LSTM, Liverpool School of Tropical Medicine; QA, quality assurance; WHO, World Health Organization

An increasingly important goal of governments and external agencies in developing countries is the need for “capacity building” in health research. Although a poorly defined and understood concept, capacity building would essentially enable de novo health research programmes to be facilitated and existing programmes to be strengthened (see [ 1 ] and page 14 in [ 2 ]). For health research, the goal of building capacity is thus to improve the ability to conduct research, to use results effectively, and to promote demand for research (see page 14 in [ 2 ]). Prioritising the need for the international community to make a “quantum leap in capacity building”, as suggested in 1998 by the Director General of the World Health Organization (WHO), would improve health and reduce poverty in developing countries [ 3 ].

To achieve this goal, there is an urgent need for an evidence-based tool for determining whether the required infrastructure is present in a given setting, as well as for underpinning the design and evaluation of capacity-building programmes in health research. Here, we describe the development and use of such a tool through analysis of published models and effective capacity-building principles, together with structured reflection and action (see page 9 in [ 2 ]) by stakeholders at the Komfo Anokye Teaching Hospital (KATH) in Kumasi, Ghana.

Challenges Faced in Building and Supporting Research Capacity at KATH

KATH benefits from a new management team that is committed to developing the hospital and medical school into a regional centre of excellence for research, teaching, and clinical care. Although local clinicians had previously been involved in multinational research projects, these projects had largely been generated by external agencies. Local staff lacked experience in the conception and design of projects, and the hospital lacked local role models and tutors for generating de novo research. Consultant posts at KATH remained vacant because senior clinical trainees had difficulty in completing the prerequisite research component of their exit examinations for the West African Colleges. Tellingly, when asked why they had not completed their specialist exams, the most common reason given by health professionals in KATH was apprehension of beginning their own research programmes.

KATH management needed a tool that they could use to ensure that all necessary resources were in place to support local research. Unfortunately, the literature specifically describing the building of health research capacity is scarce and tends to emphasise microlevel activities, such as the choice of research trainees (e.g., Nchinda TC [ 3 ]), without considering how these activities can be integrated into the wider research system. Moreover, much of the available information on building research capacity is based on retrospective reports of external consultants, and the perspective of implementing capacity building in a developing country is almost never represented [ 4 ]. Our aim therefore was to develop an evidence-based tool that could be used to guide the design, implementation, and evaluation of capacity building in health research programmes.

Developing an Appropriate Evaluation Tool

We used a three-stage approach: (1) searching the literature for existing tools and models; (2) analysing best-practice examples to guide the overall framework; and (3) adapting the framework into an operational tool that met the specific needs of KATH. By using translational research principles to analyse our findings, we systematically extracted and extrapolated stakeholders' evidential and experiential stories (see page 9 [ 2 ], and used this information to inform the overall design of our programme. Figure 1 outlines the stages of development and testing of the tool.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pmed.0030299.g001

Literature search

We searched the following electronic databases: MEDLINE, Ingenta, and Science Direct using keywords, such as “capacity building”, “capacity development”, “developing countries,” and “Africa”. We retrieved the full text of any relevant papers, including articles cited in the reference lists of these papers. Because there is limited information about health research capacity building in peer-reviewed literature, we also consulted books, Web sites of organisations working on health and research capacity building (e.g., Web sites of WHO, United Nations agencies, the European Community, and the International Development Research Centre), and references provided by colleagues. This evidence was used to derive a definition of health research capacity building, to identify existing capacity-building models, and to synthesise best-practice examples to derive key principles. Dataset S1 gives a detailed overview of our literature review on capacity building.

We found that many different definitions have been applied to capacity building according to the particular level—“micro”, “meso”, or “macro” (focused on in [ 5 ])—but that one of the most widely used definitions is “an ability of individuals, organisations or systems to perform appropriate functions effectively, efficiently and sustainably” [ 6 ].

By combining the definition for generic “capacity building” with published evidence and our practical experiences of developing a planning and evaluation tool, we have defined building capacity for health research as “an ability of individuals, organisations, or systems to perform and utilise health research effectively, efficiently, and sustainably.”

Using published best-practice examples to design the evaluation programme

No tools exist that are specific for evaluating health research capacity-building programmes. However, the literature review was helpful for identifying ineffective capacity-building strategies, such as “bolting-on” capacity building to research projects initiated by a specific donor in developing countries [ 7 ]. It was also useful for highlighting generic principles underlying successful capacity building.

We grouped the generic principles that consistently emerged from the literature as best practices into themes that emphasised the importance of three concepts. The first theme was a “phased approach”; this requires the sequential involvement of all stakeholders in assessing capacity gaps, developing strategies to fill these gaps, and evaluating outcomes [ 6 ]. The second theme was “strengthening of existing processes”; this is an iterative and flexible process that focuses on enhancing local ability to solve problems, define and achieve development needs, and then incrementally incorporate expanding circles of individuals, institutions, and systems [ 8 ]. The third theme was “partnerships”; for effective or sustained capacity building, the various partners involved must have similar concepts [ 5 ] and share responsibilities and obligations, with local partners taking ownership and leadership [ 6 , 9 ]. Thus, the role of external expertise is to facilitate the development of local skills through learning by experience, rather than acting as a “donor” who retains control of funds and expertise over a poorer “beneficiary” partner.

Developing and adapting the evaluation tool

An illuminating finding of the literature search was that there was no model that had been specifically designed with health research capacity building in mind. Indeed, the most useful model was one that had been developed for institutionalising quality assurance (QA) [ 10 ] because it focused on defining, measuring, and improving quality. This mirrors the processes required for capacity building in health research: defining the institutional systems needed to support research, enumerating existing and missing resources, and improving research support by addressing identified gaps. The QA institutionalisation framework represented a synthesis of over ten years' experience in developing countries, and was derived from a combination of organisational development and QA literature. The framework described organisations as passing through four phases when they implement innovation: awareness, experiential, expansion, and consolidation ( Table 1 ).

thumbnail

https://doi.org/10.1371/journal.pmed.0030299.t001

In the course of adapting our framework into a tool that was relevant to KATH, we were also influenced by a published framework for dissemination and implementation of evidence-based medicine (EBM) [ 11 ]. This prompted us to change the name of the “experiential” phase to “implementation”, as this was more appropriate to a research programme.

To meet the specific needs of KATH, local research stakeholders participated in adapting the tool. These stakeholders comprised ten KATH health professionals (nine clinicians and one physiotherapist), and senior hospital managers—including the chief executive, medical director, and heads of departments. Individual and group discussions took place during a workshop for the health professionals. Stakeholders considered each phase in the framework ( Table 2 ), and suggested characteristics, activities, and indicators of progress in building research capacity that met the needs of KATH and that could be feasibly measured or shown. The stakeholders' suggestions were incorporated into the framework to create an operational tool that could be used to identify gaps in the research infrastructural support at KATH ( Table 2 ). This ensured that a holistic approach was taken to developing the research capacity in the hospital, rather than a fragmented, unfocused approach.

thumbnail

https://doi.org/10.1371/journal.pmed.0030299.t002

Using the Evaluation Tool at KATH

Identifying strengths and weaknesses in the research capacity.

In the year following development of the tool, the ten health professionals undertook a research project as part of a work-based course to prepare them for the research component of their professional exams. By comparing their actual research experiences at KATH with the components itemised in the evaluation tool, they were able to identify aspects that were well supported by the institution and aspects where support was lacking or could only be provided by external facilitators. The comparison was achieved through group discussions and analysis of individual reflective statements about their research experiences, using a standard qualitative research approach known as “grounded theory”. Individual statements were scrutinised, and themes relating to research infrastructural support were extracted. Cycles of scrutinising, extracting data, and allocating it to themes were repeated until no new themes emerged [ 12 ].

A comparison between the themes that emerged from this process with the capacity-building evaluation tool identified strengths and weaknesses in the research infrastructural support. Strengths included the peer-support mechanisms within KATH, which occurred predominantly in three different contexts (peer group committees to review research proposals, small group work within course workshops, and cross-departmental research meetings). Peer support to promote work-based learning is an evidence-based educational approach [ 13 ], so the peer-support mechanisms in KATH corresponded to components of the evaluation tool. Weaknesses that emerged included gaps in knowledge concerning research resources available on the Internet, particularly systematic searching of the published literature.

Prioritising and implementing actions for addressing gaps in the research capacity

A nominal group technique [ 14 ] was used to achieve consensus among researchers about aspects of research support that were lacking in KATH and to agree on which of these should be prioritised. For this technique, researchers used their experiences of doing research and the evaluation tool to write their own observations on areas of research infrastructure that were lacking at KATH. These were categorised into themes by the whole group and ranked according to their importance for supporting research. Gaps that were identified as priorities included provision of local statistical expertise, lack of researcher skills in critical literature reviews, and inadequate Internet access. These gaps were presented by the researchers to senior managers in KATH as a list of recommendations, and the managers incorporated activities to address these recommendations in their annual plans and budgets in 2004/2005 and 2005/2006. Progress was reviewed with the managers and the researchers during the six-month course workshops ( Table 3 ).

thumbnail

https://doi.org/10.1371/journal.pmed.0030299.t003

What Was Achieved by Using the Tool?

Progress in strengthening the research infrastructure in KATH has been achieved both for individuals and for the institution. For individuals, a course to teach research skills has been established in partnership with the Liverpool School of Tropical Medicine (LSTM). Local facilitators have been trained to run the course and funding has been secured so that within three years the course will be wholly the responsibility of KATH staff, with LSTM providing external quality reviews for the course. At an institutional level, an Internet suite has been refurbished and equipped for use by researchers, research support meetings are now a regular monthly event, and KATH has trained its own clinical biostatistician to support its researchers. Within 18 months of the original recommendations, KATH management and researchers have achieved many of the indicators of progress listed in the evaluation tool, and have developed plans to achieve the remaining indicators within the next two years. Naturally, progress some indicators, particularly those relating to using research results to improve the quality of clinical care and encouraging whole departments to be more proactive about research, will be slow and could take several years to achieve.

What have we learned?

The evaluation tool has enabled researchers and hospital managers to work together to achieve a common goal of improving the research capacity in KATH. They have monitored their progress against predetermined standards and have identified and filled gaps in research infrastructure.

The evaluation tool should be flexible enough to incorporate changes in the local environment and the needs of KATH, and consequently we plan to re-evaluate and amend it within five years. Because changing the research culture of an institution is a complex process, some important components that should have been included in the tool might have been overlooked. For example, dialogue between scientists and nonscientists, as well as non-health-sector workers, is important for developing and sustaining health research capacity [ 3 ]. Such interactions are not represented in our tool, which has focused instead on building institutional capacity.

The success of the process by which this tool was developed and tested confirms the importance of the generic principles underlying effective capacity building that we extracted from the literature. We used a phased approach to engage stakeholders in identifying strengths and weaknesses, and then to develop, implement, and monitor action plans to address these gaps [ 6 ]. Part of this process involved identifying and strengthening existing processes and building up local resources, rather than developing new parallel systems [ 15 ]. This strengthening process included formalising the peer-support meetings that researchers had found so helpful, and expanding the existing Internet facilities. The process is a good example of a genuine partnership for problem solving that is built on trust, common interest, long-term commitment, and shared responsibilities and obligations [ 16 , 17 ]. Although funding for the process was initially shared between KATH and LSTM, KATH has maintained ownership and leadership, and is now totally funding the capacity-building process. Each partner had clearly delineated roles, and mechanisms and timescales for transfer of skills from LSTM to KATH staff were agreed on early in the process.

Two important criteria for this project's success were the motivation of the researchers and the strong leadership and commitment of KATH managers. Participation of all stakeholders in the design of evaluation indicators is recognised to promote motivation and commitment (see Chapter 7 [ 18 ]). The rate of progress is likely to slow down over the next few years, as the institutional shift towards research begins to involve individuals who might not have the high motivation of the managers and researchers, but the tool nevertheless provides a means for maintaining focus on achieving some of the more difficult indicators.

How transferable are these lessons and the tool?

The generic principles of effective capacity building—phased approach, strengthening existing systems and partnerships for problem solving—were derived from contexts that were not health sector–specific, and yet they have been applied successfully here. However, the evaluation tool was developed for the context of health research at KATH, and its value and transferability in other contexts would need to be assessed.

Although the framework from which the tool was derived incorporated all the elements of a research process, such as problem identification, priority setting, and research use (see page 16 in [ 2 ]), the specific components used to produce the operational tool would need to be adapted to suit the specific needs of other institutions. Monitoring and evaluation is the most difficult and neglected component of capacity-building programmes because they can take over 20 years to achieve their objectives [ 8 ], and some outcomes, such as organisational culture, are difficult to measure [ 19 ]. Different users of evaluations will have different priorities, and the use of an evaluation tool helps to promote agreement on the purpose of the evaluation and the indicators [ 20 ]. The major advantages of our tool are that it enables an institution in a developing country to set its own priorities, to have control over local capacity building [ 21 ], and to evaluate progress in building capacity from its own perspective rather than from that of an external agency.

Supporting Information

Dataset s1. further details of the literature review on capacity building.

https://doi.org/10.1371/journal.pmed.0030299.sd001

(52 KB DOC).

  • 1. Wojtas O (2004 June 25) New head will extend learning body's reach. The Times Higher Education. 11.
  • 2. Suwanwela C, Neufeld V (2001) Health research for development: Realities and challenges. Ottawa: International Development Research Centre. Available: http://www.idrc.ca/books/ev-27424-201-1-DO_TOPIC.html . Accessed 5 June 2006.
  • View Article
  • Google Scholar
  • 5. European Centre for Development Policy Management (2000) Modernising international cooperation: Lessons and opportunities, ACP-EU partnership. Maastricht (Netherlands): European Centre for Development Policy Management. Issues paper 3.
  • 6. Milen A (2001) What do we know about capacity building? An overview of existing knowledge and good practice. Geneva: World Health Organization. Available: http://www.unescobkk.org/fileadmin/user_upload/aims/capacity_building.pdf . Accessed 7 June 2006.
  • 7. Parliamentary Office of Science and Technology [UK] (2004) Scientific capacity in developing countries. London: Parliamentary Office of Science and Technology. Available: http://www.parliament.uk/documents/upload/POSTpn216.pdf . Accessed 5 June 2006.
  • 8. United Nations Development Program (1998) Capacity assessment and development, in a systems and strategic management context. New York: United Nations Development Program. Available: http://www.capacity.undp.org/indexAction.cfm?module=Library&action=GetFile&DocumentID=5072 . Accessed 5 June 2006.
  • 9. Development Assistance Committee (1996) Shaping the 21st century: The contribution of development cooperation. Paris: Organisation for Economic Co-operation and Development. Available: http://www.oecd.org/dataoecd/23/35/2508761.pdf . Accessed 5 June 2006.
  • 12. Glaser BG, Strauss AL (1967) The discovery of grounded theory. Chicago: Aldane. p.
  • 16. European Centre for Development Policy Management (2000) Modernising international cooperation: Lessons and opportunities, ACP-EU partnership. Maastricht (Netherlands): European Centre for Development Policy Management. Case study 6.
  • 17. Fowler A (2000) Questioning partnership. The reality of aid and NGO relations, IDS Bulletin: 31.
  • 18. Horton D, Alexaki A, Bennett-Lartey S, Brice KN, Campilan D (2003) Evaluating capacity development: Experiences from research and development organizations around the world. Ottawa: International Development Research Centre. p Available: http://www.idrc.ca/en/ev-31556-201-1-DO_TOPIC.html . Accessed 12 June 2006.
  • 19. Land A (2000) Implementing institutional and capacity development: conceptual and operational issues. Maastricht (Netherlands): European Centre for Development Policy Management. ECDPM Discussion paper 14.
  • 20. World Health Organization (1995) WHO global programme on AIDS: Consultation of strengthening NGO HIV/AIDS umbrella initiatives. Geneva: World Health Organization.
  • 21. Chataway J, Smith J, Wield J (2005) An Africa-Canada-UK exploration: Building science and technology capacity with African partners. Ottawa: International Development Research Centre. Available: http://www.idrc.ca/roks/ev-70877-201-1-DO_TOPIC.html . Accessed 5 June 2006.
  • Open access
  • Published: 18 August 2016

Understanding collaboration in a multi-national research capacity-building partnership: a qualitative study

  • Dinansha Varshney 1 ,
  • Salla Atkins 2 ,
  • Arindam Das 3 &
  • Vishal Diwan 1 , 2 , 4  

Health Research Policy and Systems volume  14 , Article number:  64 ( 2016 ) Cite this article

4940 Accesses

22 Citations

3 Altmetric

Metrics details

Research capacity building and its impact on policy and international research partnership is increasingly seen as important. High income and low- and middle-income countries frequently engage in research collaborations. These can have a positive impact on research capacity building, provided such partnerships are long-term collaborations with a unified aim, but they can also have challenges. What are these challenges, which often result in a short term/ non viable collaboration? Does such collaboration results in capacity building? What are the requirements to make any collaboration sustainable? This study aimed to answer these and other research questions through examining an international collaboration in one multi-country research capacity building project ARCADE RSDH (Asian Regional Capacity Development for Research on Social Determinants of Health).

A qualitative study was conducted that focused on the reasons for the collaboration, collaboration patterns involved, processes of exchanging information, barriers faced and perceived growth in research capacity. In-depth interviews were conducted with the principal investigators (n = 12), research assistants (n = 2) and a scientific coordinator (n = 1) of the collaborating institutes. Data were analysed using thematic framework analysis.

The initial contact between institutes was through previous collaborations. The collaboration was affected by the organisational structure of the partner institutes, political influences and the collaboration design. Communication was usually conducted online, which was affected by differences in time and language and inefficient infrastructure. Limited funding resulted in restricted engagement by some partners.

This study explored work in a large, North-South collaboration project focusing on building research capacity in partner institutes. The project helped strengthen research capacity, though differences in organization types, existing research capacity, culture, time, and language acted as obstacles to the success of the project. Managing these differences requires preplanned strategies to develop functional communication channels among the partners, maintaining transparency, and sharing the rewards and benefits at all stages of collaboration.

Peer Review reports

Health research capacity can contribute to overall health system development of any country, particularly in low- and middle-income countries (LMICs), where there is less capacity in research [ 1 , 2 ]. The reasons for low research output from LMICs can be shortages of local qualified researchers, limited funding, poor infrastructure, and lack of expertise in academic writing [ 3 – 7 ]. Building and sustaining research capacity within developing countries is a complex [ 7 , 8 ], but essential and effective means of accelerating research contributions to health and development [ 9 – 12 ]. However, researchers have noted that 90% of global research investments addresses the needs of only 10% of the world’s population [ 13 ]. Most research output is also from high-income countries [ 14 ].

Facilitating collaboration between developed and developing counterparts [ 15 ] could result in higher research outputs from LMICs [ 16 , 17 ]. To this end, organisations, such as the Council on Health Research for Development (COHRED) in 2003, were created at the global level to work directly with governments in LMICs to promote national and international collaborations [ 18 ]. Research collaboration in general has grown in importance for scientists, research organisations and policymakers [ 19 – 21 ]. However, in spite of many initiatives, generally originating from high-income countries, collaborations are often criticised for failing to strengthen, incorporate, and involve low-income partners in priority setting and publications, and research collaboration does not always result in increased outputs [ 15 ]. An international, cross-disciplinary, project faces many challenges, such as communication and coordination problems, misunderstandings, and mismatched expectations. Participants in such projects come from different fields of work and work for a unified goal, and are usually dependent on each other [ 16 ]. Therefore, the responsibility for making such collaboration successful falls onto the lead researchers, leading others at partner universities and institutes. This management of  multiple stakeholders with different resources and expectations is also a challenge [ 13 ]. It is suggested that successful research collaborations need exploration and identification of areas of interventions, effective dissemination strategies, uptake of results and, most importantly, the commitment of the partner countries [ 9 , 10 ]. Barriers to successful collaboration include, amongst others, aims that are not shared, unequal distribution of power, lack of trust, ineffective membership structures and poor leadership [ 22 ].

Despite these challenges, international collaborations are often presented as a panacea for particularly complex issues and problems that exist within the fields of policy and politics in a wide range of international contexts [ 23 ]. Many benefits and requirements of such interventions are documented in the past, but the actual process of the collaboration has been studied infrequently. Given the challenges in trans-disciplinary and international collaborative research, and the possible barriers to success, it is important to learn from existing collaborations to give recommendations for countering challenges. Therefore, we aimed to understand (1) What are these challenges, which often result in a short-term/non-viable collaboration? (2) Does such collaboration result in capacity building? and (3) What are the requirements to make collaborations sustainable? in one multi-country research collaboration, ARCADE RSDH (Asian Regional Capacity Development Research on Social Determinants of Health).

ARCADE RSDH Project

ARCADE RSDH ( www.arcade-project.org ) was developed in response to inequities in health in the Asian context, coinciding with weak local health research capacity, especially in the social determinants of health (SDH) research [ 24 ]. The project originated as an adaptation of its sister project ARCADE HSSR. The project developed and progressed through funds from the European Union. The purpose of this collaboration was to add new research training capacity by training a new generation of researchers. The focus was on postgraduate, doctoral and postdoctoral training in LMICs in Asia, and on the promotion of research on the SDH [ 24 ]. Under this collaboration, many courses were developed and delivered across institutions, and institutional capacity in grants management and communications was built [ 25 , 26 ]. Various innovative technologies were used by the project to produce world class online learning modules. Through innovative technology, courses were made available to researchers in LMICs that may not otherwise have had access to such material. The list of ARCADE partners is shown in Table  1 .

The ARCADE Consortia

ARCADE RSDH operated through 12 universities operating within a network with expertise in research in social determinants of health or related areas. Karolinska Institutet (KI), based in Sweden, coordinated the project activities. At each partner institution, Principal Investigators (PI) were mostly senior staff, supported by junior staff (postdoctoral fellows, researchers or PhD students) in running the project.

Activities were developed collaboratively within the consortium. The consortium had two regional training centres (hubs; TJMC and SJNANHS, at the time of this study), each training doctoral and postdoctoral students, from their own countries, from the other Asian partners and from the European partners, as well as open to international student applications. The European partners and their networks supported the two regional training centres through staff visits and exchanges. The two regional training centres supported each other through exchange of course materials, experience, skills and staff, joint training programmes for their students, and joint applications for research and research training grants in cooperation with European partners.

Study setting

We conducted a qualitative interview study online in R D Gardi Medical College, Ujjain, India with 12 partner universities, from India, China, Oman, Vietnam, UK, Sweden, Finland and South Africa, between March 2014 to June 2014. The study was conducted in the middle phase of this project. 

Data collection

Email invitations were sent to all the PIs and project staff of the partner organizations for in-depth interviews, stating the purpose and objectives of the study. In total, 16 participants from 12 institutions (12 PIs, two research assistants, one project manager and one scientific coordinator) participated in the study. Half of the interviewees were key people involved in formulation and execution of the project, and the rest played a less central role. Individual interviews at a time and place convenient to the participants were conducted in English through Skype in March and April 2014. The interviews followed a set of topic guides, which were designed for the following categories of participants: coordinating organisation partners with large funding including the hub and partners with little funding. The topic guide addressed (1) the reasons for collaboration, (2) how the information was exchanged and the challenges faced while carrying out project’s activities, and (3) whether capacity growth has occurred in the developing country over the course of the collaboration (Additional file 1 ). Each interview lasted approximately 40 to 45 minutes. The interviews were recorded electronically using a tape recorder. The first author then transcribed them and this was cross-checked by the last author (VD). In order to protect anonymity, each transcript was marked with a unique case identification number and all names were removed.

Data analysis

The data collected were analyzed using thematic framework analysis. The transcripts were read and reread and initial codes were enlisted by research team. Each code was described briefly. From these codes, a list of categories were developed which were tabulated using Microsoft Excel 2010. This framework was applied again on the transcripts, charting the data onto excel, modifying categories where necessary. The final framework, consisting of 52 codes and 16 subcategories was analysed further to develop seven categories and three themes. To increase the reliability, the complete thematic framework was read by the last author independently and updated. Analyses were discussed repeatedly among the authors [ 27 – 29 ].

Three themes emerged from the analysis, namely (1) collaboration process: perception, phases and patterns; (2) communication and outputs hampered by Internet infrastructure and consortium size; and (3) outcomes of the collaboration: what was actually achieved (Table  2 ).

Theme 1: Collaboration process: perception, phases and pattern

The collaboration process theme emerged from the following categories: (1) perception about the project – unequal participation depending on available funding; (2) collaboration process – importance of the network; and (3) collaboration pattern – challenges between Asian partners.

Category 1: Perception about the project – unequal participation depending on available funding

A clear understanding about the project aim and goal is important in an international collaborative project. It seemed that ARCADE RSDH partners shared an understanding of the project. Many interviewees saw ARCADE RSDH as a great way of extending collaboration and adding research capacity to their own institutes.

“ Our institute needs to work with other international institutes, to enhance our network ” – Interviewee, small partner organisation.
“ The peculiarity of this project is that the big partners have a bigger involvement, stronger roles and they tend to participate more. The involvement of smaller partners seems to be a little less involved, but there are some exceptions in our case ” – Interviewee, coordinating organisation.

Participants regarded the project as a way to improve both the collaboration capacity at the organisations as well as the educational capacity of the organisations.

Category 2: The collaboration process – importance of networks

The consortium was developed based on previous working experience. Initially, partners were selected based on previous working relationships with KIs and knowledge and interest in the subject area.

“ We knew that they were excellent from previous experience and we wanted to bring the collaboration to the new project to continue working with them ” – Interviewee, coordinating organisation.

Partner organisations were not just selected because they were known to the coordinator, they also had to have expertise and resources to implement the study. For example, one of the small partners did not have any previous relation with the coordinating institute, but joined the collaboration to extend its international collaboration.

“ No, this is the first time we are working with KI, we don’t have any previous collaboration with any of the institute. Our institute is very much aligned to the aim of the project and we need to enhance our network ” – Interviewee, small partner.

Similar interests with the project’s objectives acted as a motivation to join the project, though motivations differed between ‘smaller’ and ‘larger’ partners. Partners with less funding were motivated to participate by access to resources and opportunity to communicate with international experts, which would ultimately enhance their institute’s scientific/research carrying capacity. On the other hand, ‘larger’ partners wished to share their resources and skilful expertise with the developing institutes in order to help them gain technical capacity. This kind of mutual interest aided the collaboration:

“ KI approached us to give training to the less resourced research organisation and we have resources to help them, so we joined the project ” – Interviewee, big partner.
“ By this kind of collaboration, we could increase educational resource and communicate with international experts ” – Interviewee, small partner organisation.

Category 3: Collaboration pattern – challenges between Asian partners

The organisational culture and its structure influence any collaboration to a great extent. In this project, the preference for collaboration was with universities due to their flexible working style, though not all the partners were universities. According to one interviewee, the bureaucratic structure and other political issues often made collaboration difficult with some government run organizations because of their rigid working structure.

“ The role of politicians is very much higher so it is much more difficult to work with government institute especially in India. However, I am working with other institution in other countries, their political involvement is not there and it’s easy to work with them in comparison to India ” – Interviewee, coordinating institute.

While analysing the pattern of collaboration among ARCADE RSDH partners, collaboration was seen (1) between organisations in European countries; (2) between organisations in European and Asian countries; (3) between organisations in Asian countries; and (4) between organisations within the same country.

Active collaboration was seen between the European partners. One of the European partners is the coordinating organisation, which receives help from another European organisation in developing online courses.

The collaboration among Asian and European partners was seen as good, though there seemed to be less involvement of smaller Asian partners than the ‘larger’ Asian partners:

“ We have this kind of activity of mentorship. We have expert faculties from Chinese university and IDS [United Kingdom] ” – Interviewee, small partner organisation.

However, the links between the Asian partners were found to be weak. Only one or two institutes from Asian countries collaborated with each other as opposed to collaborating with European institutes. Some of the reasons mentioned for this included little interaction and previous experience of working together between partners. The differences in goals and administrative structures of the partners (medical colleges, management and research institutes, universities) also influenced the collaboration practice negatively.

“ We came to know our partner after the consortium was formed; most of the institutes are universities different from us so we don’t have much scope to learn from them ” – Interviewee, small partner.

Participants observed that ‘smaller’ partners collaborated with the hub institutes and other ‘larger’ partner organisations, but there was no or little interaction between them. The participants suspected this was caused by a lack of resources at ‘smaller’ partners, limited funding and differences in the educational system, language and culture of the institutes.

“ I think, as project needs more communication, many times we couldn’t get some difference in culture and language and we have different educational systems which poses challenges. Yes language is a big problem ” – Interviewee, small partner.

Missing common goals among the partner institutes may hinder the process of international collaboration. To some extent, this was the case in ARCADE RSDH – according to one of the small partner PIs, the programme intends to build the research capacity of the institutes by giving training to the PhD students and not the staff. Therefore, as these institutes did not have PhD students, they were not active participants.

Theme 2: Communication and outputs hampered by Internet infrastructure and consortium size

This theme emerged from the following categories: (1) email facilitating communication in a large consortium, and (2) the size of the consortium and diverse partners as a challenge to communication and activity.

Category 1: Email facilitating communication in a large consortium

Technology plays an important role in building international collaboration. It helps to bridge the cultural diversity and differences in an international collaboration such as ARCADE RSDH. Expectations of the collaboration can be quite different, and therefore good communication channels between the partners are important, which is usually facilitated by technology. The most common and convenient method used in communicating daily activities was email. Technology such as Skype and GoToMeeting ( www.gotomeeting.com ) were used to arrange online meetings. While reports gave partners an idea of ongoing activities, the most interaction and sharing of work occurred at annual meetings:

“ This is a huge project so many people have joined and working in the same year, the best way to get more response is by email ” – Interviewee, small partner, Asia.

As the coordinating institute preferred to work via email, all partners also found that response from the coordinator was good.

“ We get loads of emails they are good at providing emails. They immediately reply. We come to know about all the activities ” – Interviewee, small partner, Asia.

Partners communicated in the consortium to different degrees – responses to the coordinator varied from partner to partner. Similarly, the hub institutes reported getting mixed responses from ‘smaller’ partners, as some replied promptly and others were more reluctant:

“ The main challenge is people don’t respond to emails so that’s the problem ” – Interviewee, hub institute, Asia.

Category 2: Size of the consortium and diverse partners as a challenge to communication and activity

The size of the consortium, and the diversity of partners, was one of the challenges in ARCADE RSDH. One of the main challenges is keeping track of all the ongoing activities, as ARCADE RSDH had 13 partners:

“ Since it is a big workload for the coordinator to keep track of all the activities of the partners, sometimes some of the activities go under the radar ” – Interviewee, coordinating institute.

Lack of in-depth knowledge of ongoing activities was also seen as a barrier to activity by small partners; as some partners were ignorant about the project activities they lost interest in the project and became passive:

“ We need effective communication we have so many partners. For example, I don’t know the recent development of this project and activities done ” – Interviewee, small partner organisation.

As some of the staff contributed to the project part-time, they were seen as overburdened. They could contribute only limited hours to the project, some due to limited budget. According to an interviewee, part-time involvement also affected the project outputs. One's entire attention is needed on the project to make the project successful. Inability to motivate the partners was also stated as a shortcoming. 

“ If we want ARCADE to happen really the interest should be solely focused on ARCADE activities. Motivation shall be as such that one really pays attention to this. Now the communication between the organisations is also must, the annual meeting is also a problem as not all are able to participate because of the time difference or the difficulty to reach the place. Having a conference on Skype is also difficult ” – Interviewee, big partner organisation.

The basic structure of the consortium, in which the hub institutes communicated with smaller partners and coordinating institutes, was seen as one of the challenges in communication. This specific structure sometimes resulted in turbulent flows of information as the exchange of information mostly occurred among the coordinating institute and larger partners and less between small partners. Smaller partners were mostly dependent on the hub institute for communication:

“ Another problem is the structure of ARCADE is based on hub working/communicating with small institutes in some way that has not been working and this is creating a problem ” – Interviewee, coordinating institute.

According to the participants, new ways should have been explored to engage all the partners in the project. To increase their involvement, they should be told about the benefits of engaging themselves in grant writing and grant management workshops, which will later help them build their research capacity. The diversity of partners meant that there were several challenges to address in communication, including lack of infrastructure for communication and time differences.

Lack of infrastructure as barrier to collaboration

Within the large consortium, lacking infrastructure, including internet facilities in ‘small’ partner organisations, affected the planned ongoing activities of the project. Even if Internet was available, the bandwidth may not have been sufficient, which was one of the most important prerequisites for conducting online courses and other online activities for a project like ARCADE RSDH. All this resulted in a weak collaboration, especially among the small partners and between small and big partners.

“ If you want to give good video or audio, bandwidth shall be good, but the institute is not able to give its entire bandwidth to this education ” – Interviewee, coordinating institute.

As can be expected, no technical issues were reported in European partners.

Time difference

As many of the partners were from different countries, time differences also hampered the collaboration and resulted in increased dependency on the use of email. Arranging online meetings with all the partners was difficult as any time decided was odd for one or another partner.

“ Setting up of some meeting, which would engage one person from Canada and other partners based in China and Vietnam, bringing up everyone in the same meeting is challenging ” – Interviewee, coordinating institute.

The small partners, who were less active, did not find timing as a major issue in collaboration.

“ Until now, we have not arranged any such meetings so I don’t see any challenge, it’s just a management issue ” – Interviewee, small institute.

Theme 3: Outcomes of the collaboration: what was actually achieved?

This theme emerged from the following categories: (1) limited effects on research training capacity and (2) infrastructure development as an additional outcome.

Category 1: Limited effects on research training capacity

Participants felt that building research capacity for any institute is a gradual process and takes time, though enhancement in the research capacity was certainly observed.

“ Increase in research carrying capacity, specifically in SDH [social determinants of health] is not that significant, if judged by the no of proposals submitted for funding. But it will soon become better. I cannot score it high ” – Interviewee, small partner organisation.

Hub institutes felt that their capacity had increased, with mentoring activities, development and networks being extended:

“ There are courses which are not developed in our institute but are developed by other partner organisations, so our students get the benefit of those courses. Other than that it’s an internal learning too, how to work in collaboration with other organisations ” – Interviewee, big partner organisation, European country.

According to participants, mapping the impact of research capacity building activities was essential to an international collaborative project. For this, certain indicators were to be developed. ARCADE RSDH had some process indicators to assess the ongoing activities in the project, but those were not sufficient to map the gain in the research carrying capacity of the partner institute.

“ This is something we haven’t really captured with indicators as to how our project is progressing and this is very much qualitative and depends on the level of satisfaction, etc .” – Interviewee, coordinating organisation.

Category 2: Infrastructure development as an additional outcome

Some participants reported that they had developed in terms of infrastructure, which they were lacking earlier, and had become increasingly aware about grant management and implementation of online activities such as meetings and courses. An increase in human resource capacity was also observed in partner organisations. Specific platforms for online learning, such as Moodle ( www.moodle.org ) were adopted and developed by the hub institutes to conduct online courses. Many blended courses had been conducted across the partner institutes and a good enrolment of students for these courses was seen.

“ Staff has increased so small partner organisation’s capacity has also grown. I think last time, when online activities were conducted, they were not that good as of now. Things are better technically [Internet connectivity]” – Interviewee, hub institute.

This study explored the process of collaboration in an international collaborative project, ARCADE RSDH, which focused on building research capacity of young postgraduate, doctoral and postdoctoral researchers. The project activities were carried out in a collaborative environment, with different collaborative partners and a unified goal [ 30 ]. The project partners were assigned different roles that ranged from providing the resources to being the end-users of materials developed within the project. The study highlighted how a collaboration begins and develops, and the challenges and solutions for communication and achieving the set goals.

Collaborative projects are often seen challenging, as the project managers come from different organisations and backgrounds to work together. The diversity in cultural and national backgrounds, skill sets and expertise, and managerial roles, results in different expectations, which may be difficult to meet [ 30 ].

ARCADE RSDH had 12 partners with diverse cultural and organisational backgrounds, which were brought together mainly due to previous working relationships. It is often seen that human, linguistic or social ties form as a result of historical interactions and support present-day collaborations [ 14 , 18 ]. The partners involved in ARCADE were known to each other through previous work relations, though not all of them were familiar with each other. These working relationships were not necessarily long term.

Other studies have suggested that spatial proximity encourages collaboration [ 14 ]. ARCADE RSDH did not seem to be influenced by spatial proximity as much as previous working ties. However, the geographic spread also created challenges in collaboration, including time zones, organisational cultures and distance to meetings despite an active use of online resources. In order to counter the effects of distance, emails were used as the main mode of communication. However, email is usually overloaded and often missed by project staff [ 14 , 18 ], and thus may not be the most effective mode of communication [ 14 ]. Project-related communication was enhanced by quarterly and annual meetings, but not all partners attended these. In order to overcome these communication challenges a robust communication strategy should be planned, which should encourage face-to-face interaction among all the project partners, taking into account the time and cultural differences in the beginning phase of the project [ 22 , 29 ].

One of the barriers found in this project, both with regards to communication and achieving outcomes, was understanding each other’s infrastructures and administrative organisations. In this collaboration, some partners who initially committed to the project later became passive. This could have been due to incentives not aligning with the demand for achievements, recognition and rewards [ 22 , 31 ]. Lack of a clear understanding about other partners’ organizational structure and working style is also one of the reason behind them becoming passive. Successful collaborations require an understanding of each other’s working environment from the project planning stage of the partnership in order to get a clear insight about each other’s working methods. This alignment can be achieved by openly discussing the differences and processing them at the beginning of the project, enhancing a focus driven collaboration [ 32 ].

Paradoxically to the aim of collaborations to enhance research capacity, existing research capacity in an institute also attracted collaboration with international partners in our study. Research indicates that scientific research requires a ‘baseline’ level of scientific infrastructure to make collaboration effective in capacity building [ 14 ]. Therefore, institutions that do not have capacity at that critical level may not attract further capacity building, an issue that deserves attention and further research.

One reason for “smaller” partners becoming passive could be the funding structure and management structure that operated through hubs [ 33 ]. In another collaboration, African partners could not contact each other directly and were obliged to direct their communication through United Kingdom-based researchers [ 6 , 34 , 35 ]. This kind of structure can decrease communication among the partners, and thus result in passivity. Attention should also be paid to funding structure in projects. ‘Small’ partners with limited resources cannot commit resources, including staff, to projects fully. Paying attention to ‘smaller’ partners’ needs is important in planning activities. This helps enhance their research capacity, and in ensuring an equitable approach to capacity building [ 36 ].

Long and sustained collaboration is key to a project’s positive outcome. Sustained, systematic effort is particularly needed in collaborations aiming to build research capacity [ 8 ]. Short-term collaborations often lead to loss of existing achievements, especially in LMICs, with capacities left unused and researchers migrating away in search of other job opportunities [ 13 ]. Research projects receive funding for a short period, which allows the creation of the project, but sustaining it longer when timelines are strict and further financing limited is challenging [ 37 ]. ARCADE RSDH was a 4-year span project, which received its funding from the European Union Seventh Framework Programme. These activities came to an end after 4 years, though relationships were developed with project partners. Our participants felt that collaboration in ARCADE RSDH had indeed helped to develop research carrying capacity of partner organisations in various ways, though the activities initiated should have been carried on further as research capacity building takes many years and systematic effort and is not built in a short time frame.

Limitations

This study explored international collaboration using a cross-sectional, qualitative study. As the study was conducted at a particular point of time during the middle phase of the project, it cannot represent the working of the collaboration during the entire project. Following the implementation from the beginning to the end may have yielded more substantial results. This study did not assesses the process indicators and outcomes of activities planned during the project. Evaluation of such indicators in future will give more insight of research collaboration in such a large consortium.

This study explored work in a large, North–South collaboration project focusing on building research capacity in partner institutes. Collaborative partners emphasised the need to set up clear targets, communication strategies and align research interests at the start of collaboration. Managing these challenges requires pre-planned strategies to develop proper communication channels among the partners, maintaining transparency, and sharing the rewards and benefits at all stages of collaboration. North–South collaborations should ensure that funding is equitable to ensure active participation. Collaborations such as ARCADE RSDH have potential to build research capacity in all partners, but such activity needs sustained and systematic effort.

Abbreviations

ARCADE RSDH, Asian Regional Capacity Development Research on Social Determinants of Health; LMIC, low- and middle-income countries

Department for International Development. DFID Research: Supporting research to strengthen health systems in low and middle-income countries 2013. https://www.gov.uk/government/news/Dfid-research-supporting-research-to-strengthen-health-systems-in-low-and-middle-income-countries . Accessed 1 Jan 2015.

Mony PK, Kurpad A, Vaz M. Capacity building in collaborative research is essential. Br Med J. 2005;331:843–4.

Article   Google Scholar  

Uthman OA, Wiysonge CS, Ota MO, et al. Increasing the value of health research in the WHO African Region beyond 2015— reflecting on the past, celebrating the present and building the future: a bibliometric analysis. BMJ Open. 2015;5:e006340. doi: 10.1136/bmjopen-2014-006340 .

Article   PubMed   PubMed Central   Google Scholar  

Njuguna F, Itegi F. Research in institutions of higher education in Africa: challenges and prospects. Eur Sci J. 2013;1:352–61.

Google Scholar  

Abdul G, Carel IJ, Fabio Z. Changing mindsets: Research capacity strengthening in low- and middle-income countries. Geneva: COHRED, Global Forum for Health Research and UNICEF/UNDP/World Bank/WHO Special Programme for Research and Training in Tropical Diseases (TDR); 2008.

World Health Organization. Working Together for Health. World Health Report 2006. Geneva: WHO; 2006.

Amde WK, Sanders D, Lehmann U. Building capacity to develop an African teaching platform on health workforce development: a collaborative initiative of universities from four sub Saharan coutnries. Hum Resour Health. 2014;12:31. doi: 10.1186/1478-4491-12-31 .

Trostle J, Simon J. Building applied health research capacity in less-developed countries: Problems encountered by the ADDR project. Soc Sci Med. 1992;35(11):1379–87.

Article   CAS   PubMed   Google Scholar  

Nuyen Y. No development without research: a challenge for research capacity strengthening. Global Forum for Health Research. 2005. http://www.sdh-net.eu/data/uploads/publications-library/no-development-without-research.pdf . Accessed 31 Dec 2015.

Loukanova S, Prytherch H, Blank A, Duysburgh E, Williams J. Nesting doctoral students in collaborative North South partnerships for health systems research. Glob Health Action. 2014;1(February):1–14.

Mayhew SH, Doherty J, Pitayarangsarit S. Developing health systems research capacities through north-south partnership: an evaluation of collaboration with South Africa and Thailand. Health Res Policy Syst. 2008;6:8.

Olaleye DO, Odaibo GN, Carney P, Agbaji O, Sagay AS, Muktar H, et al. Enhancement of health research capacity in Nigeria through north-south and in-country partnerships. Acad Med. 2014;89(8 Suppl):S93–7.

Article   PubMed   Google Scholar  

Commission on Health Research for Development (1990). Health Research, Oxford. http://www.nap.edu/openbook.php?record_id=5513&page=39 . Accessed 1 Mar 2015.

Wagner CS, Brahmakulama I, et al. Science and Technology Collaboration: Building Capacity in Developing Countries? Prepared for the World Bank. Santa Monica: RAND; 2001.

Barbara C, Parpart JL. Academic-community collaboration, gender research, and development: pitfalls and possibilities. Dev Pract. 2006;16:1.

Olenko X, Pager S, Holden L. A thematic analysis of the role of the organisation in building allied health research capacity: a senior managers’ perspective. BMC Health Serv Res. 2012;12:276.

Cole D, Boyd A, Aslanyan G, Bates I. Indicators for tracking programmes to strengthen health research capacity in lower and middle income countries: a qualitative synthesis. Health Res Policy Syst. 2014;12:13.

A call for action to strengthen health research capacity in low and middle income countries, IAMP Interacademy medical project. http://www.iamp-online.org/sites/iamp-online.org/files/IAMP%20Call%20for%20Action%20on%20RSC%20.pdf . Accessed 1 Jan 2015.

Asian Regional Capacity Development for Research on Social Determinants of Health. www.arcade-project.org . Accessed 1 Mar 2015.

Fawcett S, Schultz J, Watson-Thompson J. Building Multisectoral Partnerships for Population Health and Health Equity. Prev Chronic Dis. 2010;7(6):A118.

PubMed   PubMed Central   Google Scholar  

Augustino D, Arena M, Azzone G, Dal Molin M, Masella C. Developing a performance measurement system for public research centers. Int J Busin Sci Appl Manage. 2012;7(1):627–38.

Barrett A, Crossley M, Dachi H. International partnerships, collaboration and capacity building in educational research: the EdQual experience. Comp Educ. 2011;47(1):25–43.

Armistead C, Pettigrew P, Aves S. Exploring leadership in multi-sectoral partnerships. Leadership. 2007;3(2):211–30.

Farnman R, Diwan V, Atkins S. Successes and challenges of north-south partnerships- key lessons from the African/Asian Research Capacity Development projects. Global Health Action. 2016.

Atkins S, Marsden S, Diwan V, Zwarenstein M. North-South collaboration and capacity development in global health research in low- and middle income countries. The ARCADE projects. Global Health Action. 2016.

Atkins S, Varshney D, Meragia E, Zwarenstein M, Diwan V. Research Clinics’: online journal clubs between south and north for student mentoring. Global Health Action. 2016.

Strauss A, Corbin J. Basics of Qualitative Research. Newbury Park: Sage; 1990.

The Framework Approach to Qualitative Data Analysis. https://www.surrey.ac.uk/sociology/research/researchcentres/caqdas/files/Session%201%20Introduction%20to%20Framework.pdf . Accessed 1 Nov 2015.

Gale NK, et al. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13:11.

Vom Brocke J, Lippe S. Managing collaborative research projects: A synthesis of project management literature and directive for future research. Int J Proj Manag. 2015;33:1022–39.

Laura D, et al. Promoting sustainable research partnerships: a mixed-method evaluation of a United Kingdom–Africa capacity strengthening award scheme. Health Res Policy Syst. 2015;13:81. doi: 10.1186/s12961-015-0071-2 .

Roper L. Achieving successful academic-practitioner research collaborations. Dev Pract. 2002;12(3-4):338–45.

Pratt B, Loff B. Health research systems: promoting health equity or economic competitiveness. Bull World Health Organ. 2012;90:55–62. doi: 10.2471/BLT.11.092007 .

Colin A, et al. Exploring Leadership in Multi-sectoral Partnerships, The Centre for Organisational Effectiveness, Bournemouth University. UK: GHA; 2012. p. 5.

Costello A, Zumla A. Moving to research partnerships in developing countries. BMJ. 2000;321(7264):827–9.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Hea Z-L, Gengb X-S. Research collaboration and research output: A longitudinal study of 65 biomedical scientists in a New Zealand university. Res Policy. 2009;38(2):306–17.

Niclas A, et al. The challenge of managing boundary spanning research activities: Experience from Swedish context. Res Policy. 2009;38:1136–49.

Download references

Acknowledgements

We acknowledge all our respondents for their participation in this study.

The research received funding from the European Union’s Seventh Framework Programme (FP7/2007-2013) under grant agreement number 281930, ARCADE RSDH. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Availability of data and materials

Due to ethical and legal restrictions, all inquiries should be made to The Chairman, Ethics Committee, RD Gardi Medical College, Agar Road, Ujjain, India 456006 (Emails: [email protected], [email protected]), giving all details of the publication. Upon verification of genuineness of the inquiry, the data will be made available. For reference, please quote ethical permission no. 414 dated 20/5/2014.

Authors’ contributions

DV, SA, AD and VD conceived and designed the study; DV collected and analyzed the data and wrote the first draft. SA and VD contributed in further analysis and refining of the results and provided intellectual content in the writing of the manuscript. All authors approved the final version.

Competing interests

The authors declare that they have no competing interests.

Ethics approval and consent to participate

The study was approved by ethics committee of RD Gardi Medical College, Ujjain (no 414 dated 20/5/2014). Informed consent was obtained from all the participants for the interview and recording and the information collected from all the respondents was kept confidential and anonymous.

Author information

Authors and affiliations.

Department of Public Health and Environment, RD Gardi Medical College, Ujjain, India

Dinansha Varshney & Vishal Diwan

Department of Public Health Sciences, Karolinska Institutet, Stockholm, Sweden

Salla Atkins & Vishal Diwan

Institute of Health Management Research University, Jaipur, India

Arindam Das

International Centre for Health Research, RD Gardi Medical College, Ujjain, India

Vishal Diwan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Vishal Diwan .

Additional file

Additional file 1:.

Topic Guide – Understanding collaboration in a multi-national research capacity-building partnership. (DOCX 18 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Varshney, D., Atkins, S., Das, A. et al. Understanding collaboration in a multi-national research capacity-building partnership: a qualitative study. Health Res Policy Sys 14 , 64 (2016). https://doi.org/10.1186/s12961-016-0132-1

Download citation

Received : 30 November 2015

Accepted : 12 July 2016

Published : 18 August 2016

DOI : https://doi.org/10.1186/s12961-016-0132-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • International collaboration
  • Research capacity building
  • Social determinants of health

Health Research Policy and Systems

ISSN: 1478-4505

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

research and capacity building organisations

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • For authors
  • Browse by collection
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 11, Issue 7
  • Measuring research capacity development in healthcare workers: a systematic review
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0002-8765-7384 Davide Bilardi 1 , 2 ,
  • http://orcid.org/0000-0001-8818-8148 Elizabeth Rapa 3 ,
  • http://orcid.org/0000-0001-7628-8408 Sarah Bernays 4 , 5 ,
  • http://orcid.org/0000-0003-2273-5975 Trudie Lang 1
  • 1 Nuffield Department of Medicine , University of Oxford Centre for Tropical Medicine and Global Health , Oxford , UK
  • 2 Fondazione Penta Onlus , Padova , Italy
  • 3 Department of Psychiatry , University of Oxford , Oxford , UK
  • 4 School of Public Health , University of Sydney–Sydney Medical School Nepean , Sydney , New South Wales , Australia
  • 5 Public Health and Policy , London School of Hygiene & Tropical Medicine , London , UK
  • Correspondence to Dr Davide Bilardi; davide.bilardi{at}gtc.ox.ac.uk

Objectives A key barrier in supporting health research capacity development (HRCD) is the lack of empirical measurement of competencies to assess skills and identify gaps in research activities. An effective tool to measure HRCD in healthcare workers would help inform teams to undertake more locally led research. The objective of this systematic review is to identify tools measuring healthcare workers’ individual capacities to conduct research.

Design Systematic review and narrative synthesis using Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist for reporting systematic reviews and narrative synthesis and the Critical Appraisals Skills Programme (CASP) checklist for qualitative studies.

Data sources 11 databases were searched from inception to 16 January 2020. The first 10 pages of Google Scholar results were also screened.

Eligibility criteria We included papers describing the use of tools/to measure/assess HRCD at an individual level among healthcare workers involved in research. Qualitative, mixed and quantitative methods were all eligible. Search was limited to English language only.

Data extraction and synthesis Two authors independently screened and reviewed studies using Covidence software, and performed quality assessments using the extraction log validated against the CASP qualitative checklist. The content method was used to define a narrative synthesis.

Results The titles and abstracts for 7474 unique records were screened and the full texts of 178 references were reviewed. 16 papers were selected: 7 quantitative studies; 1 qualitative study; 5 mixed methods studies; and 3 studies describing the creation of a tool. Tools with different levels of accuracy in measuring HRCD in healthcare workers at the individual level were described. The Research Capacity and Culture tool and the ‘Research Spider’ tool were the most commonly defined. Other tools designed for ad hoc interventions with good generalisability potential were identified. Three papers described health research core competency frameworks. All tools measured HRCD in healthcare workers at an individual level with the majority adding a measurement at the team/organisational level, or data about perceived barriers and motivators for conducting health research.

Conclusions Capacity building is commonly identified with pre/postintervention evaluations without using a specific tool. This shows the need for a clear distinction between measuring the outcomes of training activities in a team/organisation, and effective actions promoting HRCD. This review highlights the lack of globally applicable comprehensive tools to provide comparable, standardised and consistent measurements of research competencies.

PROSPERO registration number CRD42019122310.

  • organisational development
  • organisation of health services
  • medical education & training
  • public health

Data availability statement

Data are available upon reasonable request. All data relevant to the study are included in the article. The complete data set generated by the systematic review and included in the extraction log is available upon request.

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:  http://creativecommons.org/licenses/by-nc/4.0/ .

https://doi.org/10.1136/bmjopen-2020-046796

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

Thoroughly conducted systematic review collecting data from all major existing databases and grey literature.

Topic not previously addressed in other reviews searching for tools to measure health research capacity building at individual level.

Brief overview of the identified tools to measure health research capacity building at individual level highlighting strengths and weaknesses of them.

Complex identification of relevant studies due to the lack of clarity on a common definition and terminology to identify health research capacity development.

None of the studies use the standard reporting procedures for qualitative or quantitative research.

Introduction

In 2004, the Global Forum for Health Research highlighted the challenge for low and middle-income countries to have the capacity to perform effective and locally led health research which address the major health problems affecting their own populations. 1–3 Twenty years later, low and middle-income countries still carry 90% of the global disease burden, but only 10% of global funding for health research is devoted to addressing these persistent health challenges. 4 Health research capacity development (HRCD) for healthcare workers has been recognised as a critical element to overcoming global health challenges, especially in low and middle-income countries. 5 For too long HRCD in low and middle-income countries has been documented through training programmes which enable local teams to participate in externally sponsored trials, creating a false appearance of growth and generating dependence on foreign support. 6 7

The process of progressive empowerment is usually referred to as capacity development. 8 This term has been used in multiple areas and applied in different sectors to develop new or existing competencies, skills and strategies at a macro or individual level. 9 In the field of health, research capacity development should support healthcare workers in generating local evidence-based results to inform policy and improve population health. The three health-related Millennium Development Goals, and more recently the targets ‘B’ and ‘C’ of the Sustainable Development Goals, all support the adoption of new strategies to strengthen the capacity of healthcare workers in all countries in performing their job and engaging in research. 10–12 One of the critical barriers in supporting HRCD is the lack of empirical measurement of competencies in relation to the performance of research activities. Existing frameworks and tools have been developed for a particular purpose in a particular context. 13 14 Others have identified barriers that healthcare workers encounter in engaging in research or have monitored and evaluated targeted training activities. 15 This systematic review aims to identify tools to measure individual healthcare workers’ capacities to conduct research.

The Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist 16 for reporting systematic reviews and narrative synthesis and Critical Appraisals Skills Programme (CASP) checklist 17 on critical appraisal for qualitative studies were used to design this systematic review and to refine the extraction log according to recognised guidelines.

Inclusion and exclusion criteria

The aim of the systematic review was to identify existing tools which measure individual capacities in conducting research in healthcare workers. The inclusion and exclusion criteria were defined in advance and documented using an adapted version of a SPIDER table ( table 1 ). The primary population of interest were all health-related professionals or healthcare workers involved in research activities. Healthcare workers delivering health services when research was not considered as the focus of the study were excluded. Occupational health research was excluded. Studies about volunteers, defined as people offering their services to support health activities with no specific training as health professionals, were also excluded. Initially, only healthcare workers working in low and middle-income countries were included, but this limitation was removed to identify any tool measuring HRCD in any setting. The Phenomenon of Interest was defined as: assessing HRCD; or identifying tools, frameworks and templates designed to assess HRCD. A comprehensive range of terms including synonyms for ‘assess’, ‘tool’ or ‘development’ was used. Studies were excluded which mentioned components that could be considered to assess, measure and ‘give evidence to’ research capacity development, but were not presented in any capacity development context. In addition, since the concept of capacity development is widely applied to different settings, studies on areas unrelated to health, such as ‘air pollution’, ‘financial capacity’ or ‘tobacco’, were also excluded. The study design criteria were broad to include qualitative, quantitative and mixed methods papers. Further criteria of eligibility included in the SPIDER table refer to the quality of the study (Evaluation) and the Research type.

  • View inline

SPIDER diagram—inclusion and exclusion criteria

Information sources and search strategy

Eleven databases were searched from inception to 16 January 2020: Ovid MEDLINE; Ovid Embase; Ovid PsycINFO; Ovid Global Health; EBSCO CINAHL; ProQuest Applied Social Sciences Index & Abstracts (ASSIA); ProQuest Sociological Abstracts; ProQuest Dissertations & Theses Global; Scopus; Web of Science Core Collection; and the WHO Global Index Medicus Regional Libraries. The first 10 pages of results from Google Scholar were also screened. The search strategies used free text terms and combinations of the relevant thesaurus terms, limited to English language publications only, to combine terms for capacity building, measuring and health research. The ‘NOT’ command was used to exclude papers about students, postgraduate students, tobacco, air pollution and a variety of other concepts to minimise the number of irrelevant results (see box 1 for a full set of search strategies).

Search strategy

Database: medline (ovid medline epub ahead of print, in-process & other non-indexed citations, ovid medline daily and ovid medline) 1946 to present.

Capacity Building/ (1965)

(capacit* adj2 build*).ti,ab. (5789)

(capacit* adj2 develop*).ti,ab. (3591)

(capacit* adj2 strengthen*).ti,ab. (924)

(competenc* adj2 improv*).ti,ab. (1460)

((professional* adj2 develop*) and (competenc* or capacit*)).ti,ab. (1747)

1 or 2 or 3 or 4 or 5 or 6 (13649)

Mentoring/ (820)

mentor*.ti,ab. (13369)

(assess* or measur* or evaluat* or analys* or tool* or equip*).ti,ab. (9653076)

“giv* evidence”.ti,ab. (3814)

framework*.ti,ab. (231138)

8 or 9 or 10 or 11 or 12 (9763562)

Research/ (196782)

clinical.ti,ab. (3158817)

(health* and research*).ti,ab. (337604)

14 or 15 or 16 (3588891)

7 and 13 and 17 (3433)

limit 19 to English language (3346)

(student* or graduate or graduates or postgraduate* or “post graduate*” or volunteer* or communit* or tobacco or “climate change” or “air pollution” or occupational or “financial capacity” or informatics or “IT system” or “information system” or transport or “cultural competenc*” or disabili* or trauma).ti,ab. (1828113)

20 not 21 (1673)

Google Scholar—screen the first 10 pages of results

Sorted by relevance:.

(“capacit* build*”|“build* capacit*”|“capacit* develop*”|“develop* capacit*”|“capacit* strengthen*”|“strengthen* capacit*”|“professional* develop*”|“completenc* improv*”|“improv* competenc*”)(“health* research*”|clinical) https://scholar.google.co.uk/scholar?q= (%22capacit*+build*%22%7C%22build*+capacit*%22%7C%22capacit*+develop*%22%7C%22develop*+capacit*%22%7C%22capacit*+strengthen*%22%7C%22strengthen*+capacit*%22%7C%22professional*+develop*%22%7C%22completenc*+improv*%22%7C%22improv*+competenc*%22)(%22health*+research*%22%7Cclinical)&hl=en&as_sdt=0,5

Study selection

Two researchers, DB and ER, independently screened and reviewed studies using the Covidence systematic review software. 18 In case of disagreement, DB and ER discussed the abstracts in question. After consensus on inclusion was reached, the full texts of all included studies were rechecked for inclusion by DB and confirmed by ER.

Study analysis procedure

Data from selected papers were extracted, and quality assessments performed using an extraction log created and validated against the CASP checklist 17 on critical appraisal for qualitative studies. Macro areas of interest in the log were: general information on the paper such as author and title, main focus and study design. The source of funding, conflict of interests and ethics approval were also recorded. A separate section of the extraction log recorded the characteristics of the tool used or described in each selected paper ( figure 1 ). The extraction log also included specific sections considering the study design, the methodology and the main findings of each paper. Furthermore, a dedicated section of the log collected data on the quality of each study, analysing selection biases and a critical appraisal derived from the CASP checklist. If a definition of capacity development was given, the definition was collected. Some of these sections of the extraction log are not present in figure 1 since it focuses on the description of the identified tool. The content method was used to define a narrative, described in the Discussion section.

  • Download figure
  • Open in new tab
  • Download powerpoint

Extraction log.

Patient and public involvement

Patients and/or the public were not involved inthe design, or conduct, or reporting, or dissemination plans of this research.

Database search and results screening

In December 2018, the first round of the search was performed in 11 different databases and in Google Scholar using the search strategy described in box 1 . A total of 13 264 suitable records were found. A total of 6905 duplicates were removed, resulting in 6359 unique records for inclusion screening by title and abstract ( table 2 ), which was performed throughout 2019. In January 2020, an additional search for papers published or included in publication databases in 2019 was performed using the same search strategy and resulted in 15 775 papers and after removal of duplications, a total of 1118 papers were found. These papers were then added to the 6359 papers identified from the first search. A total of 7474 unique papers were included for title and abstract inclusion screening (three duplicate records were removed in the Covidence software).

Search results

The 7474 unique relevant studies identified were uploaded to the Covidence systematic review software. Two researchers, DB and ER, independently screened the studies, including or excluding according to the criteria in the SPIDER table ( table 1 ). A total of 7280 studies were considered irrelevant. The full-text papers for the remaining 178 references were reviewed. Reasons for exclusion were identified by streamlining the SPIDER table criteria into three main criteria: wrong setting, irrelevant study design and wrong focus of the study. A reason for exclusion was assigned to each paper. All 178 studies described some form of activity to measure the competencies related to performing health research. Thirty were excluded because they were literature reviews on a different aspect of health research or because they described a general perspective on the topic of health capacity development without offering any specific measurement or without reference to research. In addition, 42 studies were excluded because of the wrong setting, since competencies were measured at the level of research institutions or within a specific network. An additional 90 studies were excluded because the study design did not match the inclusion criteria: 38 studies described the use of a measurement tool tailored to the context (eg, specific profession, intervention or setting) and not at the individual level; the remaining 34 studies were excluded because there was no mention of a specific tool to measure HRCD. The final 18 papers reported the use of an evaluation tool, but the tool was an ad hoc pre/postintervention questionnaire with low potential of applicability in a context different from the one described in the paper. A total of 162 studies were therefore excluded, leaving 16 studies for this review ( figure 2 ).

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) screening diagram.

Analysis of the findings across the selected papers

A total of 16 studies met the inclusion criteria set for this systematic review. 19–34 The 16 articles were analysed using the extraction log created and validated against the CASP qualitative checklist.

The results are summarised in table 2 . None of the papers were published before 2006 and only nine of them were published after 2014. 20 21 23–26 31 33 34 The majority (n=13) applied a tool in high-income settings. 19 20 22–24 26–32 34 Seven papers described the use of tools in Australia, 20 22 24 26 28 29 34 three in low and middle-income countries (one in Ghana, Kenya, Malawi, Sri Lanka, 25 one in the Pacific Islands 21 and one in the Philippines 33 ), one in Europe (Norway), 19 one in the USA 32 and one measured HRCD in a group linked to a specific intervention located in multiple areas of the world. Three of them described the creation of a tool without applying it to any specific context, 23 30 31 but they were all designed by research groups in high-income countries (one in the USA and two in the UK).

All of the selected studies applied quantitative, qualitative and mixed methods analyses. The preferred approach (n=7) was to generate quantitative data using an HRCD tool. 20 24 26–28 33 34 One-third of the studies (n=5) used a mixed methods approach 19 21 22 25 29 ; quantitative tools were associated with semistructured interviews, or in some cases qualitative questions were added to the questionnaire. The three studies describing the creation of a tool were not analysed under this methodological category.

Of the 16 selected studies, three used the term ‘capacity development’, 23 25 30 and two included a definition of the concept. 25 30 Seven papers used ‘capacity building’, 20–22 24 26 28 31 of which four also included a definition. 20 22 24 28 In two papers, the capacity building definition was associated with the definition of ‘research culture’. 20 22 Two additional papers used alternative generic terms like ‘research capacity’ 33 or ‘research self-efficacy’. 32 Four papers did not refer to any specific term and therefore no definition was given. 19 27 29 34

Five of the 16 selected papers openly declared no conflict of interests. 22–24 28 31 Eight stated the source of funding used to carry out the activities described. 19 21 23 27–31 The number of participants in the studies varied from 28 enrolled participants for a qualitative study 21 to 3500 users of an online measurement tool. 27

Analysis of the tools from the selected papers

The tools described or used in the 16 selected papers varied in nature, length and applicability. In general, even when there were similarities, each paper described a different perspective on the use of a tool. Four papers applied a questionnaire-type tool to assess research competencies and skills. 19 21 25 33 The length of these questionnaires varied from 19 21 to 59 19 health research capacity-related questions, with the addition of open-ended qualitative questions in two studies, 19 21 and a structured interview in another study. 25

Three studies 22 24 34 used, with a range of adaptations, the Research Capacity and Culture tool and one study 20 revised this tool into a Research Capacity and Context tool referencing Research Capacity and Culture tool as a primary source. Another recurrence in the papers was the use of the ‘Research Spider’ tool. 28 29 Again, the original tool had been adapted to the context, and in one case, 29 the tool was used as a base for qualitative research on HRCD. Two additional papers described tools designed ad hoc to measure the impact of an intervention (CareerTrac 27 and Cross-Sectional Electronic Survey 26 ). These last two papers were not excluded under pre/postintervention since the action was wider, at a programme level and the tool used to measure HRCD was the main focus of the paper. Furthermore, another paper described a tool for a specific category of healthcare workers (Nursing Research Self-Efficacy Scale—NURSES). 32 Three papers 23 30 31 focused on the creation of a new tool and described the process of identifying a set of competencies required to run health research. The outcome of two of them was defined as a ‘core competency framework’. 23 31 The third defined the outcome of the analysis as a ‘set of indicators’. 30

In terms of the target population, the identified tools aimed to measure HRCD in a range of different healthcare worker professions. One-third of the papers (n=5) focused on measuring HRCD on allied health professionals (AHPs). 20 22 24 26 34 Nurses were the main focus in two other studies, 19 32 and four studies applied a tool to a range of health professions (ranging from laboratory scientists to data managers). 21 25 28 29 Two other papers focused on groups linked to a specific intervention. 27 33 All 16 papers included, alongside healthcare workers, representatives of technical professions in health such as managers, directors, faculty members and consumer organisation representatives. In the case of the three papers describing the creation of a new tool, they suggest that these tools would be applicable to all research roles. 23 30 31

As per inclusion criteria, the main level of measurement of the tools was at the individual level. Seven papers only measured HRCD at the individual level. 19 23 28 29 31–33 Three papers added to the individual level of measurement by including information on the perceived barriers in performing health research 21 26 29 ; of these three, two also focused on understanding what motivates healthcare workers to become involved in health research. 26 29 The five studies, which used the Research Capacity and Culture tool and its variants, included the measurement of HRCD at the individual level, and at the team and organisational level. 20 22 24 25 34 One paper described the creation of a tool designed to be used at the organisational level, but embedded a measurement of HRCD at the individual level as well. 30

The most common way a selected tool was validated was by referencing the main paper that described the selected tool and its validation process (n=6). 23 28 29 32–34 This was the case for some of the ad hoc questionnaires, 23 33 34 of the ‘Research Spider’ tool 28 29 and of the NURSES tool. 32 Papers which described an original process or used modified versions of an original tool validated the tool through a contextual validation process described in the paper. 21 22 24 25 31 These validation processes included a consultation of a panel of experts 22 24 31 or a reiterative process of validity. 21 25 One paper stated that the tool used was a validated tool without referencing the process or tool. 20

Overall, only two papers 23 31 focused specifically on tools to measure HRCD on a wider level, without linking the measurement to a specific group or a geographical area which was done in the majority of papers. 19 24 25 28 29 33 In four cases, the tools described were adapted to identify determinants or barriers of HRCD in a defined setting 20 30 34 or to promote HRCD in relation to a specific disease or research topic. 21 In other cases, the papers focused on a tool aiming to assess the impact of specific interventions or programmes on HRCD. 26 27

Summary of evidence

This systematic review aimed to identify tools which measure individual capacities in conducting research in healthcare workers; the 16 included articles 19–34 which demonstrated that tools to measure HRCD in healthcare workers are available, even if they are limited in number. In most cases, the identified tools do not originate from the need to measure and foster HRCD as a necessary strategy to promote research capacity. There is, therefore, a need to design more comprehensive tools which are globally applicable and able to provide comparable, standardised and consistent measurements of research competencies.

The importance of measuring HRCD has only been recognised recently. 15 As the date of publication of the identified papers shows, the appreciation of the contribution that health research can offer in capacity development at a personal level only began in the first decade of this new millennium. Almost half of the selected papers (n=7) refer to studies whose data have been collected after 2014. 20 21 24 26 31 33 34 Of note is the high number of new publications which were retrieved from the academic databases (1118 papers) when the search strategy was rerun in 2020.

Questionnaires were the most commonly used method for assessing research skills and competencies. Almost two-thirds of the papers (n=10) 19 20 22 24 26 28 29 32–34 based the measuring system of different research skills at a personal level using a 5-point Likert scale (n=6) 19 26 28 29 32 33 or a 10-point scale (n=4). 20 22 24 34 This choice highlights the need for a validated quantitative tool based on a set of competency-related questions that can bring standardisation, comparability and consistency across different roles and contexts. However, the extensive use of mixed methods, combining quantitative questionnaires with other qualitative instruments, reflects that HRCD depends on a complex series of components that need to be identified both qualitatively and quantitatively.

By not limiting the selection of articles for this review to those tools used in low and middle-income countries, this review has revealed that most of the tools identified were used in high-income settings. It is important to note that excluding pre/postintervention assessments significantly reduced the inclusion of studies performed in low and middle-income countries. This finding highlights that although health systems in low and middle-income countries may benefit from providing evidence for HRCD, 5 they are rarely the focus of the HRCD literature. Most of the measurements of HRCD in lower income settings appear, in fact, to be narrowly linked to the measurement of the effectiveness of training offered for a specific study or limited to a particular disease. Even when the perspective is broader than a particular study, it is mostly limited to the evaluation and sustainability of training programmes and not linked to a plan of career progression and research competency acquisition. More attention should therefore be given in creating tools which are able to measure, support and promote long-lasting research capabilities in the perspective of professional growth for healthcare workers.

Three essential findings of this systematic review support a change in the perception of HRCD and the tools needed to measure it. First, many of the excluded papers (42 out of 162 excluded papers from the last round of analysis) focused exclusively on the institutional level of measuring research capacity. This is mostly because training interventions are designed to prepare a team to run a study and rarely to promote individual HRCD. 1 35 36 In some cases, the measurement via a tool is also an exercise to demonstrate the investment in training activities for reporting purposes. 37 38 It is therefore important to start promoting a more effective research culture which is independent of specific diseases or roles. This progression could be achieved by championing systems which measure the changes in research capacities at a team and personal level using a globally applicable tool. Most of the tools excluded were evaluation tools designed for, or used in, a specific setting and thus not suitable for a comparable, standardised and consistent analysis of long-term research competency acquisition strategies.

Second, papers that focused on measuring HRCD at the individual level confirmed that research is seen as an opportunity to learn the cross-cutting skills needed in healthcare. A defined set of standardised competencies required to conduct research could be used to measure an individual, team and organisation’s abilities. This was the focus of two papers 23 31 which identified a framework of core competencies. Most of the tools (n=7) were designed to be applied to a wider variety of health professions. 21 23 25 28–31 HRCD can be accessed at different entry points depending on the specific job title, but the set of skills acquired is common and shared among the research team. 1 The approach on assessing these inter-related competencies should therefore be global and not role or disease based. 39 The measurement at an individual level is essential to promote a consistent and coherent career progression for each person and role. 40 However, the overall capability in running research programmes should be measured at a team level where all roles and competencies complement each other, skills are made visible, and measurable as a whole against an overall competency framework. Individual and institutional/team levels are therefore two aspects of HRCD that grow together supported by a common comparable, standardised and consistent tool.

Third, the lack of a standard definition for HRCD can lead to post-training evaluations being categorised as HRCD activities. Although pre/post-training evaluations are important, it might be helpful to define what a ‘structured action’ is to promote HRCD. As previously mentioned, the term ‘capacity development’ is not universally used, with many synonyms such as ‘research capacity’ or ‘capacity strengthening’, creating the possibility of different interpretations. Furthermore, inconsistent terminology was found in describing activities in support of HRCD that in reality were very similar (eg, workshop, training, course). Steinert et al 41 suggest that there should be a standard definition in the context of educational capacity development. This suggestion, alongside a common taxonomy to describe health professions, would support the identification of HRCD as a defined process with specific characteristics and not with a general effort for research training.

The most common tool identified in this review was the Research Capacity and Culture tool. 20 22 24 34 The Research Capacity and Culture tool consists of 52 questions that examine participants’ self-reported success or skill in a range of areas related to research capacity or culture across three domains including the organisation (18 questions), team (19 questions) and individual (15 questions). The Research Capacity and Culture tool includes questions on perceived barriers and motivators for undertaking research. The respondents of the Research Capacity and Culture tool are asked to rate a series of statements relevant to these three domains on a scale of 1–10, with 1 being the lowest and 10 being the highest possible skill or success level. It represents a good example of a comprehensive tool. As confirmed by the review findings, a potential limitation is its application mainly in an Australian context and almost exclusively to measure HRCD in AHPs. 22 24 34 The generalisability of the tool should thus be confirmed. Nevertheless, the Research Capacity and Culture tool represents a strong example of how having a tool refined around a context, and a specific health profession can be an incentive in measuring HRCD.

Another tool highlighted by this review was the ‘Research Spider’ tool. 28 29 42 This tool collects information on individual research experience and interest in research skill development in 10 core areas. These include ‘writing a research protocol’, ‘using quantitative research methods’, ‘publishing research’, ‘finding relevant literature’ and ‘applying for research funding’. In each area, the level of experience is measured on a 5-point Likert scale, from 1 (no experience) to 5 (high experience). The primary aim of the ‘Research Spider’ is to be a flexible tool. This flexibility is confirmed in two studies 28 29 which used the ‘Research Spider’, with one 28 using it as the main measurement, and the other 29 as a quantitative base for qualitative semistructured interviews. The advantage of this tool is that it provides a visual overview of personal research competencies. However, although the limited number of measurement areas (n=10) makes the tool a good initial evaluation instrument, it does not offer a specification of the subskills of each area.

A critical mention should be reserved for the two papers which described the creation of a comprehensive research core competency framework. 23 31 Despite no specific tool being described and the competency scores being visualised by using a spider diagram, these studies present the most accurate overview of the skills required in running research programmes related to health. As mentioned before, a tool which applies a scoring system to the list of competencies identified by these frameworks has the potential of being widely applicable and reliable. This wide applicability and the absence of explicit biases in measuring research skills improvement can foster a more robust approach to research in health. The measurement of HRCD unrelated to specific interventions would maximise the benefit of research at every level. At a personal level, it would clarify a potential career progression path highlighting possible gaps; at the team level, it would support a multidisciplinary approach to health challenges; and at an institutional level, the measurement of HRCD would make the know-how generated by the international scientific community accessible to a broader group of local health workers. Overall, health practice at a global scale would benefit from the incentive of getting involved in research derived from measuring the impact of it on improving competencies. Thus positive outcomes of measuring HRCD could place the issue of universal transferability, and applicability of research methodology and results at a higher level of priority in the design of health research projects.

Limitations of the systematic review

Methodological limitations are recognised for this systematic review. First, there is a lack of clarity on a common definition and terminology to identify HRCD which complicates the search strategy. A long reiteration process was necessary when developing the search strategies for the databases to try and include all the possible variants used to define ‘tool’, ‘capacities’ and ‘development’. Despite this effort, some studies may have been missed. Second, there was a lack of studies which referenced a standard reporting procedure, despite the presence of standards available for reporting qualitative or quantitative research 43–45 as well as for mixed methods research. 46 Other limitations typical for reviews may also apply. Third, while this review has attempted to be as comprehensive as possible, some sources might not have been detected due to the challenge in finding all the relevant grey literature, and the restriction to English language sources only. Finally, it was not possible to analyse the psychometric aspects of each identified tool due to inconsistent reporting.

Conclusions

Sixteen studies using or describing tools to measure HRCD were identified and analysed in this systematic review. 19–34 Identifying capacity development with pre/postintervention evaluations or to generically evaluate capacity development without using a tool was common. There is a need for a clear distinction between simply measuring training activity outcomes in healthcare workers and effective action promoting HRCD for healthcare workers.

The most recurrent tools described were the Research Capacity and Culture tool 20 22 24 34 and the ‘Research Spider’ tool. 28 29 A variety of other tools, mostly questionnaire based, were identified, and in most cases, a broader applicability than described in the specific context of the paper may be possible. Two frameworks systematising research core competencies were identified. 23 31 The potential of tools derived from these frameworks could be significant. The applicability of each tool depends on the context and on the level of accuracy needed. Such tools could be routinely incorporated into standard personal development reviews in order to consistently support capacity development in research studies and organisations.

Future directions for HRCD include the design of a standardised, comparable and consistent tool to measure individual HRCD not linked to training evaluation, but support a long-term research competencies acquisition strategy. In addition, the harmonisation of definitions and terminologies used in identifying HRCD actions and processes could facilitate standardisation and comparability of HRCD strategies.

Ethics statements

Patient consent for publication.

Not required.

Acknowledgments

Authors are sincerely thankful for the immense and competent support of Elinor Harriss, Librarian of the Bodleian Health Care Libraries; Rebekah Burrow who instructed on the different steps and tool needed to perform the present systematic review; and Filippo Bianchi, first DPhil colleague who provided the basic knowledge on systematic reviews.

  • Franzen SRP ,
  • Chandler C ,
  • Research GFfH, editor
  • Hoelscher M , et al
  • Laabes EP ,
  • Zawedde SM , et al
  • Lusthaus C ,
  • Adrien M-H ,
  • Perstinger M
  • Meyers DC , et al
  • Lansang MA ,
  • UN General Assembly
  • Smith H , et al
  • Bauer D , et al
  • Liberati A , et al
  • The Critical Appraisal Skills Programme UK
  • Akerjordet K ,
  • Severinsson E
  • Alison JA ,
  • Zafiropoulos B ,
  • Ekeroma AJ ,
  • Kenealy T ,
  • Shulruf B , et al
  • Golenko X , et al
  • Furtado T ,
  • Boggs L , et al
  • Hughes I , et al
  • Njelesani J ,
  • Dacombe R ,
  • Palmer T , et al
  • Petersen M ,
  • Raghavan R ,
  • Farmer EA ,
  • Sonstein SA ,
  • Namenek Brouwer RJ ,
  • Gluck W , et al
  • Swenson-Britt E ,
  • Torres GCS ,
  • Estrada MG ,
  • Sumile EFR , et al
  • Williams C ,
  • Miyazaki K ,
  • Borkowski D , et al
  • Siribaddana S
  • Lyytikainen M
  • Siribaddana S , et al
  • Ijsselmuiden C ,
  • Marais DL ,
  • Becerra-Posada F , et al
  • Reeder JC ,
  • DeGruttola V ,
  • Dixon D , et al
  • Steinert Y ,
  • Centeno A , et al
  • Morgan S , et al
  • Lipsey MW ,
  • Frambach JM ,
  • van der Vleuten CPM ,

Contributors DB and ER designed and conducted the systematic review. DB wrote the draft of the systematic review and revised it according to the commentaries of ER, SB and TL. DB provided the final version of the manuscript. ER critically reviewed the manuscript and substantially contributed to the final version of the manuscript. SB critically reviewed both the design of the systematic review and the manuscript, and was involved in the development of meaningful inclusion criteria. TL critically reviewed the design of the study, made important suggestions for improvement, critically reviewed the manuscript and substantially contributed to the final version of the manuscript. All authors approved the final version of the manuscript.

Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Competing interests None declared.

Provenance and peer review Not commissioned; externally peer reviewed.

Read the full text or download the PDF:

Advertisement

Advertisement

Health research, development and innovation capacity building, enhancement and sustainability

  • Perspective
  • Open access
  • Published: 03 August 2023
  • Volume 3 , article number  18 , ( 2023 )

Cite this article

You have full access to this open access article

research and capacity building organisations

  • Marlon E. Cerf 1 , 2  

929 Accesses

Explore all metrics

Research, development and innovation (RDI) encompasses undertaking research to contribute to new knowledge, developing policies, and generating products and services. Building health RDI capacity should be informed by the developmental gap, required resources and the impact. Low- and middle-income countries often face barriers to reaching their RDI potential. To address some of the RDI challenges, a framework is presented for building, enhancing and sustaining health RDI capacity at the researcher, department and faculty, institution and government dimensions, which is unpacked at the construct, expand, team, gear and leverage phases. Existing and new health RDI capacity requires building, enhancing and sustaining (constructing) before improving, refining and growing RDI expertise and portfolios (expanding). Collaborative RDI networks and robust partnerships should then be established (teaming) and researchers nurtured, with resources optimized to secure investments for embarking on new activities (gearing). Harnessing the collective RDI collaborations and partnerships leads to greater global competitiveness and sustainability (leveraging). Capacity building, enhancement and sustainability in health RDI addresses health challenges that contributes to improving health, economy and societal outcomes.

Similar content being viewed by others

research and capacity building organisations

Criteria for Good Qualitative Research: A Comprehensive Review

Drishti Yadav

research and capacity building organisations

The theory contribution of case study research designs

Hans-Gerd Ridder

Sustainability, FinTech and Financial Inclusion

Douglas W. Arner, Ross P. Buckley, … Robin Veidt

Avoid common mistakes on your manuscript.

1 Introduction

Global health and development programs, like the Sustainable Development Goals (SDGs), have specific indicators and targets to establish road maps and budgets to mobilize resources in a partner coordination matrix, which could be applied for health research capacity strengthening [ 1 ]. Building, enhancing and sustaining health research capacity, viz. medicine, biosciences, public (and global) health and allied disciplines, is critical for growing and gearing (i.e. preparing and equipping) the next generation of researchers and clinicians for discovery and refinement of health solutions. With the aim of investing in research and development (R&D) by allocating ~ 2% gross domestic product (GDP) for building a knowledge economy, many low- and middle-income countries (LMIC) struggle to meet this target, given the allocation of resources for other priorities, such as addressing infrastructural and developmental lags. Funds are also prioritized for health services before filtering to health research. Research, development and innovation (RDI) encompasses undertaking research (R) to develop (D) policies and new products and services (I). Health RDI is the research, development and innovation to address health challenges by generating new knowledge, developing research to progress to the next phase, and to develop products or services through innovation. Health RDI capacity is often constrained in LMIC due to insufficient RDI practitioners, including leaders and mentors; limited capacity to conduct RDI; inadequate infrastructure and support; insufficient funding and investment; fragmented and underdeveloped RDI systems; misalignment of RDI activities with country needs and priorities; being funder led; incompatible collaborations; and ineffective and unequal partnerships [ 2 , 3 , 4 , 5 , 6 , 7 ].

Health research capacity strengthening (HRCS) refers to enhancing the capacity, at the researcher and institution levels, to conduct, manage, disseminate and apply research, while enabling national and sub-national research systems to effectively support research and the linkages between research, practice [ 8 ] and innovation. Generic research capacity strengthening (RCS) frameworks emphasize the focus at individual (researcher), institution and the environment in which the institution operates (e.g. government) [ 9 ] and societal [ 10 ] levels; with varying dimensions aligned to national, institutional and program strategies, resources, leadership, infrastructure, skills and culture; and reflect an emergent, systemic and long-term process that requires ownership, and the right to contribute and benefit from RCS [ 8 ]. To support RCS programs in LMIC, there are conceptual tools to guide decision making on intervention design, to assess implementation challenges (e.g. cost, time and level of control), to identify which interventions may be most impactful, and to support the evaluation of multiple RCS interventions [ 11 ]. These tools mainly serve to guide funders and implementing parties who are active in LMIC.

The article presents a conceptual framework, particularly relevant for application in the LMIC context, at the researcher, department and faculty, institution and government dimensions, that are integrated, interdependent and synergistic. A recent health policy and systems research (HPSR) framework, which can also be applied to enhance research capacity more generally, highlighted the importance of RCS at the institution and network levels [ 12 ] (i.e., improving institutional processes for research governance and teaching quality assurance or building network-level relationships), which are more limited relative to the researcher focus [ 13 ]. The framework to be presented focuses more specifically on health and biomedical research (not HPSR) and incorporates RDI, not only research. The framework can complement and be integrated and aligned with HRCS principles and other frameworks for advancing, evaluating and monitoring research capacity building, enhancement and sustainability [ 7 , 8 , 10 , 11 , 12 ] to guide RDI practitioners. The conceptual framework is based on a selective literature review and informed by experiences to offer perspectives on building, enhancing and sustaining health research capacity, particularly in a LMIC context.

2 Health RDI framework for building, enhancing and sustaining capacity

Figure  1 presents a conceptual framework that outlines the phases for building, enhancing and sustaining health RDI capacity; which requires constant monitoring, evaluation, input, refinement and maintenance. The key elements of the framework are depicted in Fig.  1 and the key transitions are listed in Table 1 . The phases are to (i) construct: build, enhance and sustain RDI capacity; (ii) expand: improve, refine and grow RDI expertise and portfolios; (iii) team: establish collaborative RDI networks and robust partnerships; (iv) gear: nurture researchers and optimize resources to secure investments to pursue new RDI activities; and (v) leverage: harness the collective RDI collaborations and partnerships for greater global competitiveness and sustainability.

figure 1

Research, development and innovation capacity framework

The construct, expand, team, gear and leverage phases each have specific actions to build, enhance and sustain research capacity across the researcher, department and faculty, institution, and government dimensions. For constructing, research capacity can be strengthened at the researcher, department and faculty, and institution levels, in alignment with national (government) research and development (e.g. health and economic policy development) priorities and support global initiatives (e.g. the SDGs), while planning for pathways for innovation. For expanding, at the researcher, department and faculty, and institution levels, building RDI capacity identifies and focuses on relevant and impactful RDI activities that can be improved and refined to grow expertise and portfolios. For teaming, identifying and conducting RDI of national priority and global relevance strengthens alignment with governments and global institutions that opens opportunities for collaborations and partnerships to create networks. This translates into greater critical mass and synergies to conduct RDI, and funding options to support future initiatives. Reciprocal complementary teams add value to collaborations and partnerships and leverage synergies. For gearing, the aim is to establish greater independence of researchers, departments and faculties, and institutions i.e. financially which entails greater funding inflows to subsidize research partially or fully. This can be achieved by successful grantsmanship from various funders and also presents an opportunity to attract new collaborators and partners as new RDI avenues present. For leveraging, steady high quality RDI outputs over time enhances reputation and track records, to augment the development of skilled researchers, establish globally competitive departments and faculties, and realize excellent institutions.

Building, enhancing and sustaining RDI is complex, as it is dependent on moving parts of the RDI ecosystem, and progress may not be synchronized or easily phased. There are several challenges for building, enhancing and sustaining RDI, particularly in LMIC that are under-resourced due to few researchers, limited infrastructure and inadequate funding and support. Other barriers that can derail progress are the shifting of RDI focus, ineffective collaborations and incompatible partnerships. Further, researchers may not have the necessary knowledge or skills that align with the research priorities of their departments, faculties or institutions—i.e. there may be a lack of critical mass to address research priorities. Departments and faculties may focus largely on teaching and learning, with limited capacity, resources and time allocated or allowed for RDI activities. Further, some LMIC institutions may not have sufficient funds to support RDI, which will add further pressure on researchers, departments and faculties to constantly seek and secure grants for sustainability, that may diminish the quality and quantity of outputs. Some LMIC governments may also be overburdened with service delivery, and reallocate RDI funds to services, strategic projects and implementation programs.

3 Application of the health RDI framework

Examples of how the framework may be applied in practice will be presented by unpacking the phases (construct, expand, team, gear and leverage) across the dimensions (researcher, department and faculty, institution, and government).

3.1 Construct: build, enhance and sustain RDI capacity

The constructing phase focuses on building existing and new health RDI capacity at the researcher (individual), department and faculty, institution and government levels (Fig.  1 ), that is enhanced and sustained by continuous monitoring, evaluation, input, refinement and maintenance. Researchers are broadly defined as research practitioners involved in RDI activities e.g. academic researchers, policy developers and makers, and innovators. Research encompasses RDI where relevant.

3.1.1 Researcher capacity

Researchers should identify the RDI gaps and their needs for building, enhancing and sustaining capacity (Table 1 ). Enablers of HRCS are to provide quality and relevant training for researchers, to recognize and empower research leadership (in departments, faculties and institutions), to build collaborations in support of career progression (that is equitable and transparent), and to provide tailored supervision, management and mentorship [ 10 ]. Researchers are equipped to plan and conduct research, to develop and apply methods to address health challenges, to establish and develop cohesive internal collaborations to be leveraged externally at the provincial (state or sub-national), national, regional, continental and global levels; and to advise, guide, support, supervise, manage and mentor early career researchers, particularly at national and regional levels [ 14 ]. General skills and training are important to develop holistic researchers who can apply project management tools, design excellent proposals, write robust grant applications, learn time and people management skills, and how to lead, train, motivate, mentor and coach researchers and teams. Early career researchers are the foundation for successful research institutions [ 15 ], and for maintaining continuity and excellence, and therefore need to be trained and mentored to become established scientists that drive their own health RDI activities in alignment with institutional and national (government) priorities. Effective and efficient health RDI capacity requires research and support personnel optimally embedded in the pipeline [ 16 ] to enable translation and discovery [ 15 ] (i.e. policies, products and services). At the researcher level, capacity development often intensifies during postgraduate training—at the masters, doctoral and postdoctoral career stages. However, researchers can also gain experience horizontally e.g. undergo training in policy development and practice, technology transfer, research administration; or in specific techniques such as specialized and advanced training in gene editing; all for the transfer and application at their host institutions. Placement in industry is important, particularly for innovation (e.g. biotechnology R&D) and broader health skilling (e.g. pharmaceutical production and quality control). The horizontal integration across departments and faculties in institutions with a good understanding of their vertical lines (their RDI discipline and specialty) will set individual researchers apart as they will gain a competitive advantage. The immersion of researchers in the RDI pipeline (conducting RDI) and across delivery platforms (supporting RDI) enhances process understanding, fosters creativity, and guides towards holistic systems thinking. To foster the development of national RDI systems that translates into enhancement and discovery, adequate support is required for researchers operating across the pipeline [ 15 ].

3.1.2 Department and faculty research capacity

Departments and faculties should support researchers in alignment with their RDI strategies, priorities and goals (Table 1 ). With critical mass in a department, growth can be fostered to build expertise in specific diseases e.g. a program in obesity can feed into diabetes, cardiovascular disease and specific types of cancer. The allocation of department resources should be assigned to grow certain aspects of RDI that will lead to a competitive advantage, with oversight from faculty leadership and management. For example, access to unique clinical samples (e.g. specific ethnic samples and rare disease samples), indigenous extracts, drug resistant patients (e.g. resistance to TB and other anti-microbial medications), and multimorbid patients (e.g. HIV/AIDS and TB patients afflicted with a non-communicable disease) often treated by polypharmacy, will elevate a department’s research profile as they hold valuable assets and the requisite knowledge and expertise for conducting specialized, relevant and responsive research. This equates to research currency that attracts collaborators, partners and investors. With good research outputs and track records, their faculty, department and researcher reputations will be augmented. This will grow faculty and department capacity and stature that will attract collaborators who seek access to valuable samples, knowledge and expertise and will bring additional knowledge and expertise to grow and/or pursue different RDI directions. This translates into greater economies of RDI scale that elevates the team’s global competitiveness.

Multidisciplinary (additive) research harnesses the knowledge and expertise, within specific boundaries, in collaborative RDI teams (from different fields who investigate a similar but broad research topic); interdisciplinary (interactive) RDI analyzes, synthesizes and harmonizes the disciplinary interlinkages in a coordinated and coherent manner (researchers inform and compare perspectives by knowledge transfer across disciplines); whereas transdisciplinary (holistic) RDI transcends and integrates health, social, natural and other sciences (to blend diverse perspectives to understand complex research questions and challenges). To foster and develop the values and skills in collaborating multi-, inter- and trans-disciplinary research teams, specific competencies are often required [ 17 , 18 ]. Complementary competencies should be harnessed to enrich the collaborating research team for greater productivity through driving knowledge generation, policy development and innovative health solutions.

3.1.3 Institutional research capacity

Institutional research capacity depends on the researchers, research support personnel and an enabling environment (i.e. supportive processes, systems and culture) [ 12 ] to build, enhance and sustain research capacity through harnessing the collective economies of RDI scale within the institution. Therefore, institutions should coordinate RDI activities by providing an enabling environment (systems, processes and support) (Table 1 ). Designing RDI capacity strengthening programs in health, at institution level, requires clear goal setting, defining the capacity required to achieve the goal, determining baseline capacity and identifying the gaps towards achieving the desired capacity, delivering implementation and action plans to address the gaps, and learning by adapting and refining the plan and indicators [ 7 ]. Building institutional capacity is required to establish and sustain RDI that is aligned with a shared vision [ 19 ], mutual goals and focused on delivering improved health outcomes. To enable RDI, adequate infrastructure e.g. equipped laboratories and clinics, dedicated time and support should be provided by institutions [ 14 ] concomitant with sufficient personnel; and there should be freedom to conduct policy- and innovation-focused research. There should also be a shift where clinical and translational implementation is the required output; or alternatively, newer, cheaper and better treatments or improved processes for health service delivery. To maintain health RDI capacity, institutions should devise strategies and policies to grow and develop, retain and attract researchers by continuously improving and refining training, support and mentoring, and also provide access to resources [ 20 , 21 , 22 ]; i.e. to maintain and continuously make the environment enabling and geared to respond to health challenges. Sharing agile and mobile teams, responsibility and accountability is required with vested interest in making progress and reaching milestones. With virtual teaming applications becoming more frequently adopted and utilized, collaborations and partnerships are more easily managed to advance RDI.

RDI capacity should be integrated to meet the institutional demands. For a research institution that focuses on select diseases in response to national health priorities, there should be the assurance of internal cohesion for better outputs and impact. Focusing on infectious diseases such as HIV/AIDS and TB that burden citizens will lead to research outputs e.g. publications and graduated students. But ultimately the impact should lead to better health outcomes for citizens afflicted by diseases of high national prevalence i.e. institutional research should be aligned and converge to reduce morbidity and mortality from the most prevalent diseases. With a compromised immunologic and metabolic state, and the greater likelihood of living longer, HIV/AIDS patients are susceptible to opportunistic pathogens e.g. TB but may also later develop non-communicable diseases such as cardiovascular disease and diabetes as they age or due to adverse reactions to multiple types and/or more frequent treatments. This requires a necessary shift in focus as future research should be responsive to health needs. This also reflects a major health and economic burden i.e. the complex and high cost of treatment of chronic diseases, multimorbidity and polypharmacy afflicting an increasing number of patients over a longer duration as they age. Hence the integration of departments e.g. epidemiological, clinical and basic research should all mutually inform and reinforce each other and be supported by an enabling research environment (faculty and institution) that aims to realize translational research i.e. policies that introduce or improve practices and innovation that leads to products and services that bring in revenue. This configuration lends itself to making an impact with its focused relevance and responsiveness. However, silos and fragmentation often need to be overcome as non-collaborative mindsets may exist and research domains may be fiercely guarded.

Some institutions face infrastructural RDI capacity challenges. RDI and general operations capacity are necessary to support researchers. Institutions need a critical mass of researchers; adequate research support personnel (e.g. grant, financial, human resources and operations management); board and advisory committees (for institutional governance, review and strategy); and research, data and innovation management systems [ 14 ] to deliver RDI outputs. RDI capacity and capabilities are highly variable inter- and intra-country, particularly in LMIC, but need to be adequate to maintain fidelity to research protocols [ 14 ], that incudes harmonization of protocols for reproducibility of research at different sites intra- and inter-country. Other institutional challenges are budgetary constraints and no or limited access to timely information and data [ 14 ]. The strengthening of financial systems is integral for high performing institutions to facilitate better management of RDI income. Funders should offer training and share insights on financial management [ 23 ] and incorporate experiences that are conveyed to researchers. Following international ethics, grant funding and management standards is important to ensure that relevant topics are adequately addressed and that funds are not incorrectly allocated or misused but are used cost-effectively to gain the best possible value for RDI outputs and impact.

3.1.4 Governmental health research agendas

National, state (provincial or sub-national), municipality (city) and district (suburb) government departments should align their RDI strategies and priorities and identify challenges that need to be addressed (Table 1 ). Governments should set priorities, plan and coordinate research, support governance, enable regulation, facilitate knowledge translation and dissemination [ 24 , 25 , 26 ] and provide resources for the academic, policy and innovation health research tracks. Without country-level planning, action and guiding documents and policies in LMIC, health RDI is more likely to be influenced by international funders’ agendas instead of focusing on the priorities of LMIC [ 27 ]. LMIC researchers should inform and lead their RDI agendas and foreign investors should be cognizant of the tacit knowledge held by LMIC researchers. Governments need to ensure that their health RDI priorities are being addressed and remain protected from external influences with potential conflicting agendas. Therefore, national health strategies should be lucid, integrated and preserved, and distinctive in alignment. There should be allowance for some deviations from global health strategies and standards to favor the development of LMIC RDI capacity specific to their own health priorities. In LMIC, particularly in Africa, adding value to health RDI requires evidence-based actions that are adopted by national and provincial governments to bring health to the forefront of development agendas [ 14 ]. Further, adding value to health RDI is realized by defining, financing and monitoring lucid national plans for a future health research [ 14 ] from which the outputs and impact emanate.

3.2 Expand: improve, refine and grow research expertise and portfolios

In the expanding phase, the focus shifts to improving, refining and growing RDI expertise and portfolios (Fig.  1 ).

3.2.1 Researcher expansion

Researchers should undergo relevant training and skilling, and identify projects to grow their RDI expertise and portfolios (Table 1 ). Investment in early career researchers and developing leaders, concomitant with building and providing technical, managerial and administrative support, will encourage and motivate researchers towards greater productivity [ 8 ] to generate RDI outputs and contribute to their development as researchers and principal investigators in preparation for leading teams. LMIC researchers are best informed to identify and tackle health challenges that afflict their countries and to generate high quality and relevant evidence to decision makers [ 28 ]. At a researcher level, refinement of research projects should occur in line with market information and research developments. From a health perspective, market information refers to disease burdens that are high, emerging or dissipating and will therefore inform resource allocation. Research developments, in a health market sense, refers to the expiry of patents or lost efficacy and drug side effects that could initiate research to discover and introduce new agents to the market. Researchers need to take ownership of their expansion pathways towards enhancing their RDI outputs and for advancing their career trajectories as they grow their research expertise and enterprise.

3.2.2 Department and faculty expansion

Departments and faculties should provide an enabling RDI environment, prioritize areas and grow their teams (Table 1 ). Individual researchers form teams that feed into departments, managed by faculties, that operate across institutions. Each team member is tasked to grow their research expertise and portfolios to contribute and align to robust department and faculty research expertise. For example, with an obesity epidemic, that presents a market (or target group) in countries with a high obesity prevalence, departmental teams can define their contribution along the RDI pipeline i.e. team members can be assigned to health promotion and disease prevention, or patient enrollment, others to conducting the research, others to data analyses and interpretation, others to research dissemination e.g. articles and theses that follow the conventional academic research route, or policy development and innovation outputs, and others can program or project manage the entire RDI process (i.e. provide governance (compliance, monitoring and evaluation) and scientific oversight). This reflects the collaborative nature of RDI in departments within faculties and institutions, but team members should also be conversant in all aspects of the pipeline, with expertise in one or more aspects for greater capacity and coverage. Innovation creation should be embedded from inception, through duration, to completion of projects. Essentially, better health outcomes should be the main deliverable, not articles or theses, for RDI outputs to have greater impact. Innovation can be initiated by identifying and pursuing leads from a study. For example, a subset of participants may be more resistant to the benefits of physical activity, healthy nutrition and/or treatments. Their genetic and molecular signatures may reveal insights into factors in signaling pathways that are modulated by specific lifestyle interventions and/or treatments; what accounts for unusual variability; which factors present in early, stable and advanced states of diseases; or novel factors may be identified to explain variability in responsiveness. Precision medicine can then be pursued to contribute to healthier patients as more customization per target group enables better overall health outcomes.

3.2.3 Institutional expansion

The alignment of the right fit of researchers across departments and faculties to address major RDI challenges is an institutional function to realize expansion and can be advanced by adopting multi-, inter- and trans-disciplinary approaches (Table 1 ). For institutions, their overarching RDI strategy should focus on identifying, conducting and supporting activities that align with the national burden of disease (i.e. addresses the diseases that lead to the most morbidity and mortality in the country and can be further investigated at sub-national levels) but remains globally relevant and aligned e.g. diabetes, cardiovascular disease and cancer are global health challenges.

3.2.4 Government expansion

Enabling factors for RDI capacity building, enhancing and sustainability are shared visions and goals, collaborations, partnerships, training (including building skills and providing education), institutional support and leadership, monitoring and evaluation [ 29 ] and government buy in to protect and advance the health interests of society [ 30 , 31 ]. Government departments refine, build and integrate prioritized RDI activities that will have a positive social, education, economy and health impact (e.g. policy development for best practice and products and services derived through innovation) (Table 1 ). LMIC led research capacity building, when coordinated at government level, will be better aligned to address the relevant health challenges, with enhanced ownership and the opportunities for building skills [ 32 ] at researcher, department and faculty, and institution levels. With scarce resources in some LMIC, it is important to focus on where the greatest health impact can be realized [ 33 ] despite the emerging diseases, outbreaks, multi-morbidities, polypharmacy and drug resistance that complicate disease treatment and management. Plague may be important in a few countries, but globally it will not be a priority. It cannot be ignored though; at the very least, there should be sufficient capacity to deal with domestic threats and outbreaks that transcend globally. The Covid-19 pandemic disrupted the global health landscape and highlighted the need for better preparedness for diseases that expand rapidly and globally. Further, education, the economy, employment and society were adversely impacted by the Covid-19 pandemic. The recovery post-pandemic is challenging, and some countries will lag even further as they navigate their disease burdens which have been shifted and exacerbated by the pandemic. Not all diseases can be investigated, thus the diseases require prioritization, and the practice is to allocate skills and funding, and to build critical mass to respond by reducing the disease burden that will have the most impact. For example, in countries with the highest non-communicable diseases morbidities and mortalities, focusing research, policy development and implementation, and innovation, to reduce the burden will translate into better population health outcomes, with healthier people contributing to greater economic activity. Governments need to selective expand key RDI areas and partner with high performing institutions that can deliver by conducting RDI activities that address the prioritized health challenges for better health outcomes for their citizens.

3.3 Team: form collaborative research networks and robust partnerships

In the teaming phase, collaborative research networks and robust partnerships are established (Fig.  1 ).

3.3.1 Researcher teaming

Researchers should be immersed in multi-, inter- and trans-disciplinary teams to realize synergies from their collective and complementary RDI activities (Table 1 ). Within a team, each researcher should strive to learn and reciprocate that learning, with other team members. Teams work in departments, across faculties, to elevate institutional RDI outputs, through harnessing their diversity and synergies.

3.3.2 Department and faculty teaming

Building expert teams in and across departments and faculties will foster multi-, inter- and trans-disciplinary RDI (Table 1 ). External drivers shape researchers’ perspectives on the socioeconomic and political context and need to be understood to determine RDI priorities before establishing collaborations [ 34 ]. Research collaborations build capacity by pooling resources and sharing knowledge and need to function effectively to achieve greater RDI quality and impact [ 35 ]. For effective collaboration, reciprocity is key for enhancing RDI capacity and exchanging knowledge and skills [ 34 ]. If diabetes is being tackled, then data are required i.e. epidemiological data such as prevalence, incidences and rate of progression, subtypes; and clinical data on the treatments coupled to anthropometrical data. If treatment is not optimally efficacious, basic science studies could be designed to investigate drug interactions, gene editing experiments could be conducted, or loss/gain of function studies undertaken to reveal targets; informed by patients’ samples that are sequenced, and precision medicine applications will provide accurate and holistic detail. Therefore, an overarching department and faculty research theme(s), with teams focusing on specific RDI activities to contribute to that theme, will build department and faculty capacity and expertise, address more complex challenges and elevate institutions’ global statures towards greater excellence.

3.3.3 Institutional teaming

RDI capacity building thrives through supportive collaboration, mentorship, and the provision of training to develop management, financial and communication skills [ 36 ] to equip support teams to provide an enabling environment. RDI administrators, managers and leaders should collaborate with researchers to guide them on processes and systems, and inform them of funding and collaboration opportunities (Table 1 ). Institutions can form consortia with other compatible and complementary institutions. Integrated partnerships between the universities and health sectors accelerate RDI activity, capacity and readiness [ 34 ], which can be extended by partnerships with industry to drive RDI and be supported by the philanthropy sector. A strong consortium that covers the key components of the RDI pipeline, that can meet academic and innovation criteria, will be competitively placed to deliver outputs and attract further funding, collaborations and partnerships on a pathway towards sustainability. Reciprocal, complementary and value added institutional partnerships will enable the leveraging of synergies.

3.3.4 Government teaming

Task teams should be established to liaise, monitor, evaluate and convene RDI committees that identify researchers and institutions that can address prioritized government challenges (Table 1 ). Committees that are constituted by diverse and knowledgeable members that represent the key stakeholders in health and related sectors should be led by governments for setting, monitoring and evaluating priorities, reaching consensus on the relevance and responsiveness of RDI being conducted, championing RDI, and mobilizing resources rapidly respond to existing and emerging health challenges.

3.4 Gear: nurture researchers, departments and faculties, and institutions

In the gearing phase, researchers are nurtured (i.e. prepared and equipped) and resources are optimized (at department, faculty and institutional levels) to secure investments to embark on new activities (Fig.  1 ).

3.4.1 Researcher gearing

Researchers are prepared, equipped and nurtured, in teams, to seek and secure funding (e.g. seed funding) and generate new projects (Table 1 ). In the research team matrix, there are typically new, emerging, established and/or prominent researchers. A critical mass of established independent researchers could develop sufficient new and emerging researchers to meet department, institution and faculty RDI demands. Sustainable independence refers to researchers who secure funding to conduct RDI, that is fully subsidized externally, and may cover a reasonable infrastructural overhead of 10–15%. Full sustainable independence is achieved when all RDI, operations and salaries are financed externally e.g. from grant income, providing training and/or revenue from products and services. This is realized by researchers making discoveries that are patented to yield products and services that generate income, with revenue reinvested to sustain current RDI and feed into future discoveries. Venture capitalism, philanthropy and crowd funding are sources to start up. More conventionally, multiple grant funding, and multiplier funding (where secured awards are leveraged to attract co-investment from existing funders and additional investment from new funders), awarded to researchers within a department from various funders can be used to start up. Successful grantsmanship from diverse and multiple sources to fund RDI activities often attracts new collaborations and partnerships.

3.4.2 Department and faculty gearing

Faculties should convene interdepartmental planning and strategy meetings to inform RDI directions, priorities and funding opportunities (Table 1 ). This will encourage researchers from different departments to collaborate on a shared faculty vision e.g. secure a five-year grant that will fund the full RDI costs for early career researchers to work across the faculty to address a major multimorbidity (health challenge) or build critical and scarce skills that can be applied across departments (e.g. biostatistics and bioinformatics). This is also an opportunity to monitor, evaluate, provide input, refine and maintain RDI activities at the department and faculty levels, that can be conveyed to the institution for oversight.

3.4.3 Institutional gearing

Institution managers and leaders should monitor and evaluate RDI to identify opportunities for growth and to invest and disinvest in activities based on outputs, relevance and responsiveness (Table 1 ). With institutional oversight, leaders play a role in identifying RDI opportunities when networking with national and international stakeholders and guiding researchers to form new collaborations, particularly by increasing their footprint in new countries. Further, some funding should be allocated for researchers to pursue high risk research, within a realistic time frame, that could yield high rewards e.g. a new income generating health product (e.g. a diagnostic, device or treatment).

3.4.4 Government gearing

RDI activities with the potential to inform policy or generate revenue through innovation should be prioritized (Table 1 ) and may lead to more rapid uptake by governments [ 32 ], especially when government stakeholders are involved and informed of the initiatives from inception, which may garner buy in and further support. Funders should collaborate with institutions to improve job security for researchers by setting fair employment contracts [ 36 ] and incentivize researchers to secure further grants.

For collaborations (researcher, department and faculty) and partnerships (institution and government), there are two broad concepts. The complementary principle is where each researcher, department and faculty, institution or government contributes expertise and resources that are complementary, and not duplicated but may be supplementary, to conduct and support RDI activities that are scalable and deliver mutually beneficially outputs that improve outcomes. The currency principle refers to the collaborator or partner i.e. their value as a contributor, i.e. what they offer, and what they can gain (reciprocity). For medical insurance, better data on their patients for more accurate forecasting of costs, and the opportunity to treat diseases earlier, even prior to onset (e.g. incentivizing obese and/or insulin resistant people to adopt healthier lifestyles to prevent diabetes), that will result in savings, reflects their potential gains. For companies, an opportunity to demonstrate that they value their employees and gain insight (collective and anonymous) on employees’ health and personal challenges are gains. RDI teams in turn will benefit from receiving funding and/or support (e.g. data analysis or project management) and contribute to the well-being of medical insurance companies’ patients and employees. Ultimately, people will benefit from better health and quality of life.

3.5 Leverage: harness collective RDI collaborations and partnerships for greater global competitiveness and sustainability

In the leveraging phase (Fig.  1 ), harnessing the collective RDI of skilled researchers who constitute excellent departments and faculties that constitute highly reputable institutions enables greater global competitiveness and sustainability. This drives and sustains RDI excellence.

3.5.1 Researcher leveraging

Researchers should harness collaborative networks to secure funding from diverse and multiple sources, grow networks and consistently deliver high quality and relevant RDI outputs with impact (Table 1 ) to maintain and improve excellence. Researchers require personal, frequent and flexible support from committed supervisors and mentors to guide them to produce quality, relevant and timely outputs [ 28 ] and to become more established by securing funding to sustain their RDI activities. Sustainability is achieved with financial independence and autonomy in decision making [ 7 ] which will contribute to global competitiveness.

3.5.2 Department and faculty leveraging

With overarching departmental and faculty research themes, building the collective RDI expertise and leveraging networks and consortia to secure larger funding awards will translate in generating more quality outputs (Table 1 ). Foundational excellence at the unit i.e. individual researcher level with competitive and marketable skills will grow successful departments and faculties that deliver quality research and innovation outputs frequently such as articles, graduates and products (new diagnostics, devices and treatments) and services (training, editing, scientific writing and mentoring) to further elevate their institutions’ stature. This charts the path towards creating a sustainable RDI enterprise.

3.5.3 Institutional leveraging

Existing and new institutional partnerships are leveraged to harness RDI economies of scale for further elevation of institutional stature, competitive advantage and excellence (Table 1 ). Institutions should align and selectively partner with compatible and high performing institutions, and with shared interests and goals, identify and harness their collective RDI expertise, capabilities, funds and other resources to further elevate their global competitiveness. This should grow in a phased, timely and responsible manner, to apply resources where the most value and impact can be derived through RDI activities. Institution leaders can demonstrate that they value their researchers by endorsing the RDI, ensuring that systems and processes are enabling and adaptive to advance RDI, and by prioritizing and (re)aligning RDI with institutional goals [ 37 ]. Institutional sustainability can be realized by securing research grants, frequently publishing quality and relevant articles, developing policies (for implementation and best practice), and generating patents for products (innovation) [ 10 ] and providing services.

3.5.4 Government leveraging

Policies are introduced, refined or revised that lead to improved practices in government departments to realize greater effectiveness and efficiencies (e.g. systems, processes and costs), and revenue generated from innovation is reinvested in RDI with specific funding ringfenced for addressing government priorities (Table 1 ). Governments should mobilize funds, informed by their RDI priorities that address their health challenges, and, with philanthropy partners, incentivize the private sector (industry) to contribute to developing, marketing and sustaining innovation [ 38 ]. Public–private-philanthropy partnerships should be cohesive to address potential mismatches in national and global RDI priorities [ 38 , 39 ]. Further, governments should collectively coordinate and support the co-creation of a global RDI roadmap by analyzing the capacities in countries, and collectively, charting the steps for each country to enhance capacity and calculate the costs [ 38 ]. A global RDI roadmap should encompass the entire RDI life cycle, from laboratory to implementation science to innovation, and be monitored and evaluated [ 38 ]. It is imperative to prioritize and integrate RDI findings into practice [ 37 ] and to realize products and services for positive societal impact.

4 Limitations

The conceptual health RDI framework presents a guide for building, enhancing and sustaining capacity, but was informed by selective literature, a LMIC perspective, and limited by the paucity of relevant studies—e.g. some studies focus on building capacity to address specific diseases. The definitions of research, development and innovation may differ, and often only one of these aspects are the focus in studies, despite the need for greater integration. The framework can be reinforced and adapted by adopting principles and best practices from countries with advanced RDI track records. For some countries, the challenges to conduct and support RDI varies, due to resourcing constraints and conflicting priorities. Thus building, enhancing and sustaining RDI may only be realistic at researcher and/or department and faculty levels and progress may be gradual.

Building, enhancing and sustaining capacity are separate, albeit interlinked, phases for advancing RDI researchers, departments and faculties, and institutions may struggle to shift from building to enhancing RDI. In addition, sustaining RDI is challenging in a competitive global arena, with changing priorities, under-resourcing, and competition for researchers. Future research can focus on the inputs for realizing RDI building, enhancing and sustaining (such as quality researchers), as each presents a direction for further development. To test the framework, indictors could be assigned that are specific per researcher, department and faculty, institution and government and be informed by the activities required to progress from building to enhancing to sustaining RDI.

5 Conclusion

Constructing, expanding, teaming, gearing and leveraging RDI are phases that advance global competitiveness, stature and lead towards greater sustainability. Building, enhancing and sustaining research capacity requires constant monitoring, evaluation, input, refinement and maintenance. Capacity building, enhancement and sustainability—across the researcher, department and faculty, institution, and government dimensions—that applies RDI to address health challenges contributes to improving health, economy and societal outcomes.

Kilmarx PH, Maitin T, Adam T, Akuffo H, Aslanyan G, Cheetham M, Corrêa-Oliveira R, Kay S, Khelef N, Kunaratnam Y, et al. A mechanism for reviewing investments in health research capacity strengthening in low- and middle-income countries. Ann Glob Health. 2020;86:92. https://doi.org/10.5334/aogh.2941 .

Article   Google Scholar  

Woodward A, Sheahan K, Martineau T, Sondorp E. Health systems research in fragile and conflict affected states: a qualitative study of associated challenges. Health Res Policy Syst. 2017;15:44. https://doi.org/10.1186/s12961-017-0204-x .

Bowsher G, Papamichail A, El Achi N, Ekzayez A, Roberts B, Sullivan R, Patel P. A narrative review of health research capacity strengthening in low and middle-income countries: lessons for conflict-affected areas. Global Health. 2019;15:23. https://doi.org/10.1186/s12992-019-0465-y .

Mwendera CA, De Jager C, Longwe H, Phiri K, Hongoro C, Mutero CM. Facilitating factors and barriers to malaria research utilization for policy development in Malawi. Malar J. 2016;15:512. https://doi.org/10.1186/s12936-016-1547-4 .

Laabes EP, Desai R, Zawedde SM, Glew RH. How much longer will Africa have to depend on western nations for support of its capacity-building efforts for biomedical research? Trop Med Int Health. 2011;16:258–62. https://doi.org/10.1111/j.1365-3156.2010.02709.x .

Franzen SRP, Chandler C, Lang T. Health research capacity development in low and middle income countries: reality or rhetoric? A systematic meta-narrative review of the qualitative literature. BMJ Open. 2017;7:e012332. https://doi.org/10.1136/bmjopen-2016-012332 .

Bates I, Boyd A, Smith H, Cole DC. A practical and systematic approach to organisational capacity strengthening for research in the health sector in Africa. Health Res Policy Syst. 2014;12:11. https://doi.org/10.1186/1478-4505-12-11 .

ESSENCE on Health Research and CCR Effective Research Capacity Strengthening: A Quick Guide for Funders. 2023.

Lê G, Mirzoev T, Orgill M, Erasmus E, Lehmann U, Okeyo S, Goudge J, Maluka S, Uzochukwu B, Aikins M, et al. A new methodology for assessing health policy and systems research and analysis capacity in African universities. Health Res Policy Syst. 2014;12:59. https://doi.org/10.1186/1478-4505-12-59 .

Khisa, A.; Gitau, E.; Pulford, J.; Bates I. A Framework and Indicators to Improve Research Capacity Strengthening Evaluation Practice; Liverpool, 2019.

Pulford J, Crossman S, Abomo P, Amegee Quach J, Begg S, Ding Y, El Hajj T, Bates I. Guidance and conceptual tools to inform the design, selection and evaluation of research capacity strengthening interventions. BMJ Glob Health. 2021;6:e005153. https://doi.org/10.1136/bmjgh-2021-005153 .

Mirzoev T, Topp SM, Afifi RA, Fadlallah R, Obi FA, Gilson L. Conceptual framework for systemic capacity strengthening for health policy and systems research. BMJ Glob Health. 2022;7:e009764. https://doi.org/10.1136/bmjgh-2022-009764 .

Bennett S, Agyepong IA, Sheikh K, Hanson K, Ssengooba F, Gilson L. Building the field of health policy and systems research: an agenda for action. PLoS Med. 2011;8:e1001081. https://doi.org/10.1371/journal.pmed.1001081 .

Cottler LB, Zunt J, Weiss B, Kamal AK, Vaddiparti K. Building global capacity for brain and nervous system disorders research. Nature. 2015;527:S207–13. https://doi.org/10.1038/nature16037 .

Morel T, Maher D, Nyirenda T, Olesen OF. Strengthening health research capacity in sub-Saharan Africa: mapping the 2012–2017 landscape of externally funded international postgraduate training at institutions in the region. Global Health. 2018;14:77. https://doi.org/10.1186/s12992-018-0395-0 .

Fitchetta JR, Lib JF, Atuna R. Innovative financing for late-stage global health research and development: the global health investment fund. Int Health. 2016;8:3–4. https://doi.org/10.1093/inthealth/ihv067 .

Estapé ES, Rodríguez-Orengo JF. Development of multidisciplinary academic programs for clinical research education. J Allied Health. 2005;34:55–70.

Google Scholar  

Straus SE, Brouwers M, Johnson D, Lavis JN, Légaré F, Majumdar SR, McKibbon KA, Sales AE, Stacey D, Klein G, et al. Core competencies in the science and practice of knowledge translation: description of a Canadian strategic training initiative. Implement Sci. 2011;6:127. https://doi.org/10.1186/1748-5908-6-127 .

Toma J. Building Organizational Capacity: Strategic Management in Higher Education. Baltimore: Johns Hopkins University Press; 2010.

Book   Google Scholar  

Jones N, Bailey M, Lyytikäinen M. Research capacity strengthening in Africa: trends, gaps and opportunities. Overseas Development Institute; 2007.

Kariuki T, Phillips R, Njenga S, Olesen OF, Klatser PR, Porro R, Lock S, Cabral MH, Gliber M, Hanne D. Research and capacity building for control of neglected tropical diseases: the need for a different approach. PLoS Negl Trop Dis. 2011;5:e1020. https://doi.org/10.1371/journal.pntd.0001020 .

Minja H, Nsanzabana C, Maure C, Hoffmann A, Rumisha S, Ogundahunsi O, Zicker F, Tanner M, Launois P. Impact of health research capacity strengthening in low- and middle-income countries: the case of WHO/TDR programmes. PLoS Negl Trop Dis. 2011;5:e1351. https://doi.org/10.1371/journal.pntd.0001351 .

Dean L, Njelesani J, Smith H, Bates I. Promoting sustainable research partnerships: a mixed-method evaluation of a United Kingdom-Africa capacity strengthening award scheme. Health Res Policy Syst. 2015;13:81. https://doi.org/10.1186/s12961-015-0071-2 .

WHO. World Health Organization. Geneva: The WHO Strategy on Research for Health; 2012.

WHO. World Health Organization National Health Research Systems. Geneva: Report of an International Workshop; 2002.

Ijsselmuiden C, Marais DL, Becerra-Posada F, Ghannem H. Africa’s neglected area of human resources for health research—the way forward. S Afr Med J. 2012;102:228–33.

Uthman OA, Wiysonge CS, Ota MO, Nicol M, Hussey GD, Ndumbe PM, Mayosi BM. Increasing the value of health research in the WHO African Region beyond 2015–reflecting on the past, celebrating the present and building the future: a bibliometric analysis. BMJ Open. 2015;5:e006340. https://doi.org/10.1136/bmjopen-2014-006340 .

TDR for research on diseases of poverty ESSENCE on Health Research. Seven Principles for Strengthening Research Capacity in Low and Middle Income Countries: Simple Ideas in a Complex World ; 2014 .

Slade SC, Philip K, Morris ME. Frameworks for embedding a research culture in allied health practice: a rapid review. Health Res Policy Syst. 2018;16:29. https://doi.org/10.1186/s12961-018-0304-2 .

Jotterand F, Spellecy R, Shaker R. The rights (and responsibilities) of the public to advance health through research. Archives of Public Health. 2021;79:198. https://doi.org/10.1186/s13690-021-00726-w .

Tang N, Eisenberg JM, Meyer GS. The roles of government in improving health care quality and safety. Jt Comm J Qual Saf. 2004;30:47–55. https://doi.org/10.1016/S1549-3741(04)30006-7 .

Kasprowicz VO, Chopera D, Waddilove KD, Brockman MA, Gilmour J, Hunter E, Kilembe W, Karita E, Gaseitsiwe S, Sanders EJ, et al. African-led health research and capacity building—is it working? BMC Public Health. 2020;20:1104. https://doi.org/10.1186/s12889-020-08875-3 .

Cerf ME. Sustainable development goal integration, interdependence, and implementation: the environment–economic–health nexus and universal health coverage. Global Chall. 2019;3:1900021. https://doi.org/10.1002/gch2.201900021 .

Whitworth A, Haining S, Stringer H. Enhancing research capacity across healthcare and higher education sectors: development and evaluation of an integrated model. BMC Health Serv Res. 2012;12:287. https://doi.org/10.1186/1472-6963-12-287 .

Adelle C, Elema N, Chakauya E, Benson D. Evaluating ‘homegrown’ research networks in Africa. S Afr J Sci. 2018. https://doi.org/10.1715/sajs.2018/20170070 .

Chapman N, Thomas EE, Tan JTM, Inglis SC, Wu JHY, Climie RE, Picone DS, Blekkenhorst LC, Wise SG, Mirabito Colafella KM, et al. A roadmap of strategies to support cardiovascular researchers: from policy to practice. Nat Rev Cardiol. 2022;19:765–77. https://doi.org/10.1038/s41569-022-00700-1 .

Matus J, Walker A, Mickan S. Research capacity building frameworks for allied health professionals—a systematic review. BMC Health Serv Res. 2018;18:716. https://doi.org/10.1186/s12913-018-3518-7 .

Yamey G, Batson A, Kilmarx PH, Yotebieng M. Funding innovation in neglected diseases. BMJ. 2018;360:k1182. https://doi.org/10.1136/bmj.k1182 .

Viergever RF. The mismatch between the health research and development (R&D) that is needed and the R&D that is undertaken: an overview of the problem, the causes, and solutions. Glob Health Action. 2013;6:22450. https://doi.org/10.3402/gha.v6i0.22450 .

Download references

Author information

Authors and affiliations.

Grants, Innovation and Product Development, South African Medical Research Council, PO Box 19070, Tygerberg, Cape Town, 7505, South Africa

Marlon E. Cerf

Biomedical Research and Innovation Platform, South African Medical Research Council, PO Box 19070, Tygerberg, Cape Town, 7505, South Africa

You can also search for this author in PubMed   Google Scholar

Contributions

MC wrote the main manuscript text, prepared the figure and table and reviewed the manuscript.

Corresponding author

Correspondence to Marlon E. Cerf .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cerf, M.E. Health research, development and innovation capacity building, enhancement and sustainability. Discov Soc Sci Health 3 , 18 (2023). https://doi.org/10.1007/s44155-023-00051-3

Download citation

Received : 24 April 2023

Accepted : 24 July 2023

Published : 03 August 2023

DOI : https://doi.org/10.1007/s44155-023-00051-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collaboration
  • Low- and middle-income countries
  • Partnership
  • Find a journal
  • Publish with us
  • Track your research

Research engagement and research capacity building: a priority for healthcare organisations in the UK

Affiliations.

  • 1 Centre for Health Services Studies, University of Kent, Canterbury, UK.
  • 2 Health Services Research and Policy, LSHTM, London, UK.
  • PMID: 36951686
  • DOI: 10.1108/JHOM-12-2021-0436

Purpose: To research involvement of healthcare staff in the UK and identify practical organisational and policy solutions to improve and boost capacity of the existing workforce to conduct research.

Design/methodology/approach: A mixed-method study presenting three work packages here: secondary analysis of levels of staff research activity, funding, academic outputs and workforce among healthcare organisations in the United Kingdom; 39 Research and Development lead and funder interviews; an online survey of 11 healthcare organisations across the UK, with 1,016 responses from healthcare staff included for analysis; and 51 interviews of healthcare staff in different roles from six UK healthcare organisations.

Findings: Interest in research involvement is strong and widespread but hampered by a lack of systematic organisational support despite national policies and strategies to increase staff engagement in research. While useful, these external strategies have limited universal success due to lack of organisational support. Healthcare organisations should embed research within organisational and human resources policies and increase the visibility of research through strategic organisational goals and governance processes. A systems-based approach is needed.

Research limitations/implications: The research gathered data from a limited number of NHS trusts but these were purposively sampled to provide a range of different acute/community health service organisations in different areas. But data was therefore more detailed and nuanced due to a more in-depth approach.

Practical implications: The findings are relevant for developing policies and practice within healthcare organisations to support research engagement. The findings also set out key policy and strategic recommendations that will support greater research engagement.

Social implications: Increased research activity and engagement in healthcare providers improves healthcare outcomes for patients.

Originality/value: This is a large scale (UK-wide) study involving a broad range of healthcare staff, with good engagement of nurses, midwives and Allied Healthcare Professionals who have not been previously achieved. This allowed valuable analysis of under-researched groups and comparisons by professional groups. The findings highlight the need for tailored action to embed research reporting, skills, professional development and infrastructure into organisational policies, strategies and systems, along with broader system-wide development.

Keywords: Capacity building; Health professionals; Health services research; Research engagement.

© Emerald Publishing Limited.

  • Allied Health Personnel
  • Capacity Building*
  • Delivery of Health Care*
  • Health Personnel
  • United Kingdom

research and capacity building organisations

  • National Institute of General Medical Sciences

Search

  • Biophysics, Biomedical Technology, and Computational Biosciences
  • Genetics and Molecular, Cellular, and Developmental Biology
  • Pharmacology, Physiology, and Biological Chemistry
  • Research Capacity Building
  • Training, Workforce Development, and Diversity
  • Contacts by Research Area
  • Funding Opportunities and Notices
  • Post Award Information
  • Submitting an Application
  • NIH RePORTER
  • Dashboard of TWD Funded Programs
  • High School and Undergraduate
  • Postbaccalaureate and Graduate Students
  • Postdoctoral, Early Career, and Faculty
  • Workforce Development
  • Contact Information
  • Division Structure and Programs
  • Enhancing Diversity in Training Programs
  • Evaluation Resources
  • Laboratory Safety and Guidelines
  • Training Resources
  • Institutional Development Award (IDeA)
  • Native American Research Centers for Health (NARCH)
  • Science Education Partnership Awards (SEPA)
  • Support for Research Excellence (SuRE)
  • DRCB Staff Contacts
  • Current NIGMS Funding Opportunities
  • Parent Announcements for Investigator-Initiated Applications
  • Research Project Grants (NIH Parent R01)
  • Research With Activities Related to Diversity (ReWARD)
  • Maximizing Investigators' Research Awards (MIRA)
  • Instrumentation Grant Program for Resource-Limited Institutions (RLI-S10)
  • Undergraduate-Focused Institutions
  • Small Business Research
  • Multidisciplinary Teams/Collaborative Research
  • Technology Development
  • Research Resources
  • Clinical Studies and Trials
  • Conferences and Scientific Meetings
  • Administrative Supplements
  • All Funding Opportunities
  • How to Apply
  • Grant Application and Review Process

NIGMS Funding Policies

  • Post-Award Information
  • Talking to NIH Staff About Your Application and Grant
  • Considerations for Multiple Principal Investigator (MPI) Applications
  • Attribution of NIH/NIGMS Support
  • Message to NIGMS Investigators
  • Research Using Human Subjects or Specimens
  • Data Management and Sharing Plan Expectations for NIGMS Grantees
  • STEM Teaching Resources
  • Coloring Pages
  • Educator's Corner
  • Image & Video Gallery
  • NIGMS-Supported Resources
  • Protein Alphabet
  • Biomedical Beat Blog
  • Featured Topics
  • Past Campaigns
  • Past Releases and Announcements
  • Media Coverage
  • Feedback Loop Blog
  • NIGMS-Supported Meetings
  • Webinars for the NIGMS Training Community
  • Face to Face with Program Directors
  • Grant Writing Webinar Series for Institutions Building Research and Research Training Capacity
  • Image and Video Gallery
  • Director's Corner
  • Organization and Staff
  • Staff Directory
  • Budget, Financial Management, and Congressional Material
  • Strategic Plans
  • Data Integration, Modeling, and Analytics
  • Advisory Council
  • Communications and Public Liaison Branch
  • Job Vacancies
  • Visitor Information

Highlight Header ​​​Latest Program News/Events

​DRCB Weekly Updates

IDeA Interactive Portfolio Dashboard

NARCH Interactive Portfolio Dashboard

SCORE Interactive Portfolio Dashboard

SEPA Interactive Portfolio Dashboard

IDeA-Supported Research Cores

IDeA States Pediatric Clinical Trials Network (ISPCTN)

American Indian and Alaska Native Research in the Health Sciences: Critical Considerations for the Review of Research Applications [PDF]

NSF Established Program to Stimulate Competitive Research (EPSCoR)

Staff Contact Information

Related Information

NIGMS Feedback Loop Blog

Division for Research Capacity Building (DRCB)

DRCB administers the Institutional Development Award (IDeA) program that supports research, research workforce development, and research infrastructure improvement in states where levels of NIH funding have historically been low. The division also oversees the Native American Research Centers for Health (NARCH) program, the Support for Research Excellence (SuRE) program, and the Science Education Partnership Awards (SEPA) program.

IDeA is a congressionally mandated program that builds research capacity in states with low levels of NIH funding. It supports competitive basic, clinical, and translational research, research workforce development, and infrastructure improvements. The program aims to strengthen institutions' ability to support biomedical research, enhance the competitiveness of investigators in securing research funding, and catalyze clinical and translational research that addresses the needs of IDeA state populations.

The NARCH program funds federally recognized American Indian/Alaska Native (AI/AN) tribes and organizations for health research, research career enhancement, and research infrastructure enhancement activities.

The SuRE program supports research capacity building at eligible higher education institutions through funding investigator-initiated research in the biomedical, clinical, behavioral, and social sciences that falls in the mission areas of the NIH.

The SEPA program supports educational activities that encourage pre-college students (pre-kindergarten to grade 12) from diverse backgrounds to pursue further studies in science, technology, engineering, and mathematics (STEM).

For DRCB staff contact information and their biographical sketches, please see the Division Staff Contacts page .​

research and capacity building organisations

Connect With Us: Facebook Instagram Linkedin Subscriptions YouTube

  • Your Privacy
  • Accessibility
  • Disclaimers
  • HHS Vulnerability Disclosure
  • U.S. Department of Health and Human Services
  • National Institutes of Health: NIH...Turning Discovery Into Health®
  • 45 Center Drive MSC 6200
  • Bethesda, MD 20892-6200
  • 301-496-7301
  • Architecture and Design
  • Asian and Pacific Studies
  • Business and Economics
  • Classical and Ancient Near Eastern Studies
  • Computer Sciences
  • Cultural Studies
  • Engineering
  • General Interest
  • Geosciences
  • Industrial Chemistry
  • Islamic and Middle Eastern Studies
  • Jewish Studies
  • Library and Information Science, Book Studies
  • Life Sciences
  • Linguistics and Semiotics
  • Literary Studies
  • Materials Sciences
  • Mathematics
  • Social Sciences
  • Sports and Recreation
  • Theology and Religion
  • Publish your article
  • The role of authors
  • Promoting your article
  • Abstracting & indexing
  • Publishing Ethics
  • Why publish with De Gruyter
  • How to publish with De Gruyter
  • Our book series
  • Our subject areas
  • Your digital product at De Gruyter
  • Contribute to our reference works
  • Product information
  • Tools & resources
  • Product Information
  • Promotional Materials
  • Orders and Inquiries
  • FAQ for Library Suppliers and Book Sellers
  • Repository Policy
  • Free access policy
  • Open Access agreements
  • Database portals
  • For Authors
  • Customer service
  • People + Culture
  • Journal Management
  • How to join us
  • Working at De Gruyter
  • Mission & Vision
  • De Gruyter Foundation
  • De Gruyter Ebound
  • Our Responsibility
  • Partner publishers

research and capacity building organisations

Your purchase has been completed. Your documents are now available to view.

How to Leverage Action Research to Develop Context-specific Capacity Building for Civil Society Organizations

In recent decades, increased attention has been given to the hierarchical nature and intrinsic power dynamics of CSO capacity building programs. In a global context, international donors tend to design and implement capacity building programs, which then prioritize donors’ objectives and employ Western concepts in the Global South. This research note aims to reframe capacity building around inclusive and equal partnerships centered on civil society leaders who participate in designing and delivering capacity building programs. We propose action research as a process for co-creating contextually appropriate models that enable local ownership for capacity building and thus equip civil society to improve the lives of people in communities. We apply this approach to the Liberian case to develop a process to engage local civil society organizations in developing participatory capacity building programs that address place-based needs in non-Western contexts.

1 Introduction

This research note introduces action research to nonprofit studies. We contribute to the current debates on power and equity in capacity building by proposing how the systemic and adaptive processes of inquiry associated with action research can be harnessed to develop locally embedded capacity development programs that are contextually relevant and responsive to the capacity needs of local civil society organizations (CSOs). We use the country case of Liberia to demonstrate how action research centers the voices and experiences of local CSO leaders as “experts” and “co-researchers” in identifying the capacity building realities of their own organizations, and how this local knowledge can be used to address the needs of CSOs, giving voice to those who are bypassed in the development process. This research note encourages nonprofit scholars to embrace action research as a collaborative and participatory approach to inquiry and action, and as a relevant approach for investigating and investing in capacity development. The research note illustrates the action research approach and process used in our study, to provide an illustrative example of how to implement this methodological approach. [1]

We apply this methodological approach to the Liberian case, which exemplifies the capacity building challenges of the civil society sector in many developing countries. The initial phase of our action research process reveals that capacity building programs for CSO leaders in Liberia are designed and delivered mostly by international donors (although some local programs exist), yet these opportunities are few and far between, and remain difficult to access due to cost barriers and selection processes for participation. Our preliminary analysis shows that local CSO leaders are interested in more stable and institutionalized capacity building programs, which take the form of what in the Western context is known as nonprofit management education (NME). Through this case study, the research note demonstrates how action research – the methodology we propose here – is particularly suited to address the shortcomings scholars identify in traditional approaches to capacity building.

Our study is of interest to both nonprofit management educators and practitioners. It aims to increase awareness of the value of linking capacity building and educational practices to local communities’ practical needs. Both in the United States and globally, scholars critique the intrinsic power dynamics of capacity building. Traditional models of capacity building center on practices developed by and for a sector that is overwhelmingly white and Western ( GEO 2021 ) and thus negating, if not canceling, knowledge and traditions of local communities, nationally and internationally ( Kacou, Ika, and Munro 2022 ). These critiques call for reframing capacity building in both theory and practice ( Nishimura et al. 2020 ), calls that in the context of NME are mirrored in the demand for a more explicitly “critical pedagogy” ( Feit and Sandberg 2022 ). We contribute to these debates by advocating for a participatory approach to engage local CSO leaders in co-creating capacity building programs that address the actual needs of local CSOs in non-Western contexts, and utilize action research to do so.

This research note is organized as follows. First, we discuss the roles of civil society in facilitating democracy and development, and we outline existing capacity gaps that prevent civil society from effectively carrying out these roles. Next, we develop a typology of capacity building. We then highlight capacity building challenges, including how prioritization of donors’ objectives undermines the ability of grassroots CSOs to foster development and democratization. Further, the extant literature on capacity building in non-Western contexts, and NME as one specific capacity building strategy, shows that the disconnect between Western practices and local needs limits these programs. To address these challenges, the methodological section introduces action research as an approach to incorporate multiple stakeholders in the development of capacity building programs. We apply the approach to the Liberian case, as extreme poverty, lack of formal capacity building and NME, and donor-controlled capacity building programs characterize Liberian civil society.

2 Civil Society and Sustainable Development

In Africa, CSOs fulfill important roles in advancing democratization and poverty alleviation, combining advocacy and service provision ( Lewis 2014 ). In terms of advocacy and democracy provision, CSOs contribute to fighting dictatorships, advocating for peace, demanding accountability in governance, and participating in relief and rehabilitation activities ( Teshome-Bahiru 2009 ; Yeshanew 2012 ). At the same time, CSOs also engage in service delivery and other development programs that reflect the needs of local communities ( Krawczyk 2018 ). They perform these service provision roles more effectively because of higher grassroots linkages ( Banks and Hulme 2012 ). The roles CSOs play in different African countries varies based on the political and institutional constraints inherent in each country. In Ethiopia, for example, CSOs are more focused on service delivery due to government restrictions placed on rights-advocacy CSOs ( Yeshanew 2012 ).

Yet, evidence indicates that low credibility and legitimacy, limited resources, and lack of organizational capacity constrain the important roles CSOs play in facilitating democracy and development in Africa ( Chaplowe and Engo-Tjega 2007 ; Hayman 2016 ). Chaplowe and Engo-Tjega (2007) , for example, find that insufficient human and organizational capacity is, in the African context, “a major constraint on CSO performance, impacting strategic planning, in-house and external training, monitoring and evaluation, and research and dissemination” (p. 262). Likewise, Ekirapa et al. (2012) found that most of the 952 CSOs surveyed in Nairobi, Kenya, lacked the capacity to effectively deliver services that would have a demonstrable impact. These constraints affect the performance of CSOs, and negatively impact their ability as agents of development.

International donors increasingly invest in the capacity of civil society in Africa because of these documented gaps. Scholars and development agencies thus link building capacity to achieving development goals. Civil society is identified as one of the key capacity deficiencies ( Hope 2011 ), in the hopes that a better-equipped civil society will contribute to sustainable development outcomes ( Bryan et al. 2016 ; Walker 2016 ). In the next section, we provide a typology of capacity building, and outline various capacity building strategies for civil society in Africa, including NME.

2.1 Conceptualizing Civil Society Capacity Building Strategies

Capacity building is defined as “the competency of individuals, public sector institutions, private sector entities, civil society organisations and local communities to engage in activities in a sustainable manner for positive development impacts such as poverty reduction, improvements in governance quality or meeting the MDGs” ( Hope 2011 , p. 60). Scholars differentiate between three capacity building levels: individual, organizational, and institutional ( Hope 2011 ; Kacou, Ika, and Munro 2022 ). At the individual level, the emphasis is on technical and analytical abilities, skills, competencies, and knowledge. At the organizational level, capacity building is accomplished by strengthening processes and policies that address groups, teams, or units. Institutional capacity refers to the ability of individuals, organizations, communities, states, and societies to address collective problems and create long-term benefits for citizens ( Kacou, Ika, and Munro 2022 , p. 222). These three levels are deeply intertwined: organizational capacity builds on the capacity of individuals, and institutional capacity supports (or the lack of it hinders) the capacity of the other two levels ( Balboa 2014 ; Brinkerhoff 2010 ; Kacou, Ika, and Munro 2022 ). Scholars thus emphasize capacity building’s systemic nature with a set of capacity targets (resources; skills and knowledge; organization; politics and power; incentives) that can be distinguished relative to each of these three levels, while also overlapping ( Brinkerhoff 2010 ; Brinkerhoff and Morgan 2010 ).

Balboa (2014) proposes a typology of capacity building that is relevant to this research note, as it addresses CSOs that are active at the intersection of local, regional, and global spheres. She distinguishes between three overlapping capacity categories: (1) political (politics as contestation of ideas), (2) technical (ability to access information and work toward the mission), and (3) administrative (internal managerial skills). Additionally, by mapping these three categories against three spheres of influence (local, national, and global), Balboa (2014) highlights “bridging capacity” as a fourth category, which is the capacity to negotiate the tensions that emerge from operating across the three spheres. Balboa’s typology helps differentiate between the different approaches to capacity development at the individual and organizational level.

Institutionalized, credit-based programs ( top right quadrant ): These programs are typically university-based and align with what the literature identifies as nonprofit management education. These curricula typically emphasize what Balboa (2014) categorizes as political and administrative capacity, and their higher-level standardization favors capacity categories across a range of fields rather than technical, field-specific capacity. These programs distinguish between outside function (e.g. developing resources and marketing in relation to external stakeholders), boundary spanning (functions that bridge internal and external management), and inside function (internal management skills) ( Mirabella and Wish 2000 ). A good example of an institutionalized, credit-based program is the Centre on African Philanthropy & Social Impact at the University of the Witwatersrand (Johannesburg, South Africa), which emphasizes nonprofit management while highlighting pan-African philanthropic traditions.

Institutionalized, non-credit-based programs ( bottom right quadrant ): These programs are implemented via university-based outreach (not university academic programs), or by organizations such as public agencies. Such programs are more than one-off trainings, but less than full degree programs, typically emphasizing technical capacity and, to a lesser degree, administrative capacity (see Balboa 2014 ). An example is the Liberia Institute of Public Administration (LIPA), a quasi-public agency that provides capacity building for the public, private, and civil society sectors in topic areas such as public procurement and financial management.

Ad hoc, non-credit-based programs ( bottom left quadrant ): These capacity building programs are one-off trainings that international funding agencies typically offer as part of grant programs or through local umbrella CSOs. They are the most common programs in the development context. They focus on administrative capacity, as they aim to strengthen grantees’ internal managerial practices, and on political capacity through advocacy training (see Balboa 2014 ). The European Union Agents for Citizen-Driven Transformation Programme ( https://www.justice-security.ng/agents-citizen-driven-transformation-act-august-2019-january-2020 ) is a good example as it strengthens the capacity of EU direct and indirect non-profit grantees to improve their institutional mechanisms and programmatic competence in selected states across Nigeria.

Ad hoc, credit-based programs ( top left quadrant ): These programs are one-off trainings that can be offered by both domestic and international providers. They are not part of ongoing funding programs, but participants receive certification for completing the program (a certificate of completion rather than a degree). These programs can be either field specific, emphasizing technical capacity, or more broadly conceived around administrative capacity. Good examples include the Cloneshouse Nigeria Result-based M&E Training ( https://www.cloneshouse.com/ ) and the Tom Associates Nigeria ( https://www.tomassociatesng.com/ ) training across a range of administrative and technical capacity areas.

Figure 1: 
A conceptual map of capacity building in a development context.

A conceptual map of capacity building in a development context.

While capacity building’s evolution can be traced back to the 1950s–1960s, and the need to build the capacity of newly independent countries ( Kacou, Ika, and Munro 2022 ), NME (top right quadrant in Figure 1 ) as one strategy for building the capacity of civil society is a more recent development, particularly in the international context. NME has received increased attention from scholars over the past two decades because of a growing educational field and the devolution of social services ( Weber 2022 ). At the same time, the global associational revolution ( Salamon 1994 ), and the role of nongovernmental organizations in supporting SDGs has increased the interest in NME as a capacity building strategy beyond the Western World to strengthen organizations to improve the impact of foreign aid ( Kacou, Ika, and Munro 2022 ). Yet, research on capacity building and NME in an international context operates in silos, and most research on NME focuses on credit-based, institutionalized, university-based offerings, failing to capture the field’s complexity that also includes ad hoc and non-credit based offerings.

While non-credit based education programs are often perceived as lower quality and less effective ( Gartner 2021 ; Lee 2002 ), non-credit based offerings are particularly important in the international context, and especially in Africa, where credit-based, university NME is the exception rather than the norm. These non-credit based programs with a strong emphasis on practice dominate capacity building because of the relatively limited offering of university-based curricula on nonprofit management ( Mirabella et al. 2007 ).

Despite the overall emphasis on capacity building and the variety of strategies adopted, most capacity building programs for CSOs fail to produce their intended benefits ( Ika and Donnelly 2017 , 2019 ). In the next section, we discuss the reasons for this lack of impact, and introduce our process for surmounting these challenges.

2.2 Capacity Building Challenges in Africa

Over the past several decades, donors have made significant investments in building civil society capacity in Africa, in the hopes that the sector can contribute to sustainable development outcomes. Yet, these attempts at capacity building have largely failed, and in extreme cases took the form of transforming, and even undermining, local knowledge in favor of Western rationality ( Kacou, Ika, and Munro 2022 , p. 221). For example, donor-driven capacity building programs typically employ a supply-led model, or “intentional development” ( Bebbington 2004 ), which is a one-way relationship with aid being channeled to programs that have specific goals set by donors. This contrasts with a demand-led model where CSOs have the autonomy to develop and implement programs best-suited to solve community challenges ( Edwards, Hulme, and Wallace 1999 ). The proliferation of a supply-led approach leads to: CSOs aligning their goals with donor interests; mission drift on the part of CSOs as they respond to donor priorities versus local needs; upwards accountability to funders as CSOs focus on the objectives of those who control resources; and lack of effectiveness as CSOs function merely as “subcontractors” that deliver programs for aid agencies ( AbouAssi 2013 ; Ebrahim 2016 ), which stymies the growth of locally embedded CSOs able to contribute to systemic social and political change ( Ebriahim 2016 ; Krawczyk 2018 , 2021 ).

Reliance on Western practices means capacity building programs in Africa reflect some of the common critiques associated with such models. For example, CSOs have become increasingly more commercialized and professionalized, due to neo-liberal discourses and increased competition for funding ( Eikenberry 2009 ; Mirabella and Nguyen 2019 ), which results in a shift of emphasis from democratic values towards business-like approaches ( Mirabella and Nguyen 2019 ). Indeed, Western managerial culture ( Jordan Smith 2003 ; Roberts, Jones, and Fröhling 2005 ) and a disconnect between the capacity of INGOs and local grassroots organizations ( Appe and Schnable 2019 ; Eade 2007 ) characterize current capacity building practices. To address these issues, scholars suggest a “counter-discourse” that includes more space for participation and collective problem-solving ( Mirabella and Nyugen 2019 ; Eikenberry 2009 ).

Scholars highlight capacity building deficiencies, pointing to the need for endogenous rather than donor-driven capacity building programs, while simultaneously questioning the imposition of best practices and emphasizing fit with local contexts ( Kacou, Ika, and Munro 2022 ). At a more profound level, scholars have questioned capacity building at it roots, its effectiveness, and its impact on local knowledge and ideas, both internationally and in the United States (e.g. EchoHawk 2019 ; Kacou, Ika, and Munro 2022 ; Littles 2022 ). This more radical critique thus highlights how capacity building is not a neutral concept, but rather a top-down approach emphasizing the expertise of the “builder” and undermining local, endogenous knowledge ( Eade 2007 ; Kacou, Ika, and Munro 2022 ; Littles 2022 ).

Reflecting broader critiques of capacity building, scholarship increasingly highlights the need to rethink the role of higher education in the African context. Indeed, higher education in Africa historically aimed to form African civil servants serving colonial interests ( Abrokwaa 2017 , p. 201). The colonial roots of African institutions of higher education suggests these institutions are rooted in Western practices with curricula that reflect foreign knowledge and values. Furthermore, Western education continues to promote and reinforce neocolonialism in Africa ( Mawere and Awuah-Nyamekye 2015 ; Shizha and Makuvaza 2017 ). Likewise, scholars caution against applying Western-based NME models to the Global South as they are not locally owned ( Mirabella et al. 2007 ; Mirabella, Hvenmark, and Larsson 2015 ). Indeed, with few exceptions, NME in Africa, as a credit-based capacity building program, is based on white American and Eurocentric values that downplay the histories of people of color ( Feit and Sandberg 2022 ). A “critical pedagogy” approach to NME in higher education should thus aim to facilitate participation from all stakeholders (teachers, students, community members, marginalized groups), allowing them to engage in reflection and discourse that acknowledges how societal problems are the products of historical, social, and political contexts ( Eikenberry 2009 ; Mirabella and Nguyen 2019 ).

Overall, scholarship encourages more critical approaches toward capacity building programs in general ( Kacou, Ika, and Munro 2022 ) and institutionalized, university-based education in particular ( Mirabella et al. 2015 , 2019 ). Indeed, university-based programs do not always reflect the specific needs of practitioners ( Appe 2015 ). We show that nowhere is the disconnect more pronounced than in the international arena, where capacity building programs, including NME, mirror donor approaches prioritizing doing for and doing to rather than doing with. Given the challenges discussed in this section, and following the recommendations of Hope (2011) , this research note draws on action research and proposes it as a methodological approach that facilitates bottom-up, participatory strategies to develop capacity-building programs that engage local CSO leaders in shaping need-driven curricula.

3 Action Research for Capacity Building

Nonprofit studies is interdisciplinary in nature ( Ma and Konrath 2018 ). Yet, as Kim and Raggo (2022) argue, there is a need for greater diversity in approaches, research designs, and methodologies, especially those that are inclusive and participatory, engaging local people and community organizations to give them voice and agency in development efforts. Using the field’s interdisciplinary roots as a source of methodological and conceptual innovation, we draw on action research as an approach to facilitate capacity building for civil society, and to overcome the challenges associated with capacity building discussed in the previous section. Action research has a rich tradition of being applied to different disciplines and contexts to give voice to non-dominant stakeholders and to address social issues in diverse social-cultural environments ( Mertler 2019 ).

Action research is an action-oriented, systemic approach and reflective practice that enables people to find effective solutions to problems in their everyday lives ( Stringer and Aragón 2020 ). Action research bridges research and practice and provides a model for enacting local, action-oriented approaches to inquiry, applying small scale theorizing to problems in specific situations ( Denzin and Lincoln 2018 ). It addresses the recent critiques of capacity building and facilitates tenets of critical pedagogy in that it emphasizes social justice, endogenous efforts, local knowledge, and participation of all stakeholders. It is less about generalization because “people” are not being studied, rather the focus centers around social issues and problems that impact people. Ideally, action research engages participants as co-researchers and facilitates an equitable exchange of knowledge that empowers participants and builds a body of knowledge so to enhance community of practices by facilitating new processes and action. Yet, as several scholars have noted, this collaborative, iterative process is inherently complex ( Herr and Anderson 2005 ; Hyma and Sen 2022 ; Tuck 2009 ).

Three iterative and heuristic processes, Look-Think-Act , characterize action research ( Stringer and Aragón 2020 ). These three stages aim to assist research participants in maintaining focus during the participative inquiry process. This iterative process is important because it acknowledges that action research is an iterative process that may change directions in major or minor ways based on understandings that emerge along the way. Looking is the initial phase in the action research process. It allows researchers to step back from their default approach to search for novel ways of functioning that enable them to resolve the problematic issues that inhibit their ability to accomplish community, organizational, and professional goals. This stage of investigation does not seek solutions to issues or problems. The focus here is to determine the inquiry’s direction, identify participants, and generate/gather data.

In the Thinking phase, research participants interpret the problematic issues identified during the looking phase. Reflection and analysis expose the concepts and everyday theories that practitioner experts use to describe or explain their lived experiences and actions. The research task is to assist participants to reveal their “theories in use” and reformulate them into constructions that are improved, matured, expanded, and elaborated ( Argyris and Schön 1996 ). These new ways of interpreting situations are then used to assist participants to shape actions and behaviors in ways that improve practice and empower all stakeholders.

In the third phase, Acting , research participants use the data collected during the Looking and Thinking phase to develop actions that facilitate desired change. This approach improves professional practice and identifies effective solutions as action plans are developed and implemented into new operations to improve practice ( Mertler 2019 ). At the same time, however, scholars emphasize that Acting should not be limited to the final stage of a project, but rather interwoven in all research stages: for instance, data collection through focus groups may not only aim to collect data but also focus on helping participants recognize “prior disempowering encounters, to collaboratively theorize the dysfunction, and to imagine solutions and reparation” ( Tuck 2009 , p. 53).

In the next section, the country case study of Liberia serves as an example of how the Looking and Thinking phases of action research were used to examine local level CSOs’ perceptions of capacity building efforts and to identify preferred actions and strategies for mitigating the shortcomings of existing capacity building efforts. The action research methodology allowed the team to engage staff and administrators from local CSOs as “experts,” giving them voice and agency in defining capacity development efforts.

4 Country Case Study: Liberia

We use Liberia as a country case study to outline how we adopt an action research approach that involved engaging CSO leaders to advocate for a bottom-up rather than top-down capacity-building strategy. We first discuss why Liberia was selected as a country case and then show how we use our approach to engage with local CSO leaders.

Liberia is a good case study for two main reasons. First, Liberian civil society exhibits the endemic challenges associated with civil society in developing countries as discussed in the preceding sections: the sector relies almost exclusively on international donors for financial resources, leading to mission drift and upwards accountability ( Krawczyk 2018 ). CSOs often depend on a single donor for intermittent, project-based funds. Liberian CSOs with funding from more than one donor tend to receive it sequentially – that is, one project finishes, and the next is funded by a different donor. McKeown and Mulbah (2007) argue this funding approach implies organizations may be “finding money wherever they can” without being strategic, suggesting that organizations cannot access funding twice from the same organization, perhaps due to lack of capacity and/or inability to produce quality outputs. This challenge, coupled with the sector’s low capacity, results in a supply-side relationship between Liberian civil society and donors, in which Liberian CSOs simply implement projects funded by international donors ( Krawczyk 2018 ).

Furthermore, and this is the second criteria for selecting Liberia as a case, Liberia’s small civil society relies heavily on international donors for capacity building programming that utilizes Western-based models and promotes donor priorities. Liberian civil society is resource-poor, and suffers from a severe lack of human, financial, technological, and infrastructural capacity. From a human resource standpoint, most Liberian CSOs operate without permanent staff ( Krawczyk 2021 ). Project-based donor funding allows organizations to hire staff only for the duration of funded projects, and staff salaries may be paid irregularly or not at all during periods without grants. Efforts to increase the capacity of the third sector are implemented mostly via international donor programs, which are Western-centric and reflect donor priorities ( McKeown and Mulbah 2007 ).

4.1 Methodology Applied

We outline how an action research approach engages CSO leaders as co-researchers in developing capacity building programs. Involving CSO leaders acknowledges that the people who actively engage on the ground have deep levels of experience and understanding about their own situation and can and should be seen as “experts,” and be directly involved in addressing the challenges that affect their day-to-day lives and/or the lives of the people they work for or with. In this section, we describe how we apply action research to the Liberian case. We align our discussion with the three iterative, heuristic processes associated with action research: look, think, and act. Because data collection and analysis are already embedded into the action research process (Look: gathering data; Think: reflecting and analyzing data) ( Stringer and Aragón 2020 ), we embed our own discussion of the methodology used to collect and analyze data into the Look-Think-Act phases of action research.

The Looking phase aims to challenge and question default approaches. We partnered with local organizations to avoid implicit biases that we would bring to the process, as we recognize that we are embedded in our own racial and national identities, disciplinary fields, and institutions. The individual racial and national identities of the authors are: Black African (Nigerian) male, Caucasian American female, African American female, and Caucasian European (German) male. In our specific case, we had – for disciplinary background, research interests, and institutional home – an affinity for NME as a specific approach to capacity building. Partnering with the African Methodist Episcopal University (AMEU, Monrovia, Liberia) and one Liberian civil society organization, Hope Alliance Liberia (HAL, Johnsonville, Montserrado, Liberia), offered critical insights in terms of authentically recognizing local capacities and needs. While AMEU supervised communication with and mobilization of focus group participants, HAL connected the project with grassroot organizations, communicating with and mobilizing CSO participants for focus groups. All research partners participated in the iterative research process. Further, our approach involved civil society leaders (besides our official partners) in focus groups, and thus as “co-researchers” to identify experiences and needs around capacity building from the perspective of the end users. The Looking phase thus questioned traditional capacity building strategies in the development context, identified problematic issues, and directed our inquiry.

We utilize two recognized methods to generate data for the Look phase of our action research project: desk review and focus groups ( Stringer and Aragón 2020 ). First, we verify through a two-pronged desk review the insights from the literature in the Liberian context. We analyzed international donor projects focused on CSO capacity building on the Liberia Project Dashboard (LPD) ( https://liberiaprojects.org/ ). After a review of 940 projects listed on the LPD, we identified 17 projects that were directly focused on CSO capacity building. Additionally, the review of capacity building projects by other providers (e.g. government-funded programs, institutions of higher education, and CSOs) led to two major additions: the Liberia Institute of Public Administration (LIPA) and the African Research and Development Agency (NARDA). [2] This two-pronged desk review of capacity-building programs confirmed the insights gathered from the extant literature that most capacity building programs for Liberian civil society are designed and implemented by international donors, and reflect donor priorities (e.g. civic education and elections, gender and youth, and media and peacebuilding).

Second, through focus groups, we identify the capacity building needs of Liberian CSOs and engage in a bi-directional exchange with CSO leaders over capacity building needs for sustainable development. We conducted three in-person focus groups with local CSO leaders from three different counties in Liberia, Montserrado, Nimba, and Bong, to ensure representation of both urban and rural CSOs. The focus groups took place at AMEU in June 2022. We selected focus group participants through a purposive sample that was based on the expertise of the research facilitators and co-researchers and relied on the publicly available Liberia Revenue Authority (LRA) 2017-18 CSO Registration List of registered CSOs in Monrovia, Liberia. We sent recruitment letters to 45 CSOs selected from this list (fifteen from each county), and 22 CSO leaders participated, representing 15 different CSOs. Focus group participants worked in CSOs engaged in multiple sectors, including education, agriculture, transparency and accountability, women’s empowerment, health, and drug prevention.

4.1.2 Think

Capacity building needs : participants discussed capacity building needs at two different levels. At a basic level, capacity building needs related to managerial expertise (administrative capacity). CSO leaders identified areas such as resource development, grant writing, leadership, governance, volunteer management and mobilization, financial management, and HR as core components of an effective capacity building program in Liberia. At a second level, however, participants identified technical needs related to the specific sector they worked in, e.g. the need for agricultural training (technical capacity).

Capacity building availability : participants reported capacity building programs in leadership, network building, advocacy and community mobilization, data collection and assessment, and financial management. Capacity building was delivered mostly by international donors such as the United Nations and GIZ (Deutsche Gesellschaft für Internationale Zusammenarbeit), INGOs like UNICEF, with limited offerings by Liberian NGOs and online (international) delivery methods.

Capacity building access: participants, particularly from CSOs located in the more rural counties identified obstacles in accessing the global networks (paradoxically also of international donors’ capacity building programs) and regional/government networks.

Institutionalizing capacity building within higher education : participants favored institutionalized NME as a capacity building strategy to systematize under one umbrella the various disconnected capacity-building offerings, and offer greater access to capacity building.

First, costs associated with capacity building workshops and trainings make attendance prohibitive.

Second, lack of information on available trainings makes them difficult to identify.

Third, and relatedly, even if members of civil society can identify them and afford to participate, capacity building trainings are often “closed” to broader civil society and instead, only select CSOs, pre-determined by development organizations, are invited to participate.

Core components of effective capacity building include both broad managerial areas and issue/field specific technical assistance.

While capacity building programs are primarily offered by international providers, some evidence exists of a genuinely local capacity building field.

The major challenge related to capacity building appears to be more accessibility than availability.

Institutionalizing capacity building with Liberian higher education institutions emerged as a strategy to increase access to capacity building.

This phase identifies actions that improve professional practice, knowledge, and effectiveness. However, as discussed in the methods sections, scholars suggest that action should not be limited to final phase but interwoven in all phases ( Tuck 2009 ). The focus groups served to encourage participants to reflect on the unequal power relationships characterizing donor-funded capacity building programs. Several participants, particularly from more rural areas, noted the exclusive nature of most capacity building programs, where access appeared to be tied to existing networks centering around Monrovia with international developing agencies “inviting” specific organizations to participate in capacity building programs.

While focus groups helped CSO leaders to explicitly reflect on these exclusionary practices, they also revealed internalized perspectives around “best practice” in capacity building. CSO leaders discussed how certain fields and sectors, in particular civil society and social work, were new in Liberia. They linked this novelty to a “real need for capacity building and support.” Participants described this need as a need to “catch up,” explicitly looking for outside models rather than local approaches.

Informed by the look and think phases (the initial phases are not yet completed as we will conduct additional focus groups), and in collaboration with our partners at AMEU, we aim to develop recommendations for how to best meet the capacity building needs of Liberian CSOs, and identify steps to align future capacity building with local needs. We will also co-develop a nonprofit management education capacity building curriculum with AMEU and CSO representatives (co-researchers recruited from those that participated in our focus groups), designed to meet the needs expressed by Liberian CSO leaders during our study.

5 Discussion

This research note proposes action research as methodology to develop capacity building programs, including NME, that are contextually appropriate. This approach is best positioned to address the growing critiques of traditional capacity building and NME. Both in theory and practice, action research engages local communities in the iterative process that is the essence of this research approach. In so doing, CSO leaders, in this case, are recognized as experts and become co-researchers so to co-develop and shape capacity building programs to reflect their specific needs. This is a crucial point as the growing critique of traditional capacity building programs emphasizes power asymmetries and how these programs negatively impact local knowledge. We use the example of Liberia for illustrative purposes to highlight the benefits of action research. In this section, we therefore discuss some surprising findings that emerged from the Liberia case and thus illustrate the benefits of our approach.

The desk review of capacity building programs available in Liberia and CSO leaders’ perspectives that emerged during the focus group confirm some of the critiques in the literature but are also surprising in other ways. The findings suggest a disconnect between available capacity building programs and the desires of CSO leaders, confirming previous findings (e.g. Appe 2015 ) and critiques ( Kacou, Ika, and Munro 2022 ). The disconnect seems to primarily emphasize the importance, in the eyes of CSO leaders, of technical capacity and access to administrative capacity building. Most donor-funded capacity building trainings emphasize what Balboa (2014) categorizes as administrative capacity, in efforts to strengthen the internal managerial capacity of organizations, and political capacity in the form of advocacy training. However, access to these programs is limited and is identified as a major challenge. By contrast, CSO leaders identified administrative and technical capacity as core capacity building components.

What is surprising is that CSO leaders suggest a preference for capacity building programs prioritizing what Mirabella and Wish (2000) referred to as “inside function” in NME. The more recent critical pedagogy in the field of NME criticized the emphasis on managerial functions of nonprofits ( Mirabella and Nguyen 2019 ; for a similar critique of capacity building in an international context see, Jordan Smith 2003 ; Roberts, Jones, and Fröhling 2005 ), viewing it as part of the broader and worrisome commercialism trend ( Eikenberry 2009 ), with nonprofits becoming more business-like both in rhetoric and practice ( Dart 2004 ). Paradoxically, our desk review of current capacity building trainings in Liberia shows that they emphasize network building, advocacy, and community organizing – areas that have been found lacking in current NME curricula ( Mirabella et al. 2015 , 2019 ). This paradox raises two important questions. First, what do western INGOs know about capacity building that Western NME higher education institutions fail to consider? [3] And, second, what explains the preference for administrative and technical capacity building of CSO leaders?

One possible set of answers directly emerges from our action research approach, specifically from our effort to embed action in all phases of the research process (see Tuck 2009 ). Focus group participants reflected on how certain fields are nascent in Liberia. This discussion led to a group perception that the newness of these fields in the country meant CSO leaders had to think about how to “catch up.” This perception of having to catch up, coupled with a preference for administrative capacity building (inside function), may suggest an internalized prioritization of outside models over local knowledge in what is a constant search for legitimacy in relation to international funders. [4]

At the same time, international donors’ emphasis on political capacity – in stark contrast to NME curricula – may reflect contextual factors and assumptions rather than express strategic insight. The discourse in scholarship and practice over the dual role of CSOs in both advocacy and service provision has traditionally been more pronounced in the international arena than in Western contexts, to a point that organizations expose themselves to criticism when conceiving service delivery as an end goal rather than as part of strategies for advocacy ( Lewis 2014 , pp. 166–167). Development agencies and donors have therefore consciously embedded democracy promotion and political capacity building in more traditional service delivery programs (e.g. BMZ 2013 ; Herrold 2020 ), frequently as a strategy to avoid the criticism intrinsic in explicit “democracy promotion” efforts ( Carothers and Brechenmacher 2014 ). This emphasis on advocacy, even in the context of more traditional service delivery funding, may explain the attention of Western INGOs to advocacy.

The intertwining of what Balboa (2014) refers to as multiple spheres of influence (local, national, and global) requires then from CSOs a difficult balancing act, as they must navigate the pulls and pushes of various levels to effectively work across them. Balboa (2014) defined bridging capacity as the ability of being embedded in local networks to effectively provide services while at the same time successfully connect, engage with, and take advantage of national and/or global networks and resources. While CSO leaders mentioned that they successfully cooperated with governmental agencies while working with local communities, CSOs in the more rural counties struggled with accessing the global networks (paradoxically also accessing international donors’ capacity building programs) and regional/government networks.

6 Conclusions

The participatory nature of our action research process engaged local CSO leaders and gave them agency to guide subsequent direct action steps, in which these leaders participate as co-researchers. The Liberian case demonstrates the importance of endogenous rather than donor-driven capacity building programming, and thus aligns with the “critical pedagogy” approach to NME in higher education ( Eikenberry 2009 ; Mirabella and Nyugen 2019 ). The research note thus illustrates how an action research methodology mitigates some of the critiques of traditional capacity building programs. It promotes a place-placed capacity building program employing a demand-led model in which CSOs recommend priorities and design programs that best meet local needs.

As next steps, we aim to strengthen our findings with additional focus groups. Research with our co-researchers will explore preferences for delivery formats (in person, distance learning, hybrid) to ensure inclusive programming. As a needs assessment for the Chair in African Philanthropy at the University of the Witwatersrand (Johannesburg, South Africa) stated, where one of the few nonprofit education programs deeply rooted in the local philanthropic culture exists, “The approach [to establish an academic program on philanthropy] has to be home grown, not as knee-jerk reaction to external dynamics, but as a well-considered and grounded body of understanding that stands on its own feet, so to speak” ( Fowler 2017 , p. 1).

Acknowledgments

The authors with to thank the following Liberian civil society organizations (Save Life Liberia Inc., Hope Alliance Liberia, Hope Alliance Academy, Oasis for Hope-Liberia, Libpedie, Action to Restore Community Health, More Than Me Academy, EQUIP Leadership Liberia, Accountability Lab Liberia, Center for Transparency and Accountability in Liberia, Special Emergency Activities to Restore Children’s Hope, Public Health Initiative Liberia, Center for Media Studies and Peacebuilding, Liberia Research and Development Network, and Girls Education Liberia) for participating in the focus groups and sharing their perspectives, stories, and experiences. We would also like to thank the editors and the anonymous reviewers for their constructive comments on earlier drafts. Lastly, we thank Lucky Chambers Umezulike for his assistance in the final drafting of the manuscript.

AbouAssi, K. 2013. “Hands in the Pockets of Mercurial Donors: NGO Response to Shifting Funding Priorities.” Nonprofit and Voluntary Sector Quarterly 42 (3): 584–602. https://doi.org/10.1177/0899764012439629 . Search in Google Scholar

Abrokwaa, C. 2017. “Colonialism and the Development of Higher Education.” In Re-thinking Postcolonial Education in Sub-Saharan Africa in the 21st Century , edited by E. Shizha, and N. Makuvaza, 201–20. Rotterdam: Sense Publishers. 10.1007/978-94-6300-962-1_12 Search in Google Scholar

Appe, S. 2015. “Is NGO Education Matching up to the Demand?” JNLE 5 (4): 244–60. 10.18666/JNEL-2015-V5-I4-7029 Search in Google Scholar

Appe, S., and A. Schnable. 2019. “Don’t Reinvent the Wheel: Possibilities for and Limits to Building Capacity of Grassroots International NGOs.” Third World Quarterly 40 (10): 1832–49. https://doi.org/10.1080/01436597.2019.1636226 . Search in Google Scholar

Argyris, C., and D. Schön. 1996. Organizational Learning II: Theory, Method, and Practice . Reading, MA: Addison-Wesley. Search in Google Scholar

Balboa, C. 2014. “How Successful Transnational Non-governmental Organizations Set Themselves up for Failure on the Ground.” World Development 54: 273–87. https://doi.org/10.1016/j.worlddev.2013.09.001 . Search in Google Scholar

Banks, N., and D. Hulme. 2012. “The Role of NGOs and Civil Society in Development and Poverty Reduction.” Brooks World Poverty Institute Working Paper (171) . 10.2139/ssrn.2072157 Search in Google Scholar

Bebbington, A. 2004. “NGOs and Uneven Development: Geographies of Development Intervention.” Progress in Human Geography 28 (6): 725–45. https://doi.org/10.1191/0309132504ph516oa . Search in Google Scholar

BMZ . 2013. Mitmachen, Mitwirken und Mitgestalten. Strategien zur Zusammenarbeit mit der Zivilgesellschaft in der deutschen Entwicklungspolitik . Berlin: BMZ. Search in Google Scholar

Brinkerhoff, D. 2010. “Developing Capacity in Fragile States.” Public Administration and Development 30 (1): 66–78. https://doi.org/10.1002/pad.545 . Search in Google Scholar

Brinkerhoff, D., and P. Morgan. 2010. “Capacity and Capacity Development: Coping with Complexity.” Public Administration and Development 30 (1): 2–10. https://doi.org/10.1002/pad.559 . Search in Google Scholar

Bryan, E., Q. Bernier, M. Espinal, and C. Ringler. 2016. Integrating Gender into Climate Change Adaptation Programs: A Research and Capacity Needs Assessment for Sub-saharan Africa. CCAFS Working Paper . Search in Google Scholar

Carothers, T., and S. Brechenmacher. 2014. Closing Space: Democracy and Human Rights Support under Fire . Washington: CEIP. Search in Google Scholar

Chaplowe, S., and R. Engo-Tjega. 2007. “Civil Society Organizations and Evaluation: Lessons from Africa.” Evaluation 13 (2): 257–74. https://doi.org/10.1177/1356389007075227 . Search in Google Scholar

Dart, R. 2004. “Being “Business-like” in a Nonprofit Organization: A Grounded and Inductive Typology.” Nonprofit and Voluntary Sector Quarterly 33 (2): 290–310. https://doi.org/10.1177/0899764004263522 . Search in Google Scholar

Denzin, N., and Y. Lincoln. 2018. The SAGE Handbook of Qualitative Research , 5th ed. Los Angeles, CA: Sage. Search in Google Scholar

Eade, D. 2007. “Capacity Building: Who Builds Whose Capacity?” Development in Practice 17 (4–5): 630–9. https://doi.org/10.1080/09614520701469807 . Search in Google Scholar

Ebrahim, A. 2016. “The Many Faces of Nonprofit Accountability.” In The Jossey-Bass Handbook of Nonprofit Leadership and Management , edited by D. Renz, and R. Herman, 102–23. Wiley. 10.1002/9781119176558.ch4 Search in Google Scholar

EchoHawk, S. 2019. “Unpacking Capacity Building.” Nonprofit Quarterly 28, https://nonprofitquarterly.org/unpacking-capacity-building/ . Search in Google Scholar

Edwards, M., D. Hulme, and T. Wallace. 1999. “NGOs in a Global Future: Marrying Local Delivery to Worldwide Leverage.” Public Administration and Development 19 (2): 117–36. https://doi.org/10.1002/(sici)1099-162x(199905)19:2<117::aid-pad70>3.0.co;2-s . 10.1002/(SICI)1099-162X(199905)19:2<117::AID-PAD70>3.0.CO;2-S Search in Google Scholar

Eikenberry, A. 2009. “Refusing the Market: A Democratic Discourse for Voluntary and Nonprofit Organizations.” Nonprofit and Voluntary Sector Quarterly 38 (4): 582–96. https://doi.org/10.1177/0899764009333686 . Search in Google Scholar

Ekirapa, A., G. Mgomella, and C. Kyobutungi. 2012. “Civil Society Organizations: Capacity to Address the Needs of the Urban Poor in Nairobi.” Journal of Public Health Policy 33 (4): 404–22. https://doi.org/10.1057/jphp.2012.33 . Search in Google Scholar

Feit, M., and B. Sandberg. 2022. “The Dissonance of “Doing Good”: Fostering Critical Pedagogy to Challenge the Selective Tradition of Nonprofit Management Education.” Public Integrity 24 (4–5): 486–503. https://doi.org/10.1080/10999922.2022.2034341 . Search in Google Scholar

Fowler, A. 2017. Profile of the Chair in African Philanthropy at the Wits Business School. A Platform for Practical Progress . Johannesburg: The University of the Witwatersrand. Search in Google Scholar

Gartner, M. 2021. “Non-credit Nonprofit Management Education: Beyond Mapping and towards Critical Qualitative Inquiry.” Canadian Journal of Nonprofit Social Economy Research 12 (1): 58–68. 10.29173/cjnser.2021v12n1a364 Search in Google Scholar

GEO . 2021. Reimagining Capacity Building: Navigating Culture, Systems & Power . Also available at https://www.geofunders.org/resources/reimaginingcapacity-building-navigating-culture-systems-power-1340 . Search in Google Scholar

Hayman, R. 2016. “Unpacking Civil Society Sustainability: Looking Back, Broader, Deeper, Forward.” Development in Practice 26 (5): 670–80. https://doi.org/10.1080/09614524.2016.1191439 . Search in Google Scholar

Herr, K., and G. Anderson. 2005. The Action Research Dissertation: A Guide for Students and Faculty . Los Angeles, CA: SAGE. 10.4135/9781452226644 Search in Google Scholar

Herrold, C. 2020. Delta Democracy: Pathways to Incremental Civic Revolution in Egypt and beyond . New York, NY: Oxford University Press. 10.1093/oso/9780190093235.001.0001 Search in Google Scholar

Hope, K. 2011. “Investing in Capacity Development: Towards an Implementation Framework.” Policy Studies 32 (1): 59–72. https://doi.org/10.1080/01442872.2010.529273 . Search in Google Scholar

Hyma, R., and L. Sen. 2022. “Inquiry as Practice: Building Relationships through Listening in Participatory Action Peace Research.” Peace Review 34 (3): 343–51. https://doi.org/10.1080/10402659.2022.2092396 . Search in Google Scholar

Ika, L., and J. Donnelly. 2017. “Success Conditions for International Development Capacity Building Projects.” International Journal of Project Management 35 (1): 44–63. https://doi.org/10.1016/j.ijproman.2016.10.005 . Search in Google Scholar

Ika, L., and J. Donnelly. 2019. “Under what Circumstances Does Capacity Building Work?” In Capacity Building in Developing and Emerging Countries , edited by E. Chrysostome, 43–90. Springer. 10.1007/978-3-030-16740-0_3 Search in Google Scholar

Jordan Smith, D. 2003. “Patronage, Per Diems and the “Workshop Mentality”: The Practice of Family Planning Programs in Southeastern Nigeria.” World Development 31 (4): 703–15. https://doi.org/10.1016/s0305-750x(03)00006-8 . Search in Google Scholar

Kacou, K., L. Ika, and L. Munro. 2022. “Fifty Years of Capacity Building: Taking Stock and Moving Research Forward.” Public Administration and Development 42 (4): 215–32. https://doi.org/10.1002/pad.1993 . Search in Google Scholar

Kim, M., and P. Raggo. 2022. “Taking Stock on How We Research the Third Sector: Diversity, Pluralism, and Openness.” Voluntas 33 (6): 1107–13. https://doi.org/10.1007/s11266-022-00548-6 . Search in Google Scholar

Krawczyk, K. 2018. “The Relationship between Liberian CSOs and International Donor Funding: Boon or Bane?” Voluntas 29 (2): 296–309. https://doi.org/10.1007/s11266-017-9922-5 . Search in Google Scholar

Krawczyk, K. 2021. “Strengthening Democracy and Increasing Political Participation in Liberia: Does Civil Society Density Matter?” Journal of Civil Society 17 (2): 136–54. https://doi.org/10.1080/17448689.2021.1923905 . Search in Google Scholar

Lee, M. 2002. “Noncredit Certificates in Nonprofit Management: An Exploratory Study.” Public Administration and Management 7 (3): 188–210. Search in Google Scholar

Lewis, D. 2014. Non-Governmental Organizations, Management, and Development , 3rd ed. New York, NY: Routledge. 10.4324/9780203591185 Search in Google Scholar

Littles, M. 2022. “Should We Cancel Capacity Building?” Nonprofit Quarterly 13, https://nonprofitquarterly.org/should-we-cancel-capacity-building/ . Search in Google Scholar

Ma, J., and S. Konrath. 2018. “A Century of Nonprofit Studies: Scaling the Knowledge of the Field.” Voluntas 29 (6): 1139–58. https://doi.org/10.1007/s11266-018-00057-5 . Search in Google Scholar

McKeown, M., and E. Mulbah. 2007. Civil Society in Liberia: towards a Strategic Framework for Support . Also available at https://www.sfcg.org/wp-content/uploads/2014/08/LBR_EV_Apr07_Civil-Society-in-Liberia-Towards-a-Strategic-Framework-for-Support.pdf . Search in Google Scholar

Mawere, M., and S. Awuah-Nyamekye, eds. 2015. Between Rhetoric and Reality: The State and Use of Indigenous Knowledge in Post-colonial Africa . Bamenda, North West Region, Cameroon: African Books Collective. 10.2307/j.ctvh9vwc4 Search in Google Scholar

Mertler, C. 2019. The Wiley Handbook of Action Research in Education . Hoboken, NJ: Wiley. 10.1002/9781119399490 Search in Google Scholar

Mirabella, R., G. Gemelli, M. Malcolm, and G. Berger. 2007. “Nonprofit and Philanthropic Studies: International Overview of the Field in Africa, Canada, Latin America, Asia, the Pacific, and Europe.” Nonprofit and Voluntary Sector Quarterly 36 (4): 110S–35S. https://doi.org/10.1177/0899764007305052 . Search in Google Scholar

Mirabella, R., J. Hvenmark, and O. Larsson. 2015. “Civil Society Education: International Perspectives.” Journal of Nonprofit Education and Leadership 5 (4): 213–8. https://doi.org/10.18666/jnel-2015-v5-i4-7027 . Search in Google Scholar

Mirabella, R., J. Hvenmark, and O. Larsson. 2019. “Civil Society Education: National Perspectives.” Journal of Nonprofit Education and Leadership 9 (1): 2–5. Search in Google Scholar

Mirabella, R., and K. Nguyen. 2019. “Educating Nonprofit Students as Agents of Social Transformation: Critical Public Administration as a Way Forward.” Administrative Theory and Praxis 41 (4): 388–404. https://doi.org/10.1080/10841806.2019.1643616 . Search in Google Scholar

Mirabella, R., and N. Wish. 2000. “The “Best Place” Debate: A Comparison of Graduate Education Programs for Nonprofit Managers.” Public Administration Review 60 (3): 219–29. https://doi.org/10.1111/0033-3352.00082 . Search in Google Scholar

Nishimura, A., R. Sampath, L. Vu, A. Mahar Sheikh, and A. Valenzuela. 2020. “Transformational Capacity Building.” Stanford Social Innovation Review 18 (4): 30–3. Search in Google Scholar

Roberts, S., J. Jones, and O. Fröhling. 2005. “NGOs and the Globalization of Managerialism: A Research Framework.” World Development 33 (11): 1845–64. https://doi.org/10.1016/j.worlddev.2005.07.004 . Search in Google Scholar

Salamon, L. 1994. “The Rise of the Nonprofit Sector.” Foreign Affairs 73 (4): 109–22. https://doi.org/10.2307/20046747 . Search in Google Scholar

Shizha, E., and N. Makuvaza, eds. 2017. Re-thinking Postcolonial Education in Sub-saharan Africa in the 21st Century . Rotterdam: Sense Publishers. 10.1007/978-94-6300-962-1 Search in Google Scholar

Stringer, E., and A. Aragón. 2020. Action Research , 5th ed. Los Angeles, CA: SAGE. Search in Google Scholar

Teshome-Bahiru, W. 2009. “Civil Society and Democratization in Africa: The Role of the Civil Society in the 2005 Election in Ethiopia.” International Journal of Social Sciences 4 (2): 80–95. Search in Google Scholar

Tuck, E. 2009. “Re-visioning Action: Participatory Action Research and Indigenous Theories of Change.” The Urban Review 41 (1): 47–65. https://doi.org/10.1007/s11256-008-0094-x . Search in Google Scholar

Walker, J. 2016. “Achieving Health SDG 3 in Africa through NGO Capacity Building-Insights from the Gates Foundation Investment in Partnership in Advocacy for Child and Family Health (PACFaH) Project.” African Journal of Reproductive Health 20 (3): 55–61. https://doi.org/10.29063/ajrh2016/v20i3.10 . Search in Google Scholar

Weber, P. 2022. “Institutionalization Interrupted: The Evolution of the Field of Philanthropic and Nonprofit Studies.” In Preparing Leaders of Nonprofit Organizations , edited by W. Brown, and M. Hale, 3–24. New York, NY: Routledge. 10.4324/9781003294061-2 Search in Google Scholar

Yeshanew, S. 2012. “CSO Law in Ethiopia: Considering its Constraints and Consequences.” Journal of Civil Society 8 (4): 369–84. https://doi.org/10.1080/17448689.2012.744233 . Search in Google Scholar

© 2023 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

  • X / Twitter

Supplementary Materials

Please login or register with De Gruyter to order this product.

Nonprofit Policy Forum

Journal and Issue

Articles in the same issue.

research and capacity building organisations

  • QUICK LINKS
  • How to enroll
  • Career services

Key Practices that Inform the Nature of Healthy Leadership [The Questions]

Addressing the questions that inform sustainable practice, capacity building, and generativity..

High school teacher leading a blended learning class

Healthy Leadership Blog Series By Dr. Mark MCaslin

In this issue of the Healthy Leadership blog, a discussion surrounding the key potentiating questions that provide the foundation of the key practices that inform the nature of healthy leadership are introduced. These questions inform sustainable practice, capacity building, and generativity.

Judge a man by his questions rather than by his answers.

Sustainable practice.

The first two Questions and the accompanying practices are foundational for creating and maintaining healthy leadership. Healthy leadership is comprised of five integrated practices: Deep Understanding, Critical Reflection, Maturity, Empowerment, and Generativity .

Each practice holds a question that grounds the leader/potentiator in preparation for building sustainable systems and connects the leader/potentiator in anticipation of building the capacity for generativity.  Deep Understanding and Critical Reflection build a sustainable foundation for the potentiator . Let's examine each question below.

research and capacity building organisations

1. Am I ready to learn?

When we respond affirmatively to this question, we open ourselves to understanding another’s actions and reactions to any given event, problem, or opportunity. Correspondently, the practice of Deep Understanding is a way to answer this question in a healthy, generative fashion .

As a potentiating practice, Deep Understanding embraces a conscious movement away from prejudgment of potential toward a deeper awareness of the possibilities held by another and self. Deeply rooted in empathy, it is not a directive or controlling stance but a purposeful probe into the meaning of the experience shared with another. It supports the actualization of human potential without a need to define, confine, or refine it. As a practice, it empowers creativity, curiosity, and wonder. It compassionately and intelligently opens us up to learn.

In future issues, we will discuss the heavy propositions . We will also confront the specter of desperate neutrality and its effects on human potential.

research and capacity building organisations

2. Am I ready to become critically and creatively self-aware?

When responding affirmatively to this question, we put our learning and curiosity to work in the world. The way of wonder opened in Deep Understanding gives way to wisdom through the practice of Critical Reflection , the purposeful act we take to deeply connect with where we are as a learner.

Through Critical Reflection, we become more deeply aware of our purpose and place and the impact of our interactions on others and our environment. What separates Critical Reflection from other types of learning or reflection is its deep probing into our individually held assumptions concerning how we interact with others. It is a very personal practice aimed at revealing a deeper self-awareness.

In future issues, we will discuss the importance of ethical individuality . It is not a matter of what a person may know that is important—it is what they believe. Pragmatically speaking, practicing Critical Reflection begins by exploring and constructing our unique philosophy of life.

Capacity Building

research and capacity building organisations

The foundational practices of Deep Understanding and Critical Reflection yield a mature, sustainable, empowering presence as we engage them. They instill a depth of understanding that yields us emotional agility, greater awareness of our environment, and insight into our impact as leaders, teachers, parents and community builders.

The following two Questions ask us to take our practice directly into the community of practice. While the first two practices are foundational for healthy leadership, the next two, Maturity and Empowerment , address how we put those skills to work as leaders, as they concern the act of leading. They are purposed at building the capacity of the community of practice through healthy leadership.

research and capacity building organisations

3.  Am I ready to lead?

This question calls for a mature response. By practicing Maturity , we recognize and appreciate the creative efforts of another and the good person in another, even when that other is shrouded in the fog of self-doubt, self-deception, self-destruction, and self-reproach.

We know that beneath those exteriors is always a better explanation and deeper meaning for a person’s poor and/or unhealthy behavior than what readily appears on the surface. By practicing Maturity , the leader/potentiator r ealizes the emotional agility necessary to lead a community of potential toward its greatest potential.

As we practice Maturity, we continuously apply our foundational practices, Deep Understanding and Critical Reflection , and we prepare to build the capacity of the community of practice through Empowerment. Maturity thus generates A wareness, I nsight, and D iscernment ( AID ).

research and capacity building organisations

4. Am I ready to embrace a potentiating consciousness?

A growing sense of self and self-responsibility is a product of our practicing Deep Understanding, Critical Reflection , and Maturity . We practice Empowerment because it promotes balance and inspires a permeating sense of calm for the leader/potentiator . It builds the capacity for right action while instilling an inner resilience for when things go wrong.

Maturity is a gateway practice that promotes the potentiating actions of Empowerment. The harvest waiting for those practicing Empowerment is a greater reach into the gifts of potential within ourselves and those we lead.

Practicing Empowerment is also a way to cultivate and sustain the authentic self. Through this practice, our thoughts, words, and deeds align. It is a way toward our highest self based on our willingness to build potentiating relationships. Empowerment is fundamentally a relationship-building skill centered upon self-exploration and emotional intelligence. This practice focuses on building effective relationships .

Generativity

research and capacity building organisations

To participate fully within a community of potential while elevating healthy leadership is to achieve a healthy balance. The quintessential practice of Generativity concerns empowering creativity and innovation. From this perspective, I hope we might illuminate real solutions for today and tomorrow.

5. Am I ready to explore the farther reaches of healthy leadership?

Answering affirmatively, we connect the potentiating circle. Just as graffiti begets graffiti, potential begets potential, and we are drawn toward the light of our greatest potential. Generativity leads to healthy, sustainable leadership. Becoming a healthy leader requires integral support from the practices of Deep Understanding, Critical Reflection, Maturity, and Empowerment . Generativity fuses these practices, as they work best in concert. The practice of Generativity concerns itself with thriving. Generativity is a purposeful interdependent activity that catalyzes the principled response and responsibility that yields healthy leadership.

In future issues of this Blog, we will take a deeper dive into each of these potentiating practices. Healthy Leadership begins as a powerful form of constructive and transformative self-leadership. This discipline, achieved through practice, yields the sustainable foundations others are seeking and at the same time models good and healthy practices for those they lead.

Are you ready to learn?

[Download Dr. McCaslin's Leadership Practices Document]

research and capacity building organisations

ABOUT THE AUTHOR

Dr. Mark McCaslin

Dr. Mark McCaslin is a academic leader with a rich history of teaching, educational programming, and administration. His personal and professional interests flow around the development of philosophies, principles, and practices dedicated to the full actualization of human potential. The focus of his research has centered upon healthy organizational leadership and educational approaches that foster a more holistic approach towards the actualization of that potential. At the apex of his current teaching, writing, and research is the emergence of healthy leadership and the potentiating arts.

Closing the Caribbean Data Gap: Addressing Poverty and Inequality

  • Google Calendar
  • Yahoo! Calendar
  • iCal Calendar
  • Outlook Calendar

poverty webinar

Caribbean-wide, there is an urgent need to measure poverty to understand exactly who is affected and how. Only by doing so can the region effectively implement corrective policies. 

Many Caribbean countries struggle with weak statistical capacity and low data usage. According to the World Bank's Statistical Performance Indicator (SPI), which measures statistical capacity at the country level, the region ranks lowest in statistical performance compared to other regions, aligning more closely with low-income countries. With some exceptions, like Jamaica and the Dominican Republic, the most recent poverty estimates are between 6 and 8 years old. Furthermore, national poverty estimates are only available for the 2000s in more than a handful of countries. This data deficit hinders the effective tracking of Sustainable Development Goals related to poverty and inequality (SDG1: eradicating poverty in all its forms, and SDG 10: reducing inequality within and among countries) and the design of targeted poverty alleviation programs and policies.

In the past, even if data to measure poverty was available, it had not been used sufficiently for purposes other than national poverty monitoring. A recent World Bank effort harmonized data from living conditions and household budget surveys in four Caribbean countries with existing microdata (Grenada, Jamaica, Saint Lucia, and Suriname) to produce internationally comparable poverty and equity estimates and allow the inclusion of these estimates into global monitoring of SDG1 and SDG 10. These estimates are now available on the World Bank’s Poverty and Inequality Platform .

Harmonized welfare aggregates are essential for providing a consistent and comparable welfare measure across different populations or countries, allowing for global monitoring and benchmarking. This is especially valuable in regions like the Caribbean, where countries may have diverse economic contexts but share similar development objectives. Harmonized welfare aggregates facilitate cross-country analyses, inform policy decisions, and contribute to more effective regional cooperation and development strategies.

Join us for another Ask WB Caribbean on April 30, where we will present the global estimates of poverty and inequality in the Caribbean and discuss committing to regular and comprehensive data collection on poverty and key socio-economic indicators, investing in the capacity of national statistical offices and policy analysis units, and promoting data transparency and accessibility. 

The World Bank

Gail is the World Bank Operations Manager for the Caribbean countries and Head of the World Bank’s Office in Jamaica. In this role, she works closely with government counterparts, the Country Director, and technical teams to support countries to address their development challenges.

Prior to joining the Latin America and Caribbean region, Gail was the Practice Manager for the Health, Nutrition and Population (HNP) portfolio for the South Asia region. In this role, she guided the development and quality of Bank supported HNP analytical products and financing operations in support of the Bank’s goals to end poverty and boost shared prosperity. Gail holds a Master’s Degree in Public Administration focusing on health policy and finance from the New York University (NYU). 

Jacobus Joost De Hoop

Jacobus de Hoop is a senior economist in the World Bank’s Poverty and Equity Global Practice. At the World Bank, Jacob currently leads the Gender Innovation Lab for Latin America and the Caribbean and supports the organization’s program in Barbados, Belize, and Suriname.

Before joining the World Bank, Jacob worked as a humanitarian policy research manager at UNICEF Office of Research – Innocenti, was a member of the Transfer Project, worked as a researcher at the International Labour Organization (ILO), and was affiliated with the Paris School of Economics as a Marie Curie Post-Doctoral Fellow. He holds a PhD in economics from the Tinbergen Institute and VU University in Amsterdam. 

Trinidad Saavedra

Trinidad is an Economist in the Poverty and Equity Global Practice at the World Bank. Currently, she leads the poverty and equity work programs of Chile and Grenada. As part of her work in the Caribbean, she also leads the harmonization of consumption in Caribbean countries and supports the implementation of the World Bank Data for Decision-Making Project in Grenada. Her research interests include topics related to poverty reduction, inclusive growth, the distributional impact of fiscal and social policies, gender, migration, and household finances. 

Previously, she supported the gender work program in Mexico and the poverty and equity work programs of Western Balkans countries and Armenia, where she contributed to several analytical and statistical capacity-building activities. Before joining the Bank, she worked at the Central Bank of Chile in the Financial Research Unit, where she co-led projects focused on assessing households' indebtedness and multiple imputation of missing data. She holds a Master's degree in Finance and Econometrics from the Queen Mary University of London and a Master's degree in Economics and a Bachelor's degree in Industrial Engineering from the University of Chile.

Anna Luisa Paffhausen

Anna Luisa is an Economist in the Latin America and Caribbean Unit of the Poverty & Equity Global Practice at the World Bank. She currently works in the Caribbean Poverty Team on poverty, inequality and statistical capacity development, co-leading the OECS (Organisation of Eastern Caribbean States) Data for Decision Making Project and a range of analytical products. She is the poverty economist for Saint Lucia and Guyana. 

Previously Anna Luisa worked on Brazil, Sri Lanka and in the Europe and Central Asia Region for the Poverty & Equity Global Practice and with the World Bank’s Development Research Group on micro- and small firm development. Before joining the World Bank, she was a Young Professional at the Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) where she worked on financial sector development in Mozambique. Anna Luisa holds a PhD in Economics from the University of Passau in Germany.

Rosita Sobhie

Rosita’s main research areas are poverty and inequality, macro-economic modeling, and forecasting, developing issues for small open economies, and changing attitudes in gender and ethnic roles in society. 

As a member of the National Poverty Commission of Suriname her main contribution is to provide technical support to the team regarding accurate poverty line measurement and targeting programs.

This site uses cookies to optimize functionality and give you the best possible experience. If you continue to navigate this website beyond this page, cookies will be placed on your browser. To learn more about cookies, click here .

  • العربية

GSO –YSMO Capacity Building Program on Energy efficiency in Aden

  • Share this page

GSO –YSMO Capacity Building Program on Energy efficiency in Aden

A training week program related to providing energy efficiency card issuance services concluded today in the Republic of Yemen in the temporary capital, Aden.

The program was organized by the GCC Standardization Organization (GSO) in cooperation with the Yemen Standardization, Metrology and Quality Control Organization (YSMO), within the framework of implementing the agreement to provide energy efficiency card issuance services signed on March 4 th , 2024. between GSO and YSMO.

During the week, a number of programs and workshops were held, notably a workshop on the essentials of energy efficiency programs for air conditioners, an awareness training program on the requirements for notification of conformity assessment bodies, and the preparation and development of instructions and forms for notifying conformity assessment bodies, as well as the preparation of regulation and supplementary standards for testing, in addition to training on the requirements of the  Gulf standard for energy efficiency of air conditioners.

GSO Media Contact

Latest news.

GSO –YSMO Capacity Building Program on Energy efficiency in Aden

Join Our Mailing List

We had welcome your feedback.

Thank you so much for visiting our website. Help us to improve your experience by taking our short survey.

By clicking Yes, you will be redirected to the survey page.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • BMC Fam Pract

Logo of bmcfp

A framework to evaluate research capacity building in health care

1 Primary Care and Social Care Lead, Trent Research and Development Unit, formerly, Trent Focus Group, ICOSS Building, The University of Sheffield, 219 Portobello, Sheffield S1 4DP, UK

This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Building research capacity in health services has been recognised internationally as important in order to produce a sound evidence base for decision-making in policy and practice. Activities to increase research capacity for, within, and by practice include initiatives to support individuals and teams, organisations and networks. Little has been discussed or concluded about how to measure the effectiveness of research capacity building (RCB)

This article attempts to develop the debate on measuring RCB. It highlights that traditional outcomes of publications in peer reviewed journals and successful grant applications may be important outcomes to measure, but they may not address all the relevant issues to highlight progress, especially amongst novice researchers. They do not capture factors that contribute to developing an environment to support capacity development, or on measuring the usefulness or the 'social impact' of research, or on professional outcomes.

The paper suggests a framework for planning change and measuring progress, based on six principles of RCB, which have been generated through the analysis of the literature, policy documents, empirical studies, and the experience of one Research and Development Support Unit in the UK. These principles are that RCB should: develop skills and confidence, support linkages and partnerships, ensure the research is 'close to practice', develop appropriate dissemination, invest in infrastructure, and build elements of sustainability and continuity. It is suggested that each principle operates at individual, team, organisation and supra-organisational levels. Some criteria for measuring progress are also given.

This paper highlights the need to identify ways of measuring RCB. It points out the limitations of current measurements that exist in the literature, and proposes a framework for measuring progress, which may form the basis of comparison of RCB activities. In this way it could contribute to establishing the effectiveness of these interventions, and establishing a knowledge base to inform the science of RCB.

The need to develop a sound scientific research base to inform service planning and decision-making in health services is strongly supported in the literature [ 1 ], and policy [ 2 ]. However, the level of research activity and the ability to carry out research is limited in some areas of practice, resulting in a low evidence base in these areas. Primary Care, for example, has been identified as having a poor capacity for undertaking research [ 3 - 5 ], and certain professional groups, for example nursing and allied health professionals, lack research experience and skills [ 5 - 7 ]. Much of the literature and the limited research on research capacity building (RCB) has therefore focused on this area of practice, and these professional groups. Policy initiatives to build research capacity include support in developing research for practice, where research is conducted by academics to inform practice decision making, research within or through practice, which encompasses research being conducted in collaboration with academics and practice, and research by practice, where ideas are initiated and research is conducted by practitioners [ 3 , 8 ].

The interventions to increase research capacity for, within, and by practice incorporates initiatives to support individuals and teams, organisations and networks. Examples include fellowships, training schemes and bursaries, and the development of support infrastructures, for example, research practice networks [ 9 - 13 ]. In the UK, the National Coordinating Centre for Research Capacity Development has supported links with universities and practice through funding a number of Research and Development Support Units (RDSU) [ 14 ]which are based within universities, but whose purpose is to support new and established researchers who are based in the National Health Service (NHS). However, both policy advisers and researchers have highlighted a lack of evaluative frameworks to measure progress and build an understanding of what works[ 15 , 16 ].

This paper argues for a need to establish a framework for planning and measuring progress, and to initiate a debate about identifying what are appropriate outcomes for RCB, not simply to rely on things that are easy to measure. The suggested framework has been generated through analysis of the literature, using policy documents, position statements, a limited amount of empirical studies on evaluating research RCB, and the experience of one large RSDU based in the UK.

The Department of Health within the UK has adopted the definition of RCB as 'a process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research". (pp1321) [ 17 ]

Albert & Mickan cited the National Information Services in Australia [ 18 ] who define it as

" an approach to the development of sustainable skills, organizational structures, resources and commitment to health improvement in health and other sectors to multiply health gains many times over.'

RCB can therefore be seen as a means to an end, the end being 'useful' research that informs practice and leads to health gain, or an end in itself, emphasising developments in skills and structures enabling research to take place.

A framework for measuring capacity building should therefore be inclusive of both process and outcome measures [ 19 ], to capture changes in both the 'ends' and 'means'; it should measure the ultimate goals, but also measure the steps and mechanisms to achieve them. The notion of measuring RCB by both process and outcome measures is supported within the research networks literature [ 12 , 20 ], and capacity building in health more generally [ 19 , 21 ]. Some argue we should acknowledge 'process as outcome', particularly if capacity building is seen as an end in itself [ 21 ]. In this context process measures are 'surrogate' [ 12 ], or 'proxy' outcome measures[ 16 ]. Carter et al [ 16 ]stress caution in terms of using 'proxy' measures in the context of RCB, as there is currently little evidence to link process with outcome. They do not argue against the notion of collecting process data, but stress that evaluation work should examine the relationship of process to outcome. The proposed framework discussed in this paper suggests areas to consider for both process and outcome measurement.

The most commonly accepted outcomes for RCB cited in the literature includes traditional measures of high quality research including publications, conference presentations, successful grant applications, and qualifications obtained. Many evaluations of RCB have used these as outcomes [ 9 , 10 , 22 , 23 ]. Some argue that publications in peer reviewed journals are a tall order for the low research skills base in some areas of health care practice [ 5 ], and argue for an appropriate time frame to evaluate progress. Process measures in this context could measure progress more sensitively and quickly.

However, using traditional outcomes may not be the whole story in terms of measuring impact. Position statements suggest that the ultimate goal of research capacity building is one of health improvement [ 17 , 18 , 24 ]. In order for capacity building initiatives to address these issues, outcomes should also explore the direct impact on services and clients: what Smith [ 25 ]defines as the social impact of research.

There is a strong emphasis within the primary care literature that capacity building should enhance the ability of practitioners to build their research skills: to support the development of research 'by' and 'with' practice [ 3 , 26 ], and suggests 'added value' to develop such close links to practice. A framework to measure RCB should explore and try to unpack this 'added value', both in terms of professional outcomes,[ 10 ] which include increasing professional enthusiasm, and supporting the application of critical thinking, and the use of evidence in practice. Whilst doing research alongside practice is not the only way these skills and attitudes can be developed, it does seem to be an important impact of RCB that should be examined.

The notion of developing RCB close to practice does not necessarily mean that it is small scale just because it is close to the coal face. Obviously, in order for individuals and teams to build up a track record of experience their initial projects may justifiably be small scale, but as individual's progress, they may gain experience to be able to conduct large scale studies, still based on practice problems, working in partnership with others. Similarly networks can support large scale studies as their capacity and infrastructure is developed to accommodate them.

The framework

The framework is represented by Figure ​ Figure1. 1 . It has two dimensions

An external file that holds a picture, illustration, etc.
Object name is 1471-2296-6-44-1.jpg

Research Capacity Building: A Framework for Evaluation.

• Four structural levels of development activity . These include individual, team, organisational, and the network or supra- organisational support level (networks and support units). These are represented by the concentric circles within the diagram.

• Six principles of capacity building . This are discussed in more detail below but include: building skills and confidence, developing linkages and partnerships, ensuring the research is 'close to practice', developing appropriate dissemination, investments in infrastructure, and building elements of sustainability and continuity. Each principle is represented by an arrow within the diagram, which indicates activities and processes that contribute towards capacity building. The arrows cut across the structural levels suggesting that activities and interventions may occur within, and across, structural levels. The arrow heads point in both directions suggesting that principles applied to each structural level could have an impact on other levels.

The framework acknowledges that capacity building is conducted within a policy context. Whilst this paper focuses on measurement at different structural levels, it should be acknowledged that progress and impact on RCB can be greatly nurtured or restricted by the prevailing policy. Policy decisions will influence opportunities for developing researchers, can facilitate collaborations in research, support research careers, fund research directed by practice priorities, and can influence the sustainability and the very existence of supportive infrastructures such as research networks.

The paper will explain the rationale for the dimensions of the framework, and then will suggest some examples of measurement criteria for each principle at different structural levels to evaluate RCB. It is hope that as the framework is applied, further criteria will be developed, and then used taking into account time constraints, resources, and the purpose of such evaluations.

Structural levels at which capacity building takes place

The literature strongly supports that RCB should take place at an individual and organisational level [ 8 , 15 , 27 , 28 ]. For example, the conceptual model for RCB in primary care put forward by Farmer & Weston [ 15 ] focuses particularly on individual General Practitioners (GPs) and primary care practitioners who may progress from non participation through participation, to become academic leaders in research. Their model also acknowledges the context and organisational infrastructure to support RCB by reducing barriers and accommodating diversity through providing mentorship, collaborations and networking, and by adopting a whole systems approach based on local need and existing levels of capacity. Others have acknowledged that capacity development can be focussed at a team level [ 11 , 29 ]. Jowett et al [ 30 ] found that GPs were more likely to be research active if they were part of a practice where others were involved with research. Guidance from a number of national bodies highlights the need for multiprofessional and inter-professional involvement in conducting useful research for practice [ 3 , 4 , 6 , 31 ] which implies an appropriate mix of skills and practice experience within research teams to enable this [ 32 ]. Additionally, the organisational literature has identified the importance of teams in the production of knowledge [ 18 , 33 , 34 ].

Developing structures between and outside health organisations, including the development of research networks seems important for capacity building [ 12 , 24 , 34 ]. The Department of Health in the UK [ 14 ] categorizes this supra-organisational support infrastructure to include centres of academic activity, Research & Development Support Units, and research networks.

As interventions for RCB are targeted at different levels, the framework for measuring its effectiveness mirrors this. However, these levels should not be measured in isolation. One level can have an impact on capacity development at another level, and could potentially have a synergistic or detrimental effect on the other.

The six principles of research capacity building

Evaluation involves assessing the success of an intervention against a set of indicators or criteria [ 35 , 36 ], which Meyrick and Sinkler [ 37 ] suggest should be based on underlying principles in relation to the initiative. For this reason the framework includes six principles of capacity building. The rationale for each principle is given below, along with a description of some suggested criteria for each principle. The criteria presented are not an exhaustive list. As the framework is developed and used in practice, a body of criteria will be developed and built on further.

Principle 1. Research capacity is built by developing appropriate skills, and confidence, through training and creating opportunities to apply skills

The need to develop research skills in practitioners is well established [ 3 , 4 , 6 ], and can be supported through training [ 14 , 26 ], and through mentorship and supervision [ 15 , 24 , 28 ]. There is some empirical evidence that research skill development increases research activity [ 23 , 38 ], and enhances positive attitudes towards conducting and collaborating in research [ 39 ]. Other studies cite lack of training and research skills as a barrier to doing research [ 30 , 31 ]. The need to apply and use research skills in practice is highlighted in order to build confidence [ 40 ]and to consolidate learning.

Some needs assessment studies highlight that research skills development should adopt 'outreach' and flexible learning packages and acknowledge the skills, background and epistemologies of the professional groups concerned [ 7 , 15 , 39 , 41 , 42 ]. These include doctors, nurses, a range of allied health professional and social workers. Developing an appropriate mix of professionals to support health services research means that training should be inclusive and appropriate to them, and adopt a range of methodologies and examples to support appropriate learning and experience [ 15 , 31 , 41 ]. How learning and teaching is undertaken, and the content of support programmes to reflect the backgrounds, tasks and skills of participants should therefore be measured. For example, the type of research methods teaching offered by networks and support units should reflect a range and balance of skills needed for health service research, including both qualitative and quantitative research methods.

Skills development also should be set in the context of career development, and further opportunities to apply skills to practice should be examined. Policy and position statements [ 14 , 26 ] support the concept of career progression or 'careers escalator', which also enables the sustainability of skills. Opportunities to apply research skills through applications for funding is also important [ 9 , 10 , 22 , 43 , 44 ].

At team and network level Fenton et al [ 34 ]suggest that capacity can be increased through building intellectual capacity (sharing knowledge), which enhances the ability to do research. Whilst there is no formal measure for this, an audit of the transfer of knowledge would appear to be beneficial. For example teams may share expertise within a project to build skills in novice researchers [ 45 ]which can be tracked, and appropriate divisions of workload through reading research literature and sharing this with the rest of the team/network could be noted.

The notion of stepping outside of a safety zone may also suggest increased confidence and ability to do research. This may be illustrated at an individual level by the practitioner-researcher taking on more of a management role, supervising others, or tackling new methodologies/approaches in research, or in working with other groups of health and research professionals on research projects. This approach is supported by the model of RCB suggested by Farmer and Weston [ 15 ] which supports progress from participation through to academic leadership.

Some examples of criteria for measuring skills and confidence levels are give in table ​ table1 1 .

Building skills and confidence

Principle 2. Research capacity building should support research 'close to practice' in order for it to be useful

The underlying philosophy for developing research capacity in health is that it should generate research that is useful for practice. The North American Primary Care Group [ 24 ] defined the 'ultimate goal' of research capacity development as the generation and application of new knowledge to improve the health of individuals and families (p679). There is strong support that 'useful' research is that which is conducted 'close' to practice for two reasons. Firstly by generating research knowledge that is relevant to service user and practice concerns. Many argue that the most relevant and useful research questions are those generated by, or in consultation with, practitioners and services [ 3 , 11 , 24 ], policy makers [ 46 ] and service users [ 47 , 48 ]. The level of 'immediate' usefulness [ 49 ] may also mean that messages are more likely to taken up in practice[ 50 ]. Empirical evidence suggests that practitioners and policy makers are more likely to engage in research if they see its relevance to their own decision making [ 31 , 39 , 46 ]. The notion of building research that is 'close to practice' does not necessarily mean that they are small scale, but that the research is highly relevant to practice or policy concerns. A large network of practitioners could facilitate large scale, experimental based projects for example. However, the adoption of certain methodologies is more favoured by practice because of their potential immediate impact on practice [ 47 ] and this framework acknowledges such approaches and their relevance. This includes action research projects, and participatory inquiry [ 31 , 42 ]. An example where this more participatory approach has been developed in capacity building is the WeLREN (West London Research Network) cycle [ 51 ]. Here research projects are developed in cycles of action, reflection, and dissemination, and use of findings is integral to the process. This network reports high levels of practitioner involvement.

Secondly, building research capacity 'close to practice' is useful because of the skills of critical thinking it engenders which can be applied also to practice decision making [ 28 ], and which supports quality improvement approaches in organisations [ 8 ]. Practitioners in a local bursary scheme, for example, said they were more able to take an evidence-based approach into their every day practice [ 9 ].

Developing a 'research culture' within organisations suggests a closeness to practice that impacts on the ability of teams and individuals to do research. Lester et al [ 23 ] touched on measuring this idea through a questionnaire where they explored aspects of a supportive culture within primary care academic departments. This included aspects around exploring opportunities to discuss career progression, supervision, formal appraisal, mentorship, and junior support groups. This may be a fruitful idea to expand further to develop a tool in relation to a health care environment.

Some examples of criteria for measuring the close to practice principle are give in table ​ table2 2

Close to practice

3. Linkages, partnerships and collaborations enhance research capacity building

The notion of building partnerships and collaborations is integral to capacity building [ 19 , 24 ]. It is the mechanism by which research skills, and practice knowledge is exchanged, developed and enhanced [ 12 ], and research activity conducted to address complex health problems [ 4 ]. The linkages between the practice worlds and that of academia may also enhance research use and impact [ 46 ].

The linkages that enhance RCB can exist between

• Universities and practice [ 4 , 14 , 43 ]

• Novice and experienced researchers [ 22 , 24 , 51 ].

• Different professional groups [ 2 , 4 , 20 , 34 ]

• Different health and care provider sectors [ 4 , 31 , 47 , 52 ]

• Service users, practitioners and researchers [ 47 , 48 ]

• Researchers and policy makers [ 46 ]

• Different countries [ 28 , 52 ]

• Health and industry [ 53 , 54 ]

It is suggested that it is through networking and building partnerships that intellectual capital (knowledge) and social capital (relationships) can be built, which enhances the ability to do research [ 12 , 31 , 34 ]. In particular, there is the notion that the build up of trust between different groups and individuals can enhance information and knowledge exchange[ 12 ]. This may not only have benefits for the development of appropriate research ideas, but may also have benefits for the whole of the research process including the impact of research findings.

The notion of building links with industry is becoming progressively evident within policy in the UK [ 54 ] which may impact on economic outcomes to health organisations and the society as a whole[ 55 , 56 ].

Some examples of criteria for measuring linkages and collaborations are given in table ​ table3 3 .

Linkages, collaborations and partnerships.

4. Research capacity building should ensure appropriate dissemination to maximize impact

A widely accepted measure to illustrate the impact of RCB is the dissemination of research in peer reviewed publications, and through conference presentations to academic and practice communities [ 5 , 12 , 26 , 57 ]. However this principle extends beyond this more traditional method of dissemination. The litmus test that ultimately determines the success of capacity building is that it should impact on practice, and on the health of patients and comminutes[ 24 ] that is; the social impact of research [ 25 ]. Smith [ 25 ]argues that the strategies of dissemination should include a range of methods that are 'fit for purpose'. This includes traditional dissemination, but also includes other methods, for example, instruments and programmes of care implementation, protocols, lay publications, and publicity through factsheets, the media and the Internet.

Dissemination and tracking use of products and technologies arising from RCB should also be considered, which relate to economic outcomes of capacity building [ 55 ]. In the UK, the notion of building health trusts as innovative organisations which can benefit economically through building intellectual property highlights this as an area for potential measurement [ 56 ].

Some examples of criteria for measuring appropriate dissemination are given in table ​ table4 4

Appropriate dissemination and impact

5. Research capacity building should include elements of continuity and sustainability

Definitions of capacity building suggest that it should contain elements of sustainability which alludes to the maintenance and continuity of newly acquired skills and structures to undertake research [ 18 , 19 ]. However the literature does not explore this concept well [ 19 ]. This in itself may be partly due problems around measuring capacity building. It is difficult to know how well an initiative is progressing, and how well progress is consolidated, if there are no benchmarks or outcomes against which to demonstrate this.

Crisp et al [ 19 ] suggests that capacity can be sustained by applying skills to practice. This gives us some insight about where we might look for measures of sustainability. It could include enabling opportunities to extend skills and experience, and may link into the concept of a career escalator. It also involves utilizing the capacity that has been already built. For example engaging with those who have gained skills in earlier RCB initiatives to help more novice researchers, once they have become 'experts', and in finding an appropriate place to position the person with expertise with the organisation. It could also be measured by the number of opportunities for funding for continued application of skills to research practice.

Some examples of criteria for measuring sustainability and continuity are gibe in table ​ table5 5

Continuity and sustainability

6. Appropriate infrastructures enhance research capacity building

Infrastructure includes structures and processes that are set up to enable the smooth and effective running of research projects. For example, project management skills are essential to enable projects to move forward, and as such should be measured in relation to capacity building. Similarly, projects should be suitably supervised with academic and management support. To make research work 'legitimate' it may be beneficial to make research a part of some job descriptions for certain positions, not only to reinforce research as a core skill and activity, but also to review in annual appraisals, which can be a tool for research capacity evaluation. Information flow about calls for funding and fellowships and conferences is also important. Hurst [ 42 ] found that information flow varied between trusts, and managers were more aware of research information than practitioners.

The importance of protected time and backfill arrangements as well as funding to support this, is an important principle to enable capacity building [9, 15, 24, 58]. Such arrangements may reduce barriers to participation and enable skills and enthusiasm to be developed[ 15 ]. Infrastructure to help direct new practitioners to research support has also been highlighted[ 14 ]. This is particularly true in the light of the new research governance and research ethics framework in the UK [59]. The reality of implementing systems to deal with the complexities of the research governance regulations has proved problematic, particularly in primary care, where the relative lack of research management expertise and infrastructure has resulted in what are perceived as disproportionately bureaucratic systems. Recent discussion in the literature has focused on the detrimental impact of both ethical review, and NHS approval systems, and there is evidence of serious delays in getting research projects started [60]. Administrative and support staff to help researchers through this process is important to enable research to take place [61].

Some examples of criteria for measuring are given in table ​ table6 6 .

Infrastructure

This paper suggests a framework which sets out a tentative structure by which to start measuring the impact of capacity building interventions, and invites debate around the application of this framework to plan and measure progress. It highlights that interventions can focus on individuals, teams, organisations, and through support infrastructures like RDSUs and research networks. However, capacity building may only take place once change has occurred at more than one level: for example, the culture of an organisation in which teams and individuals work may have an influence of their abilities and opportunities to do research work. It is also possible that the interplay between different levels may have an effect on the outcomes at other levels. In measuring progress, it should be possible to determine a greater understanding of the relationship between different levels. The framework proposed in this paper may be the first step to doing this.

The notion of building capacity at any structural level is dependent on funding and support opportunities, which are influenced by policy and funding bodies. The ability to build capacity across the principles developed in the framework will also be dependent of R&D strategy and policy decisions. For example, if policy fluctuates in its emphasis on building capacity 'by', 'for' or 'with' practice, the ability to build capacity close to practice will be affected.

In terms of developing a science of RCB, there is a need to capture further information on issues of measuring process and outcome data to understand what helps develop 'useful' and 'useable' research. The paper suggests principles whereby a number of indicators could be developed. The list is not exhaustive, and it is hoped that through debate and application of the framework further indicators will be developed.

An important first step to building the science of RCB should be debate about identifying appropriate outcomes. This paper supports the use of traditional outcomes of measurement, including publications in peer reviewed journals and conference presentations. This assures quality, and engages critical review and debate. However, the paper also suggests that we might move on from these outcomes in order to capture the social impact of research, and supports the notion of developing outcomes which measure how research has had an impact on the quality of services, and on the lives of patients and communities. This includes adopting and shaping the type of methodologies that capacity building interventions support, which includes incorporating patient centred outcomes in research designs, highlighting issues such as cost effectiveness of interventions, exploring economic impact of research both in terms of product outputs and health gain, and in developing action oriented, and user involvement methodologies that describe and demonstrate impact. It also may mean that we have to track the types of linkages and collaborations that are built through RCB, as linkages that are close to practice, including those with policy makers and practitioners, may enhance research use and therefore 'usefulness'. If we are to measure progress through impact and change in practice, an appropriate time frame would have to be established alongside these measures.

This paper argues that 'professional outcomes' should also be measured, to recognize how critical thinking developed during research impacts on clinical practice more generally.

Finally, the proposed framework provides the basis by which we can build a body of evidence to link process to the outcomes of capacity building. By gathering process data and linking it to appropriate outcomes, we can more clearly unpack the 'black box' of process, and investigate which processes link to desired outcomes. It is through adopting such a framework, and testing out these measurements, that we can systematically build a body of knowledge that will inform the science and the art of capacity building in health care.

• There is currently little evidence on how to plan and measure progress in research capacity building (RCB), or agreement to determining its ultimate outcomes.

• Traditional outcomes of publications in peer reviewed journals, and successful grant applications may be the easy and important outcomes to measure, but do not necessarily address issues to do with the usefulness of research, professional outcomes, the impact of research activity on practice, or on measuring health gain.

• The paper suggests a framework which provides a tentative structure by which measuring the impact of RCB could be achieved, shaped around six principles of research capacity building, and includes four structural levels on which each principle can be applied.

• The framework could be the basis by which RCB interventions could be planned, and progress measured. It could act as a basis of comparison across interventions, and could contribute to establishing a knowledge base on what is effective in RCB in healthcare

Competing interests

The author(s) declare that they have no competing interests.

Pre-publication history

The pre-publication history for this paper can be accessed here:

http://www.biomedcentral.com/1471-2296/6/44/prepub

Acknowledgements

My warm thanks go to my colleagues in the primary care group of the Trent RDSU for reading and commenting on earlier drafts of this paper, and for their continued support in practice.

  • Muir Gray JA. Evidence-based Healthcare. How to make health policy and management decisions. Edinburgh, Churchill Livingstone; 1997. [ Google Scholar ]
  • Department of Health . Research and Development for a First Class Service. Leeds, DoH; 2000. [ Google Scholar ]
  • Mant D. National working party on R&D in primary care. Final Report. London, NHSE South and West.; 1997. [ Google Scholar ]
  • Department of Health . Strategic review of the NHS R&D Levy (The Clarke Report) , Central Research Department, Department of Health; 1999. p. 11. [ Google Scholar ]
  • Campbell SM, Roland M, Bentley E, Dowell J, Hassall K, Pooley J, Price H. Research capacity in UK primary care. British Journal of General Practice. 1999; 49 :967–970. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Department of Health . Towards a strategy for nursing research and development. London, Department of Health; 2000. [ Google Scholar ]
  • Ross F, Vernon S, Smith E. Mapping research in primary care nursing: Current activity and future priorities. NT Research. 2002; 7 :46–59. [ Google Scholar ]
  • Marks L, Godfrey M. Developing Research Capacity within the NHS: A summary of the evidence. Leeds, Nuffield Portfolio Programme Report.; 2000. [ Google Scholar ]
  • Lee M, Saunders K. Oak trees from acorns? An evaluation of local bursaries in primary care. Primary Health Care Research and Development. 2004; 5 :93–95. doi: 10.1191/1463423604pc197xx. [ CrossRef ] [ Google Scholar ]
  • Bateman H, Walter F, Elliott J. What happens next? Evaluation of a scheme to support primary care practitioners with a fledgling interest in research. Family Practice. 2004; 21 :83–86. doi: 10.1093/fampra/cmh118. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Smith LFP. Research general practices: what, who and why? British Journal of General Practice. 1997; 47 :83–86. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Griffiths F, Wild A, Harvey J, Fenton E. The productivity of primary care research networks. British Journal of General Practice. 2000; 50 :913–915. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Fenton F, Harvey J, Griffiths F, Wild A, Sturt J. Reflections from organization science of primary health care networks. Family Practice. 2001; 18 :540–544. doi: 10.1093/fampra/18.5.540. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Department of Health . Research Capacity Development Strategy. London, Department of Health; 2004. [ Google Scholar ]
  • Farmer E, Weston K. A conceptual model for capacity building in Australian primary health care research. Australian Family Physician. 2002; 31 :1139–1142. [ PubMed ] [ Google Scholar ]
  • Carter YH, Shaw S, Sibbald B. Primary care research networks: an evolving model meriting national evaluation. British Journal of General Practice. 2000; 50 :859–860. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Trostle J. Research Capacity building and international health: Definitions, evaluations and strategies for success. Social Science and Medicine. 1992; 35 :1321–1324. doi: 10.1016/0277-9536(92)90035-O. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Albert E, Mickan S. Closing the gap and widening the scope. New directions for research capacity building in primary health care. Australian Family Physician. 2002; 31 :1038 –10341. [ PubMed ] [ Google Scholar ]
  • Crisp BR, Swerissen H, Duckett SJ. Four approaches to capacity building in health: consequences for measurement and accountability. Health Promotion International. 2000; 15 :99–107. doi: 10.1093/heapro/15.2.99. [ CrossRef ] [ Google Scholar ]
  • Ryan , Wyke S. The evaluation of primary care research networks in Scotland. British Journal of General Practice. 2001. pp. 154–155.
  • Gillies P. Effectiveness of alliances and partnerships for health promotion. Health Promotion International. 1998; 13 :99–120. doi: 10.1093/heapro/13.2.99. [ CrossRef ] [ Google Scholar ]
  • Pitkethly M, Sullivan F. Four years of TayRen, a primary care research and development network. Primary Care Research and Development. 2003; 4 :279–283. doi: 10.1191/1463423603pc167oa. [ CrossRef ] [ Google Scholar ]
  • Lester H, Carter YH, Dassu D, Hobbs F. Survey of research activity, training needs. departmental support, and career intentions of junior academic general practitioners. British Journal of General Practice. 1998; 48 :1322–1326. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • North American Primary Care Research Group What does it mean to build research capacity? Family Medicine. 2002; 34 :678–684. [ PubMed ] [ Google Scholar ]
  • Smith R. Measuring the social impact of research. BMJ. 2001; 323 :528. doi: 10.1136/bmj.323.7312.528. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sarre G. Capacity and activity in research project (CARP): supporting R&D in primary care trusts. 2002.
  • Del Mar C, Askew D. Building family/general practice research capacity. Annals of Family Medicine. 2004; 2 :535–540. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Carter YH, Shaw S, Macfarlane F. Primary Care research team assessment (PCRTA): development and evaluation. Occasional paper (Royal College of General Practitioners) 2002; 81 :1–72. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jowett S, Macleod J, Wilson S, Hobbs F. Research in Primary Care: extent of involvement and perceived determinants among practitioners for one English region. British Journal of General Practice. 2000; 50 :387–389. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Cooke J, Owen J, Wilson A. Research and development at the health and social care interface in primary care: a scoping exercise in one National Health Service region. Health and Social Care in the Community. 2002; 10 :435 –4444. doi: 10.1046/j.1365-2524.2002.00395.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Raghunath AS, Innes A. The case of multidisciplinary research in primary care. Primary Care Research and Development. 2004; 5 :265–273. [ Google Scholar ]
  • Reagans RZER. Networks, Diversity and Productivity: The Social Capital of Corporate R&D Teams. Organisational Science. 2001; 12 :502–517. doi: 10.1287/orsc.12.4.502.10637. [ CrossRef ] [ Google Scholar ]
  • Ovretveit J. Evaluating Health Interventions. Buckingham, Open University; 1998. [ Google Scholar ]
  • Meyrick J, Sinkler P. An evaluation Resource for Healthy Living Centres. London, Health Education Authority; 1999. [ Google Scholar ]
  • Hakansson A, Henriksson K, Isacsson A. Research methods courses for GPs: ten years' experience in southern Sweden. British Journal of General Practice. 2000; 50 :811–812. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bacigalupo B, Cooke J, Hawley M. Research activity, interest and skills in a health and social care setting: a snapshot of a primary care trust in Northern England. Health and Social Care in the Community.
  • Kernick D. Evaluating primary care research networks - exposing a wider agenda. British Journal of General Practice. 2001; 51 :63. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Owen J, Cooke J. Developing research capacity and collaboration in primary care and social care: is there enough common ground? Qualitative Social Work. 2004; 3 :398–410. doi: 10.1177/1473325004048022. [ CrossRef ] [ Google Scholar ]
  • Hurst Building a research conscious workforce. Journal of Health Organization and management. 2003; 17 :373–384. [ PubMed ] [ Google Scholar ]
  • Gillibrand WP, Burton C, Watkins GG. Clinical networks for nursing research. International Nursing Review. 2002; 49 :188–193. doi: 10.1046/j.1466-7657.2002.00124.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Campbell J, Longo D. Building research capacity in family medicine: Evaluation of the Grant Generating Project. Journal of Family Practice. 2002; 51 :593. [ PubMed ] [ Google Scholar ]
  • Cooke J, Nancarrow S, Hammersley V, Farndon L, Vernon W. The "Designated Research Team" approach to building research capacity in primary care. Primary Health Care Research and Development. [ PMC free article ] [ PubMed ]
  • Innvaer S, Vist G, Trommald M, Oxman A. Health policy- makers' perceptions of their use of evidence: a systematic review. Journal of Health Services Research and Policy. 2002; 7 :239–244. doi: 10.1258/135581902320432778. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • NHS Service Delivery Organisation . NHS Service Delivery and Against National R&D programme, National listening exercise. London, NHS SDO; 2000. [ Google Scholar ]
  • Hanley J, Bradburn S, Gorin M, Barnes M, Evans C, Goodare HB. Involving consumers in research and development in the NHS: briefing notes for researchers. Winchester, Consumers in NHS Research Support Unit,; 2000. [ Google Scholar ]
  • Frenk J. Balancing relevance and excellence: organisational responses to link research with decision making. Social Science and Medicine. 1992; 35 :1397–1404. doi: 10.1016/0277-9536(92)90043-P. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • National Audit Office. An international review on Governments' research procurement strategies. A paper in support of Getting the evidence: Using research in policy making. London, The Stationary Office.; 2003. [ Google Scholar ]
  • Thomas P, While A. Increasing research capacity and changing the culture of primary care towards reflective inquiring practice: the experience of West London Research Network (WeLReN) Journal of Interprofessional Care. 2001; 15 :133–139. doi: 10.1080/13561820120039865. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rowlands G, Crilly T, Ashworth M, Mager J, Johns C, Hilton S. Linking research and development in primary care: primary care trusts, primary care research networks and primary care academics. Primary Care Research and Development. 2004; 5 :255–263. doi: 10.1191/1463423604pc201oa. [ CrossRef ] [ Google Scholar ]
  • Davies S. R&D for the NHS- Delivering the research agenda: ; London. National Coordinating Centre for Research Capacity Development; 2005. [ Google Scholar ]
  • Department of Health. Best Research for Best Health: A New National Health Research Strategy. The NHS contribution to health research in England: A consultation. London, Department of Health; 2005. [ Google Scholar ]
  • Buxton M, Hanney S, Jones T. Estimating the economic value to societies of the impact of health research: a critical review. Bulletin of the World Health Organisation. 2004; 82 :733–739. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Department of Health. The NHS as an innovative organisation: A framework and guidance on the management of intellectual property in the NHS. London, Department of Health; 2002. [ Google Scholar ]
  • Sarre G. Trent Focus Supporting research and development in primary care organisations: report of the capacity and activity in research project (CARP) 2003.
  • Department of Health . Research Governance Framework for Health and Social Care. London, Department of Health.; 2001. [ Google Scholar ]
  • Hill J, Foster N, Hughes R, Hay E. Meeting the challenges of research governance. Rheumatology. 2005; 44 :571–572. doi: 10.1093/rheumatology/keh579. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Shaw S. Developing research management and governance capacity in primary care organizations: transferable learning from a qualitative evaluation of UK pilot sites. Family Practice. 2004; 21 :92–98. doi: 10.1093/fampra/cmh120. [ PubMed ] [ CrossRef ] [ Google Scholar ]

IMAGES

  1. The research capacity building pyramid inputs and impacts.

    research and capacity building organisations

  2. Institutional Capacity Building Framework

    research and capacity building organisations

  3. Capacity Building

    research and capacity building organisations

  4. Partnership model for research capacity building used within the North

    research and capacity building organisations

  5. Capacity Building

    research and capacity building organisations

  6. Research capacity focus

    research and capacity building organisations

VIDEO

  1. University of Bristol Business School Research Bite

  2. Research Proposal Development

  3. INNOVATION IN RESEARCH: WHAT IS IT?

  4. ASFI Reproducibility Course 2024

  5. Communicating Research to a Wider Audience

  6. Soft Skills for Researchers

COMMENTS

  1. How do NHS organisations plan research capacity development? Strategies

    The potential for building research capacity in these 'core activities' was established by reviewing them through the lens of a RCD framework. Core activities aimed to 'hard wire' RCD into health organisations. They demonstrated a complex interplay between developing a strong internal organisational infrastructure, and supporting ...

  2. Measuring the outcome and impact of research capacity strengthening

    Number of joint activities with other research organizations 10 Develop research networks within and between institutions 28 ... Pager S, Holden L: A thematic analysis of the role of the organisation in building allied health research capacity: a senior managers' perspective. BMC Health Serv Res. 2012; 12:276. 10.1186/1472-6963-12-276 ...

  3. Research engagement and research capacity building: a priority for

    Design/methodology/approach. A mixed-method study presenting three work packages here: secondary analysis of levels of staff research activity, funding, academic outputs and workforce among healthcare organisations in the United Kingdom; 39 Research and Development lead and funder interviews; an online survey of 11 healthcare organisations across the UK, with 1,016 responses from healthcare ...

  4. Capacity building for implementation research: a methodology for

    Capacity building for implementation research: a methodology for advancing health research and practice ... Departments/Agencies etc.), non-governmental organisations in health, communities, and any such institution that may be undertaking relevant and applicable health intervention in real-life contexts. This is a carefully planned activity ...

  5. Organizational capacity building: Addressing a research and practice

    The purpose of this article is to address the gap between evaluation research, and the practice of capacity building with nonprofits. This study describes a 5-year capacity building initiative with grassroots organizations including a longitudinal evaluation of the implementation and outcomes achieved. Formative processes yielded many lessons ...

  6. A framework for managing health research capacity strengthening

    Moreover, management capacity is vital to the sustenance of science systems. 19 Directors of capacity building consortia are often established scientists who are not necessarily trained managers, 20 and management of capacity-strengthening consortia differs from the management of organisations or even research consortia.

  7. Evaluating Health Research Capacity Building: An Evidence-Based Tool

    By combining the definition for generic "capacity building" with published evidence and our practical experiences of developing a planning and evaluation tool, we have defined building capacity for health research as "an ability of individuals, organisations, or systems to perform and utilise health research effectively, efficiently, and ...

  8. PDF Effective research capacity strengthening

    research programmes they support. Effective Research Capacity Strengthening: A Quick Guide for Funders is an accessible guide to the latest evidence and best practice in this field. Having elicited advice and expertise from funders around the world, the authors offer a concise definition of research capacity strengthening, and identify the

  9. Understanding collaboration in a multi-national research capacity

    Research capacity building and its impact on policy and international research partnership is increasingly seen as important. High income and low- and middle-income countries frequently engage in research collaborations. These can have a positive impact on research capacity building, provided such partnerships are long-term collaborations with a unified aim, but they can also have challenges.

  10. Measuring research capacity development in healthcare workers: a

    Objectives A key barrier in supporting health research capacity development (HRCD) is the lack of empirical measurement of competencies to assess skills and identify gaps in research activities. An effective tool to measure HRCD in healthcare workers would help inform teams to undertake more locally led research. The objective of this systematic review is to identify tools measuring healthcare ...

  11. Research capacity building—obligations for global health partners

    Global health continues to gain pace as a discipline, as is evident from the amount of funding available for challenges relevant to low-income and middle-income countries (LMICs)1,2 and the growth of journals in this field. This growth has been driven in no small part by the targets and indicators of the Millennium Development Goals. Successes towards achieving these goals, however, have often ...

  12. Health research, development and innovation capacity building

    Research, development and innovation (RDI) encompasses undertaking research to contribute to new knowledge, developing policies, and generating products and services. Building health RDI capacity should be informed by the developmental gap, required resources and the impact. Low- and middle-income countries often face barriers to reaching their RDI potential. To address some of the RDI ...

  13. Research engagement and research capacity building: a priority for

    Purpose: To research involvement of healthcare staff in the UK and identify practical organisational and policy solutions to improve and boost capacity of the existing workforce to conduct research. Design/methodology/approach: A mixed-method study presenting three work packages here: secondary analysis of levels of staff research activity, funding, academic outputs and workforce among ...

  14. HowtoLeverageActionResearchtoDevelop Context-specific Capacity Building

    At the organizational level, capacity building is accomplished by strengthening processes and policies that address groups, teams, or units. Institutional capacity refers to the ability of individuals, ... Action Research for Capacity Building 53. coursecredits, focuseson the perceived legitimacy of completing acourse/program.

  15. Division for Research Capacity Building (DRCB)

    IDeA is a congressionally mandated program that builds research capacity in states with low levels of NIH funding. It supports competitive basic, clinical, and translational research, research workforce development, and infrastructure improvements. The program aims to strengthen institutions' ability to support biomedical research, enhance the ...

  16. Research capacity building frameworks for allied health professionals

    This systematic review developed a succinct and integrated framework for allied health research capacity building. This framework may be used to inform and guide the design and evaluation of research capacity building strategies targeting individuals, teams, organisations and systems. This framework provides structure in terms of specific ...

  17. How to Leverage Action Research to Develop Context-specific Capacity

    1 Introduction. This research note introduces action research to nonprofit studies. We contribute to the current debates on power and equity in capacity building by proposing how the systemic and adaptive processes of inquiry associated with action research can be harnessed to develop locally embedded capacity development programs that are contextually relevant and responsive to the capacity ...

  18. FEEDCities research project: building capacity and eliminating trans

    building national laboratory capacity to test TFA in food products; ensuring adequate monitoring and enforcement mechanisms are in place to ensure that there is no industrially produced TFA; fostering multisectoral collaboration and partnerships between government agencies, civil society organizations, academic institutions, private sector ...

  19. Key Practices that Inform the Nature of Healthy Leadership:Questions

    Capacity Building The foundational practices of Deep Understanding and Critical Reflection yield a mature, sustainable, empowering presence as we engage them. They instill a depth of understanding that yields us emotional agility, greater awareness of our environment, and insight into our impact as leaders, teachers, parents and community builders.

  20. Closing the Caribbean Data Gap: Addressing Poverty and Inequality

    Many Caribbean countries struggle with weak statistical capacity and low data usage. According to the World Bank's Statistical Performance Indicator (SPI), which measures statistical capacity at the country level, the region ranks lowest in statistical performance compared to other regions, aligning more closely with low-income countries.

  21. MGH adding nearly 100 beds, citing 'capacity crisis'

    Health MGH adding nearly 100 beds when research facility opens, citing 'capacity crisis' The approved addition of 94 inpatient beds will allow beds to stay at MGH instead of relocated to the ...

  22. GSO -YSMO Capacity Building Program on Energy efficiency in Aden

    A training week program related to providing energy efficiency card issuance services concluded today in the Republic of Yemen in the temporary capital, Aden. The program was organized by the GCC Standardization Organization (GSO) in cooperation with the Yemen Standardization, Metrology and Quality Control Organization (YSMO), within the framework of implementing the agreement to provide …

  23. A framework to evaluate research capacity building in health care

    Building research capacity in health services has been recognised internationally as important in order to produce a sound evidence base for decision-making in policy and practice. Activities to increase research capacity for, within, and by practice include initiatives to support individuals and teams, organisations and networks.