Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • For authors
  • Browse by collection
  • BMJ Journals

You are here

  • Volume 9, Issue 8
  • Guidance on how to develop complex interventions to improve health and healthcare
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0003-4033-506X Alicia O'Cathain 1 ,
  • http://orcid.org/0000-0002-3666-6264 Liz Croot 1 ,
  • http://orcid.org/0000-0002-3400-905X Edward Duncan 2 ,
  • Nikki Rousseau 2 ,
  • Katie Sworn 1 ,
  • http://orcid.org/0000-0002-6375-2918 Katrina M Turner 3 ,
  • Lucy Yardley 3 , 4 ,
  • http://orcid.org/0000-0002-4372-9681 Pat Hoddinott 2
  • 1 Medical Care Research Unit, School of Health and Related Research , University of Sheffield , Sheffield , UK
  • 2 Nursing, Midwifery and Allied Health Professional Research Unit , University of Stirling , Stirling , UK
  • 3 School of Social and Community Medicine , University of Bristol , Bristol , UK
  • 4 Psychology , University of Southampton , Southampton , UK
  • Correspondence to Professor Alicia O'Cathain; a.ocathain{at}sheffield.ac.uk

Objective To provide researchers with guidance on actions to take during intervention development.

Summary of key points Based on a consensus exercise informed by reviews and qualitative interviews, we present key principles and actions for consideration when developing interventions to improve health. These include seeing intervention development as a dynamic iterative process, involving stakeholders, reviewing published research evidence, drawing on existing theories, articulating programme theory, undertaking primary data collection, understanding context, paying attention to future implementation in the real world and designing and refining an intervention using iterative cycles of development with stakeholder input throughout.

Conclusion Researchers should consider each action by addressing its relevance to a specific intervention in a specific context, both at the start and throughout the development process.

  • intervention development

This is an open access article distributed in accordance with the Creative Commons Attribution 4.0 Unported (CC BY 4.0) license, which permits others to copy, redistribute, remix, transform and build upon this work for any purpose, provided the original work is properly cited, a link to the licence is given, and indication of whether changes were made. See:  https://creativecommons.org/licenses/by/4.0/ .

https://doi.org/10.1136/bmjopen-2019-029954

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

There is increasing demand for new interventions as policymakers and clinicians grapple with complex challenges, such as integration of health and social care, risk associated with lifestyle behaviours, multimorbidity and the use of e-health technology. Complex interventions are often required to address these challenges. Complex interventions can have a number of interacting components, require new behaviours by those delivering or receiving the intervention or have a variety of outcomes. 1 An example is a multicomponent intervention to help people stand more at work, including a height adjustable workstation, posters and coaching sessions. 2 Careful development of complex interventions is necessary so that new interventions have a better chance of being effective when evaluated and being adopted widely in the real world. Researchers, the public, patients, industry, charities, care providers including clinicians and policymakers can all be involved in the development of new interventions to improve health, and all have an interest in how best to do this.

The UK Medical Research Council (MRC) published influential guidance on developing and evaluating complex interventions, presenting a framework of four phases: development, feasibility/piloting, evaluation and implementation. 1 The development phase is what happens between the idea for an intervention and formal pilot testing in the next phase. 3 This phase was only briefly outlined in the original MRC guidance and requires extension to offer more help to researchers wanting to develop complex interventions. Bleijenberg and colleagues 4 brought together learning from a range of guides/published approaches to intervention development to enrich the MRC framework. 4 There are also multiple sources of guidance to intervention development, embodied in books and journal articles about different approaches to intervention development (for example 5 ) and overviews of the different approaches. 6 These approaches may offer conflicting advice, and it is timely to gain consensus on key aspects of intervention development to help researchers to focus on this endeavour. Here, we present guidance on intervention development based on a consensus study which we describe below. We present this guidance as an accessible communication article on how to do intervention development, which is aimed at readers who are developers, including those new to the endeavour. We do not present it as a ‘research article’ with methods and findings to maximise its use as guidance. Lengthy detail and a long list of references are not provided so that the guidance is focused and user friendly. In addition, the key actions of intervention development are summarised in a single table so that funding panel members and developers can use this as a quick reference point of issues to consider when developing health interventions.

How this guidance was developed

This guidance is based on a study funded by the MRC and the National Institute for Health Research in the UK, with triangulation of evidence from three sources. First, we undertook a review of published approaches to intervention development that offer developers guidance on specific ways to develop interventions 6 and a review of primary research reporting intervention development. The next two phases involved developers and wider stakeholders. Developers were people who had written articles or books detailing different approaches to developing interventions and people who had developed interventions. Wider stakeholders were people involved in the wider intervention development endeavour in terms of being directors of research funding panels, editors of journals that had published intervention development studies, people who had been public and patient involvement members of studies involving intervention development and people working in health service implementation. We carried out qualitative interviews 7 and then we conducted a consensus exercise consisting of two simultaneous and identical e-Delphi studies distributed to intervention developers and wider stakeholders, respectively, and followed this with a consensus workshop. We generated items for the e-Delphi studies based on our earlier reviews and analysis of interview data and asked participants to rate 85 items on a five-point scale from ‘very’ to ‘not important’ using the question ‘when developing complex interventions to improve health, how important is it to’. The distribution of answers to each item is displayed in Appendix 1, and e-Delphi participants are described in Appendix 2. In addition to these research methods, we convened an international expert panel with members from the UK, USA and Europe early in the project to guide the research. Members of this expert panel participated in the e-Delphi studies and consensus workshop alongside other participants.

Framework for intervention development

We base this guidance on expert opinion because there is a research evidence gap about which actions are needed in intervention development to produce successful health interventions. Systematic reviews have been undertaken to determine whether following a specific published approach, or undertaking a specific action, results in effective interventions. Unfortunately, this evidence base is sparse in the field of health, largely due to the difficulty of empirically addressing this question. 8 9 Evidence tends to focus on the use of existing theory within intervention development—for example, the theory of Diffusion of Innovation or theories on behaviour change—and a review of reviews shows that interventions developed with existing theory do not result in more effective interventions than those not using existing theory. 10 The authors of this latter review highlight problems with the evidence base rather than dismiss the possibility that existing theory could help produce successful interventions.

Key principles and actions of intervention development are summarised below. More detailed guidance for the principles and actions is available at https://www.sheffield.ac.uk/scharr/sections/hsr/mcru/indexstudy .

Key principles of intervention development

Key principles of intervention development are that it is dynamic, iterative, creative, open to change and forward looking to future evaluation and implementation. Developers are likely to move backwards and forwards dynamically between overlapping actions within intervention development, such as reviewing evidence, drawing on existing theory and working with stakeholders. There will also be iterative cycles of developing a version of the intervention: getting feedback from stakeholders to identify problems, implementing potential solutions, assessing their acceptability and starting the cycle again until assessment of later iterations of the intervention produces few changes. These cycles will involve using quantitative and qualitative research methods to measure processes and intermediate outcomes, and assess the acceptability, feasibility, desirability and potential unintended harms of the intervention.

Developers may start the intervention development with strong beliefs about the need for the intervention, its content or format or how it should be delivered. They may also believe that it is possible to develop an intervention with a good chance of being effective or that it can only do good not harm. Being open to alternative possibilities throughout the development process may lead to abandoning the endeavour or taking steps back as well as forward. The rationale for being open to change is that this may reduce the possibility of developing an intervention that fails during future evaluation or is never implemented in practice. Developers may also benefit from looking forward to how the intervention will be evaluated so they can make plans for this and identify learning and key uncertainties to be addressed in future evaluation.

Key actions of intervention development

Key actions for developers to consider are summarised in table 1 and explored in more detail throughout the rest of the paper. It may not be possible or desirable for developers to address all these actions during their development process, and indeed some may not be relevant to every problem or context. The recommendation made here is that developers ‘consider the relevance and importance of these actions to their situation both at the start of, and throughout, the development process’.

  • View inline

Framework of actions for intervention development

These key actions are set out in table 1 in what appears to be a sequence. However, in practice, these actions are addressed in a dynamic way. That is, undertaken in parallel and revisited regularly as the intervention evolves, or they interact with each other when learning from one action influences plans for other actions. These actions are explored in more detail below and presented in a logic model for intervention development ( figure 1 ). A logic model is a diagram of how an intervention is proposed to work, showing mechanisms by which an intervention influences the proposed outcomes. 11 The short and long-term effects of successful intervention development were informed by the qualitative interviews with developers and wider stakeholders. 7

  • Download figure
  • Open in new tab
  • Download powerpoint

Logic model for intervention development.

Plan the development process

Understand the problem.

Developers usually start with a problem they want to solve. They may also have some initial ideas about the content, format or delivery of the proposed intervention. The knowledge about the problem and the possibilities for an intervention may be based on: personal experiences of the problem (patients, carers or members of the public); their work (practitioners, policymakers, researchers); published research or theory or discussions with stakeholders. These early ideas about the intervention may be refined and indeed challenged throughout the intervention development process. For example, understanding the problem, priorities for addressing it and the aspects that are amenable to change is part of the development process, and different solutions may emerge as understanding increases. In addition, developers may find that it is not necessary to develop a new intervention because effective or cost-effective ones already exist. It may not be worth developing a new intervention because the potential cost is likely to outweigh the potential benefits or its limited reach could increase health inequalities, or the current context may not be conducive to using it. Health economists may contribute to this debate.

Identify resources—time and funding

Once a decision has been made that a new intervention is necessary, and has the potential to be worthwhile, developers can consider the resources available to them. Spending too little time developing an intervention may result in a flawed intervention that is later found not to be effective or cost-effective or is not implemented in practice, resulting in research waste. Alternatively, spending too much time on development could also waste resources by leaving developers with an outdated intervention that is no longer acceptable or feasible to deliver because the context has changed so much or is no longer a priority. It is likely that a highly complex problem with a history of failed interventions will warrant more time for careful development.

Some funding bodies fund standalone intervention development studies or fund this endeavour as part of a programme of development, piloting and evaluation of an intervention. While pursuing such funding may be desirable to ensure sufficient resource, in practice some developers may not be able to access this funding and may have to fund different parts of the development process from separate pots of money over a number of years.

Applying for funding requires writing a protocol for a study. Funders need detail about the proposed intervention and the development process to make a funding decision. It may feel difficult to specify the intervention and the detail of its development before starting because these will depend on learning occurring throughout the development process. Developers can address this by describing in detail their best guess of the intervention and their planned development process, recognising that both are likely to change in practice. Even if funding is not sought, it may be a good idea to produce a protocol detailing the processes to be undertaken to develop the intervention so that sufficient resources can be identified.

Decide which approach to intervention development to take

A key decision for teams is whether to be guided by one of the many published approaches to intervention development or undertake a more pragmatic self-selected set of actions. A published approach is a guide to the process and methods of intervention development set out in a book, website or journal article. The rationale for using a published approach is that it sets out systematic processes that other developers have found useful. Some published approaches and approaches that developers have used in practice are listed in table 2 . 6 No research has shown that one of these approaches is better than another or that their use always leads to the development of successful interventions. In practice, developers may select a specific published approach because of the purpose of their intervention development, for example, aiming to change behaviour might lead to the use of the Behaviour Change Wheel or Intervention Mapping, in conjunction with the Person Based Approach. Alternatively, selection may depend on developers’ beliefs or values, for example, partnership approaches such as coproduction may be selected because developers believe that users will find the resultant interventions more acceptable and feasible, or they may value inclusive work practices in their own right. Although developers may follow a published approach closely, experts recommend that developers apply these approaches flexibly to fit their specific context. Many of these approaches share the same actions 4 6 and simply place more emphasis on one or a subset of actions. Researchers sometimes combine the use of different approaches in practice to gain the strengths of two approaches, as in the ‘Combination’ category of table 2 .

Different approaches to intervention development

Involve stakeholders throughout the development process

Many groups of people are likely to have a stake in the proposed intervention: the intervention may be aimed at patients or the public, or they may be expected to use the intervention; practitioners may deliver the intervention in a range of settings, for example, hospitals, primary care, community care, social care, schools, communities, voluntary/third sector organisations and users, policy makers or tax payers may pay for the intervention. The rationale for involving relevant stakeholders from the start, and indeed working closely with them throughout, is that they can help to identify priorities, understand the problem and help find solutions that may make a difference to future implementation in the real world.

There are many ways of working with stakeholders and different ways may be relevant for different stakeholders at different times during the development process. Consultation may sometimes be appropriate, where a one-off meeting with a set of stakeholders helps developers to understand the context of the problem or the context in which the intervention would operate. Alternatively, the intervention may be designed closely with stakeholders using a coproduction process, where stakeholders and developers generate ideas about potential interventions and make decisions together throughout the development process about its content, format, style and delivery. 12 This could involve a series of workshops and meetings to build relationships over time to facilitate understanding of the problem and generation of ideas for the new intervention. Coproduction rather than consultation is likely to be important when buy-in is needed from a set of stakeholders to facilitate the feasibility, acceptability and engagement with the intervention or the health problem or context is particularly complex. Coproduction involves stakeholders in this decision-making, whereas with consultation, decisions are made by the research team. Stakeholders’ views may also be obtained through qualitative interviews, surveys and stakeholder workshops, with methods tailored to the needs of each stakeholder. Innovative activities can be used to help engage stakeholders, for example: creative sessions facilitated by a design specialist might involve imagining what versions of the new intervention might look like if designed by various well-known global manufacturers or creating a patient persona to help people think through the experiences of receiving an intervention. As well as participating in developing the intervention, stakeholders can help to shape the intervention development process itself. Members of the public, patients and service users are key stakeholders, and experts recommend planning to integrate their involvement into the intervention development process from the start.

Bring together a team and establish decision-making processes

Developers may choose to work within any size of team. Small teams can reach out to stakeholders at different points in the development process. Alternatively, large teams may include all the necessary expertise. Experts recommend including: experts in the problem to be addressed by the intervention; individuals with a strong track record in developing complex interventions; a behaviour change scientist when the intervention aims to change behaviour and people who are skilled at maximising engagement of stakeholders. Other possible team members include experts in evaluation methods and economics. Within a coproduction approach to development, key stakeholders participate as equal partners with researchers. Large teams can generate ideas and ensure all the relevant skills are available but may also increase the risk of conflicting views and difficulties when making decisions about the final intervention. There is no consensus on the size of team to have, but experts think it is important to agree a process for making decisions. In particular, experts recommend that team members understand their roles, rights and responsibilities; document the reasons for decisions made and are prepared to test different options where there are team disagreements.

Review published research evidence

Reviewing published research evidence before starting to develop an intervention can help to define the health problem and its determinants, understand the context in which the problem exists, clarify who the intervention should be aimed at, identify whether effective or cost-effective interventions already exist for the target population/setting/problem, identify facilitators and barriers to delivering interventions in this context and identify key uncertainties that need to be addressed using primary data collection. Continuing to review evidence throughout the process can help to address uncertainties that arise, for example, if a new substantive intervention component is proposed then the research evidence about it can be explored. Evidence can change quickly, and keeping up with it by reviewing literature can alert developers to new relevant interventions that have been found to be effective or cost-effective. Developers may be tempted to look for evidence that supports existing ideas and plans, but should also look for, and take into account, evidence that the proposed intervention may not work in the way intended. Undertaking systematic reviews is not always necessary because there may be recent relevant reviews available, nor is it always possible in the context of tight resources available to the development team. However, undertaking some review is important for ensuring that there are no existing interventions that would make the one under development redundant.

Draw on existing theories

Some developers call their approaches to intervention development ‘theory based’ when they draw on psychological, sociological, organisational or implementation theories, or frameworks of theories, to inform their intervention. 6 The rationale for drawing on existing theories is that they can help to identify what is important, relevant and feasible to inform the intended goals of the intervention 13 and inform the content and delivery of any intervention. It may be relevant to draw on more than one existing theory. Experts recommend considering which theories are relevant at the start of the development process. However, the use of theories may need to be kept under scrutiny since in practice some developers have found that their selected theory proved difficult to apply during the development process.

Articulate programme theory

A programme theory describes how a specific intervention is expected to lead to its effects and under what conditions. 14 It shows the causal pathways between the content of the intervention, intermediate outcomes and long-term goals and how these interact with contextual factors. Articulating programme theory at the start of the development process can help to communicate to funding agencies and stakeholders how the intervention will work. Existing theories may inform this programme theory. Logic models can be drawn to communicate different parts of the programme theory such as the causes of a problem, or the mechanisms by which an intervention will achieve outcomes, to both team members and external stakeholders. Figure 1 is an example of a logic model. The programme theory and logic models are not static. They should be tested and refined throughout the development process using primary and secondary data collection and stakeholder input. Indeed, they are advocated for use in process evaluations alongside outcome evaluations in the recent MRC Guidance on process evaluation. 15

Undertake primary data collection

Primary data collection, usually involving mixed methods, can be used for a range of purposes throughout the intervention development process. Reviewing the evidence base may identify key uncertainties that primary data collection can then address. Non-participant observation can be used to understand the setting in which the intervention will be used. Qualitative interviews with the target population or patient group can identify what matters most to people, their lived experience or why people behave as they do. ‘Verbal protocol’, which involves users of an intervention talking aloud about it as they use it, 16 can be undertaken to understand the usability of early versions of the intervention. Pretest and post-test measures may be taken of intermediate outcomes to begin early testing of some aspects of the programme theory, an activity that will continue into the feasibility and evaluation phases of the MRC framework and may lead to changes to the programme theory. Surveys, discrete choice experiments or qualitative interviews can be used to assess the acceptability, values and priorities of those delivering and receiving the intervention.

Understand the context

Recent guidance on context in population health intervention research identifies a breadth of features including those relating to population and individuals; physical location or geographical setting; social, economic, cultural and political influences and factors affecting implementation, for example, organisation, funding and policy. 17 An important context is the specific setting in which the intervention will used, for example, within a busy emergency department or within people’s homes. The rationale for understanding this context, and developing interventions which can operate within it, is to avoid developing interventions that fail during later evaluation because too few people deliver or use them. Context also includes the wider complex health and social care, societal or political systems within which any intervention will operate. 18 Different approaches can be taken to understand context, including reviews of evidence, stakeholder engagement and primary data collection. A challenge of understanding context is that it may change rapidly over the course of the development process.

Pay attention to future implementation of the intervention in the real world

The end goal of developers or those who fund development is real-world implementation rather than simply the development of an intervention that is shown to be effective or cost-effective in a future evaluation. 7 Many interventions do not lead to change in policy or practice, and it is important that effective interventions inform policy and are eventually used in the real world to improve health and care. To achieve this goal, developers may pay attention early on in the development process to factors that might affect use of the intervention, ‘scale up’ of the intervention for use nationally or internationally, and sustainability. For example, consideration of the cost of the intervention at an early stage, including as stakeholders official bodies or policymakers that would endorse or accredit the intervention or addressing the challenges of training practitioners in delivering the intervention may help its future implementation. Implementation-based approaches to intervention development are listed in table 2 . Some other approaches listed in this table, such as the Normalisation Process Theory, also emphasise implementation in the real world.

Design and refine the intervention

The term ‘design’ is sometimes used interchangeably with the term ‘development’. However, it is useful to see design as a specific creative part of the development process where ideas are generated, and decisions are made about the intervention components and how it will be delivered, by whom and where. Design starts with generation of ideas about the content, format, style and delivery of the proposed intervention. The process of design may use creative ways of generating ideas, for example, using games or physically making rough prototypes. Some teams include experts in design or use designers external to the team when undertaking this action. The rationale for a wide-ranging and creative design process is to identify innovative and workable ideas that may not otherwise have been considered.

After generating ideas, a mock up or prototype of the intervention or a key component may be created to allow stakeholders to offer views on it. Once an early version or prototype of the intervention is available, it can be refined (sometimes called optimised) using a series of rapid iterations where each iteration includes an assessment of how acceptable, feasible and engaging the intervention is, leading to cycles of refinements. The programme theory and logic models are important at this point, and developers may test whether some of their proposed mechanisms of action are impacting on intermediate outcomes if statistical power allows. The rationale for spending time on multiple iterations is that problems can be identified and solutions found prior to any expensive future feasibility or evaluation phase. Some experts take a quantitative approach to optimisation of an intervention, specifically the Multiphase Optimization Strategy in table 2 , but not all experts agree that this is necessary.

End the development phase

Seeing this endeavour as a discrete ‘intervention development phase’ that comes to an end may feel artificial. In practice, there is overlap between some actions taken in the development phase and the feasibility phase of the MRC framework, 1 such as consideration of acceptability and some measurement of change in intermediate outcomes. Developers may return to the intervention development phase if findings from the feasibility phase identify significant problems with the intervention. In many ways, development never stops because developers will continue to learn about the intervention, and refine it, during the later pilot/feasibility, evaluation and implementation phases. The intention may be that some types of intervention continuously evolve during evaluation and implementation, which may reduce the amount of time spent on the development phase. However, developers need to decide when to stop that first intensive development phase, either in terms of abandoning the intervention because pursuing it is likely to be futile or moving on to the next phase of feasibility/piloting testing or full evaluation. They also face the challenge of convincing potential funders of an evaluation that enough development has occurred to risk spending resources on its pilot or evaluation. The decision to end the development phase may be partly informed by practicalities, such as the amount of time and money available, and partly by the concept of data saturation (used in qualitative research) in that the intensive process stops when few refinements are suggested by those delivering or using the intervention during its period of refinement, or these and other stakeholders indicate that the intervention feels appropriate to them.

At the end of the development process, policymakers, developers or service providers external to the original team may want to implement or evaluate the intervention. Describing the intervention, using one of the relevant reporting guidelines such as the Template for Intervention Description and Replication Checklist 19 and producing a manual or document that describes the training as well as content of the intervention can facilitate this. This information can be made available on a website, and, for some digital interventions, the intervention itself can be made available. It is helpful to publish the intervention development process because it allows others to make links in the future between intervention development processes and the subsequent success of interventions and learn about intervention development endeavours. Publishing failed attempts to develop an intervention, as well as those that produce an intervention, may help to reduce research waste. Reporting multiple, iterative and interacting processes in these articles is challenging, particularly in the context of limited word count for some journals. It may be necessary to publish more than one paper to describe the development if multiple lessons have been learnt for future development studies.

Conclusions

This guidance on intervention development presents a set of principles and actions for future developers to consider throughout the development process. There is insufficient research evidence to recommend that a particular published approach or set of actions is essential to produce a successful intervention. Some aspects of the guidance may not be relevant to some interventions or contexts, and not all developers are fortunate enough to have a large amount of resource available to them, so a flexible approach to using the guidance is required. The best way to use the guidance is to consider each action by addressing its relevance to a specific intervention in a specific context, both at the start and throughout the development process.

Supplemental material

Acknowledgments.

This guidance is based on secondary and primary research. Many thanks to participants in the e-Delphis, consensus conference and qualitative interviews, to members of our Expert Panel and to people who attended workshops discussing this guidance. The researchers leading the update of the MRC guidance on developing and evaluating interventions, due to be published later this year, also offered insightful comments on our guidance to facilitate fit between the two sets of guidance.

  • Macintyre S , et al
  • Edwardson CL ,
  • Biddle SJH , et al
  • Hoddinott P
  • Bleijenberg N ,
  • de Man-van Ginkel JM ,
  • Trappenburg JCA , et al
  • Bartholomew Eldredge LK ,
  • Parcel GS ,
  • Kok G , et al
  • O’Cathain A ,
  • Sworn K , et al
  • Turner KM ,
  • Rousseau N ,
  • Croot L , et al
  • Harris R , et al
  • Dalgetty R ,
  • Miller CB ,
  • Dombrowski SU
  • W K Kellogg Foundation
  • Davidoff F ,
  • Dixon-Woods M ,
  • Leviton L , et al
  • Barker M , et al
  • Fonteyn ME ,
  • Kuipers B ,
  • Di Ruggiero E ,
  • Frohlich K , et al
  • Hawkins J , et al
  • Hoffmann TC ,
  • Glasziou PP ,
  • Boutron I , et al

Contributors AOC and PH led the development of the guidance, wrote the first draft of the article and the full guidance document which it describes, and integrated contributions from the author group into subsequent drafts. All authors contributed to the design and content of the guidance and subsequent drafts of the paper (AOC, PH, LY, LC, NR, KMT, ED, KS). The guidance is based on reviews and primary research. AOC led the review of different approaches to intervention development working with KS. LC led the review of primary research working with KS. PH led the qualitative interview study working with NR, KMT and ED. ED led the consensus exercise working with NR. AOC acts as guarantor.

Funding MRC-NIHR Methodology Research Panel (MR/N015339/1). Funders had no influence on the guidance presented here. The authors were fully independent of the funders.

Competing interests None declared.

Patient consent for publication Not required.

Provenance and peer review Not commissioned; externally peer reviewed.

Read the full text or download the PDF:

Internet Explorer is no longer supported by Microsoft. To browse the NIHR site please use a modern, secure browser like Google Chrome, Mozilla Firefox, or Microsoft Edge.

National Institute for Health and Care Research logo | Homepage

NIHR publishes new framework on complex interventions to improve health

uk medical research council intervention development framework

Published: 01 October 2021

The NIHR and The Medical Research Council (MRC) has launched a new complex intervention research framework.

The new framework provides an updated definition of complex interventions, highlighting the dynamic relationship between the intervention and its context.

Complex interventions are widely used in the health service, in public health practice, and in areas of social policy that have important health consequences, such as education, transport, and housing. 

Interest in complex interventions has increased rapidly in recent years. Given the pace and extent of methodological development, there was a need to update the core guidance and address some of the remaining weaknesses and gaps. 

Using the framework’s core elements

There are four main phases of research: intervention development or identification, e.g. from policy or practice, feasibility, evaluation, and implementation.

At each phase, the guidance suggests that six core elements should be considered: 

  • how does the intervention interact with its context?
  • what is the underpinning programme theory?
  • how can diverse stakeholder perspectives be included in the research? 
  • what are the key uncertainties?
  • how can the intervention be refined?
  • do the effects of the intervention justify its cost?  

These core elements can be used to decide whether the research should proceed to the next phase, return to a previous phase, repeat a phase, or stop.

The journey of a research project through the phases of complex intervention research is illustrated in the below NIHR-funded study: Football Fans In Training (FFIT) . 

A randomised controlled trial set in professional football clubs established the effectiveness and cost-effectiveness of the Football Fans in Training (FFIT) programme. FFIT aimed to help men lose at least 5-10% of their weight and keep it off over the long term. The programme was developed to appeal to Scottish football fans and to help them improve their eating and activity habits.

Researchers found that participation in FFIT leads to significant sustained weight loss and improvements in diet and physical activity. As well as losing weight, participants benefited from reduced waist size, less body fat and lower blood pressure, which can all be associated with a lower risk of heart disease, diabetes and stroke. The study team considered all of the 6 core elements of complex intervention research, during each of the four phases of the research. 

Implementation was considered from the outset, the study team engaged with key stakeholders in the development phase to explore how the intervention could be implemented in practice, if proven to be effective. 

A cost effectiveness analysis demonstrated that FFIT was inexpensive to deliver, making it appeal to decision makers for local and national health provision.  The positive and sustainable results have made the programme appealing for nations with similar public health priorities such as the reduction of obesity, heart disease and improvement of mental health.

Professor Hywel Williams, NIHR Scientific and Coordinating Centre Programmes Contracts Advisor, said: “This updated framework is a landmark piece of guidance for researchers working on such interventions. The updated guidance will help researchers to develop testable and reproducible interventions that will ultimately benefit NHS patients. The guidance also represents a terrific collaborative effort between the NIHR and MRC that I would like to see more of.”

Professor Nick Wareham, Professor Nick Wareham, Chair of MRC’s Population Health Sciences Group, said: “Previous versions of the guidance on the development and evaluation of complex interventions have been extremely influential and are widely used in the field. We are delighted that the successful partnership between MRC and NIHR has enabled the guidance to be updated and extended. It is particularly important to see how the new framework brings in thinking about the interplay between an intervention and the context in which it is applied.”

Dr Kathryn Skivington, Research Fellow, MRC/CSO Social and Public Health Sciences Unit and lead author of the framework, said: “The new and exciting developments for complex intervention research are of practical relevance and I feel sure they will stimulate constructive debate, leading to further progress in this area.”

Read the full paper, published in the British Medical Journal

Find out more in the NIHR Journals Library

Latest news

ADHD digital test approved for use by the NHS

NIHR welcomes £300m investment into clinical trials infrastructure bringing new medicines to patients across the UK

First UK lung cancer patient receives novel immune therapy at NIHR Clinical Research Facility

NIHR takes on management of Better Methods for Better Research (BMBR) Programme

NIHR awards £33.2m to inspire students into research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance

Affiliations.

  • 1 MRC/CSO Social and Public Health Sciences Unit, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK.
  • 2 Medical Research Council Lifecourse Epidemiology Unit, University of Southampton, Southampton, UK.
  • 3 Medical Research Council ConDuCT-II Hub for Trials Methodology Research and Bristol Biomedical Research Centre, Bristol, UK.
  • 4 Health Economics and Health Technology Assessment Unit, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK.
  • 5 Public Health Scotland, Glasgow, UK.
  • 6 Manchester Centre for Health Psychology, University of Manchester, Manchester, UK.
  • 7 London School of Hygiene and Tropical Medicine, London, UK.
  • 8 Faculty of Health and Medicine, Lancaster University, Lancaster, UK.
  • 9 Medical Research Council Epidemiology Unit, University of Cambridge, Cambridge, UK.
  • PMID: 34593508
  • PMCID: PMC8482308
  • DOI: 10.1136/bmj.n2061

The UK Medical Research Council’s widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research.

PubMed Disclaimer

Conflict of interest statement

Competing interests: All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/coi_disclosure.pdf and declare: support from the NIHR, MRC, and the funders listed above for the submitted work; KS has project grant funding from the Scottish Government Chief Scientist Office; SAS is a former member of the NIHR Health Technology Assessment Clinical Evaluation and Trials Programme Panel (November 2016 - November 2020) and member of the Chief Scientist Office Health HIPS Committee (since 2018) and NIHR Policy Research Programme (since November 2019), and has project grant funding from the Economic and Social Research Council, MRC, and NIHR; LMo is a former member of the MRC-NIHR Methodology Research Programme Panel (2015-19) and MRC Population Health Sciences Group (2015-20); JB is a member of the NIHR Public Health Research Funding Committee (since May 2019), and a core member (since 2016) and vice chairperson (since 2018) of a public health advisory committee of the National Institute for Health and Care Excellence; JMB is a former member of the NIHR Clinical Trials Unit Standing Advisory Committee (2015-19); DPF is a former member of the NIHR Public Health Research programme research funding board (2015-2019), the MRC-NIHR Methodology Research Programme panel member (2014-2018), and is a panel member of the Research Excellence Framework 2021, subpanel 2 (public health, health services, and primary care; November 2020 - February 2022), and has grant funding from the European Commission, NIHR, MRC, Natural Environment Research Council, Prevent Breast Cancer, Breast Cancer Now, Greater Sport, Manchester University NHS Foundation Trust, Christie Hospital NHS Trust, and BXS GP; EM is a member of the NIHR Public Health Research funding board; MP has grant funding from the MRC, UK Prevention Research Partnership, and NIHR; JR-M is programme director and chairperson of the NIHR’s Health Services Delivery Research Programme (since 2014) and member of the NIHR Strategy Board (since 2014); MW received a salary as director of the NIHR PHR Programme (2014-20), has grant funding from NIHR, and is a former member of the MRC’s Population Health Sciences Strategic Committee (July 2014 to June 2020). There are no other relationships or activities that could appear to have influenced the submitted work.

Framework for developing and evaluating…

Framework for developing and evaluating complex interventions. Context=any feature of the circumstances in…

Similar articles

  • A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance. Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, Boyd KA, Craig N, French DP, McIntosh E, Petticrew M, Rycroft-Malone J, White M, Moore L. Skivington K, et al. Int J Nurs Stud. 2024 Jun;154:104705. doi: 10.1016/j.ijnurstu.2024.104705. Epub 2024 Feb 24. Int J Nurs Stud. 2024. PMID: 38564982 Review.
  • Outcomes Part I: What Makes a Good Outcome Measure? Kovach CR. Kovach CR. Res Gerontol Nurs. 2019 Nov 1;12(6):271-273. doi: 10.3928/19404921-20191024-01. Res Gerontol Nurs. 2019. PMID: 31755963 No abstract available.
  • Using natural experiments to evaluate population health interventions: new Medical Research Council guidance. Craig P, Cooper C, Gunnell D, Haw S, Lawson K, Macintyre S, Ogilvie D, Petticrew M, Reeves B, Sutton M, Thompson S. Craig P, et al. J Epidemiol Community Health. 2012 Dec;66(12):1182-6. doi: 10.1136/jech-2011-200375. Epub 2012 May 10. J Epidemiol Community Health. 2012. PMID: 22577181 Free PMC article.
  • Reporting guidelines for case reports and new methods standards for missing data. Tugwell P, Knottnerus JA, Idzerda L. Tugwell P, et al. J Clin Epidemiol. 2014 Jan;67(1):1-2. doi: 10.1016/j.jclinepi.2013.11.001. J Clin Epidemiol. 2014. PMID: 24262767 No abstract available.
  • A catalogue of reporting guidelines for health research. Simera I, Moher D, Hoey J, Schulz KF, Altman DG. Simera I, et al. Eur J Clin Invest. 2010 Jan;40(1):35-53. doi: 10.1111/j.1365-2362.2009.02234.x. Eur J Clin Invest. 2010. PMID: 20055895 Review.
  • How effective are allied health group interventions for the management of adults with long-term conditions? An umbrella review of systematic reviews and its applicability to the Australian primary health system. Dennis S, Kwok W, Alison J, Hassett L, Nisbet G, Refshauge K, Sherrington C, Williams A. Dennis S, et al. BMC Prim Care. 2024 Sep 4;25(1):325. doi: 10.1186/s12875-024-02570-7. BMC Prim Care. 2024. PMID: 39232663 Free PMC article.
  • [Skill-grade mix and shared governance in the intensive care unit: development of a management triangle and the advancement of nursing roles]. Siegling C, Mertins E, Wefer F, Bolte C, Krüger L. Siegling C, et al. Med Klin Intensivmed Notfmed. 2024 Sep 4. doi: 10.1007/s00063-024-01175-3. Online ahead of print. Med Klin Intensivmed Notfmed. 2024. PMID: 39231839 Review. German.
  • MoveONParkinson: developing a personalized motivational solution for Parkinson's disease management. Alves B, Mota PR, Sineiro D, Carmo R, Santos P, Macedo P, Carreira JC, Madeira RN, Dias SB, Pereira CM. Alves B, et al. Front Public Health. 2024 Aug 19;12:1420171. doi: 10.3389/fpubh.2024.1420171. eCollection 2024. Front Public Health. 2024. PMID: 39224558 Free PMC article.
  • Core Principles and Practices for the Design, Implementation, and Evaluation of Social and Behavior Change for Nutrition in Low- and Middle-Income Contexts with Special Applications for Nutrition-Sensitive Agriculture. Packard-Winkler M, Golding L, Tewodros T, Faerber E, Webb Girard A. Packard-Winkler M, et al. Curr Dev Nutr. 2024 Jul 14;8(8):104414. doi: 10.1016/j.cdnut.2024.104414. eCollection 2024 Aug. Curr Dev Nutr. 2024. PMID: 39224137 Free PMC article. Review.
  • An exploration of prenatal breastfeeding self-efficacy: a scoping review. McGovern LM, O'Toole L, Laws RA, Skinner TC, McAuliffe FM, O'Reilly SL. McGovern LM, et al. Int J Behav Nutr Phys Act. 2024 Sep 2;21(1):95. doi: 10.1186/s12966-024-01641-3. Int J Behav Nutr Phys Act. 2024. PMID: 39223645 Free PMC article. Review.
  • Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M, Medical Research Council Guidance . Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 2008;337:a1655. 10.1136/bmj.a1655. - DOI - PMC - PubMed
  • Craig P, Dieppe P, Macintyre S, et al. . Developing and evaluating complex interventions: new guidance. Medical Research Council, 2006.
  • Campbell M, Fitzpatrick R, Haines A, et al. . Framework for design and evaluation of complex interventions to improve health. BMJ 2000;321:694-6. 10.1136/bmj.321.7262.694. - DOI - PMC - PubMed
  • O’Cathain A, Croot L, Duncan E, et al. . Guidance on how to develop complex interventions to improve health and healthcare. BMJ Open 2019;9:e029954. 10.1136/bmjopen-2019-029954 - DOI - PMC - PubMed
  • Craig P, Cooper C, Gunnell D, et al. . Using natural experiments to evaluate population health interventions: new Medical Research Council guidance. J Epidemiol Community Health 2012;66:1182-6. 10.1136/jech-2011-200375 - DOI - PMC - PubMed
  • Search in MeSH

Related information

  • Cited in Books

Grants and funding

  • SPHSU14/CSO_/Chief Scientist Office/United Kingdom
  • MC_UU_00022/2/MRC_/Medical Research Council/United Kingdom
  • SPHSU11/CSO_/Chief Scientist Office/United Kingdom
  • SPHSU18/CSO_/Chief Scientist Office/United Kingdom
  • SPHSU15/CSO_/Chief Scientist Office/United Kingdom
  • MC_UU_12015/6/MRC_/Medical Research Council/United Kingdom
  • MC_PC_13027/MRC_/Medical Research Council/United Kingdom
  • MR/K025643/1/MRC_/Medical Research Council/United Kingdom
  • SPHSU16/CSO_/Chief Scientist Office/United Kingdom
  • SPHSU13/CSO_/Chief Scientist Office/United Kingdom
  • MC_UU_00006/7/MRC_/Medical Research Council/United Kingdom
  • MC_UU_12017/11/MRC_/Medical Research Council/United Kingdom
  • MC_UU_12017/14/MRC_/Medical Research Council/United Kingdom
  • MC_UU_00022/1/MRC_/Medical Research Council/United Kingdom
  • MC_UU_00022/3/MRC_/Medical Research Council/United Kingdom
  • MC_UU_12011/4/MRC_/Medical Research Council/United Kingdom
  • SPHSU17/CSO_/Chief Scientist Office/United Kingdom
  • MC_UU_12017/15/MRC_/Medical Research Council/United Kingdom

LinkOut - more resources

Full text sources.

  • Enlighten: Publications, University of Glasgow
  • Europe PubMed Central
  • PubMed Central
  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

  • Research article
  • Open access
  • Published: 23 January 2018

Using the Medical Research Council framework for development and evaluation of complex interventions in a low resource setting to develop a theory-based treatment support intervention delivered via SMS text message to improve blood pressure control

  • Kirsten Bobrow   ORCID: orcid.org/0000-0002-2452-2482 1 , 2 , 7 ,
  • Andrew Farmer 3 ,
  • Nomazizi Cishe 4 ,
  • Ntobeko Nwagi 4 ,
  • Mosedi Namane 5 ,
  • Thomas P. Brennan 6 ,
  • David Springer 6 ,
  • Lionel Tarassenko 6 &
  • Naomi Levitt 1 , 2  

BMC Health Services Research volume  18 , Article number:  33 ( 2018 ) Cite this article

16k Accesses

23 Citations

3 Altmetric

Metrics details

Several frameworks now exist to guide intervention development but there remains only limited evidence of their application to health interventions based around use of mobile phones or devices, particularly in a low-resource setting. We aimed to describe our experience of using the Medical Research Council (MRC) Framework on complex interventions to develop and evaluate an adherence support intervention for high blood pressure delivered by SMS text message. We further aimed to describe the developed intervention in line with reporting guidelines for a structured and systematic description.

We used a non-sequential and flexible approach guided by the 2008 MRC Framework for the development and evaluation of complex interventions.

We reviewed published literature and established a multi-disciplinary expert group to guide the development process. We selected health psychology theory and behaviour change techniques that have been shown to be important in adherence and persistence with chronic medications. Semi-structured interviews and focus groups with various stakeholders identified ways in which treatment adherence could be supported and also identified key features of well-regarded messages: polite tone, credible information, contextualised, and endorsed by identifiable member of primary care facility staff. Direct and indirect user testing enabled us to refine the intervention including refining use of language and testing of interactive components.

Conclusions

Our experience shows that using a formal intervention development process is feasible in a low-resource multi-lingual setting. The process enabled us to pre-test assumptions about the intervention and the evaluation process, allowing the improvement of both. Describing how a multi-component intervention was developed including standardised descriptions of content aimed to support behaviour change will enable comparison with other similar interventions and support development of new interventions. Even in low-resource settings, funders and policy-makers should provide researchers with time and resources for intervention development work and encourage evaluation of the entire design and testing process.

Trial registration

The trial of the intervention is registered with South African National Clinical Trials Register number (SANCTR DOH-27-1212-386; 28/12/2012); Pan Africa Trial Register (PACTR201411000724141; 14/12/2013); ClinicalTrials.gov ( NCT02019823 ; 24/12/2013).

Peer Review reports

Raised blood pressure is an important and common modifiable risk factor for cardiovascular and related diseases including stroke and chronic kidney disease [ 1 ]. Although evidence exists that lowering blood pressure substantially reduces this risk, strategies to achieve sustained blood pressure control are complex. These include modifying a range of behaviours related to health including attending clinic appointments, taking medication regularly and persisting with treatment [ 2 , 3 , 4 , 5 ].

Mobile communications technology has the potential to support behaviour change and treatment adherence in real time by facilitating remote, interactive, timely access to relevant information, providing context-specific support and prompts to action [ 6 ].

Systematic reviews of health behaviour change interventions delivered by mobile phones or devices (m-health) have shown small beneficial effects for some conditions in some settings but results are not consistent [ 7 , 8 ]. Some though not all trials have shown modest effects on treatment adherence and disease outcomes for m-health interventions among adults living with HIV [ 9 , 10 ]. Similar results have been found in trials of m-health interventions to support behaviour change for people with high blood pressure, diabetes, and heart disease [ 11 , 12 ].

Behavioural interventions, including those delivered using m-health technologies are often not systematically developed, specified, or reported [ 13 ]. The potential to accumulate evidence of effectiveness and to identify the “active components” in successful m-health interventions depends in part on replication of successful interventions across settings and in part on refining interventions (adding or subtracting elements) using evidence of behaviour change [ 14 ]. Adequate descriptions of the theory of the intervention and specific intervention components are needed to extend the evidence base in the field and to facilitate evidence synthesis [ 15 ].

Several frameworks are now available to guide intervention development but there is limited evidence of their application to describe the development of m-health interventions particularly in resource constrained settings [ 16 , 17 ]. The Medical Research Council (MRC) Framework for the development of complex interventions (initially published in 2000 and up-dated in 2008) has been used successfully across disciplines which suggest its flexible, non-linear approach may be usefully applied to the iterative design processes used in the development of new technology-based systems [ 15 , 18 , 19 , 20 ].

The aim of this paper is to describe our experience of using the 2008 MRC framework to develop and test a theory-based behaviour change intervention to support adherence to high blood pressure treatment delivered by mobile phone text message; to reflect on the benefits and challenges of applying this framework in a resource constrained setting, and to describe the final intervention in line with reporting guidelines for a structured and systematic description [ 13 ].

We used a non-sequential, flexible approach guided by the 2008 MRC Framework for the development and evaluation of complex interventions (see Fig.  1 ) [ 18 ]. Table  1 shows the stages of the 2008 MRC framework alongside with the activities we undertook in the development process. Implicit in this development process is the identification of contextual factors that can affect outcomes [ 21 ].

Process of intervention development adapted from Smith et al. [ 20 ]

Developing a complex intervention

Identifying the evidence base.

We searched PubMed, Cochrane reviews, and Google for systematic reviews and published original studies from 2000 onwards that were written in English. We used search terms including “mobile health, text-message, adherence, high blood pressure, hypertension and adherence”. We revisited the literature and narrowed the focus of our reviews as development of the intervention progressed. We set up automatic alerts to monitor the relevant literature for updates.

There is some evidence that clinical outcomes for treatment of chronic conditions can be modestly improved through targeting adherence behaviour [ 4 , 22 ], with a number of trials in hypertension [ 5 , 23 , 24 , 25 ]. The most effective strategies for improving adherence were complex including combinations of more instructions and health education for patients, disease and treatment-specific adherence counselling, automated and in-person telephone follow-up, and reminders (for pills, appointments, and prescription refills) [ 22 ]. In addition some strategies can be costly, for example with case management and pharmacy-based education. These approaches may not be practical in a low-resource setting.

Some studies report that mobile phone messaging interventions may provide benefit in supporting the self-management of long-term illnesses [ 7 , 26 ], and have the potential to support lifestyle change, including smoking cessation [ 7 , 27 ]. However randomised trials of the effectiveness of mobile phone messaging in the management of hypertension are few, include additional components (telemonitoring), often focus on high-risk groups such as stroke survivors and renal transplant recipients, and are based in high-resource settings [ 23 , 28 , 29 , 30 , 31 ].

Identifying appropriate theory

We set up an expert multi-disciplinary group comprising two specialist general practitioners, two specialist physicians, three biomedical engineers, a health systems researcher, and an epidemiologist. As a team we met formally to agree upon the research problem and the underlying principles guiding the intervention development process. Thereafter we worked in smaller groups to develop the intervention. We maintained written logs of the iterative steps of the intervention development and remained in regular contact with the full group via email and teleconference. When the group met formally we reported on technical progress, resource allocation, implementation issues, and new evidence from the literature or the field.

We used semi-structured interviews and focus groups with three stakeholder groups: (1) Patients with high blood pressure and other chronic diseases ( n  = 35), (2) primary care health professionals (general practitioners, professional nurses, staff nurses, pharmacists, allied health professionals, reception staff) ( n  = 12), (3) health care system service providers and subcontractors (provincial health systems managers ( n  = 5), third party providers of off-site pre-packaged repeat prescription services ( n  = 3)).

South Africa is a middle-income country with high levels of income inequality and a quadruple burden of disease (HIV/AIDS, maternal and child, non-communicable diseases, violence.) [ 32 , 33 ] Health care is provided for most South Africans (over 80%) by publicly funded state run facilities the foundation of which are primary care facilities. Medical doctors and nurses (some who have the right to prescribe medications) staff facilities and provide diagnostic, and monitoring services; treatment including all medications is free for patients attending primary care (user fees for primary care were abolished in 1997.) [ 34 ] National guidelines for the treatment of high blood pressure exist and are regularly updated [ 35 ].There is an essential drugs list and medicines for high blood pressure available in primary care include thiazide and other diuretics, calcium channel blockers, ace-inhibitors, and beta-blockers. Patients maybe prescribed other anti-hypertensive agents like ARBs by specialists. Statins and Aspirin are also available [ 36 ].

With the stakeholders described above we explored the problem of high blood pressure and poor control in busy and resource constrained publically-funded primary care facilities. A range of problems were identified that were seen as barriers to providing optimal care and potential targets for intervention. These included organisation of care (failure of systems for referral between primary and secondary care and medication access), service provision (failure of clinicians to adhere to management guidelines), and patient-level factors such as sub-optimal self-management and treatment adherence. From discussion with the various stakeholder groups it emerged that patient-level factors resulting in failure to attend clinic appointments and collect and take medication regularly was both a major concern and a feasible and acceptable potential target for developing an intervention to improve blood pressure control. The underlying hypothesis was that facilitating communication between patients and the health care system might lead to changes in treatment adherence behaviour and improve health outcomes.

Use of mobile devices for intervention delivery

We framed the use of mobile phones as contextual tools that could deliver support messages when and where needed i.e. at times and places outside of a health care visit (ecological momentary intervention) [ 6 ]. We focused on using widely available existing communication protocols (for example short message service or SMS text messages) that are back-compatible (even the most basic device can send and receive text messages), and adapting participants’ existing technical skills to health specific behaviours rather than focusing on acquisition of new technical skills (for example by giving participants smart phones and asking them to use an app-based intervention).

Behaviour change theory

We explored a range of social cognition models and selected the I-CHANGE model that integrated multiple different elements (awareness, motivation, and action) which have been shown to be important in adherence and persistence with chronic medications [ 22 , 37 ].

Behaviour change techniques

We used behaviour change theory to identify areas of belief or behaviour that might contribute to problems in collecting or taking medication. We then developed and refined the message content and mapped the messages to a common taxonomy of evidence-based behaviour change techniques [ 38 ].

Modelling (phase I)

We tested assumptions about the clarity, perceived usefulness and importance of individual text messages with stakeholder groups. To decide on the most appropriate tone and style of content delivery we tested individual SMS text messages using three different communication styles (directive, narrative vignette, or request). Stakeholders were asked when and how frequently adherence support messages should be sent. Messages that were unclear or ambiguous were modified; messages that were perceived by both patients and providers as not being useful or important were discarded. Patients’ thoughts and comments were also used to generate new content for new messages which were then again mapped to the taxonomy of behaviour change techniques and added to the message library. In addition, we engaged with the two local Community Advisory Boards (made up of community members and clinic patients who act as elected liaisons between the health facility and the community) who provided additional guidance and feedback on the intervention components and other study materials.

Using evidence from the literature alongside the findings from the semi-structured interviews of clinic and pharmacy staff at four representative primary care facilities in Cape Town the group agreed that in order to change adherence-related behaviours the intervention would need to,

Remind patients about up-coming scheduled clinic appointments

Provide relevant health-related information

Help participants plan and organise various treatment adherence behaviours including medication collection and taking, diet, and exercise

Support positive adherence-related behaviours

Help navigate the health care system (e.g. what to do if the patient ran out of medications)

Table  2 gives examples of the SMS texts that were developed, mapped to the taxonomy of behaviour change techniques (along with definitions and message timing) [ 38 ].

Patients and providers reported their thought that all people with high blood pressure could benefit from an intervention. Stakeholder groups reported disliking the idea of trying to target the intervention to particular patient groups, for example those with poor blood pressure control or those who only attend the clinic infrequently. Providers and health system managers cited concerns over the logistics of identifying such groups while patients reported concerns over perceptions of favouritism unrelated to illness severity. Patient groups expressed the opinion that “everyone with high blood pressure” should be offered the intervention and that people who didn’t want the intervention should be allowed to opt out.

Health care providers, particularly front-line staff preferred individual texts presented in a directive-style, for example, “You must take your medicine even if you feel well”. Reasons for this included the need to convey to patients the importance of the information being presented. In contrast all of the patient groups strongly preferred messages that were styled as polite requests, for example, “Please keep taking all your medicine even if you feel well”. Both groups were ambivalent about the use of narrative vignettes (for example “Busi in Langa: I bring my empties to clinic, then they can see I eat my pills right”.) Contextual aspects of the messages were also important (information specific to the clinic) as was the perceived authority of the message – messages signed off by a named provider were valued more highly by participants who felt they would be more likely to respond to such a message. Participants also reported that this was more important than using their name at the start of a message. Providers reported that it would be acceptable for senior staff at the facility to be named (i.e. sign off) in an SMS text as long as the messages were in line with Department of Health guidelines.

Individual SMS text messages are typically limited to 160 characters including spaces. We found that using short simple words was more acceptable to stakeholders than “textese” (a form of text-based slang using non-standard spelling and grammar). We minimised the use of contractions, using only “pls” for “please” and “thnks” for “thank you” and an abbreviation for the clinic name. All stakeholders groups reported on the value of having messages available in a variety of local languages though they did acknowledge that most people text in English (in part because of ambiguities in meaning that can arise from informal word shortening).

Participants reported that they valued the idea of being able to choose the time at which a message was sent so that it would not interfere with other commitments e.g. work or religious-activities. All stakeholders reported valuing the idea of a follow-up text message in the event a participant missed a scheduled appointment. On the basis of these discussions we decided to send follow-up messages to all participants thanking those who had attended on time and encouraging those who had not attended to please rebook their appointment. However, concerns were raised by providers and participants about the appropriate length of time between reminder messages and appointment dates so that people could make changes to their schedule or get in touch with the clinic to change their appointment. As a result, the messages were sent 48 h before and after a scheduled appointment.

Concerns about the potential costs of the intervention to the user were raised by all stakeholder groups. Specific concerns were raised about how to deliver an interactive intervention at little or no cost to the end-user. Solutions which have been used in other settings such as providing small amounts of credit to end-users to engage with an interactive system were rejected by health systems managers and sub-contractors due to concerns that the intervention would be too costly to deliver sustainably at scale. As a result of the telecommunications market in South Africa at the time it was not possible to use free-to-user short codes for interactive SMS text messagss.

Final interventions

The final interventions consisted of an adherence support intervention delivered by a weekly information-only (unidirectional) or interactive (bidirectional) SMS text message delivered at a time and in a language of the user’s choice. Messages were endorsed (signed off by a named provider) and contained content that was credible to both providers and patients and addressed a broad range of barriers to treatment adherence common in the local context. Reminder prompt text messages were sent 48 h before a scheduled clinic appointment (for a follow-up visit or to collect medication) with a follow-up message 48 h later either to thank participants for attending their appointment or to encourage them to rebook in the event of a missed appointment. To enable the system to be interactive we developed a system using free-to-user “Please Call Me” (or Call Me Back Code which is a service available on all local networks which allows a user to prompt someone else to call them) or missed-calls that enabled users to generate automated responses that allowed them to cancel or change their appointment, and change the time and language of the SMS text-messages.

By designing an intervention that was perceived in user-testing to be sent suitably frequently to keep users engaged (but not annoy them), contained content that was useful and could be trusted, and was phrased using polite and respectful language we felt the intervention would increase awareness and support motivation and actions to improve adherence to treatment for high blood pressure.

Modelling process and outcomes

We used a causal model to link theoretically relevant behavioural determinants to specific adherence related behaviours. We linked these to health impacts and outcomes along a hypothesised casual pathway [ 7 , 8 ]. We used validated measures to assess important variables along the causal pathway. (See Fig.  2 ).

Hypothesised causal pathways and measures for evaluation for SMS text Adherence suppoRt (StAR) trial

Assessing feasibility and piloting methods

Availability and use of mobile phones among adults with chronic diseases attending primary care services in south africa.

To test assumptions about access to and use of mobile phones we conducted a cross-sectional survey among adult patients attending any one of five community health centres in the Western Cape Metro Health Districts for treatment of hypertension and other chronic diseases. These outpatient facilities provide comprehensive primary care services for people living in the surrounding areas.

At primary health care level, the service is based on prevention by educating people about the benefits of a healthy lifestyle. Every clinic has a staff member who has the skills to diagnose and manage chronic conditions, from young to elderly patients. Patients can see the same nurse for repeat visits if they come regularly on the hypertension, diabetes or asthma clinic day. Counselling, compliance, and health education are also part of usual care. The service is led by clinical nurse practitioners and supported by doctors.

Arrangements are made by the clinic to minimise patient travel (especially by the elderly) by prescribing supplies of drugs that last one to 3 months. Staff often facilitate the initiation of clubs and special support groups for people with chronic diseases. In this way, a patient can get more information on special care and health education pertaining to their condition.

These primary-level services are supported and strengthened by other levels of care, including acute and specialised referral hospitals. If complications arise, patients will be referred to the next level of care [ 39 , 40 ].

The interviewer-administered questionnaire asked about socio-demographic factors, contact with the clinic, chronic diseases and treatment burdens and about access to and the use of mobile phones. We sampled consecutive consenting adults from outpatient services over a period of 6 weeks. A total of 127 willing and eligible adult patients completed the survey (see Table  3 ). Mean age (SD) was 53.3 (14) years, 73% were women, and two-thirds had at least some high-school education. Ninety percent of participants reported having regular (daily) access to a mobile phone and 80% reported that their phone was with them most or all of the time. Most participants did not share their phones (76%); women reported sharing more frequently than men (24% versus 13%). Most participants (76%) reported having registered their phone numbers in their name and that they had had the same mobile phone number for two or more years (63%).

70% of participants reported feeling very confident about using their phone to receive an SMS text messages, while fewer participants (55%) were as confident about sending SMS text messages. Most (70%) felt very confident about sending a “Please Call Me” a free service provided by South African telecommunications providers across all local networks. Fewer than 10% of participants reported knowing how to use unstructured supplementary service data (USSD) communication protocol services like mobile-phone banking.

When asked about preferred ways for the clinic to be in touch, more women (74%) than men (47%) preferred SMS text messages to phone calls or other methods like home visits. The majority of participants reported that they would find reminders to attend up-coming clinic appoints (92%), collect medications (94%), and take medications (87%) helpful.

Testing procedures

To optimise the intervention and test the technical systems responsible for message delivery we service tested the full intervention package with 19 patients recruited from patient-stakeholder focus groups. We tested the messages in the three languages most commonly used in Cape Town (English, Afrikaans, isiXhosa). Participants were contacted on a weekly basis by a researcher (experienced in qualitative methods) for a semi-structured interview on their experience of the intervention and the SMS text message delivery system. Suggestions were discussed with the intervention development team and changes were made where necessary.

Estimating recruitment and retention

In consultation with the local department of health we identified primary care health centres with high patient loads which might be suitable for a clinical trial to test the intervention. We visited each site to confirm the numbers of patients with high blood pressure using clinic registers, to map out patient flow through the clinic so we could operationalise trial procedures and identify potential challenges and barriers to implementation. One of the requirements for approval from the local department of health for research in public facilities is that normal activities are not interrupted. We therefore selected a health centre with a high caseload of patients with “chronic diseases of lifestyle” where we could recruit, screen, and enrol trial participants without interrupting the usual flow of patients through the clinic services.

We estimated we would be able to screen and recruit between 45 and 120 participants per week based on the functioning of the clinic and the experience of other local researchers [ 41 ]. We tested our capacity to recruit and screen participants using a clinical service offering blood pressure measurement to all patients attending the clinical service prior to the start of the trial. We tested trial registration and enrolment procedures including the receipt of an initial SMS text message at the time of enrolment. We collected detailed contact information on participants as well as the details of two next of kin (or similar) to maximise our chances of remaining in contact with participants for the duration of the trial. We monitored recruitment and retention on an ongoing basis.

Determining sample size

Adequately powered trials of the effects of m-health interventions on important clinical outcomes are required to develop the evidence base for these approaches [ 7 , 8 ]. As blood pressure is strongly and directly related to mortality we selected change in mean systolic blood pressure at 12-months from baseline as our primary outcome (clinical). We selected medication adherence (behavioural) as a secondary outcome. We decided not to report these as co-primary outcomes. As there were no previous trials of the effect of adherence support delivered by SMS text on blood pressure measures we used data from published trials of behavioural interventions delivered using other methods to estimate sample size [ 42 ]. A decrease in systolic blood pressure of 5 mmHg is associated with clinically important reduction in the relative risk of stroke and coronary heart disease events [ 43 ]. Based on a study population similar to that expected for the trial population we used the standard deviation (SD) of systolic blood pressure (22.0 mmHg) to calculate the required sample size [ 44 ]. We proposed an intended target sample size of 1215 participants, allowing for 20% loss to follow-up, (at least 405 in each group) to detect an absolute mean difference in SBP of 5 mmHg (SD 22) at 12 months from baseline, with 90% power and 0.05 (two-sided) level of significance. We used an intention to treat (ITT) approach for all analyses.

Evaluating a complex intervention

Assessing effectiveness.

We decided that the most appropriate design to evaluate the effectiveness of adherence support via SMS text message would be a large single blind (concealed outcome assessment), individually randomised controlled trial. As the effect on clinical outcomes of an informational versus interactive system of SMS text messages was unclear from the published literature we decided to include two interventions; information-only SMS texts, and interactive SMS texts [ 8 , 45 ]. To try to assess the effects of the behavioural intervention beyond receipt of an SMS text message from the clinic, we decided the control group would receive simple, infrequent text messages (less than one per month) related to the importance of ongoing trial participation [ 46 ]. Details of the intervention can be found in the TIDieR checklist (Table 4 ).

The trial is registered with the South African National Clinical Trials Register number (SANCTR DOH-27-1212-386; 28/12/2012); Pan Africa Trial Register (PACTR201411000724141; 14/12/2013); ClinicalTrials.gov (NCT02019823; 24/12/2013).

Understanding change processes

Implementation. fidelity assessment.

SMS text messages were sent using an automated system independent of trial and clinical staff. Participants were informed that not everyone would be receiving the exact same messages. Participants were also asked not to share the SMS text messages with others. Intervention fidelity was ensured by confirming the receipt at least of an initial “Welcome” SMS text message for all enrolled trial participants prior to randomisation. Message delivery reports were monitored throughout the trial to check the intervention was being delivered as planned. In addition, we also set up a system of sentinel-phones (using the five most common entry-level handsets in South Africa) registered and allocated to receive messages in the same way as trial participants.

The trial interventions were delivered separately from the health care workers providing usual clinical care for participants. For each anticipated study visit (enrolment, 6-month follow-up, 12-month follow-up) standardised protocols were used. Structured logs were used to record detailed information for any interactions between trial staff and participants outside of expected study visits.

Contextual factors

In the final stages of the trial we conducted an independent process evaluation to explore the implementation of the intervention, contextual factors, and potential mechanisms of action. We employed a qualitative design using focus groups and in-depth interviews. The findings from the evaluation have been reported separately [ 47 ].

Cost-effectiveness analyses

We collected information on the costs of developing and delivering the intervention. The findings from this analysis will be reported separately.

Implementation and beyond

Dissemination.

The potential to accumulate evidence of effectiveness and to identify the “active components” in successful m-health interventions depends in part on replication of successful interventions across settings and refining interventions (adding or subtracting elements) using evidence of behaviour change. To facilitate use and adaptation of our intervention we have used recommendations for reporting intervention development to ensure we have described the intervention and its delivery in sufficient detail [ 13 ]. We have registered the trial and published the trial protocol [ 48 ], and we will publish the trial results (using CONSORT reporting guidelines) in an open access journal. We have also published the findings from the process evaluation [in press]. We have reported findings to participants, health care workers, policy makers, and funders.

Surveillance, monitoring, and long-term follow up

We obtained permission to collect routine health data (dispensing and adherence data) from trial participants for a period of 6 months after the trial ended to explore for persistence of effects of the intervention (if any).

Main findings

Using the MRC Framework was feasible in a low-resource multi-lingual setting. The adoption of the framework enabled us to develop a theory- and evidence-based intervention; to specify a proposed causal pathway to modify adherence behaviour and clinical outcomes; to test and refine the intervention delivery system; to design a randomised evaluation of the intervention; and to test and evaluate proposed study procedures.

What is already known on this topic

Mobile devices are a promising approach for delivering health interventions [ 7 , 8 ]. Replication of study findings is hampered by the lack of adequate description of specific intervention components and their theoretical basis [ 13 , 14 , 15 ]. A number of frameworks have been proposed in the health and technology fields to aid the design of technology-based interventions [ 16 , 49 , 50 , 51 ]. The 2008 MRC Framework has been used to design and evaluate interventions across disciplines which suggest its flexible, comprehensive, non-linear, iterative approach may be applicable to the design and evaluation of m-health interventions [ 15 , 17 , 18 , 19 , 20 ].

What this study adds

This paper shows how the framework can be operationalised for an m-health intervention by explicitly mapping the activities, development, and testing to the stages of the 2008 MRC framework. We have also included detailed descriptions of the various aspects of the intervention and its delivery, reported in-line with TIDieR guidelines, which will enable comparison with other m-health interventions and support development of new interventions. Lastly, we have demonstrated that it is feasible and beneficial to use this approach in a multi-lingual low-resource setting.

Limitations of this study/framework

Sufficient time and resources need to be available to apply the Framework and benefit from the iterative development process and from testing of study-related procedures. For example, it took us several months longer than anticipated to complete the intervention development and testing in part because in resource constrained settings like the public health facilities in South Africa it can be challenging for frontline service staff to find time to engage in intervention design activities (interviews, discussions, message library review.)

Whilst the intervention development work was carried out at several sites the clinical trial was at a single-site. In future, we will engage in both development and testing across sites to tease out factors that are common and unique for specific mHealth interventions.

Lastly, attention also needs to be given to field testing of recruitment and retention strategies as there are many instances where trials of mHealth interventions in similar settings are inconclusive because of poor recruitment and high rates of loss to follow-up [ 52 , 53 ].

The MRC Framework can be successfully applied to develop and evaluate m-health interventions in a multi-lingual resource-constrained setting. Detailed descriptions of the development process, the intervention and its delivery may advance the evidence-base for m-health interventions, enabling comparison, adaption, and development of interventions.

Abbreviations

Medical Research Council

SMS text Adherence suppoRt trial

Lim SS, Vos T, Flaxman AD, Danaei G, Shibuya K, Adair-Rohani H, et al. A comparative risk assessment of burden of disease and injury attributable to 67 risk factors and risk factor clusters in 21 regions, 1990-2010: a systematic analysis for the global burden of disease study 2010. Lancet. 2012;380(9859):2224–60.

Article   PubMed   PubMed Central   Google Scholar  

Lewington S, Clarke R, Qizilbash N, Peto R, Collins R, Prospective Studies Collaboration. Age-specific relevance of usual blood pressure to vascular mortality: a meta-analysis of individual data for one million adults in 61 prospective studies. Lancet. 2002;360(9349):1903–13.

Article   PubMed   Google Scholar  

Burnier M. Medication adherence and persistence as the cornerstone of effective antihypertensive therapy. Am J Hypertens. 2006;19(11):1190–6.

Viswanathan M, Golin CE, Jones CD, Ashok M, Blalock SJ, Wines RCM, et al. Interventions to improve adherence to self-administered medications for chronic diseases in the United States: a systematic review. Ann Intern Med. 2012;157(11):785–95.

Gwadry-Sridhar FH, Manias E, Lal L, Salas M, Hughes DA, Ratzki-Leewing A, et al. Impact of interventions on medication adherence and blood pressure control in patients with essential hypertension: a systematic review by the ISPOR medication adherence and persistence special interest group. Value Health. 2013;16(5):863–71.

Heron KE, Smyth JM. Ecological momentary interventions: incorporating mobile technology into psychosocial and health behaviour treatments. Br J Health Psychol. 2010;15(Pt 1):1–39.

Free C, Phillips G, Galli L, Watson L, Felix L, Edwards P, et al. The effectiveness of mobile-health technology-based health behaviour change or disease management interventions for health care consumers: a systematic review. PLoS Med. 2013;10(1):e1001362.

Beratarrechea A, Lee AG, Willner JM, Jahangir E, Ciapponi A, Rubinstein A. The impact of mobile health interventions on chronic disease outcomes in developing countries: a systematic review. Telemed J E Health. 2014;20(1):75–82.

Lester RT, Ritvo P, Mills EJ, Kariri A, Karanja S, Chung MH, et al. Effects of a mobile phone short message service on antiretroviral treatment adherence in Kenya (WelTel Kenya1): a randomised trial. Lancet. 2010;376(9755):1838–45.

Mbuagbaw L, Thabane L, Ongolo-Zogo P, Lester RT, Mills EJ, Smieja M, et al. The Cameroon Mobile Phone SMS (CAMPS) Trial: A Randomized Trial of Text Messaging versus Usual Care for Adherence to Antiretroviral Therapy. PLoS ONE. 2012;7(12):e46909. https://doi.org/10.1371/journal.pone.0046909

Yasmin F, Banu B, Zakir SM, Sauerborn R, Ali L, Souares A. Positive influence of short message service and voice call interventions on adherence and health outcomes in case of chronic disease care: a systematic review. BMC Med Inform Decis Mak. 2016;16(1):46.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Adler AJ, Martin N, Mariani J, Tajer CD, Owolabi OO, Free C, et al. Mobile phone text messaging to improve medication adherence in secondary prevention of cardiovascular disease. In: The Cochrane Collaboration, editor. Cochrane database of systematic reviews. Chichester: Wiley; 2017. Available from: http://doi.wiley.com/10.1002/14651858.CD011851.pub2 . Cited 19 Jun 2017.

Google Scholar  

Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.

Michie S, Brown J, Geraghty AWA, Miller S, Yardley L, Gardner B, et al. Development of StopAdvisor. Transl Behav Med. 2012;2(3):263–75.

Lakshman R, Griffin S, Hardeman W, Schiff A, Kinmonth AL, Ong KK. Using the Medical Research Council framework for the development and evaluation of complex interventions in a theory-based infant feeding intervention to prevent childhood obesity: the baby milk intervention and trial. J Obes. 2014;2014:646504.

Nhavoto JA, Grönlund Å, Chaquilla WP. SMSaúde: design, development, and implementation of a remote/mobile patient management system to improve retention in care for HIV/AIDS and tuberculosis patients. JMIR MHealth UHealth. 2015;3(1):e26.

Modi D, Gopalan R, Shah S, Venkatraman S, Desai G, Desai S, et al. Development and formative evaluation of an innovative mHealth intervention for improving coverage of community-based maternal, newborn and child health services in rural areas of India. Glob Health Action. 2015;8:26769.

Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655. https://doi.org/10.1136/bmj.a1655 .

Paul G, Smith SM, Whitford D, O’Kelly F, O’Dowd T. Development of a complex intervention to test the effectiveness of peer support in type 2 diabetes. BMC Health Serv Res. 2007;7(1):136.

Smith SM, Murchie P, Devereux G, Johnston M, Lee AJ, Macleod U, et al. Developing a complex intervention to reduce time to presentation with symptoms of lung cancer. Br J Gen Pract. 2012;62(602):e605–15.

Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.

Nieuwlaat R, Wilczynski N, Navarro T, Hobson N, Jeffery R, Keepanasseril A, et al. Interventions for enhancing medication adherence. Cochrane Database Syst Rev. 2014;11:CD000011.

Márquez Contreras E, de la Figuera von Wichmann M, Gil Guillén V, Ylla-Catalá A, Figueras M, Balaña M, et al. Effectiveness of an intervention to provide information to patients with hypertension as short text messages and reminders sent to their mobile phone (HTA-Alert). Atencion Primaria. 2004;34(8):399–405.

Márquez Contreras E, Vegazo García O, Martel Claros N, Gil Guillén V, de la Figuera v, Wichmann M, Casado Martínez JJ, et al. Efficacy of telephone and mail intervention in patient compliance with antihypertensive drugs in hypertension. ETECUM-HTA study. Blood Press. 2005;14(3):151–8.

Morikawa N, Yamasue K, Tochikubo O, Mizushima S. Effect of salt reduction intervention program using an electronic salt sensor and cellular phone on blood pressure among hypertensive workers. Clin Exp Hypertens. 2011;33(4):216–22.

de Jongh T, Gurol-Urganci I, Vodopivec-Jamsek V, Car J, Atun R. Mobile phone messaging for facilitating self-management of long-term illnesses. Cochrane Database Syst Rev. 2012;12:CD007459.

PubMed   Google Scholar  

Buhi ER, Trudnak TE, Martinasek MP, Oberne AB, Fuhrmann HJ, McDermott RJ. Mobile phone-based behavioural interventions for health: a systematic review. Health Educ J. 2012; https://doi.org/10.1177/0017896912452071 .

Carrasco MP, Salvador CH, Sagredo PG, Márquez-Montes J, González de Mingo MA, Fragua JA, et al. Impact of patient-general practitioner short-messages-based interaction on the control of hypertension in a follow-up service for low-to-medium risk hypertensive patients: a randomized controlled trial. IEEE Trans Inf Technol Biomed. 2008;12(6):780–91.

Blasco A, Carmona M, Fernández-Lozano I, Salvador CH, Pascual M, Sagredo PG, et al. Evaluation of a telemedicine service for the secondary prevention of coronary artery disease. J Cardiopulm Rehabil Prev. 2012;32(1):25–31.

Logan AG, Irvine MJ, McIsaac WJ, Tisler A, Rossos PG, Easty A, et al. Effect of home blood pressure telemonitoring with self-care support on uncontrolled systolic hypertension in diabetics. Hypertension. 2012;60(1):51–7.

Article   CAS   PubMed   Google Scholar  

McKinstry B, Hanley J, Wild S, Pagliari C, Paterson M, Lewis S, et al. Telemonitoring based service redesign for the management of uncontrolled hypertension: multicentre randomised controlled trial. BMJ. 2013;346:f3030.

Mayosi BM, Lawn JE, van Niekerk A, Bradshaw D, Abdool Karim SS, Coovadia HM. Health in South Africa: changes and challenges since 2009. Lancet. 2012;380(9858):2029–43.

Ataguba JE-O, Day C, McIntyre D. Explaining the role of the social determinants of health on health inequality in South Africa. Glob Health Action. 2015;8:28665. https://doi.org/10.3402/gha.v8.28865 .

Wilkinson D, Gouws E, Sach M, Karim SS. Effect of removing user fees on attendance for curative and preventive primary health care services in rural South Africa. Bull World Health Organ. 2001;79(7):665–71.

CAS   PubMed   PubMed Central   Google Scholar  

Seedat Y, Rayner B, Veriava Y. South African hypertension practice guideline 2014. Cardiovasc J Afr. 2014;25(6):288–94.

Essential Drugs Programme (EDP) [Internet]. [cited 2017 Jun 21]. Available from: http://www.health.gov.za/index.php/essential-drugs-programme-edp

de Josselin de Jong S, Candel M, Segaar D, Cremers H-P, de Vries H. Efficacy of a web-based computer-tailored smoking prevention intervention for Dutch adolescents: randomized controlled trial. J Med Internet Res. 2014;16(3):e82. https://doi.org/10.2196/jmir.2469 .

Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95.

Healthcare 2030: A Future Health Service for the Western Cape [Internet]. Western Cape Government. [cited 2017 Jun 21]. Available from: https://www.westerncape.gov.za/news/healthcare-2030-future-health-service-western-cape

Chronic Care [Internet]. Western Cape Government. [cited 2017 Jun 21]. Available from: https://www.westerncape.gov.za/service/chronic-care

Stewart S, Carrington MJ, Pretorius S, Ogah OS, Blauwet L, Antras-Ferry J, et al. Elevated risk factors but low burden of heart disease in urban African primary care patients: a fundamental role for primary prevention. Int J Cardiol. 2012;158(2):205–10.

Schroeder K, Fahey T, Ebrahim S. Interventions for improving adherence to treatment in patients with high blood pressure in ambulatory settings. Cochrane Database Syst Rev. 2004. http://onlinelibrary.wiley.com/doi/10.1002/14651858.CD004804/abstract .

Collins R, Peto R, MacMahon S, Hebert P, Fiebach NH, Eberlein KA, et al. Blood pressure, stroke, and coronary heart disease. Part 2, short-term reductions in blood pressure: overview of randomised drug trials in their epidemiological context. Lancet. 1990;335(8693):827–38.

Tibazarwa K, Ntyintyane L, Sliwa K, Gerntholtz T, Carrington M, Wilkinson D, et al. A time bomb of cardiovascular risk factors in South Africa: results from the heart of Soweto study ‘heart awareness days’. Int J Cardiol. 2009;132(2):233–9.

Finitsis DJ, Pellowski JA, Johnson BT. Text message intervention designs to promote adherence to antiretroviral therapy (ART): a meta-analysis of randomized controlled trials. PLoS One. 2014;9(2):e88166.

Free C, Knight R, Robertson S, Whittaker R, Edwards P, Zhou W, et al. Smoking cessation support delivered via mobile phone text messaging (txt2stop): a single-blind, randomised trial. Lancet. 2011;378(9785):49–55.

Leon N, Surender R, Bobrow K, Muller J, Farmer A. Improving treatment adherence for blood pressure lowering via mobile phone SMS-messages in South Africa: a qualitative evaluation of the SMS-text Adherence SuppoRt (StAR) trial. BMC Fam Pract. 2015;16(1):80.

Bobrow K, Brennan T, Springer D, Levitt NS, Rayner B, Namane M, et al. Efficacy of a text messaging (SMS) based intervention for adults with hypertension: protocol for the StAR (SMS Text-message Adherence suppoRt trial) randomised controlled trial. BMC Public Health. 2014;14(1):28.

Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

Spoth R, Rohrbach LA, Greenberg M, Leaf P, Brown CH, Fagan A, et al. Addressing Core challenges for the next generation of type 2 translation research and systems: the translation science to population impact (TSci impact) framework. Prev Sci. 2013;14(4):319–51.

Crosby R, Noar SM. What is a planning model? An introduction to PRECEDE-PROCEED. J Public Health Dent. 2011;71(Suppl 1):S7–15.

Rubinstein A, Miranda JJ, Beratarrechea A, Diez-Canseco F, Kanter R, Gutierrez L, et al. Effectiveness of an mHealth intervention to improve the cardiometabolic profile of people with prehypertension in low-resource urban settings in Latin America: a randomised controlled trial. Lancet Diabetes Endocrinol. 2016;4(1):52–63.

Lau YK, Cassidy T, Hacking D, Brittain K, Haricharan HJ, Heap M. Antenatal health promotion via short message service at a midwife obstetrics unit in South Africa: a mixed methods study. BMC Pregnancy Childbirth. 2014;14:284.

Download references

Acknowledgements

We are grateful to the patients; health-care workers; pharmacist; clinic administrative staff for their assistance. We are grateful to the Department of Health of the Western Cape for their support and access to facilities. We are grateful to Professor Krisela Steyn and Professor Brian Rayner for their insight and support. We are also grateful for the administrative support from the Chronic Diseases Initiative for Africa secretariat. We are especially grateful to Sr Carmen Delport and Ms. Liezel Fisher.

This research project is supported by the Wellcome Trust and the Engineering and Physical Sciences Research Council. AF is a Senior NIHR Investigator, and AF and LT are supported by funding from the NIHR Oxford Biomedical Research Centre. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Availability of data and materials

Materials used are publically available Medical Research Council (MRC) Framework on complex interventions. TIDieR check list describing final intervention included in supplementary materials. Additional anonymised data from focus groups and cross-sectional survey available on request.

Author information

Authors and affiliations.

Chronic Disease Initiative for Africa, Cape Town, South Africa

Kirsten Bobrow & Naomi Levitt

Division of Diabetic Medicine and Endocrinology, Department of Medicine, University of Cape Town, Cape Town, South Africa

Primary Care Clinical Trials Unit, Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK

Andrew Farmer

Women’s Health Research Unit, School of Public Health & Family Medicine, University of Cape Town, Cape Town, South Africa

Nomazizi Cishe & Ntobeko Nwagi

Western Cape Province Department of Health, Cape Town, South Africa

Mosedi Namane

Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Oxford, UK

Thomas P. Brennan, David Springer & Lionel Tarassenko

Nuffield Department of Primary Care Health Sciences, Radcliffe Observatory Quarter, University of Oxford, Oxford, OX2 6GG, UK

Kirsten Bobrow

You can also search for this author in PubMed   Google Scholar

Contributions

KB, AF, TB, NL, BR, MN, LT conceived the study. KB and AF designed and coordinated the study and wrote the protocol. The protocol was refined with contributions from TB, DS, NL, BR, KS, MN, LT, who also contributed to study coordination. NN, NC contributed to the data collection and coding, analysis and edited the manuscript. TB, DS and LT developed and implemented the technical system for sending the SMS text-messages and contributed to the manuscript. LT is the grant holder for the program that supported this work. KB and AF (as joint first authors and equal contributors) wrote the first draft of the manuscript, which was critically revised for important intellectual content by all authors. All authors read and approved the final manuscript. KB and AF are the guarantors of the manuscript, and affirm that the manuscript is an honest, accurate, and transparent account of the research being reported; and that no important aspects of the study have been omitted.

Corresponding author

Correspondence to Kirsten Bobrow .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the Human Research Ethics Committee of the University of Cape Town (HREC UCT 418/211, 017/2014), the Oxford Tropical Research Ethics Committee (OXTREC 03–12, 13–14), and the Metro District Health Services, Western Cape (RP 141/2011). Trial conduct was overseen by a trial steering committee. All participants provided written informed consent. All the requirements of the Helsinki Declaration of 2008 were fulfilled.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Bobrow, K., Farmer, A., Cishe, N. et al. Using the Medical Research Council framework for development and evaluation of complex interventions in a low resource setting to develop a theory-based treatment support intervention delivered via SMS text message to improve blood pressure control. BMC Health Serv Res 18 , 33 (2018). https://doi.org/10.1186/s12913-017-2808-9

Download citation

Received : 21 September 2015

Accepted : 15 December 2017

Published : 23 January 2018

DOI : https://doi.org/10.1186/s12913-017-2808-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Intervention development
  • MRC framework
  • Health care
  • Self-management
  • Behaviour modification

BMC Health Services Research

ISSN: 1472-6963

uk medical research council intervention development framework

  • Skip to Content
  • Research Projects
  • Other NIHR research
  • Collections
  • For authors
  • For reviewers
  • Accessibility

Journals Library

  • Efficacy and Mechanism Evaluation
  • Global Health Research
  • Health and Social Care Delivery Research
  • Health Technology Assessment
  • Programme Grants for Applied Research
  • Public Health Research

An error occurred retrieving publication content to display, please try again.

Page not found (404)

Sorry - the page you requested could not be found..

Please choose a page from the navigation or try a website search above to find the information you need.

  • Download report PDF
  • 1)"> Download report documents
  • 0)"> Download report documents
  • Disclosure of interest
  • Download report XML
  • 1)"> Citation Tools
  • Responses to this report (0)
  • Permissions information

This study has updated the MRC's framework in the light of developments in complex intervention research since 2006, adopting a pluralist approach and encouraging the consideration and use of diverse research perspectives.

{{author}} {{author}} {{($index > metadata.AuthorsAndEtalArray.length-1) ? ',' : '.'}}

ORCID logo

  • Detailed Author information

Kathryn Skivington 1, * , Lynsay Matthews 1 , Sharon Anne Simpson 1 , Peter Craig 1 , Janis Baird 2 , Jane M Blazeby 3 , Kathleen Anne Boyd 4 , Neil Craig 5 , David P French 6 , Emma McIntosh 4 , Mark Petticrew 7 , Jo Rycroft-Malone 8 , Martin White 9 , Laurence Moore 1, *

1 Medical Research Council/Chief Scientist Office Social and Public Health Sciences Unit, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK 2 Medical Research Council Lifecourse Epidemiology Unit, University of Southampton, Southampton, UK 3 Medical Research Council ConDuCT-II Hub for Trials Methodology Research and Bristol Biomedical Research Centre, University of Bristol, Bristol, UK 4 Health Economics and Health Technology Assessment Unit, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK 5 Public Health Scotland, Glasgow, UK 6 Manchester Centre for Health Psychology, University of Manchester, Manchester, UK 7 London School of Hygiene and Tropical Medicine, London, UK 8 Faculty of Health and Medicine, Lancaster University, Lancaster, UK 9 Medical Research Council Epidemiology Unit, University of Cambridge, Cambridge, UK * Corresponding author Emails: [email protected] and [email protected]

Funding: {{metadata.Funding}}

{{metadata.Journal}} Volume: {{metadata.Volume}}, Issue: {{metadata.Issue}}, Published in {{metadata.PublicationDate | date:'MMMM yyyy'}}

https://doi.org/{{metadata.DOI}}

Citation: {{author}}{{ (($index < metadata.AuthorsArray.length-1) && ($index <=6)) ? ', ' : '' }} {{(metadata.AuthorsArray.length <= 6) ? '.' : '' }} {{(metadata.AuthorsArray.length > 6) ? 'et al. ' : ''}} . {{metadata.JournalShortName}} {{metadata.PublicationDate | date:'yyyy'}};{{metadata.Volume}}({{metadata.Issue}})

Report Content

  • Full Report
  • Plain English Summary
  • Plain Language Summary
  • Scientific Summary

The full text of this issue is available as a PDF document from the Toolkit section on this page.

  • References (RIS)

If you would like to receive a notification when this project publishes in the NIHR Journals Library, please submit your email address below.

Responses to this report

No responses have been published.

If you would like to submit a response to this publication, please do so using the form below:

Comments submitted to the NIHR Journals Library are electronic letters to the editor. They enable our readers to debate issues raised in research reports published in the Journals Library. We aim to post within 14 working days all responses that contribute substantially to the topic investigated, as determined by the Editors.  Non-relevant comments will be deleted.

Your name and affiliations will be published with your comment.

Once published, you will not have the right to remove or edit your response. The Editors may add, remove, or edit comments at their absolute discretion.

By submitting your response, you are stating that you agree to the  terms & conditions

An error has occurred in processing the XML document

© NIHR 2024

  • Rights and permissions
  • Information for libraries
  • - Google Chrome

Intended for healthcare professionals

  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • Process evaluation of...

Process evaluation of complex interventions: Medical Research Council guidance

  • Related content
  • Peer review
  • Graham F Moore , research fellow 1 ,
  • Suzanne Audrey , research fellow 2 ,
  • Mary Barker , associate professor of psychology 3 ,
  • Lyndal Bond , principal research officer 4 ,
  • Chris Bonell , professor of sociology and social policy 5 ,
  • Wendy Hardeman , senior research associate in behavioural science 6 ,
  • Laurence Moore , director 7 ,
  • Alicia O’Cathain , professor of health services research 8 ,
  • Tannaze Tinati , research fellow 3 ,
  • Daniel Wight , children, young people, families and health programme leader 7 ,
  • Janis Baird , associate professor of public health 3
  • 1 DECIPHer UKCRC Public Health Research Centre of Excellence, School of Social Sciences, Cardiff University, Cardiff, UK
  • 2 DECIPHer UKCRC Public Health Research Centre of Excellence, School of Social and Community Medicine, University of Bristol, Bristol, UK
  • 3 MRC Lifecourse Epidemiology Unit, University of Southampton, Southampton, UK
  • 4 Centre of Excellence in Intervention and Prevention Science, Melbourne, VIC Australia
  • 5 Department of Childhood, Families and Health, Institute of Education, University of London, London, UK
  • 6 Primary Care Unit, Department of Public Health and Primary Care, University of Cambridge, Cambridge, UK
  • 7 MRC/CSO Social and Public Health Sciences Unit, University of Glasgow, Glasgow, UK
  • 8 School of Health and Related Research, University of Sheffield, Sheffield, UK
  • Correspondence to: G F Moore MooreG{at}cardiff.ac.uk
  • Accepted 13 January 2015

Process evaluation is an essential part of designing and testing complex interventions. New MRC guidance provides a framework for conducting and reporting process evaluation studies

Attempts to tackle problems such as smoking and obesity increasingly use complex interventions. These are commonly defined as interventions that comprise multiple interacting components, although additional dimensions of complexity include the difficulty of their implementation and the number of organisational levels they target. 1 Randomised controlled trials are regarded as the gold standard for establishing the effectiveness of interventions, when randomisation is feasible. However, effect sizes do not provide policy makers with information on how an intervention might be replicated in their specific context, or whether trial outcomes will be reproduced. Earlier MRC guidance for evaluating complex interventions focused on randomised trials, making no mention of process evaluation. 2 Updated guidance recognised the value of process evaluation within trials, stating that it “can be used to assess fidelity and quality of implementation, clarify causal mechanisms and identify contextual factors associated with variation in outcomes.” 3 However, it did not provide guidance for carrying out process evaluation.

Summary points

MRC guidance for developing and evaluating complex interventions recognised the importance of process evaluation within trials but did not provide guidance for its conduct

This article presents a framework for process evaluation, building on the three themes for process evaluation described in 2008 MRC guidance (implementation, mechanisms, and context)

It argues for a systematic approach to designing and conducting process evaluations, drawing on clear descriptions of intervention theory and identification of key process questions

While each process evaluation will be different, the guidance facilitates planning and conducting a process evaluation

Developing guidance for process evaluation

In 2010, a workshop funded by the MRC Population Health Science Research Network discussed the need for guidance on process evaluation. 4 There was consensus that researchers, funders, and reviewers would benefit from guidance. A group of researchers with experience and expertise in evaluating complex interventions was assembled to produce the guidance. In line with the principles followed in developing earlier MRC guidance documents, draft guidance was produced drawing on literature reviews, process evaluation case studies, workshops, and discussions at conferences and seminars. It was then circulated to academic, policy, and practice stakeholders for comment. Around 30 stakeholders provided written comments on the draft structure, while others commented during conference workshops run throughout the development process. A full draft was recirculated for further review, before being revised and approved by key MRC funding panels.

Although the aim was to provide guidance on process evaluation of public health interventions, the guidance is highly relevant to complex intervention research in other domains, such as health services and education. The full guidance ( www.populationhealthsciences.org/Process-Evaluation-Guidance.html ) begins by setting out the need for process evaluation. It then presents a review of influential theories and frameworks which informed its development, before offering practical recommendations, and six detailed case studies. In this article, we provide an overview of the new framework and summarise our practical recommendations using one of the case studies as an example.

MRC process evaluation framework

The new framework builds on the process evaluation themes described in the 2008 MRC complex interventions guidance (fig 1 ⇓ ). 3 Although the role of theory within evaluation is contested, 5 6 we concur with the position set out in the 2008 guidance, which argued that an understanding of the causal assumptions underpinning the intervention and use of evaluation to understand how interventions work in practice are vital in building an evidence base that informs policy and practice. 1 Causal assumptions may be drawn from social science theory, although complex interventions will often also be informed by other factors such as past experience or common sense. An intervention as simple as a health information leaflet, for example, may reflect an assumption that increased knowledge of health consequences will trigger behavioural change. Explicitly stating causal assumptions about how the intervention will work can allow external scrutiny of its plausibility and help evaluators decide which aspects of the intervention or its context to prioritise for investigation. Our framework also emphasises the relations between implementation, mechanisms, and context. For example, implementation of a new intervention will be affected by its existing context, but a new intervention may also in turn change aspects of the context in which it is delivered.

Fig 1 Key functions of process evaluation and relations among them (blue boxes are the key components of a process evaluation. Investigation of these components is shaped by a clear intervention description and informs interpretation of outcomes)

  • Download figure
  • Open in new tab
  • Download powerpoint

Implementation: what is implemented, and how?

An intervention may have limited effects either because of weaknesses in its design or because it is not properly implemented. 7 On the other hand, positive outcomes can sometimes be achieved even when an intervention was not delivered fully as intended. 8 Hence, to begin to enable conclusions about what works, process evaluation will usually aim to capture fidelity (whether the intervention was delivered as intended) and dose (the quantity of intervention implemented). Complex interventions usually undergo some tailoring when implemented in different contexts. Capturing what is delivered in practice, with close reference to the theory of the intervention, can enable evaluators to distinguish between adaptations to make the intervention fit different contexts and changes that undermine intervention fidelity. 9 10 Unresolved debates regarding adaption of interventions, and what is meant by intervention fidelity, are discussed at length in the full guidance.

In addition to what was delivered, process evaluation can usefully investigate how the intervention was delivered. 11 12 This can provide policy makers and practitioners with vital information about how the intervention might be replicated, as well as generalisable knowledge on how to implement complex interventions. Issues considered may include training and support, communication and management structures, and how these structures interact with implementers’ attitudes and circumstances to shape the intervention.

Process evaluations also commonly investigate the “reach” of interventions (whether the intended audience comes into contact with the intervention, and how). 13 There is no consensus on how best to divide the study of implementation into key subcomponents (such as fidelity, dose, and reach), and it is currently not possible to adjudicate between the various frameworks that attempt to do this. These issues are discussed further in the full guidance document.

Mechanisms of impact: how does the delivered intervention produce change?

Exploring the mechanisms through which interventions bring about change is crucial to understanding both how the effects of the specific intervention occurred and how these effects might be replicated by similar future interventions. 14 Process evaluations may test hypothesised causal pathways using quantitative data as well as using qualitative methods to better understand complex pathways or to identify unexpected mechanisms. 15

Context: how does context affect implementation and outcomes?

Context includes anything external to the intervention that may act as a barrier or facilitator to its implementation, or its effects. As described above, implementation will often vary from one context to another. However, an intervention may have different effects in different contexts even if its implementation does not vary. 16 Complex interventions work by introducing mechanisms that are sufficiently suited to their context to produce change, 17 while causes of problems targeted by interventions may differ from one context to another. Understanding context is therefore critical in interpreting the findings of a specific evaluation and generalising beyond it. Even where an intervention itself is relatively simple, its interaction with its context may still be highly complex.

Functions of process evaluation at different stages of development, evaluation, and implementation

The focus of process evaluation will vary according to the stage at which it is conducted. The MRC framework recommends a feasibility and piloting phase after an intervention has been developed. 1 3 At this stage, process evaluation can have a vital role in understanding the feasibility of the intervention and optimising its design and evaluation. However, at the next stage, evaluating effectiveness, the emphasis of process evaluation shifts towards providing greater confidence in conclusions about effectiveness by assessing the quantity and quality of what was delivered, and assessing the generalisability of its effectiveness by understanding the role of context. Even when a process evaluation has been conducted at the feasibility stage, another will usually be needed alongside the full trial because new problems are likely to emerge when the intervention is tested in a larger more diverse sample.

Planning, designing, conducting, and reporting a process evaluation

Box 1 summarises the key recommendations of the new MRC guidance for process evaluation. Given the diversity of complex interventions, the aims and methods of process evaluations will vary, but there are common considerations when developing and planning any such evaluation. The recommendations are not intended to be prescriptive but to help researchers to make decisions. Throughout this section, we have illustrated our points using one of the six case studies included in the full guidance, the process evaluation of the Welsh national exercise referral scheme (NERS) 8 18 19 ; this scheme aimed to improve physical activity through primary care referral to exercise professionals in local authority leisure centres.

Box 1: Key recommendations for process evaluation

Carefully define the parameters of relationships with intervention developers or implementers

Balance the need for sufficiently good working relationships to allow close observation, against the need to remain credible as independent evaluators

Agree whether evaluators will take an active role in communicating findings as they emerge (and helping correct implementation challenges) or have a more passive role

Ensure that the research team has the correct expertise. This may require:

Expertise in qualitative and quantitative research methods

Appropriate interdisciplinary theoretical expertise

Decide the degree of separation or integration between process and outcome evaluation teams

Ensure effective oversight by a principal investigator who values all evaluation components

Develop good communication systems to minimise duplication and conflict between process and outcomes evaluations

Ensure that plans for integration of process and outcome data are agreed from the outset

Design and conduct

Clearly describe the intervention and clarify causal assumptions (in relation to how it will be implemented, and the mechanisms through which it will produce change, in a specific context)

Identify key uncertainties and systematically select the most important questions to address

Identify potential questions by considering the assumptions represented by the intervention

Agree scientific and policy priority questions by considering the evidence for intervention assumptions and consulting the evaluation team and policy or practice stakeholders

Identify previous process evaluations of similar interventions and consider whether it is appropriate to replicate aspects of them and build on their findings

Select a combination of methods appropriate to the research questions:

Use quantitative methods to measure key process variables and allow testing of pre-hypothesised mechanisms of impact and contextual moderators

Use qualitative methods to capture emerging changes in implementation, experiences of the intervention and unanticipated or complex causal pathways, and to generate new theory

Balance collection of data on key process variables from all sites or participants with detailed data from smaller, purposively selected samples

Consider data collection at multiple time points to capture changes to the intervention over time

Provide descriptive quantitative information on fidelity, dose, and reach

Consider more detailed modelling of variations between participants or sites in terms of factors such as fidelity or reach (eg, are there socioeconomic biases in who received the intervention?)

Integrate quantitative process data into outcomes datasets to examine whether effects differ by implementation or prespecified contextual moderators, and test hypothesised mediators

Collect and analyse qualitative data iteratively so that themes that emerge in early interviews can be explored in later ones

Ensure that quantitative and qualitative analyses build upon one another (eg, qualitative data used to explain quantitative findings or quantitative data used to test hypotheses generated by qualitative data)

Where possible, initially analyse and report process data before trial outcomes are known to avoid biased interpretation

Transparently report whether process data are being used to generate hypotheses (analysis blind to trial outcomes), or for post-hoc explanation (analysis after trial outcomes are known)

Identify existing reporting guidance specific to the methods adopted

Report the logic model or intervention theory and clarify how it was used to guide selection of research questions and methods

Disseminate findings to policy and practice stakeholders

If multiple journal articles are published from the same process evaluation ensure that each article makes clear its context within the evaluation as a whole:

Publish a full report comprising all evaluation components or a protocol paper describing the whole evaluation, to which reference should be made in all articles

Emphasise contributions to intervention theory or methods development to enhance interest to a readership beyond the specific intervention in question

Planning a process evaluation

Working with intervention developers and implementers.

High quality process evaluation requires good working relationships with all stakeholders involved in intervention development or implementation. These can be difficult to establish—for example, because these stakeholders have professional or personal interests in portraying the intervention positively, or see evaluation as threatening. However, without good relationships, close observation of the intervention can be challenging. Evaluators also need to ensure that they maintain sufficient independence to observe the work of stakeholders critically. The NERS process evaluation identified serious problems with the implementation of some intervention components. 19 Evaluators needed to be close enough to the intervention to record these problems and understand why they occurred, yet sufficiently independent to report them to intervention stakeholders honestly. Transparent reporting of relationships with policy and practice stakeholders, and being mindful of how these affect the evaluation, is crucial.

One key challenge in working with intervention stakeholders is whether to communicate emerging findings. That is, should evaluators act as passive observers who feed findings back at the end of an evaluation or help to correct problems in implementation as and when they appear. 20 A more active role is appropriate at the feasibility testing stage. However, when evaluating effectiveness, researchers will ideally not engage in continuous quality improvement activities because these may compromise the external validity of the evaluation. Agreeing systems for communicating information to stakeholders at the outset of the study may help to avoid perceptions of undue interference or that the evaluator withheld important information.

Resources and staffing

When planning a process evaluation, evaluators need to ensure that there is sufficient expertise and experience to decide on, and achieve, its aims. A process evaluation team will often require expertise in quantitative and qualitative research methods. Process evaluations will often need to draw on expertise from a range of relevant disciplines including, for example, public health, primary care, epidemiology, sociology, and psychology. Sufficient resources are required to allow collection and analysis of large quantities of diverse data, bearing in mind that analysis of qualitative data is especially time consuming.

Relationships within evaluation teams

Process evaluation will typically form part of a study that includes evaluation of outcomes and possibly cost effectiveness. Some evaluators choose to separate process and outcome teams, while in other cases they are combined. Box 2 gives some pros and cons of each model. If the teams are separate effective communications are necessary to prevent duplication or conflict; with combined teams, there is a need for transparency about how this might influence the conduct and interpretation of the evaluation. Effective integration of evaluation components is more likely when members of a team respect and value each other’s work, and when the overall study is overseen by a principal investigator who values integration. 21

Box 2: Separation or integration of process evaluation and outcome evaluation teams?

Arguments for separation.

Separation may reduce potential biases in analysis of outcomes data arising from feedback on the perceived functioning of the intervention

In controlled trials, process evaluators cannot be blinded to treatment condition. Those collecting or analysing outcomes data ought to be blinded where possible

Analysing process data without knowledge of trial outcomes prevents fishing for explanations and biasing interpretations. Although it may not always be practical to delay outcomes analysis until process analyses are complete, if separate researchers are responsible for each part it may be possible to conducted the analyses concurrently without biasing the results

Process evaluation may produce data that would be hard for those with vested interests in the trial to analyse and report dispassionately

If implementers or participants have concerns about a trial, a degree of separation from the trial may make it easier for process evaluators to build rapport and understand their concerns

Arguments for integration

Process evaluators and outcomes evaluators will want to work together to ensure that data on implementation can be integrated into analysis of outcomes, or that data on emerging process issues can be integrated into trial data collections

Data on intermediate outcomes and causal processes identified by process evaluators may inform integration of new measures into outcomes data collections

If some relevant process measures are already being collected as part of the outcomes evaluation, it is important to avoid duplication of efforts and reduce measurement burden for participants

One component of data collection should not compromise another. For example, if collection of process data is causing a high measurement burden for participants, this may lead to lower response to outcomes assessments

Designing and conducting a process evaluation

Describing the intervention and clarifying causal assumptions.

A clear description of the intended intervention, how it will be implemented, and how it is expected to work, will ideally have been developed before evaluation. In such cases, designing a process evaluation will begin by reviewing these descriptions to decide what requires investigation. Any ambiguity over what the intervention is, or how it is intended to work, should be resolved with the intervention developers before the design of the process evaluation is finalised. Evaluators of NERS had limited involvement in the development of the intervention, which was a Welsh government policy initiative. Hence, when evaluation began, some ambiguity remained over the content of the intervention and how it was intended to work. Evaluators worked with intervention developers to resolve this ambiguity, but as this took place after the evaluation had started, the time available to develop robust measures of some key activities was limited. 8

It is useful if interventions and their evaluations draw explicitly on existing theories so that these can be tested and refined. However, when an intervention’s development is driven by other factors, such as experience or common sense, it is important to be open about this and clear about what these assumptions are, rather than trying to force an established theoretical framework to fit the intervention. Evaluators should also avoid focusing narrowly on inappropriate theories from a single discipline. For example, psychological theory may be useful for interventions that work at the individual level but is less useful when intervening with organisations or at wider social levels. 22

Depicting the intervention in a logic model can help clarify causal assumptions. 23 Fig 2 ⇓ gives an example for INCLUSIVE, a school based intervention that aimed to reduce bullying and improve student health by implementing “restorative practices” across the whole school. 24 The logic model was based on Markham and Aveyard’s theory of human functioning and school organisation, which suggests that health benefits would be mediated by whether students were connected to their school’s learning and community. 25 This led the authors to identify measures of commitment and belonging as intermediate outcomes. 26

Fig 2 Logic model for the INCLUSIVE intervention to reduce violence and aggression in schools 24

Learning from previous process evaluations

When designing a process evaluation, it is important to be mindful that the results may later be included in systematic reviews. Process evaluation will provide the information on implementation and context that Waters and colleagues argue is essential if reviews are to assist decision makers. 27 It is therefore helpful if process evaluations of similar interventions build on one another’s findings, using comparable methods if possible, so that reviewers can make meaningful comparisons across studies.

Deciding core research questions

Process evaluations cannot expect to provide answers to all of the uncertainties of a complex intervention. 28 It is generally better to answer the most important questions well than to try to answer too many questions and do so unsatisfactorily. To identify core questions, evaluators may start by listing causal assumptions within the intervention manual or logic model and establishing which have the most limited evidence base. This can be done by reviewing the literature, consultation with policy and practice stakeholders, and discussions within the research team. Complex interventions are inherently unpredictable. Evaluators may therefore identify additional questions during the course of their evaluation. Hence, although clear focus from the outset is vital, process evaluations must be designed with sufficient flexibility and resources to allow important emerging questions to be addressed.

Selecting methods

Figure 3 ⇓ lists some common data collection and analysis methods adopted by process evaluations, the merits of which should be considered carefully in relation to the research questions. Process evaluation of complex interventions usually requires a combination of quantitative and qualitative methods, but their relative importance may vary according to the status of the evidence base or stage of the evaluation process. At the feasibility and piloting stage, basic quantitative measures of implementation may be combined with in-depth qualitative data to provide detailed understandings of intervention functioning on a small scale.

Fig 3 Commonly used data collection and analysis methods for process evaluation

When evaluating effectiveness, collection of quantitative process measures to allow testing of hypothesised pathways or to measure contextual factors may be a priority. If directly relevant qualitative data are already available (for example, from an earlier feasibility study), evaluators may choose not to collect extensive qualitative process data while evaluating effectiveness. However, collecting additional qualitative data may still help in understanding issues arising from the movement from a small scale feasibility study to a larger scale evaluation involving greater diversity in implementers, settings, and participants.

Key methodological considerations include sampling and timing of data collection. Interviewing every implementer may not provide greater insights than interviewing a small well selected sample, and may lead to overwhelming volumes of data. Conducting observations in every site may be prohibitively expensive and unduly influence implementation. Conversely, there are dangers in collecting data from only a few sites in order to draw conclusions regarding the intervention as a whole. 28 Hence, when feasible, it is often useful to combine quantitative data on key process variables from all sites or participants with in-depth qualitative data from samples purposively selected along dimensions expected to influence the functioning of the intervention. Collecting data at multiple time points may be useful because interventions can suffer from teething problems which are rectified as the evaluation progresses.

Within the NERS process evaluation, quantitative measures included structured observations of audio recorded patient consultations. These were used to examine aspects of fidelity (such as consistency with motivational interviewing principles), and dose (such as the duration of consultations). Sociodemographic patterning in entry to the scheme (reach) was evaluated using routinely collected monitoring data. 8 Quantitative measures of hypothesised psychological mechanisms, including motivation for exercise and confidence, were collected as part of the trial. 18 Qualitative interviews were conducted with patients, exercise professionals, scheme coordinators, and health professionals. These focused on challenges in implementation across contexts and how NERS was perceived to work in practice. 8

Analysis of process data, and integration of process and outcome data

Analysis of quantitative process data will usually begin with descriptive statistics relating to questions such as fidelity, dose, and reach. Subsequently, integrating quantitative process measures into outcomes datasets can help to understand how, for example, implementation variability affected outcomes (on-treatment analyses) and test hypotheses arising from qualitative analyses. Some argue that initial analysis of process data should be conducted before the outcomes analysis to avoid biased interpretation of process data. 29 If this model is followed, process data may provide prospective insights into why evaluators might subsequently expect to see positive or negative overall effects and generate hypotheses about how variability in outcomes may emerge. 30

In the NERS process evaluation, implementation measures indicated that the intervention comprised a common core of health professional referrals to discounted, supervised, group based exercise. However, some activities, such as motivational interviewing and goal setting, were poorly delivered. 8 Nevertheless, qualitative data (analysed before trial outcomes were available) indicated that patient motivation was supported by other mechanisms, such as social support from other patients. 8 Subsequently, integration of quantitative measures of psychological change mechanisms with trial outcomes data indicated that significant improvement in physical activity was explained by change in motivation for exercise. 18 Hence, the integration of qualitative and quantitative process data with trial outcomes helped to clarify complex causal pathways.

Reporting findings

Reporting guidelines for health research are available on the EQUATOR network website ( www.equator-network.org/home ), but such guidelines for process evaluations are challenging because they vary so much. Key considerations include reporting relations between quantitative and qualitative components, and the relation of the process evaluation to other evaluation components, such as outcomes or economic evaluation. It is also useful to report assumptions about how the intervention works (ideally in a logic model), and how these informed the selection of research questions and methods. 31 Reporting in the peer reviewed literature will often require multiple articles. To maintain sight of the broader picture, all journal articles should refer to other articles published from the study or to a protocol paper or report that clarifies how the component publications relate to the overall evaluation. When process evaluation has been conducted to interpret trial outcomes, interpretation needs to be clear in the published papers, with process evaluation data linked, in discussion, to trial outcomes. It is also important to report in lay formats for people who delivered the intervention or who will be making decisions about its future implementation.

Cite this as: BMJ 2015;350:h1258

Contributors: GM led the development of the guidance, wrote the first draft of the article, and the full guidance document which it describes, and integrated contributions from the author group into subsequent drafts. JB was the lead applicant for the funding to conduct the work and chaired the author group. All authors contributed to the design and content of the guidance and subsequent drafts of the paper. GM acts as guarantor.

Funding: The work was funded by the MRC Population Health Science Research Network (PHSRN45).

Competing interests: All authors have read and understood BMJ policy on declaration of interests and have no relevant interests to declare.

Provenance and peer review: Not commissioned; externally peer reviewed.

This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) license, which permits others to distribute, remix, adapt and build upon this work, for commercial use, provided the original work is properly cited. See: http://creativecommons.org/licenses/by/4.0/ .

  • ↵ Craig P, Dieppe P, Macintyre S, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 2008 ; 337 : a1655 . OpenUrl FREE Full Text
  • ↵ Campbell M, Fitzpatrick R, Haines A, et al. Framework for design and evaluation of complex interventions to improve health. BMJ 2000 ; 321 : 694 -96. OpenUrl FREE Full Text
  • ↵ Craig P, Dieppe P, Macintyre S, et al. Developing and evaluating complex interventions: new guidance: MRC, 2008.
  • ↵ Moore G, Audrey S, Barker M, et al. Process evaluation in complex public health intervention studies: the need for guidance. J Epidemiol Community Health 2014 ; 68 : 101 -02. OpenUrl FREE Full Text
  • ↵ Fretheim A, Flottorp S, Oxman AD. It is a capital mistake to theorize before one has data: a response to Eccle’s criticism of the OFF theory of research utilization. J Clin Epidemiol 2005 ; 58 : 119 -20. OpenUrl CrossRef
  • ↵ De Silva M, Breuer E, Lee L, et al. Theory of change: a theory-driven approach to enhance the Medical Research Council’s framework for complex interventions. Trials 2014 ; 15 : 267 . OpenUrl CrossRef PubMed
  • ↵ Steckler A, Linnan L, editors. Process evaluation for public health interventions and research. Jossey-Bass, 2002.
  • ↵ Moore GF, Raisanen L, Moore L, et al. Mixed-method process evaluation of the Welsh National Exercise Referral Scheme. Health Education 2013 ; 113 : 476 -501. OpenUrl CrossRef
  • ↵ Hawe P, Shiell A, Riley T. Complex interventions: how “out of control” can a randomised controlled trial be? BMJ 2004 ; 328 : 1561 -63. OpenUrl FREE Full Text
  • ↵ Bumbarger B, Perkins D. After randomised trials: issues related to dissemination of evidence-based interventions. J Children Serv 2008 ; 3 : 55 -64. OpenUrl CrossRef
  • ↵ Carroll C, Patterson M, Wood S, et al. A conceptual framework for implementation fidelity. Implement Sci 2007 ; 2 : 40 . OpenUrl CrossRef PubMed
  • ↵ Montgomery P, Underhill K, Gardner F, et al. The Oxford Implementation Index: a new tool for incorporating implementation data into systematic reviews and meta-analyses. J Clin Epidemiol 2013 ; 66 : 874 -82. OpenUrl CrossRef PubMed
  • ↵ Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999 ; 89 : 1322 -27. OpenUrl CrossRef PubMed Web of Science
  • ↵ Grant A, Treweek S, Dreischulte T, et al. Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 2013 ; 14 : 15 . OpenUrl CrossRef PubMed
  • ↵ Bonell C, Fletcher A, Morton M, et al. Realist randomised controlled trials: a new approach to evaluating complex public health interventions. Soc Sci Med 2012 ; 75 : 2299 -306. OpenUrl CrossRef PubMed
  • ↵ Shiell A, Hawe P, Gold L. Complex interventions or complex systems? Implications for health economic evaluation. BMJ 2008 ; 336 : 1281 -83. OpenUrl FREE Full Text
  • ↵ Pawson R, Tilley N. Realistic evaluation . Sage, 1997 .
  • ↵ Littlecott H, Moore G, Moore L, et al. Psychosocial mediators of change in physical activity in the Welsh national exercise referral scheme: secondary analysis of a randomised controlled trial. Int J Behav Nutr Physical Activity 2014 ; 11 : 109 . OpenUrl CrossRef Web of Science
  • ↵ Moore GF, Moore L, Murphy S. Integration of motivational interviewing into practice in the national exercise referral scheme in Wales: a mixed methods study. Behav Cog Psychother 2012 ; 40 : 313 -30. OpenUrl CrossRef
  • ↵ Audrey S, Holliday J, Parry-Langdon N, et al. Meeting the challenges of implementing process evaluation within randomized controlled trials: the example of ASSIST (A Stop Smoking in Schools Trial). Health Educ Res 2006 ; 21 : 366 -77. OpenUrl Abstract / FREE Full Text
  • ↵ O’Cathain A, Murphy E, Nicholl J. Multidisciplinary, interdisciplinary, or dysfunctional? Team working in mixed-methods research. Qual Health Res 2008 ; 18 : 1574 -85. OpenUrl Abstract / FREE Full Text
  • ↵ Hawe P, Shiell A, Riley T. Theorising Interventions as Events in Systems. Am J Community Psychol 2009 ; 43 : 267 -76. OpenUrl CrossRef PubMed Web of Science
  • ↵ Kellogg Foundation WK. Logic model development guide. W K Kellogg Foundation, 2004.
  • ↵ Bonell C, Fletcher, A, Fitzgerald-Yau, N, et al. Initiating change locally in bullying and aggression through the school environment (INCLUSIVE): pilot randomised controlled trial. Health Technol Assess (forthcoming).
  • ↵ Markham WA, Aveyard P. A new theory of health promoting schools based on human functioning, school organisation and pedagogic practice. Soc Sci Med 2003 ; 56 : 1209 -20. OpenUrl CrossRef PubMed Web of Science
  • ↵ Sawyer MG, Pfeiffer S, Spence SH, et al. School based prevention of depression: a randomised controlled study of the beyondblue schools research initiative. J Child Psychol Psychiatry 2010 ; 51 : 199 -209. OpenUrl CrossRef PubMed Web of Science
  • ↵ Waters E, Hall BJ, Armstrong R, et al. Essential components of public health evidence reviews: capturing intervention complexity, implementation, economics and equity. J Public Health 2011 ; 33 : 462 -65. OpenUrl FREE Full Text
  • ↵ Munro A, Bloor M. Process evaluation: the new miracle ingredient in public health research? Qualitative Research 2010 ; 10 : 699 -713. OpenUrl Abstract / FREE Full Text
  • ↵ Oakley A, Strange V, Bonell C, et al. Health services research—process evaluation in randomised controlled trials of complex interventions. BMJ 2006 ; 332 : 413 -6. OpenUrl FREE Full Text
  • ↵ Mermelstein R. Moving tobacco prevention outside the classroom. Lancet 2008 ; 371 : 1556 -57. OpenUrl CrossRef PubMed Web of Science
  • ↵ Armstrong R, Waters E, Moore L, et al. Improving the reporting of public health intervention research: advancing TREND and CONSORT. J Public Health 2008 ; 30 : 103 -9. OpenUrl Abstract / FREE Full Text

uk medical research council intervention development framework

  • Public Health
  • Complex Interventions

Medical Research Council framework for development and evaluation of complex interventions: A comprehensive guidance

  • Journal of Education and Health Promotion 9(1):88
  • CC BY-NC-SA 4.0

Hooman Shahsavari at Tehran University of Medical Sciences

  • Tehran University of Medical Sciences

Pegah Matourypour at Tehran University of Medical Sciences

  • This person is not on ResearchGate, or hasn't claimed this research yet.

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Ciara McCormack

  • Bróna Kehoe

Sarah Cullivan

  • David Beard
  • Alison Harvey

Pip Logan

  • BMC MED RES METHODOL

Blanca De Dios Perez

  • Kit-Ching Lee

Daniel Bressington

  • Eva Coringrato
  • Katherine Alaimo

Jenn Leiferman

  • BMC MED INFORM DECIS

Liza G G van Lent

  • J HEALTH PSYCHOL

Eric Schoenmakers

  • Josan Merchan
  • Mark Twyford
  • Daniel Westby
  • Colum Keohane

Stewart R Walsh

  • BMC HEALTH SERV RES
  • Kirsten Bobrow
  • Andrew Farmer
  • Nomazizi Cishe

Naomi Levitt

  • J PUBLIC HEALTH-UK

Jacqui Troughton

  • Sudesna Chatterjee

Siân E Hill

  • Melanie J Davies
  • Br Med J Int Ed

Graham F Moore

  • Mary Barker
  • Rajalakshmi Lakshman
  • Simon J Griffin

Wendy Hardeman

  • Mark Petticrew

Mikayla Barker

  • Janis Baird
  • P Matourypour
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.374; 2021

A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance

Kathryn skivington.

1 MRC/CSO Social and Public Health Sciences Unit, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK

Lynsay Matthews

Sharon anne simpson, peter craig, janis baird.

2 Medical Research Council Lifecourse Epidemiology Unit, University of Southampton, Southampton, UK

Jane M Blazeby

3 Medical Research Council ConDuCT-II Hub for Trials Methodology Research and Bristol Biomedical Research Centre, Bristol, UK

Kathleen Anne Boyd

4 Health Economics and Health Technology Assessment Unit, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK

5 Public Health Scotland, Glasgow, UK

David P French

6 Manchester Centre for Health Psychology, University of Manchester, Manchester, UK

Emma McIntosh

Mark petticrew.

7 London School of Hygiene and Tropical Medicine, London, UK

Jo Rycroft-Malone

8 Faculty of Health and Medicine, Lancaster University, Lancaster, UK

Martin White

9 Medical Research Council Epidemiology Unit, University of Cambridge, Cambridge, UK

Laurence Moore

The UK Medical Research Council’s widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research.

Complex interventions are commonly used in the health and social care services, public health practice, and other areas of social and economic policy that have consequences for health. Such interventions are delivered and evaluated at different levels, from individual to societal levels. Examples include a new surgical procedure, the redesign of a healthcare programme, and a change in welfare policy. The UK Medical Research Council (MRC) published a framework for researchers and research funders on developing and evaluating complex interventions in 2000 and revised guidance in 2006. 1 2 3 Although these documents continue to be widely used and are now accompanied by a range of more detailed guidance on specific aspects of the research process, 4 5 6 7 8 several important conceptual, methodological and theoretical developments have taken place since 2006. These developments have been included in a new framework commissioned by the National Institute of Health Research (NIHR) and the MRC. 9 The framework aims to help researchers work with other stakeholders to identify the key questions about complex interventions, and to design and conduct research with a diversity of perspectives and appropriate choice of methods.

Summary points

  • Complex intervention research can take an efficacy, effectiveness, theory based, and/or systems perspective, the choice of which is based on what is known already and what further evidence would add most to knowledge
  • Complex intervention research goes beyond asking whether an intervention works in the sense of achieving its intended outcome—to asking a broader range of questions (eg, identifying what other impact it has, assessing its value relative to the resources required to deliver it, theorising how it works, taking account of how it interacts with the context in which it is implemented, how it contributes to system change, and how the evidence can be used to support real world decision making)
  • A trade-off exists between precise unbiased answers to narrow questions and more uncertain answers to broader, more complex questions; researchers should answer the questions that are most useful to decision makers rather than those that can be answered with greater certainty
  • Complex intervention research can be considered in terms of phases, although these phases are not necessarily sequential: development or identification of an intervention, assessment of feasibility of the intervention and evaluation design, evaluation of the intervention, and impactful implementation
  • How does the intervention interact with its context?
  • What is the underpinning programme theory?
  • How can diverse stakeholder perspectives be included in the research?
  • What are the key uncertainties?
  • How can the intervention be refined?
  • What are the comparative resource and outcome consequences of the intervention?
  • The answers to these questions should be used to decide whether the research should proceed to the next phase, return to a previous phase, repeat a phase, or stop

Development of the Framework for Developing and Evaluating Complex Interventions

The updated Framework for Developing and Evaluating Complex Interventions is the culmination of a process that included four stages:

  • A gap analysis to identify developments in the methods and practice since the previous framework was published
  • A full day expert workshop, in May 2018, of 36 participants to discuss the topics identified in the gap analysis
  • An open consultation on a draft of the framework in April 2019, whereby we sought stakeholder opinion by advertising via social media, email lists and other networks for written feedback (52 detailed responses were received from stakeholders internationally)
  • Redraft using findings from the previous stages, followed by a final expert review.

We also sought stakeholder views at various interactive workshops throughout the development of the framework: at the annual meetings of the Society for Social Medicine and Population Health (2018), the UK Society for Behavioural Medicine (2017, 2018), and internationally at the International Congress of Behavioural Medicine (2018). The entire process was overseen by a scientific advisory group representing the range of relevant NIHR programmes and MRC population health investments. The framework was reviewed by the MRC-NIHR Methodology Research Programme Advisory Group and then approved by the MRC Population Health Sciences Group in March 2020 before undergoing further external peer and editorial review through the NIHR Journals Library peer review process. More detailed information and the methods used to develop this new framework are described elsewhere. 9 This article introduces the framework and summarises the main messages for producers and users of evidence.

What are complex interventions?

An intervention might be considered complex because of properties of the intervention itself, such as the number of components involved; the range of behaviours targeted; expertise and skills required by those delivering and receiving the intervention; the number of groups, settings, or levels targeted; or the permitted level of flexibility of the intervention or its components. For example, the Links Worker Programme was an intervention in primary care in Glasgow, Scotland, that aimed to link people with community resources to help them “live well” in their communities. It targeted individual, primary care (general practitioner (GP) surgery), and community levels. The intervention was flexible in that it could differ between primary care GP surgeries. In addition, the Link Workers did not support just one specific health or wellbeing issue: bereavement, substance use, employment, and learning difficulties were all included. 10 11 The complexity of this intervention had implications for many aspects of its evaluation, such as the choice of appropriate outcomes and processes to assess.

Flexibility in intervention delivery and adherence might be permitted to allow for variation in how, where, and by whom interventions are delivered and received. Standardisation of interventions could relate more to the underlying process and functions of the intervention than on the specific form of components delivered. 12 For example, in surgical trials, protocols can be designed with flexibility for intervention delivery. 13 Interventions require a theoretical deconstruction into components and then agreement about permissible and prohibited variation in the delivery of those components. This approach allows implementation of a complex intervention to vary across different contexts yet maintain the integrity of the core intervention components. Drawing on this approach in the ROMIO pilot trial, core components of minimally invasive oesophagectomy were agreed and subsequently monitored during main trial delivery using photography. 14

Complexity might also arise through interactions between the intervention and its context, by which we mean “any feature of the circumstances in which an intervention is conceived, developed, implemented and evaluated.” 6 15 16 17 Much of the criticism of and extensions to the existing framework and guidance have focused on the need for greater attention on understanding how and under what circumstances interventions bring about change. 7 15 18 The importance of interactions between the intervention and its context emphasises the value of identifying mechanisms of change, where mechanisms are the causal links between intervention components and outcomes; and contextual factors, which determine and shape whether and how outcomes are generated. 19

Thus, attention is given not only to the design of the intervention itself but also to the conditions needed to realise its mechanisms of change and/or the resources required to support intervention reach and impact in real world implementation. For example, in a cluster randomised trial of ASSIST (a peer led, smoking prevention intervention), researchers found that the intervention worked particularly well in cohesive communities that were served by one secondary school where peer supporters were in regular contact with their peers—a key contextual factor consistent with diffusion of innovation theory, which underpinned the intervention design. 20 A process evaluation conducted alongside a trial of robot assisted surgery identified key contextual factors to support effective implementation of this procedure, including engaging staff at different levels and surgeons who would not be using robot assisted surgery, whole team training, and an operating theatre of suitable size. 21

With this framing, complex interventions can helpfully be considered as events in systems. 16 Thinking about systems helps us understand the interaction between an intervention and the context in which it is implemented in a dynamic way. 22 Systems can be thought of as complex and adaptive, 23 characterised by properties such as emergence, feedback, adaptation, and self-organisation ( table 1 ).

Properties and examples of complex adaptive systems

System propertiesExample
Complex systems have emergent, often unanticipated, properties that are a feature of the system as a wholeGroup based interventions that target young people at risk could be undermined by the emergence of new social relationships among the group that increase members’ exposure to risk behaviours, while reducing their contact with other young people less tolerant of risk taking
Where one change reinforces, promotes, balances, or diminishes anotherA smoking ban in public places reduces the visibility and convenience of smoking; fewer young people start smoking, further reducing its visibility, in a reinforcing loop
Change of system behaviour in response to an interventionRetailers adapted to the ban on multi-buy discounts by discounting individual alcohol products, offering them at the same price individually as they would have been if part of a multi-buy offer
Order arising from spontaneous local interaction rather than a preconceived plan or external controlRecognising that individual treatment did not address some social aspects of alcohol dependency, recovering drinkers self-organised to form Alcoholics Anonymous

For complex intervention research to be most useful to decision makers, it should take into account the complexity that arises both from the intervention’s components and from its interaction with the context in which it is being implemented.

Research perspectives

The previous framework and guidance were based on a paradigm in which the salient question was to identify whether an intervention was effective. Complex intervention research driven primarily by this question could fail to deliver interventions that are implementable, cost effective, transferable, and scalable in real world conditions. To deliver solutions for real world practice, complex intervention research requires strong and early engagement with patients, practitioners, and policy makers, shifting the focus from the “binary question of effectiveness” 26 to whether and how the intervention will be acceptable, implementable, cost effective, scalable, and transferable across contexts. In line with a broader conception of complexity, the scope of complex intervention research needs to include the development, identification, and evaluation of whole system interventions and the assessment of how interventions contribute to system change. 22 27 The new framework therefore takes a pluralistic approach and identifies four perspectives that can be used to guide the design and conduct of complex intervention research: efficacy, effectiveness, theory based, and systems ( table 2 ).

Perspective and research questionKey pointsVaccine study example
To what extent does the intervention produce the intended outcomes in experimental or ideal settings?Conducted under idealised conditions; maximises internal validity to provide a precise, unbiased estimate of efficacySeeks to measure the effect of the vaccine on immune system response and report its safety
To what extent does the intervention produce the intended outcomes in real world settings?Intervention often compared against treatment as usual; results inform choices between an established and a novel approach to achieving the desired outcomeSeeks to determine whether the vaccination programme, implemented in a range of real world populations and settings, is effective in terms of what it set out to do (eg, prevent disease)
What works in which circumstances and how?Aims to understand how change is brought about, including the interplay of mechanisms and context; can lead to refinement of theoryAsks why effectiveness varies across contexts, and asks what this variation indicates about the conditions for a successful vaccination programme ; considerations that might be explored go beyond whether the vaccine works
How do the system and intervention adapt to one another?Treats the intervention as a disruption to a complex system Seeks to understand the dynamic interdependence of vaccination rollout, population risk of infection and willingness to be vaccinated, as the vaccination programme proceeds

Although each research perspective prompts different types of research question, they should be thought of as overlapping rather than mutually exclusive. For example, theory based and systems perspectives to evaluation can be used in conjunction, 33 while an effectiveness evaluation can draw on a theory based or systems perspective through an embedded process evaluation to explore how and under what circumstances outcomes are achieved. 34 35 36

Most complex health intervention research so far has taken an efficacy or effectiveness perspective and for some research questions these perspectives will continue to be the most appropriate. However, some questions equally relevant to the needs of decision makers cannot be answered by research restricted to an efficacy or effectiveness perspective. A wider range and combination of research perspectives and methods, which answer questions beyond efficacy and effectiveness, need to be used by researchers and supported by funders. Doing so will help to improve the extent to which key questions for decision makers can be answered by complex intervention research. Example questions include:

  • Will this effective intervention reproduce the effects found in the trial when implemented here?
  • Is the intervention cost effective?
  • What are the most important things we need to do that will collectively improve health outcomes?
  • In the absence of evidence from randomised trials and the infeasibility of conducting such a trial, what does the existing evidence suggest is the best option now and how can this be evaluated?
  • What wider changes will occur as a result of this intervention?
  • How are the intervention effects mediated by different settings and contexts?

Phases and core elements of complex intervention research

The framework divides complex intervention research into four phases: development or identification of the intervention, feasibility, evaluation, and implementation ( fig 1 ). A research programme might begin at any phase, depending on the key uncertainties about the intervention in question. Repeating phases is preferable to automatic progression if uncertainties remain unresolved. Each phase has a common set of core elements—considering context, developing and refining programme theory, engaging stakeholders, identifying key uncertainties, refining the intervention, and economic considerations. These elements should be considered early and continually revisited throughout the research process, and especially before moving between phases (for example, between feasibility testing and evaluation).

An external file that holds a picture, illustration, etc.
Object name is skik065968.f1.jpg

Framework for developing and evaluating complex interventions. Context=any feature of the circumstances in which an intervention is conceived, developed, evaluated, and implemented; programme theory=describes how an intervention is expected to lead to its effects and under what conditions—the programme theory should be tested and refined at all stages and used to guide the identification of uncertainties and research questions; stakeholders=those who are targeted by the intervention or policy, involved in its development or delivery, or more broadly those whose personal or professional interests are affected (that is, who have a stake in the topic)—this includes patients and members of the public as well as those linked in a professional capacity; uncertainties=identifying the key uncertainties that exist, given what is already known and what the programme theory, research team, and stakeholders identify as being most important to discover—these judgments inform the framing of research questions, which in turn govern the choice of research perspective; refinement=the process of fine tuning or making changes to the intervention once a preliminary version (prototype) has been developed; economic considerations=determining the comparative resource and outcome consequences of the interventions for those people and organisations affected

Core elements

The effects of a complex intervention might often be highly dependent on context, such that an intervention that is effective in some settings could be ineffective or even harmful elsewhere. 6 As the examples in table 1 show, interventions can modify the contexts in which they are implemented, by eliciting responses from other agents, or by changing behavioural norms or exposure to risk, so that their effects will also vary over time. Context can be considered as both dynamic and multi-dimensional. Key dimensions include physical, spatial, organisational, social, cultural, political, or economic features of the healthcare, health system, or public health contexts in which interventions are implemented. For example, the evaluation of the Breastfeeding In Groups intervention found that the context of the different localities (eg, staff morale and suitable premises) influenced policy implementation and was an explanatory factor in why breastfeeding rates increased in some intervention localities and declined in others. 37

Programme theory

Programme theory describes how an intervention is expected to lead to its effects and under what conditions. It articulates the key components of the intervention and how they interact, the mechanisms of the intervention, the features of the context that are expected to influence those mechanisms, and how those mechanisms might influence the context. 38 Programme theory can be used to promote shared understanding of the intervention among diverse stakeholders, and to identify key uncertainties and research questions. Where an intervention (such as a policy) is developed by others, researchers still need to theorise the intervention before attempting to evaluate it. 39 Best practice is to develop programme theory at the beginning of the research project with involvement of diverse stakeholders, based on evidence and theory from relevant fields, and to refine it during successive phases. The EPOCH trial tested a large scale quality improvement programme aimed at improving 90 day survival rates for patients undergoing emergency abdominal surgery; it included a well articulated programme theory at the outset, which supported the tailoring of programme delivery to local contexts. 40 The development, implementation, and post-study reflection of the programme theory resulted in suggested improvements for future implementation of the quality improvement programme.

A refined programme theory is an important evaluation outcome and is the principal aim where a theory based perspective is taken. Improved programme theory will help inform transferability of interventions across settings and help produce evidence and understanding that is useful to decision makers. In addition to full articulation of programme theory, it can help provide visual representations—for example, using a logic model, 41 42 43 realist matrix, 44 or a system map, 45 with the choice depending on which is most appropriate for the research perspective and research questions. Although useful, any single visual representation is unlikely to sufficiently articulate the programme theory—it should always be articulated well within the text of publications, reports, and funding applications.

Stakeholders

Stakeholders include those individuals who are targeted by the intervention or policy, those involved in its development or delivery, or those whose personal or professional interests are affected (that is, all those who have a stake in the topic). Patients and the public are key stakeholders. Meaningful engagement with appropriate stakeholders at each phase of the research is needed to maximise the potential of developing or identifying an intervention that is likely to have positive impacts on health and to enhance prospects of achieving changes in policy or practice. For example, patient and public involvement 46 activities in the PARADES programme, which evaluated approaches to reduce harm and improve outcomes for people with bipolar disorder, were wide ranging and central to the project. 47 Involving service users with lived experiences of bipolar disorder had many benefits, for example, it enhanced the intervention but also improved the evaluation and dissemination methods. Service users involved in the study also had positive outcomes, including more settled employment and progression to further education. Broad thinking and consultation is needed to identify a diverse range of appropriate stakeholders.

The purpose of stakeholder engagement will differ depending on the context and phase of the research, but is essential for prioritising research questions, the co-development of programme theory, choosing the most useful research perspective, and overcoming practical obstacles to evaluation and implementation. Researchers should nevertheless be mindful of conflicts of interest among stakeholders and use transparent methods to record potential conflicts of interest. Research should not only elicit stakeholder priorities, but also consider why they are priorities. Careful consideration of the appropriateness and methods of identification and engagement of stakeholders is needed. 46 48

Key uncertainties

Many questions could be answered at each phase of the research process. The design and conduct of research need to engage pragmatically with the multiple uncertainties involved and offer a flexible and emergent approach to exploring them. 15 Therefore, researchers should spend time developing the programme theory, clearly identifying the remaining uncertainties, given what is already known and what the research team and stakeholders identify as being most important to determine. Judgments about the key uncertainties inform the framing of research questions, which in turn govern the choice of research perspective.

Efficacy trials of relatively uncomplicated interventions in tightly controlled conditions, where research questions are answered with great certainty, will always be important, but translation of the evidence into the diverse settings of everyday practice is often highly problematic. 27 For intervention research in healthcare and public health settings to take on more challenging evaluation questions, greater priority should be given to mixed methods, theory based, or systems evaluation that is sensitive to complexity and that emphasises implementation, context, and system fit. This approach could help improve understanding and identify important implications for decision makers, albeit with caveats, assumptions, and limitations. 22 Rather than maintaining the established tendency to prioritise strong research designs that answer some questions with certainty but are unsuited to resolving many important evaluation questions, this more inclusive, deliberative process could place greater value on equivocal findings that nevertheless inform important decisions where evidence is sparse.

Intervention refinement

Within each phase of complex intervention research and on transition from one phase to another, the intervention might need to be refined, on the basis of data collected or development of programme theory. 4 The feasibility and acceptability of interventions can be improved by engaging potential intervention users to inform refinements. For example, an online physical activity planner for people with diabetes mellitus was found to be difficult to use, resulting in the tool providing incorrect personalised advice. To improve usability and the advice given, several iterations of the planner were developed on the basis of interviews and observations. This iterative process led to the refined planner demonstrating greater feasibility and accuracy. 49

Refinements should be guided by the programme theory, with acceptable boundaries agreed and specified at the beginning of each research phase, and with transparent reporting of the rationale for change. Scope for refinement might also be limited by the policy or practice context. Refinement will be rare in the evaluation phase of efficacy and effectiveness research, where interventions will ideally not change or evolve within the course of the study. However, between the phases of research and within systems and theory based evaluation studies, refinement of interventions in response to accumulated data or as an adaptive and variable response to context and system change are likely to be desirable features of the intervention and a key focus of the research.

Economic considerations

Economic evaluation—the comparative analysis of alternative courses of action in terms of both costs (resource use) and consequences (outcomes, effects)—should be a core component of all phases of intervention research. Early engagement of economic expertise will help identify the scope of costs and benefits to assess in order to answer questions that matter most to decision makers. 50 Broad ranging approaches such as cost benefit analysis or cost consequence analysis, which seek to capture the full range of health and non-health costs and benefits across different sectors, 51 will often be more suitable for an economic evaluation of a complex intervention than narrower approaches such as cost effectiveness or cost utility analysis. For example, evaluation of the New Orleans Intervention Model for infants entering foster care in Glasgow included short and long term economic analysis from multiple perspectives (the UK’s health service and personal social services, public sector, and wider societal perspectives); and used a range of frameworks, including cost utility and cost consequence analysis, to capture changes in the intersectoral costs and outcomes associated with child maltreatment. 52 53 The use of multiple economic evaluation frameworks provides decision makers with a comprehensive, multi-perspective guide to the cost effectiveness of the New Orleans Intervention Model.

Developing or identifying a complex intervention

Development refers to the whole process of designing and planning an intervention, from initial conception through to feasibility, pilot, or evaluation study. Guidance on intervention development has recently been developed through the INDEX study 4 ; although here we highlight that complex intervention research does not always begin with new or researcher led interventions. For example:

  • A key source of intervention development might be an intervention that has been developed elsewhere and has the possibility of being adapted to a new context. Adaptation of existing interventions could include adapting to a new population, to a new setting, 54 55 or to target other outcomes (eg, a smoking prevention intervention being adapted to tackle substance misuse and sexual health). 20 56 57 A well developed programme theory can help identify what features of the antecedent intervention(s) need to be adapted for different applications, and the key mechanisms that should be retained even if delivered slightly differently. 54 58
  • Policy or practice led interventions are an important focus of evaluation research. Again, uncovering the implicit theoretical basis of an intervention and developing a programme theory is essential to identifying key uncertainties and working out how the intervention might be evaluated. This step is important, even if rollout has begun, because it supports the identification of mechanisms of change, important contextual factors, and relevant outcome measures. For example, researchers evaluating the UK soft drinks industry levy developed a bounded conceptual system map to articulate their understanding (drawing on stakeholder views and document review) of how the intervention was expected to work. This system map guided the evaluation design and helped identify data sources to support evaluation. 45 Another example is a recent analysis of the implicit theory of the NHS diabetes prevention programme, involving analysis of documentation by NHS England and four providers, showing that there was no explicit theoretical basis for the programme, and no logic model showing how the intervention was expected to work. This meant that the justification for the inclusion of intervention components was unclear. 59

Intervention identification and intervention development represent two distinct pathways of evidence generation, 60 but in both cases, the key considerations in this phase relate to the core elements described above.

Feasibility

A feasibility study should be designed to assess predefined progression criteria that relate to the evaluation design (eg, reducing uncertainty around recruitment, data collection, retention, outcomes, and analysis) or the intervention itself (eg, around optimal content and delivery, acceptability, adherence, likelihood of cost effectiveness, or capacity of providers to deliver the intervention). If the programme theory suggests that contextual or implementation factors might influence the acceptability, effectiveness, or cost effectiveness of the intervention, these questions should be considered.

Despite being overlooked or rushed in the past, the value of feasibility testing is now widely accepted with key terms and concepts well defined. 61 62 Before initiating a feasibility study, researchers should consider conducting an evaluability assessment to determine whether and how an intervention can usefully be evaluated. Evaluability assessment involves collaboration with stakeholders to reach agreement on the expected outcomes of the intervention, the data that could be collected to assess processes and outcomes, and the options for designing the evaluation. 63 The end result is a recommendation on whether an evaluation is feasible, whether it can be carried out at a reasonable cost, and by which methods. 64

Economic modelling can be undertaken at the feasibility stage to assess the likelihood that the expected benefits of the intervention justify the costs (including the cost of further research), and to help decision makers decide whether proceeding to a full scale evaluation is worthwhile. 65 Depending on the results of the feasibility study, further work might be required to progressively refine the intervention before embarking on a full scale evaluation.

The new framework defines evaluation as going beyond asking whether an intervention works (in the sense of achieving its intended outcome), to a broader range of questions including identifying what other impact it has, theorising how it works, taking account of how it interacts with the context in which it is implemented, how it contributes to system change, and how the evidence can be used to support decision making in the real world. This implies a shift from an exclusive focus on obtaining unbiased estimates of effectiveness 66 towards prioritising the usefulness of information for decision making in selecting the optimal research perspective and in prioritising answerable research questions.

A crucial aspect of evaluation design is the choice of outcome measures or evidence of change. Evaluators should work with stakeholders to assess which outcomes are most important, and how to deal with multiple outcomes in the analysis with due consideration of statistical power and transparent reporting. A sharp distinction between one primary outcome and several secondary outcomes is not necessarily appropriate, particularly where the programme theory identifies impacts across a range of domains. Where needed to support the research questions, prespecified subgroup analyses should be carried out and reported. Even where such analyses are underpowered, they should be included in the protocol because they might be useful for subsequent meta-analyses, or for developing hypotheses for testing in further research. Outcome measures could capture changes to a system rather than changes in individuals. Examples include changes in relationships within an organisation, the introduction of policies, changes in social norms, or normalisation of practice. Such system level outcomes include how changing the dynamics of one part of a system alters behaviours in other parts, such as the potential for displacement of smoking into the home after a public smoking ban.

A helpful illustration of the use of system level outcomes is the evaluation of the Delaware Young Health Program—an initiative to improve the health and wellbeing of young people in Delaware, USA. The intervention aimed to change underlying system dynamics, structures, and conditions, so the evaluation identified systems oriented research questions and methods. Three systems science methods were used: group model building and viable systems model assessment to identify underlying patterns and structures; and social network analysis to evaluate change in relationships over time. 67

Researchers have many study designs to choose from, and different designs are optimally suited to consider different research questions and different circumstances. 68 Extensions to standard designs of randomised controlled trials (including adaptive designs, SMART trials (sequential multiple assignment randomised trials), n-of-1 trials, and hybrid effectiveness-implementation designs) are important areas of methods development to improve the efficiency of complex intervention research. 69 70 71 72 Non-randomised designs and modelling approaches might work best if a randomised design is not practical, for example, in natural experiments or systems evaluations. 5 73 74 A purely quantitative approach, using an experimental design with no additional elements such as a process evaluation, is rarely adequate for complex intervention research, where qualitative and mixed methods designs might be necessary to answer questions beyond effectiveness. In many evaluations, the nature of the intervention, the programme theory, or the priorities of stakeholders could lead to a greater focus on improving theories about how to intervene. In this view, effect estimates are inherently context bound, so that average effects are not a useful guide to decision makers working in different contexts. Contextualised understandings of how an intervention induces change might be more useful, as well as details on the most important enablers and constraints on its delivery across a range of settings. 7

Process evaluation can answer questions around fidelity and quality of implementation (eg, what is implemented and how?), mechanisms of change (eg, how does the delivered intervention produce change?), and context (eg, how does context affect implementation and outcomes?). 7 Process evaluation can help determine why an intervention fails unexpectedly or has unanticipated consequences, or why it works and how it can be optimised. Such findings can facilitate further development of the intervention programme theory. 75 In a theory based or systems evaluation, there is not necessarily such a clear distinction between process and outcome evaluation as there is in an effectiveness study. 76 These perspectives could prioritise theory building over evidence production and use case study or simulation methods to understand how outcomes or system behaviour are generated through intervention. 74 77

Implementation

Early consideration of implementation increases the potential of developing an intervention that can be widely adopted and maintained in real world settings. Implementation questions should be anticipated in the intervention programme theory, and considered throughout the phases of intervention development, feasibility testing, process, and outcome evaluation. Alongside implementation specific outcomes (such as reach or uptake of services), attention to the components of the implementation strategy, and contextual factors that support or hinder the achievement of impacts, are key. Some flexibility in intervention implementation might support intervention transferability into different contexts (an important aspect of long term implementation 78 ), provided that the key functions of the programme are maintained, and that the adaptations made are clearly understood. 8

In the ASSIST study, 20 a school based, peer led intervention for smoking prevention, researchers considered implementation at each phase. The intervention was developed to have minimal disruption on school resources; the feasibility study resulted in intervention refinements to improve acceptability and improve reach to male students; and in the evaluation (cluster randomised controlled trial), the intervention was delivered as closely as possible to real world implementation. Drawing on the process evaluation, the implementation included an intervention manual that identified critical components and other components that could be adapted or dropped to allow flexible implementation while achieving delivery of the key mechanisms of change; and a training manual for the trainers and ongoing quality assurance built into rollout for the longer term.

In a natural experimental study, evaluation takes place during or after the implementation of the intervention in a real world context. Highly pragmatic effectiveness trials or specific hybrid effectiveness-implementation designs also combine effectiveness and implementation outcomes in one study, with the aim of reducing time for translation of research on effectiveness into routine practice. 72 79 80

Implementation questions should be included in economic considerations during the early stages of intervention and study development. How the results of economic analyses are reported and presented to decision makers can affect whether and how they act on the results. 81 A key consideration is how to deal with interventions across different sectors, where those paying for interventions and those receiving the benefits of them could differ, reducing the incentive to implement an intervention, even if shown to be beneficial and cost effective. Early engagement with appropriate stakeholders will help frame appropriate research questions and could anticipate any implementation challenges that might arise. 82

Conclusions

One of the motivations for developing this new framework was to answer calls for a change in research priorities, towards allocating greater effort and funding to research that can have the optimum impact on healthcare or population health outcomes. The framework challenges the view that unbiased estimates of effectiveness are the cardinal goal of evaluation. It asserts that improving theories and understanding how interventions contribute to change, including how they interact with their context and wider dynamic systems, is an equally important goal. For some complex intervention research problems, an efficacy or effectiveness perspective will be the optimal approach, and a randomised controlled trial will provide the best design to achieve an unbiased estimate. For others, alternative perspectives and designs might work better, or might be the only way to generate new knowledge to reduce decision maker uncertainty.

What is important for the future is that the scope of intervention research is not constrained by an unduly limited set of perspectives and approaches that might be less risky to commission and more likely to produce a clear and unbiased answer to a specific question. A bolder approach is needed—to include methods and perspectives where experience is still quite limited, but where we, supported by our workshop participants and respondents to our consultations, believe there is an urgent need to make progress. This endeavour will involve mainstreaming new methods that are not yet widely used, as well as undertaking methodological innovation and development. The deliberative and flexible approach that we encourage is intended to reduce research waste, 83 maximise usefulness for decision makers, and increase the efficiency with which complex intervention research generates knowledge that contributes to health improvement.

Monitoring the use of the framework and evaluating its acceptability and impact is important but has been lacking in the past. We encourage research funders and journal editors to support the diversity of research perspectives and methods that are advocated here and to seek evidence that the core elements are attended to in research design and conduct. We have developed a checklist to support the preparation of funding applications, research protocols, and journal publications. 9 This checklist offers one way to monitor impact of the guidance on researchers, funders, and journal editors.

We recommend that the guidance is continually updated, and future updates continue to adopt a broad, pluralist perspective. Given its wider scope, and the range of detailed guidance that is now available on specific methods and topics, we believe that the framework is best seen as meta-guidance. Further editions should be published in a fluid, web based format, and more frequently updated to incorporate new material, further case studies, and additional links to other new resources.

Acknowledgments

We thank the experts who provided input at the workshop, those who responded to the consultation, and those who provided advice and review throughout the process. The many people involved are acknowledged in the full framework document. 9 Parts of this manuscript have been reproduced (some with edits and formatting changes), with permission, from that longer framework document.

Contributors: All authors made a substantial contribution to all stages of the development of the framework—they contributed to its development, drafting, and final approval. KS and LMa led the writing of the framework, and KS wrote the first draft of this paper. PC, SAS, and LMo provided critical insights to the development of the framework and contributed to writing both the framework and this paper. KS, LMa, SAS, PC, and LMo facilitated the expert workshop, KS and LMa developed the gap analysis and led the analysis of the consultation. KAB, NC, and EM contributed the economic components to the framework. The scientific advisory group (JB, JMB, DPF, MP, JR-M, and MW) provided feedback and edits on drafts of the framework, with particular attention to process evaluation (JB), clinical research (JMB), implementation (JR-M, DPF), systems perspective (MP), theory based perspective (JR-M), and population health (MW). LMo is senior author. KS and LMo are the guarantors of this work and accept the full responsibility for the finished article. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting authorship criteria have been omitted.

Funding: The work was funded by the National Institute for Health Research (Department of Health and Social Care 73514) and Medical Research Council (MRC). Additional time on the study was funded by grants from the MRC for KS (MC_UU_12017/11, MC_UU_00022/3), LMa, SAS, and LMo (MC_UU_12017/14, MC_UU_00022/1); PC (MC_UU_12017/15, MC_UU_00022/2); and MW (MC_UU_12015/6 and MC_UU_00006/7). Additional time on the study was also funded by grants from the Chief Scientist Office of the Scottish Government Health Directorates for KS (SPHSU11 and SPHSU18); LMa, SAS, and LMo (SPHSU14 and SPHSU16); and PC (SPHSU13 and SPHSU15). KS and SAS were also supported by an MRC Strategic Award (MC_PC_13027). JMB received funding from the NIHR Biomedical Research Centre at University Hospitals Bristol NHS Foundation Trust and the University of Bristol and by the MRC ConDuCT-II Hub (Collaboration and innovation for Difficult and Complex randomised controlled Trials In Invasive procedures - MR/K025643/1). DF is funded in part by the NIHR Manchester Biomedical Research Centre (IS-BRC-1215-20007) and NIHR Applied Research Collaboration - Greater Manchester (NIHR200174). MP is funded in part as director of the NIHR’s Public Health Policy Research Unit. This project was overseen by a scientific advisory group that comprised representatives of NIHR research programmes, of the MRC/NIHR Methodology Research Programme Panel, of key MRC population health research investments, and authors of the 2006 guidance. A prospectively agreed protocol, outlining the workplan, was agreed with MRC and NIHR, and signed off by the scientific advisory group. The framework was reviewed and approved by the MRC/NIHR Methodology Research Programme Advisory Group and MRC Population Health Sciences Group and completed NIHR HTA Monograph editorial and peer review processes.

Competing interests: All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/coi_disclosure.pdf and declare: support from the NIHR, MRC, and the funders listed above for the submitted work; KS has project grant funding from the Scottish Government Chief Scientist Office; SAS is a former member of the NIHR Health Technology Assessment Clinical Evaluation and Trials Programme Panel (November 2016 - November 2020) and member of the Chief Scientist Office Health HIPS Committee (since 2018) and NIHR Policy Research Programme (since November 2019), and has project grant funding from the Economic and Social Research Council, MRC, and NIHR; LMo is a former member of the MRC-NIHR Methodology Research Programme Panel (2015-19) and MRC Population Health Sciences Group (2015-20); JB is a member of the NIHR Public Health Research Funding Committee (since May 2019), and a core member (since 2016) and vice chairperson (since 2018) of a public health advisory committee of the National Institute for Health and Care Excellence; JMB is a former member of the NIHR Clinical Trials Unit Standing Advisory Committee (2015-19); DPF is a former member of the NIHR Public Health Research programme research funding board (2015-2019), the MRC-NIHR Methodology Research Programme panel member (2014-2018), and is a panel member of the Research Excellence Framework 2021, subpanel 2 (public health, health services, and primary care; November 2020 - February 2022), and has grant funding from the European Commission, NIHR, MRC, Natural Environment Research Council, Prevent Breast Cancer, Breast Cancer Now, Greater Sport, Manchester University NHS Foundation Trust, Christie Hospital NHS Trust, and BXS GP; EM is a member of the NIHR Public Health Research funding board; MP has grant funding from the MRC, UK Prevention Research Partnership, and NIHR; JR-M is programme director and chairperson of the NIHR’s Health Services Delivery Research Programme (since 2014) and member of the NIHR Strategy Board (since 2014); MW received a salary as director of the NIHR PHR Programme (2014-20), has grant funding from NIHR, and is a former member of the MRC’s Population Health Sciences Strategic Committee (July 2014 to June 2020). There are no other relationships or activities that could appear to have influenced the submitted work.

Patient and public involvement: This project was methodological; views of patients and the public were included at the open consultation stage of the update. The open consultation, involving access to an initial draft, was promoted to our networks via email and digital channels, such as our unit Twitter account ( @theSPHSU ). We received five responses from people who identified as service users (rather than researchers or professionals in a relevant capacity). Their input included helpful feedback on the main complexity diagram, the different research perspectives, the challenge of moving interventions between different contexts and overall readability and accessibility of the document. Several respondents also highlighted useful signposts to include for readers. Various dissemination events are planned, but as this project is methodological we will not specifically disseminate to patients and the public beyond the planned dissemination activities.

Provenance and peer review: Not commissioned; externally peer reviewed.

uk medical research council intervention development framework

Medical Research Council (MRC)

MRC funds research at the forefront of science to prevent illness, develop therapies and improve human health.

Medical Research Council

MRC content

  • Funding, assessment and award management
  • Strategy, remit and programmes
  • People, skills and fellowships
  • Investments, impacts and engagement
  • Institutes, facilities and resources

uk medical research council intervention development framework

MRC impact showcase

uk medical research council intervention development framework

MRC funded discovery science underpins gene therapy cures

uk medical research council intervention development framework

5 September 2024

£27.4m for MRC cell signalling and disease research unit

uk medical research council intervention development framework

2 September 2024

First projects from UKRI’s new interdisciplinary scheme announced

uk medical research council intervention development framework

15 August 2024

Perimenopause linked with higher risk of bipolar and depression

View all MRC news

uk medical research council intervention development framework

28 August 2024

Our new Community Visits programme: what’s that all about then?

uk medical research council intervention development framework

31 July 2024

Application process 101: MRC Centres of Research Excellence

View all MRC blog posts

Funding opportunities

Ukri dri: championing knowledge exchange for uk computational science.

Apply for funding to lead and co-ordinate a Knowledge Exchange and Communications NetworkPlus for UK Large Scale Compute.

You must be based at a UK research organisation eligible for UK Research and Innovation (UKRI) funding.

UKRI DRI: digital research infrastructure skills hubs for accelerated compute

Apply for funding to develop a national hub for digital Research Technology Professionals for accelerated and large-scale compute technologies.

View all MRC opportunities

Medical Research Council (MRC) Polaris House, North Star Avenue, Swindon, SN2 1FL

Connect with MRC

Subscribe to ukri emails.

Keep up to date with funding, news and events, and a weekly newsletter.

Digital twins community workshop

18 September 2024

Online and at Rothamsted Conference Centre, West Common, Harpenden, AL5 2JQ

View all MRC events

This is the website for UKRI: our seven research councils, Research England and Innovate UK. Let us know if you have feedback or would like to help improve our online products and services .

COMMENTS

  1. A new framework for developing and evaluating complex interventions

    The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research. Complex ...

  2. New framework on complex interventions to improve health

    The Medical Research Council (MRC) and National Institute for Health Research (NIHR) complex intervention research framework has been published.

  3. Guidance on how to develop complex interventions to improve health and

    The UK Medical Research Council (MRC) published influential guidance on developing and evaluating complex interventions, presenting a framework of four phases: development, feasibility/piloting, evaluation and implementation. 1 The development phase is what happens between the idea for an intervention and formal pilot testing in the next phase ...

  4. NIHR publishes new framework on complex interventions to improve health

    Published: 01 October 2021. The NIHR and The Medical Research Council (MRC) has launched a new complex intervention research framework. The new framework provides an updated definition of complex interventions, highlighting the dynamic relationship between the intervention and its context. Complex interventions are widely used in the health ...

  5. Developing and evaluating complex interventions: the new Medical

    The Medical Research Council's evaluation framework (2000) brought welcome clarity to the task. Now the council has updated its guidance. Complex interventions are widely used in the health service, in public health practice, and in areas of social policy that have important health consequences, such as education, transport, and housing.

  6. PDF Process evaluation of complex interventions: a summary of Medical

    Process evaluation of complex interventions: a summary of Medical Research Council guidance Prepared on behalf of the MRC Population Health Sciences Research Network by: Graham Moore*, Research Fellow, DECIPHer UKCRC Public Health Research Centre of Excellence, School of Social Sciences, Cardiff University.

  7. A new framework for developing and evaluating complex interventions

    Re: A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance Dear Editor, How we make decisions in the face of path-goal multiplicity, interdependence and heterogeneity is a serious problem.

  8. A new framework for developing and evaluating complex interventions

    Abstract and Figures The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the ...

  9. Guidance on how to develop complex interventions to improve health and

    s, we present key principles and actions for consideration when developing interventions to improve health. These include seeing intervention development as a dynamic iterative process, involving stakeholders, reviewing published research evidence, drawing on existing theories, articulating programme theory, undertaking primary data collection, understanding context, paying attention to future ...

  10. A new framework for developing and evaluating complex interventions

    The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the ...

  11. Framework for the development and evaluation of complex interventions

    The Medical Research Council published the second edition of its framework in 2006 on developing and evaluating complex interventions. Since then, there have been considerable developments in the field of complex intervention research. The objective of this project was to update the framework in the light of these developments.

  12. A new framework for developing and evaluating complex interventions

    Re: A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance Dear Editor The new MRC guidance brings a touch of heroism to lumbering literature on healthcare research. On offer is an almighty methodological shove in favour of pluralism in the evaluation of interventions.

  13. PDF Process evaluation of complex interventions

    complex interventions UK Medical Research Council (MRC) guidance ... Danny Wight8, Janis Baird3 1 Centre for the Development and Evaluation of Complex Interventions for Public Health Improvement (DECIPHer), 2 Cardiff School of ... was established by the UK Medical Research Council in 2005. The work of the network focuses on methodological ...

  14. Guidance on how to develop complex interventions to improve health and

    The UK Medical Research Council (MRC) published influential guidance on developing and evaluating complex interventions, presenting a framework of four phases: development, feasibility/piloting, evaluation and implementation. 1 The development phase is what happens between the idea for an intervention and formal pilot testing in the next phase. 3 This phase was only briefly outlined in the ...

  15. Using the Medical Research Council framework for development and

    Background Several frameworks now exist to guide intervention development but there remains only limited evidence of their application to health interventions based around use of mobile phones or devices, particularly in a low-resource setting. We aimed to describe our experience of using the Medical Research Council (MRC) Framework on complex interventions to develop and evaluate an adherence ...

  16. Framework for the development and evaluation of complex interventions

    This study has updated the MRC's framework in the light of developments in complex intervention research since 2006, adopting a pluralist approach and encouraging the consideration and use of diverse research perspectives.

  17. PDF Using the MRC Framework for Complex Interventions to Develop ...

    The Medical Research Council (MRC) framework for complex interventions provides useful guidance to assist with the development and evaluation of health technology interventions such as decision support.

  18. Process evaluation of complex interventions: Medical Research Council

    Process evaluation is an essential part of designing and testing complex interventions. New MRC guidance provides a framework for conducting and reporting process evaluation studies Attempts to tackle problems such as smoking and obesity increasingly use complex interventions.

  19. Medical Research Council framework for development and evaluation of

    The development and evaluation of complex interventions in healthcare has been supported by the Medical Research Council (MRC) guidance since the first iteration in 2000 [1], subsequently revised ...

  20. Framework for the development and evaluation of complex interventions

    A Framework for Development and Evaluation of RCTs for Complex Interventions to Improve Health. London: Medical Research Council; 2000). The aim was to help researchers and research funders recognise and adopt appropriate methods to improve the quality of research to develop and evaluate complex interventions and, thereby, maximise its impact.

  21. A new framework for developing and evaluating complex interventions

    The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research.

  22. Medical Research Council (MRC)

    MRC funds research at the forefront of science to prevent illness, develop therapies and improve human health.

  23. A new framework for developing and evaluating complex interventions

    Abstract The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research.