14
Process evaluations in neurological rehabilitation: a mixed-evidence systematic review and recommendations for future research Patricia Masterson-Algar, Christopher R Burton, Jo Rycroft-Malone To cite: Masterson-Algar P, Burton CR, Rycroft-Malone J. Process evaluations in neurological rehabilitation: a mixed-evidence systematic review and recommendations for future research. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016- 013002 Prepublication history and additional material is available. To view please visit the journal (http://dx.doi.org/ 10.1136/bmjopen-2016- 013002). Received 10 June 2016 Revised 23 September 2016 Accepted 13 October 2016 School of Healthcare Sciences, Bangor University, Bangor, UK Correspondence to Patricia Masterson-Algar; [email protected] ABSTRACT Objective: To systematically review how process evaluations are currently designed, what methodologies are used and how are they developed alongside or within neurological rehabilitation trials. Methods: This mixed-methods systematic review had two evidence streams: stream I, studies reporting process evaluations alongside neurorehabilitation trials research and stream II, methodological guidance on process evaluation design and methodology. A search strategy was designed for each evidence stream. Data regarding process evaluation core concepts and design issues were extracted using a bespoke template. Evidence from both streams was analysed separately and then synthesised in a final overarching synthesis proposing a number of recommendations for future research. Results: A total of 124 process evaluation studies, reporting on 106 interventions, were included in stream I evidence. 30 studies were included as stream II evidence. Synthesis 1 produced 9 themes, and synthesis 2 identified a total of 8 recommendations for process evaluation research. The overall synthesis resulted in 57 synthesis recommendationsabout process evaluation methodology grouped into 9 research areas, including the use of theory, the investigation of context, intervention staff characteristics and the delivery of the trial intervention. Conclusions: There remains no consensus regarding process evaluation terminology within the neurological rehabilitation field. There is a need for process evaluations to address the nature and influence of context over time. Process evaluations should clearly describe what intervention staff bring to a trial, including skills and experience prior to joining the research. Process evaluations should monitor intervention staffs learning effects and the possible impact that these may have on trial outcomes. INTRODUCTION Although the number of randomised con- trolled trials (RCTs) of rehabilitation inter- ventions has risen dramatically in recent years, rehabilitation research lags behind other sciences in providing conclusive evi- dence of its benecial impacts. As a conse- quence, the development of innovative interventions and programmes is being slowed down. 1 As a broad-based discipline, the impacts of rehabilitation interventions are difcult to evaluate, and therefore its multidisciplinary nature needs to be addressed when designing research studies. 2 Addressing this challenge relies in working towards dening, in detail, rehabilitation treatments in terms of what are their active ingredients, what is their individual impact and what is the impact of the intervention as a whole. 3 Furthermore, rehabilitation is context specic and often dened as the interaction between the individual and the environment. 4 Thus, identifying contextual factors (physical, psychological, social, etc) and acknowledging that researchers bring their own personal values into situations is of great importance when thinking about the science of rehabilitation. A number of models to assist with the development of complex interventions and improve their quality have been published. Strengths and limitations of this study This paper presents the first systematic review that applies a two-stream mixed-evidence syn- thesis to investigate process evaluations in neurological rehabilitation research. This review used a rigorous and broad search strategy including a wide range of sources to maximise the capture of relevant literature. There is a possibility that relevant studies were not identified due to the fact that the term process evaluationwas not considered a Medical Subject Heading in any of the databases that were searched. This review does not critically appraise the quality of included process evaluation studies. Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 1 Open Access Research on 20 August 2018 by guest. Protected by copyright. http://bmjopen.bmj.com/ BMJ Open: first published as 10.1136/bmjopen-2016-013002 on 8 November 2016. Downloaded from

Open Access Research Process evaluations in …bmjopen.bmj.com/content/bmjopen/6/11/e013002.full.pdf · Process evaluations in neurological rehabilitation: a mixed-evidence systematic

Embed Size (px)

Citation preview

  • Process evaluations in neurologicalrehabilitation: a mixed-evidencesystematic review and recommendationsfor future research

    Patricia Masterson-Algar, Christopher R Burton, Jo Rycroft-Malone

    To cite: Masterson-Algar P,Burton CR, Rycroft-Malone J.Process evaluations inneurological rehabilitation:a mixed-evidence systematicreview and recommendationsfor future research. BMJOpen 2016;6:e013002.doi:10.1136/bmjopen-2016-013002

    Prepublication history andadditional material isavailable. To view please visitthe journal (http://dx.doi.org/10.1136/bmjopen-2016-013002).

    Received 10 June 2016Revised 23 September 2016Accepted 13 October 2016

    School of HealthcareSciences, Bangor University,Bangor, UK

    Correspondence toPatricia Masterson-Algar;[email protected]

    ABSTRACTObjective: To systematically review how processevaluations are currently designed, what methodologiesare used and how are they developed alongside orwithin neurological rehabilitation trials.Methods: This mixed-methods systematic review hadtwo evidence streams: stream I, studies reportingprocess evaluations alongside neurorehabilitation trialsresearch and stream II, methodological guidance onprocess evaluation design and methodology. A searchstrategy was designed for each evidence stream. Dataregarding process evaluation core concepts and designissues were extracted using a bespoke template.Evidence from both streams was analysed separatelyand then synthesised in a final overarching synthesisproposing a number of recommendations for futureresearch.Results: A total of 124 process evaluation studies,reporting on 106 interventions, were included instream I evidence. 30 studies were included as streamII evidence. Synthesis 1 produced 9 themes, andsynthesis 2 identified a total of 8 recommendations forprocess evaluation research. The overall synthesisresulted in 57 synthesis recommendations aboutprocess evaluation methodology grouped into 9research areas, including the use of theory, theinvestigation of context, intervention staffcharacteristics and the delivery of the trial intervention.Conclusions: There remains no consensus regardingprocess evaluation terminology within the neurologicalrehabilitation field. There is a need for processevaluations to address the nature and influence ofcontext over time. Process evaluations should clearlydescribe what intervention staff bring to a trial,including skills and experience prior to joining theresearch. Process evaluations should monitorintervention staffs learning effects and the possibleimpact that these may have on trial outcomes.

    INTRODUCTIONAlthough the number of randomised con-trolled trials (RCTs) of rehabilitation inter-ventions has risen dramatically in recent

    years, rehabilitation research lags behindother sciences in providing conclusive evi-dence of its beneficial impacts. As a conse-quence, the development of innovativeinterventions and programmes is beingslowed down.1 As a broad-based discipline,the impacts of rehabilitation interventionsare difficult to evaluate, and therefore itsmultidisciplinary nature needs to beaddressed when designing research studies.2

    Addressing this challenge relies in workingtowards defining, in detail, rehabilitationtreatments in terms of what are their activeingredients, what is their individual impactand what is the impact of the intervention asa whole.3 Furthermore, rehabilitation iscontext specific and often defined as theinteraction between the individual and theenvironment.4 Thus, identifying contextualfactors (physical, psychological, social, etc)and acknowledging that researchers bringtheir own personal values into situations is ofgreat importance when thinking about thescience of rehabilitation.A number of models to assist with the

    development of complex interventions andimprove their quality have been published.

    Strengths and limitations of this study

    This paper presents the first systematic reviewthat applies a two-stream mixed-evidence syn-thesis to investigate process evaluations inneurological rehabilitation research.

    This review used a rigorous and broad searchstrategy including a wide range of sources tomaximise the capture of relevant literature.

    There is a possibility that relevant studies werenot identified due to the fact that the termprocess evaluation was not considered aMedical Subject Heading in any of the databasesthat were searched.

    This review does not critically appraise thequality of included process evaluation studies.

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 1

    Open Access Research

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://dx.doi.org/10.1136/bmjopen-2016-013002http://dx.doi.org/10.1136/bmjopen-2016-013002http://dx.doi.org/10.1136/bmjopen-2016-013002http://crossmark.crossref.org/dialog/?doi=10.1136/bmjopen-2016-013002&domain=pdf&date_stamp=2016-11-07http://bmjopen.bmj.comhttp://bmjopen.bmj.com/

  • The UK Medical Research Council (MRC)5 has pro-posed an approach to the evaluation of complex inter-ventions which includes developing theory-basedexplanations of how interventions work. This frameworkhas already been used in a number of neurologicalrehabilitation research projects.68 While it describes fivephases for intervention development which the researchprocess should follow, this framework does not providedetails as to which research methods should be used.The MRC5 framework highlights that a process evalu-ation, including qualitative data gathering methods, canprovide insights into why an intervention fails unexpect-edly, has unanticipated consequences or why a successfulintervention works and how it can be optimised.9 In2014, the MRC published the first guidance for carryingout process evaluations in health research.10 This guid-ance summarises why there is a need for process evalua-tions alongside current health research, and it thenproposes a framework which is highly informed by theMRC guidance on complex interventions.5 It is widelyaccepted that process evaluations serve a very importantrole in health services research, not only when checkingwhether the trial intervention was performed as plannedbut also in providing detailed insight into the experi-ences of those exposed to the intervention.9 11 By evalu-ating processes, an intervention can be improved eitherduring its application, or afterwards, at an implementa-tion stage.12 Finally, trials which include a process evalu-ation will produce higher quality results that can helpclarify the potential generalisability and optimisation ofthe proposed intervention in routine practice.13

    However, to date, process evaluations alongside trials arescarce, and they are even rarer in multidisciplinarytherapy research on neurological rehabilitation.The aim of a process evaluation is to investigate how

    and why an intervention fails or succeeds at producingthe desired outcomes.14 In recent years, there has beenan increase in published research on theories and fra-meworks driving process evaluations for complex inter-ventions. Steckler and Linnan11 proposed a frameworkfor carrying out process evaluations which included aseries of programme components that should be mea-sured and evaluated: recruitment (how were participantsattracted into the study), context (social, political andenvironmental factors that could have influenced imple-mentation), reach (proportion of targeted patients thatparticipated in the intervention), dose delivered (pro-portion of the intervention components that was pro-vided), dose received (level of participants engagementin the intervention), fidelity (to what extent was theintervention delivered as had been intended by theresearchers) and implementation. In order to measurethese components, process evaluations may make use ofqualitative and quantitative methods.15 However, there isstill very limited guidance to help researchers designprocess evaluations;16 as a consequence, researchers canfind the prospect of carrying out a process evaluationdaunting and this could lead them to discard the idea

    of embedding one alongside their proposed trials, espe-cially when looking at complex interventions.To the best of our knowledge, this constitutes the first

    systematic review that applies a two-streammixed-evidence synthesis to investigate the design ofprocess evaluations in neurological rehabilitationresearch. The overarching question of this systematicreview was: How are process evaluations currentlydesigned, what methodologies are used and how arethey developed alongside or within neurologicalrehabilitation research?Specific research questions were: What methodologies and methods have been used to

    carry out process evaluations when undertaken along-side neurological rehabilitation research?

    What are the theoretical underpinnings (if any) ofprocess evaluations alongside neurological rehabilita-tion research trials?

    How have the results from process evaluations along-side neurological rehabilitation research trials beenused to understand and clarify trial results?

    What are the potential barriers and facilitators to car-rying out process evaluations alongside neurologicalrehabilitation research?

    What terminology is currently being used in processevaluation research?

    METHODSDesignmixed-evidence synthesisThe Preferred Reporting Items for Systematic Reviewsand Meta-Analysis (PRISMA) was followed in this review.The University of York Centre for Research andDissemination (York CRD) guidelines were followed atthe time of conducting searches and extracting data.17

    The Evidence for Policy and Practice Information andCo-ordinating Centre (EPPI-centre) published methodsfor conducting systematic reviews18 was used to guidethe synthesis of mixed-evidence findings.Following the CRDs guidance for undertaking reviews

    in healthcare,17 an initial review of published literaturewas carried out in order to identify key terms and typesof studies reporting on process evaluations. This reviewidentified a variety of quantitative, qualitative and meth-odological literature. Thus, the decision was made tosegregate studies into two different streams of evidence.Studies included in each evidence stream would be ana-lysed and synthesised separately, and following from this,results from each evidence stream would be bridgedtogether.19 20 The following streams of evidence wereagreed. Details on each evidence stream and inclusioncriteria are reported in table 1.

    Search methodsStream I searchesRelevant articles to be included in stream I were identi-fied through conducting electronic database searches. Allsearch details (date, search strategy, hits) were recorded

    2 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • in a search log. All search results were managed usingRefWorks. The following databases were searched:CINAHL, Web of Science, MEDLINE, PEDro, SSCI,PsycINFO, ClinicalTrials.gov, HTA Database, CochraneCentral, EPPI-Centre Database, ASSIA and DORIS. Thedata searches were carried out using search terms andkeywords that were a mixture of Medical SubjectHeadings (MeSH) and non-MeSH terms. Experts in thefield were contacted, and manual reference list checks(citation tracking) and hand-searched key journals andtrials register websites as well as other specialist websitesand internet search engines were carried out.In consultation with an experienced librarian and

    informed by three published systematic reviews onprocess evaluation2123 searches for stream I werecarried out combining three different search arms:process evaluation, neurological conditions andrehabilitation (a detailed search strategy is shown inonline supplementary table S1). Searches were restrictedto studies published in English, between 1977 andFebruary 2015.

    Stream II searchesRelevant literature was identified through the use ofsearch engines, such as Google Scholar, and snowballsampling. Thorough manual reference checking ofstudies included in stream I was carried out. Specialistwebsites were searched, and experts were contacted inorder to assure that relevant studies were not left out.Searches were not limited to the UK; internationalguidelines and methodological resources were includedwhen considered relevant to the review.

    Screening of resultsThe screening of the results was carried out followingPRISMA guidelines, which were adapted for each of the

    evidence streams. In order to reduce bias, a secondresearcher joined the screening process and reviewedeach study independently. When necessary, agreementon inclusion was reached through discussion.

    Data extractionThere are currently no established methods for assessingthe quality of process evaluations partly due to the widevariation in their reporting.16 Thus, it was decided notto formally review the methodological quality of theprocess evaluations. The data extraction process wasinformed by three published systematic reviews onprocess evaluations2123 and a systematic review whichincluded a set of studies that were process evaluations.24

    Basic information (publication year, author, title, discip-line, study type) was collected from all stream I studies.A bespoke template for data extraction was preparedincluding components identified in published frame-works on process evaluation such as the one proposedby Steckler and Linnan11 (table 2).In the case of stream II studies, all basic information

    was first extracted on type of publication, topic, aims,objectives and target audience. Information was col-lected on how stream II studies had addressed the fol-lowing aspects (or a number of them) of processevaluation research: theory development, context,recruitment, staff characteristics, staff training, learningover time and adherence to protocol.As recommended by the CRD,17 data extraction forms

    were piloted on a sample of 10 included studies in orderto guarantee that the relevant information was captured,and resources were not being wasted on extractingirrelevant data. Furthermore, a quality check on dataextraction from a random 10% sample of studiesincluded in stream I and stream II evidence was under-taken by two reviewers.

    Table 1 Evidence streams and inclusion criteria

    Aim and description Inclusion criteria

    Stream I (researchevidence)

    To identify how process evaluations have beencarried out alongside/linked to neurologicalresearch trials.Provide data to answer the specific objectives:identifying terminology used in processevaluations, how results of process evaluations areused to understand the trials overall outcomes andwhat particular methods are mostly used byresearchers.

    Qualitative and quantitative primary researchstudies. All study types

    Reporting on process evaluations linked toneurological rehabilitation research trialsconducted around the world.

    Reporting on one or more processevaluation components.11

    (Papers that only reported on impactevaluation were excluded)

    Stream II(methodologicalevidence)

    Research studies which were not necessarilyprimary research.Studies that would provide rich data to help answermethodological and theory-related researchquestions and aimed to explore frameworks andtheory behind process evaluations.

    Studies and reports, published in theEnglish language.

    Exploring methodologies, guidance andopinions regarding process evaluationresearch methods.

    Studies in health research reporting onframeworks for implementation fidelity andits components.

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 3

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://dx.doi.org/10.1136/bmjopen-2016-013002http://bmjopen.bmj.com/

  • Synthesis of extracted dataDescriptive statistics were used in order to map studiesincluded in both streams of evidence. A separate synthe-sis was carried out by evidence type.

    Synthesis 1Data extracted from studies included in evidence streamI was analysed and themes were identified using a modi-fied method described by Kavanagh et al.24 Results fromthis analysis were presented in the form of overarchingnarrative descriptive themes which were inductively gen-erated.25 26 For quality assurance, initial patterns andsubsequent themes were independently reviewed byanother member of the research team. Emergingthemes were then opened to debate, and disagreementsamong researchers were solved via discussion.

    Synthesis 2Thomas and Hardens27 method for carrying out the-matic qualitative synthesis on primary qualitativeresearch was adapted, as required, in order to carry outthe synthesis. The following steps were followed:

    1. Finding descriptive themes: the researcher readeach of the sections which had been extractedduring the data extraction phase and coded sectionsaccording to its purpose and content. As the codingprogressed, a bank of codes was generated and newones created as necessary. The researcher followedan inductive process and looked at similaritiesbetween codes in order to group the initial codes.This process resulted in a number of descriptivethemes (figure 1 is an example of how one of thedescriptive themes, intervention staff factors, wasgenerated).

    2. The generation of analytical themes was performedby using the descriptive themes that emerged fromthe inductive analysis to answer the systematic reviewquestion. The lead author did this independently,and then, through discussion with the rest of themembers of the team, more analytical (abstract)themes began to emerge. These final analyticalthemes constituted a list of recommendationsregarding the undertaking of different aspects ofprocess evaluation research.

    Figure 1 Example of grouping of initial codes to form broader descriptive themes.

    Table 2 Data extraction criteria for stream I evidence

    Data extractioncriteria

    Basic information: author, publication year, title, discipline, study type, design. Recruitment procedures, dose delivered, participant attitudes investigated, adherence/fidelity

    measured, implementation monitoring, context considered, intervention protocol, aims andobjectives, PE study design and rationale, mechanisms to assess adherence to intervention protocol,description of intervention providers, training of intervention providers, learning over time measured,theory informing PE, PE findings, links between PE and outcome results.

    4 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • Synthesis 3The method described by Harden et al19 and Oliveret al20 was used to structure, compare and bring togetherfindings from both research evidence streams. Thethemes that emerged from synthesis 1 were thenmapped on to the recommendations identified in syn-thesis 2. As a result of this, potential gaps and strengthsdefining neurological rehabilitation process evaluationresearch were identified (figure 2). Synthesis 3 is pre-sented as a list of synthesis recommendations on howto best carry out and report process evaluations inneurological rehabilitation research.

    RESULTSIncluded studiesA total of 3327 studies were found for stream I searches.A total of 1380 duplicates were removed, and afterscreening titles and abstracts, 1673 were excluded on thebasis of not meeting inclusion criteria. The full text ofthe remaining 274 studies was screened and on com-plete reading, 124 studies were included (figure 3). Themain reason for exclusion at this stage was studies notincluding any reference to one or more process evalu-ation components. Some researchers published a separ-ate process evaluation result paper in addition to anoutcome evaluation paper and a process evaluationprotocol paper; articles describing one interventionwere combined and considered as one unit for thepurpose of this review. After this grouping, a total of 124studies reporting on 106 interventions remained for ana-lysis (see online supplementary table S2).Search strategies for stream II led to a total of 45

    studies. Full text of these was screened, and finally 30studies were deemed eligible for inclusion (see onlinesupplementary table S3).

    Synthesis 1research evidenceThe majority of studies (89%) included in stream I werepublished between 2001 and 2014. Only four and eightstudies were published during the 1990s and the year2000, respectively. In terms of rehabilitation disciplines,Figure 2 Summary of the synthesising process.

    Figure 3 Flow diagram ofsearch strategies and reviewprocess based on PRISMAstatement. PRISMA, PreferredReporting Items for SystematicReviews and Meta-Analysis.

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 5

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://dx.doi.org/10.1136/bmjopen-2016-013002http://dx.doi.org/10.1136/bmjopen-2016-013002http://bmjopen.bmj.com/

  • 33 (31%) interventions were multidisciplinary, 20 (19%)involved occupational therapy, 21 (20%) were physio-therapy, 9 (8%) were psychological interventions and 4(4%) were speech and language therapy interventions.A total of 14 (13%) were interventions involving alterna-tive forms of exercise or therapy (eg, yoga, Tai chi, tread-mill training).In terms of research design, half of interventions

    (50%) were investigated using randomised trialmethods, including pilot, multicentre and cluster RCTs.The remaining studies used a range of approaches toinvestigate interventions, including pre-post one groupdesign, repeated measure three group design and twogroup non-randomised design. Out of the 124, 10studies were purely qualitative research studies.All 106 interventions described in the 124 studies were

    being investigated for to their effectiveness in treating arange of neurological conditions, such as stroke (28%),Parkinsons disease (PD) (10%), multiple sclerosis (MS)(8%) and traumatic brain injury (TBI) (7%). A total of24 interventions (23%) were targeting cognitive impair-ments including Alzheimers disease and dementia.Following the method previously described, a total of

    10 themes were developed (table 3):

    TerminologyThe term process evaluation is yet to be widely used todescribe the assessment and evaluation that takes placeat the time of carrying out research on the implementa-tion of a new or innovative intervention. Only 32 studiesof those included in stream I used the term processevaluation.More specifically, among all studies included in this

    review there was a clear lack of consensus regarding ter-minology used to describe the processes that were beingevaluated and their components; this led to confusionand lack of clarity. The term adherence was a clearexample of this since it was variously referred to asdose, attendance rate, compliance, fidelity orexposure. Neurological rehabilitation studies failed todefine terms in a unique and non-interchangeablemanner. The term feasibility was widely used; however, itwas defined differently across studies. McGinley et al28 intheir pilot RCT study looking at the impact of a physicaltherapy intervention for patients with PD defined it interms of safety, retention, adherence and compliance

    measures. Stephens et al29 in their pilot RCT investigat-ing the potential benefits of an exercise intervention forchildren with fibromyalgia defined it as comprisingadherence and recruitment data.

    Aims and strategiesA total of 50 studies (reporting on 43 interventions)among those included in stream I identified aims andresearch questions specific to the process evaluation.Forty-seven studies reported on strategies in place, at timesreferred to as feasibility (12 and 42) or fidelity out-comes (67 and 89) to answer those research questions.However, most studies provided a very broad description ofthese strategies, without much detail. Two studies provideddetails on specific tools used to investigate a process evalu-ation component; Alwin et al30 31 described the PatientPerspective on Care and Rehabilitation Process (POCR)instrument to investigate the significance of an assistivetechnology intervention for the relatives of people withdementia. Khalil et al32 33 used the Intrinsic MotivationInventory (IMI) to assess how individuals withHuntingtons disease perceived (eg, enjoyment, value, use-fulness) the activities included in a home-based exerciseintervention.

    Addressing contextThe context in which the neurological rehabilitationintervention implementation and research trial tookplace was described for 49 (45%) of the included inter-ventions; however, this description often lacked detail andwas focused solely on information regarding the trialssetting (eg, 1, 9, 57, 64) without accounting for the widerphysical, social, economic, organisation and politicalenvironment. Only 11 studies (9%) described in detailthe provision (with respect to the trialled intervention)that already existed, and crucially how these contextualfactors had changed over time. Out of the 11, 5 studies(16, 17, 18, 93 and 108) described the strategies in placeto explore how changes in contextual factors had possiblyaffected the implementation and/or impacts.

    RecruitmentAll trials investigating the 106 included interventionsdescribed the main trials participant recruitment proce-dures, including chosen sites, participants inclusion/exclusion criteria and diagrams describing the flow of

    Table 3 Synthesis 1: emergent themes

    Emergent themes

    Terminology Training and assessing intervention staff competence in deliveringthe intervention

    Aims and strategies Tailoring and adherence to intervention protocols Addressing context Investigating participants opinions Recruitment A tool to identify barriers and facilitators to implementation Describing those in charge of delivering the trialled

    intervention Linking trial outcomes to process evaluation findings

    6 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • participants throughout the research process; only 21 ofthem (17%) provided a detailed description of identi-fied barriers and facilitators to initial recruitment andconsequent retention and engagement of individualsduring the running of the neurological rehabilitationtrials. Scianni et al34 provided a detailed account ofrecruitment procedures and identified lack of transportas the main barrier to patient participation in a gaittraining study. In relation to recruitment of participantsfor the process evaluation, 15 studies provided anexplanation of what subsample was selected and why.

    Describing those in charge of delivering the trialledinterventionA total of 29 studies (comprising 23 interventions)(22%) investigated what were the intervention providersmotivations for joining the research programme or whatwere their perceptions regarding treatment effects andpossible impacts. This was mostly achieved by carryingout in-depth interviews and administering question-naires (including ranking items and open questions)and less often through focus groups (five studies).Scobbie et al35 carried out a process evaluation to evalu-ate the implementation of a theory-based action plan-ning framework (G-AP) to guide goal setting practice. Intheir study, interviews were chosen as a data collectionmethod in order to investigate the experiences of inter-vention providers and the difficulties they faced. Thestudy by Dpp et al36 is a clear example of the use of avariety of methods in order to achieve this; the authorsused web-based questionnaires which included anumber of statements regarding barriers to implementa-tion which the occupational therapists had to rate interms of how much they agreed. They carried outfurther focus groups and semistructured telephoneinterviews.Of those studies included in stream I, only those

    looking at 10 (9%) of the interventions provided detailsin regard to the level of experience of intervention staff.A number of studies referred to health professionalsexperience using expressions such as experiencedhealth professionals (51 and 95), multiple years of clin-ical experience (58) or had prior experience (75)without providing any further detail. Some studiesreported on the years of experience that health profes-sionals had (eg, 28, 32, 49, 60, 71, 100, 103, 107, 108and 110), the grade of intervention staff (17) or level ofeducation (58 and 101).

    Training and assessing intervention staff competence indelivering the interventionOut of the 106 interventions included in stream Istudies, 40 (38%) were delivered by staff who hadattended training workshops regarding the neurologicalrehabilitation intervention, prior to the start of the trial.This training was delivered using a variety of methodssuch as lectures, role play (14 and 107), practical ses-sions and group discussions, and it varied in length from

    2 hours (58 and 81), 2.5 hours (60), 4 hours (92),16 hours (107), 40 hours (89) to day (57), 2 days(45), 3 days (23 and 93) and 5 days (50). A studylooking at the impact of a structured training pro-gramme for caregivers of inpatients after stroke (TRACSstudy) (16, 17 and 18) relied on cascade training as amethod of transferring knowledge from one health pro-fessional to another.King et al37 in their feasibility study evaluating the

    effects of paediatric therapy services in the schoolsetting and Voigt-Radloff et al38 39 in the process evalu-ation of the RCT looking at the impact of evidence-based community on dementia patients (WHEDAstudy), both provided a detailed description about thetraining and expected learning outcomes of staff.However, among the studies that included a trainingcomponent, only seven defined performance criteriaand measured (mainly via observations) the skill acquisi-tion post-training to a minimum standard that wouldallow the provider to be involved in the delivery of theintervention. Chung40 and Dpp et al36 used quizzes andquestionnaires, respectively, to assess intervention staffsknowledge on the trialled dementia intervention. InMorris et al,41 all personnel who had undergone trainingwere required to achieve a score equal to, or greaterthan, 90% of items correctly executed. Failure to meetthis criterion required the tester or trainer to withdrawfrom the project and to resubmit standardisation video-tapes for rating until the 90% or higher criterion wasachieved.Only 11 studies (reporting on 8 interventions) (16, 17,

    18, 25, 28, 37, 39, 42, 45, 81 and 92) reported onmethods in place to regularly refresh intervention staffknowledge on the neurological rehabilitation interven-tion (eg, due to staff turnover throughout the researchprocess). In the TRACS study (16, 17 and 18), localtraining sessions were arranged, if necessary, to providefeedback and support. Additionally all centres involvedin the trial were offered a local refresher course midwaythrough the trial. A total of 22 studies (reporting on 17interventions, 16%) described methods in place tomaintain intervention staffs skills over time. This wasachieved mainly via individual or group supervision (setor available when necessary), led by an expert in thefield or an advisory group (87) and delivered via anumber of ways: meetings, telephone conversations,emails or blogs, among others.Only 16 studies (regarding a total of 8 interventions)

    (7, 8, 21, 22, 23, 49, 52, 53, 54, 55, 63, 70, 93, 94, 122and 123) discussed changes on how intervention staffdelivered the neurological rehabilitation interventionover time. However, this was not described in detail.Morris et al,41 in the process evaluation of an RCTlooking at an extremity constraint induced therapy inter-vention, and stensj et al,42 in a trial of a goal settingrehabilitation programme, discussed how health profes-sionals improved their standard performance over time.Although these studies referred to learning over time,

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 7

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • none of them acknowledged learning curve effects atthe time of evaluation.

    Tailoring and adherence to intervention protocolsA total of 93 studies (comprising 79 interventions)(74%) reported having protocols/manuals which pro-vided a description of the intervention with a variedlevel of detail. However, 42 of included studies (34%)also discussed how the intervention remained flexiblethroughout the research period in order to be tailoredto the needs of individual patients. Studies included inthis review generally failed to investigate and reportwhether the research team had reached a consensusregarding standardisation of the intervention, or ifhealth professionals were provided with a brief rationaleto help them assess which was the right amount of tai-loring that should take place. Mayo et al43 and McGinleyet al28 describe two different exercise programmes forpatients with stroke and PD, respectively. Both studiesexplain how exercise programmes were tailored to indi-vidual needs while remaining within the limits estab-lished by the study protocol. Although none of thestudies in stream I discusses whether intervention staffwere provided with specific guidance and advice on tai-loring, 78 of studies (63%) which described a protocolreported results and described a variety of strategies inplace to monitor and measure adherence to researchprotocols. Adherence was assessed via a number ofmethods: reviews by experts of audiotaped/videotapedsessions of intervention staff with participants (4, 7, 8,49, 58, 62, 63, 89 and 110), life observations (7, 8, 16,17, 18, 38, 65, 67, 74, 100, 101, 119, 120, 121, 122 and123), treatment log books, intervention staff reflexiveaccounts, therapy evaluation forms, diaries, field notes/case notes and various standardised scales and checklistsfor fidelity.

    Investigating participants opinionA total of 84 studies (77 interventions) reported theexperience, motivations and opinions of those exposedto the neurological rehabilitation intervention. Out ofthese, 44 gathered this information via evaluation ques-tionnaires that included either itemised scaled ques-tions, open-ended questions or both. A total of 33studies used in-depth interviews, and 7 studies carriedout focus groups among other methods.Of all the studies that investigated patients opinions,

    29 described the use of enquiry tools and methods spe-cifically designed to investigate whether participantsunderstood the intervention (eg, goals, outcome mea-sures, rationale) or not. Leuty et al44 and Macht et al45

    assessed understanding by adding related questions tothe participant opinions questionnaire. Others such asLi et al46 enquired about the level of understandingduring an exit interview. Observations of interventionimplementation sessions were also reported as ways ofassessing participants understanding. This was the casefor Taylor et al47 who investigated the impact of home-

    based strength training for young people with cerebralpalsy and for Resnick et al48 in their RCT evaluating theimpact of an exercise training intervention for patientswith stroke (Treadmill Study).

    A tool to identify barriers and facilitators to implementationStudies included in stream I often provided details abouthow emergent contextual and implementation issues hadimpacted on the results of the outcome evaluation.Twenty studies (18 interventions) described and discussedbarriers and facilitators (enablers and inhibitors) to theimplementation of the trialled intervention. Severalstudies (eg, 40, 41, 57, 75, 81, 96 and 109) used tables orappendices to present data about barriers and facilitators.Vant Leven et al49 and Dpp et al50 used focus groups andtelephone interviews with occupational therapists andmanagers to explore barriers and facilitators to the imple-mentation of an occupational therapy (OT) guideline forolder people with dementia and their careers. Scobbieet al35 carried out in-depth interviews with health profes-sionals and participants in order to identify views onimplementation and acceptability of a framework for goalsetting in community-based stroke rehabilitation.

    Linking trial outcomes to process evaluation findingsOnly 24 studies (21 interventions) reported the use of atheoretical framework or a research framework toinform the decision-making and design of the processevaluation. As a result, reasoning behind usedapproaches and methodologies was rarely described.The MRC guideline for reporting complex interven-tions5 was the most mentioned among neurologicalrehabilitation trials included in stream I evidence (16,17, 18, 40, 81, 87, 93, 94, 103, 114 and 117). MacNeilVroomen et al51 used the Adaptive ImplementationModel while Whiting et al52 used the Borrelli et al53 fidel-ity framework. Vluggen et al54 applied the method ofSaunders et al55 which recommends a number of themesto investigate as part of a process evaluation.Forty-six studies (reporting on 39 interventions) pre-

    sented and discussed relationships between the resultsof process evaluations and trials; these studies used theresults from the process evaluation to make sense ofwhat had been taking place during the implementationprocess and for building explanations about the impacton outcome measures. However, only 11 of these studiesmade use of theoretical frameworks and behavioural the-ories such as Normalization Process Theory (TRACSstudy1618) to guide the explanation building process.One example of this is Letts and Dunal56 which devel-oped, through consensus, a logic model in order to planthe implementation and integrate information aboutprocess and outcomes of a community rehabilitationintervention for adults with brain injury.Finally, nearly half of the studies (64; 55 interventions)

    used process evaluation results to generate suggestionsand develop recommendations to counter balance thelimitations of the research study. These

    8 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • recommendations were regarding aspects of the inter-vention that could be adapted or modified in order toincrease chances of success at the time of future imple-mentation and further neurological rehabilitationresearch work.

    Synthesis 2methodological guidance/resourcesAmong all stream II included studies, 10 (33%) wereguidelines and guideline-related studies (G). Eleven(37%) were published studies describing and providinguseful resources in order to investigate fidelity alongsideresearch trials (F). Six (20%) were specifically related toprocess evaluation research and design (P). Finally, 3(10%) provided information regarding the assessmentof learning curves in health research (L).Following the method previously described, eight ana-

    lytical themes, in the form of recommendations,emerged. These recommendations were developedincluding potential strategies, measuring tools andmethods to address process evaluation research (seeonline supplementary table S4).

    Recommendation 1Complex interventions should be clearly defined assuch. Complex interventions should be described interms of their active ingredients. By defining these,researchers can identify how the intervention works andhow these active ingredients are exerting their effect.Creating a steering group of experts (eg, researchers,practitioners and stakeholders) is one of the recom-mended strategies in order to achieve a clear under-standing of the intervention and its characteristics.

    Recommendation 2Process evaluations should be theory-based. Researchersshould draw on existing evidence, guidance and frame-works in order to understand and theoretically explainwhat processes they expect will be taking place.

    Recommendation 3Context should be acknowledged and accounted forthroughout the research process. The context in whichthe intervention was developed, implemented andfinally evaluated should be clearly defined. It would benecessary to describe and monitor changes in the social,physical, economic, political and organisational contextin which the intervention is embedded. Understandingcontext will help the researchers identify its potentialimpact on implementation and outcomes.

    Recommendation 4Recruitment strategies and changes in recruitment overtime, of the main trial and the process evaluation,should be clearly explained. Researchers need to iden-tify and assess the strategies in place to approach andrecruit participants for the research trial and the processevaluation. Barriers and facilitators to recruitment willrequire close investigation. Interviewing staff involved in

    the recruitment process, or completing logbooks andquestionnaires recording reasons for withdrawal, couldbe a potential strategy in order to address this issue.

    Recommendation 5Information regarding intervention staff should be ana-lysed in detail. A detailed description of their character-istics should be provided: numbers, backgroundexperience, incentives and motivations to join theresearch and their opinions regarding the potentialneed for the intervention under investigation.

    Recommendation 6The delivery of the complex intervention should beclosely monitored. It is necessary for the complex inter-vention(s) to be described in a study protocol/manual.This protocol should be a tool that intervention provi-ders can use in order to understand the level of tailoringthat is considered appropriate. Furthermore, staff deli-vering the intervention should be trained in order toincrease the chances of standardisation and to brief staffregarding the performance criteria. Process evaluationsshould have strategies in place to investigate if the inter-ventions are delivered as planned in terms of dose andcontent.

    Recommendation 7Results from process evaluations should be analysed indetail in order to identify possible links with main trialsoutcome results. Data collected and analysed during theprocess evaluation will be of vital importance in order toavoid type III errors when analysing complex interven-tions trials outcomes.

    Recommendation 8Methodological approaches and data collection methodsused in process evaluations should be clearly defined.Chosen terminology and clear aims and objectivesshould be clearly stated at the start of the processevaluation.

    Synthesis 3overarching findingsNarrative themes from synthesis 1 and recommendationsfrom synthesis 2 were collated to generate 57 synthesisrecommendations which were then grouped in the fol-lowing 9 areas (see online supplementary table S5).

    AREA 1complex interventions and theoretical approachesThe use of theory to inform and guide process evalua-tions is recommended. However, to date most processevaluations fail to do so. Theory should be used to getan in-depth understanding of the neurological rehabili-tation under investigation and to identify its compo-nents. Researchers working on neurologicalrehabilitation research currently fail to draw on meth-odological guidance at the time of designing how theywill evaluate the processes taking place.

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 9

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://dx.doi.org/10.1136/bmjopen-2016-013002http://dx.doi.org/10.1136/bmjopen-2016-013002http://bmjopen.bmj.com/

  • AREA 2contextLittle attention is currently being given to the in-depthexploration of the contextual systems in which theneurological intervention is embedded. Evaluationliterature discusses context extensively with respect tothe need to describe it in detail, and the need to under-stand how it can impact on implementation at differentstages of the research process. Neurological researchersshould consider moving away from the narrow definitionof context as the setting where the research takesplace.

    AREA 3recruitmentRecruitment has been identified as one of the mainchallenges of rehabilitation research. Understanding thebarriers and facilitators that take place during therecruitment period is of vital importance for thoseattempting to evaluate processes taking place and theirpotential impact on outcomes. To date, process evalua-tions alongside neurological rehabilitation research haverarely investigated recruitment with this purpose. Finally,recruitment strategies into the process evaluation shouldbe carefully thought through.

    AREA 4describing intervention staffA number of emergent themes from synthesis 1 identifyhow, to date, process evaluations are not focussingenough on understanding issues such as interventionstaffs motivations for joining research or their percep-tions on the potential benefits of the intervention underinvestigation. The role of intervention staffs level ofexperience in shaping potential outcome results isequally neglected. Researchers designing process evalua-tions alongside neurological rehabilitation researchshould attempt to record and investigate these.

    AREA 5describing the interventionTo date, process evaluations often fail to investigate andreport whether the research team had reached a consen-sus regarding standardisation of the intervention, or ifthe health professionals were provided with a rationaleto help them assess which was the right level of tailoringthat should take place. Study protocols should bedetailed enough and guide intervention staff throughthe research process.

    AREA 6preparing and assessing intervention staffProcess evaluations should attempt to have a clearunderstanding about how intervention staff were trainedin order to start their research role. Although training isoften provided, it is important that it includes well-defined performance criteria to guarantee the correctdelivery of the intervention. At present, this is rarely thecase. Furthermore, staff competence should be assessedat different time points in order to identify any potentialchanges that could ultimately impact on outcomes.

    AREA 7delivering the trial interventionTo date, process evaluations of neurological researchtrials often provide information which will help identifybarriers and facilitators to the implementation process.However, there is a need for PEs to go deeper in orderto generate a complete understanding of the quality ofintervention delivery. This can be achieved by definingclear strategies to monitor not only the quality but alsoto measure how much of the intervention was delivered(dose delivered) and how much was received byparticipants.

    AREA 8understanding and interpreting process evaluationresultsOne clear theme that emerged from synthesis 1 was thatat present, results from process evaluations are often notused to make sense of what has been taking place duringthe research process. Process evaluation (PE) results arerequired to build explanations about the impacts of thetrialled intervention on outcome measures.

    AREA 9thinking about methodologyA process evaluation should be a piece of research in itsown right and therefore should be described in a proto-col. It should have a clear purpose and clear aims andobjectives. To date process evaluations alongside neuro-logical rehabilitation research rarely provide detailedinformation regarding design and chosen strategies.Ultimately, process evaluation should aim at answering aresearch question by making use of a variety of methodsin order to gather sufficient data.

    DISCUSSIONThis mixed-evidence systematic review has resulted inthe identification of a number of gaps in the evidenceinforming the undertaking of process evaluations inneurological rehabilitation research. At present, there isno consensus among researchers carrying out andreporting results from process evaluations, regardingwhich terminology (and definitions) to use. These find-ings are in line with previous work by Steckler andLinnan11 or Carroll et al.57 They identified a consider-able overlap in how terms like fidelity or dose aredefined. Among the studies included in this review(stream I and stream II), a clear overlap of terms likeadherence, dose, attendance rate, compliance,fidelity or exposure was identified. Taking adherenceas an example; although it may be simply defined asdoing what is required, from a health services researchperspective, several authors58 59 have explained adher-ence as the component of implementation fidelity thatmeasures to what extent the intervention that has beendelivered is consistent with the way the intervention wasoriginally designed or planned. Carroll et al (ref. 57,p. 3) report that the measurement of implementationfidelity is the measurement of adherence whichincludes the subcategories of content, frequency,

    10 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • duration and dose. However, Steckler and Linnan11

    propose this same definition for fidelity which theyconsider a process evaluation component in its ownright, in the same way as dose. The findings herereported therefore reinforce what others have sug-gested;11 that a clearly defined set of terms for processevaluation still needs to be developed, universally recog-nised and applied in order to allow the number ofneurological rehabilitation research studies whichinclude a process evaluation alongside them to increase.Our findings show that context is often acknowledged.

    Two previous systematic reviews looking at process evalua-tions in occupational stress management programmes21

    and church-based health interventions23 found that only9% and 34% of the studies, respectively, included infor-mation concerning context. However, context was rarelydefined. Neither of these studies assessed the level ofdetail in which context had been described as part of theprocess evaluation or which strategies had been used, ifany, to assess the impacts that contextual changes overtime might have had on outcomes. Our findings showthat the way context is currently being assessed as aprocess evaluation component is not detailed enough,and that the impact of wider contextual changes overtime is rarely investigated or even acknowledged inneurological rehabilitation research. This is contrary tothe general recognition that context is important in theimplementation of interventions and needs to be paidattention to.60 61 Process evaluation should not only aimat identifying and describing contextual factors but alsoinvestigate their association with variation in mediatingresponses to intervention components, and ultimatelyoutcomes.9 Campbell et al62 argue that the investigationof context is all important and should include all widersocioeconomic background. They further report thatcontextual changes over time can influence how an inter-vention may succeed or fail to show a significant impact.In other words, describing the context in which an inter-vention takes place is important, but understanding it iscrucial, not only to inform intervention design but alsoto assess if successful ones might, or might not, workwhen implemented in different settings and conditions.A further important point identified in this systematic

    review is the lack of detailed information describingthose delivering the trialled intervention in terms oftheir previous experience and background and their opi-nions and perceptions of treatment effects and possibleimpacts of the intervention. First, although near aquarter of studies included in stream I investigated provi-ders perceptions towards the quality of the intervention,its perceived effects and possible impacts, this number isrelatively low and therefore we suggest more effort needsto be put into this aspect of process evaluations. This isin line with what other authors have suggested regardingstaffs perceptions playing a role in influencing out-comes.9 63 64 Second, these results have identified thatthere is a strong need for process evaluations to both,clearly describe intervention staff skills and experience

    prior to joining the research and investigate how theymight influence outcomes. It is evident that carefulrecruitment of practitioners with the right level ofexperience is paramount to the success of implementa-tion.65 However, there is currently a lack of evidence tohelp researchers decide which should be the essentialrequirements and optimum experience level of staff incharge of implementing a new neurological rehabilita-tion intervention.Training of intervention staff has often been men-

    tioned as a necessary component in order to increaseimplementation fidelity.6567 The results from this reviewshow that although training of staff often takes place,process evaluations rarely mention performance criteriaor assessments to measure skill acquisition post-trainingand throughout the research programme. In otherwords, our results show that process evaluations in thisfield of research are currently not addressing the rolethat learning curve effects might play in influencingintervention outcomes. Although a number of studieshave looked at learning curve effects in RCTs,6870 thebulk of these are primarily focused on surgical trials (clin-ical health technologies) and the implementation of newsurgical procedures. Learning curve effects can bedefined as an improvement in performance over time(ref. 69, p. 421) which indicates that with time, there isgenerally a change leading to higher quality implementa-tion of the tested intervention.68 A number of hierarch-ical factors have been reported as influencing learning.69

    Cook et al69 reported that professional teams, thecharacteristics of the patients undergoing the procedureand the characteristics of the surgeons carrying out theintervention (eg, attitudes, abilities and previous experi-ence), will further impact on the learning. Further con-sideration should be given to how possible learningcurves and their impact on outcomes should be studiedwithin the context of a longer term, complex interven-tion. The process evaluation of an RCT looking at theimpact of an occupational therapy intervention for strokesurvivor living in care homes (OTCH)70 described howoccupational therapists reported becoming better atimplementing the OTCH intervention as time went by.Therapists learnt how to overcome challenges linked to,among others, resource limitations, institutional contextand patient and care home staff engagement. Theseresults provided evidence about how learning curves canimpact on the quality of the implementation of neuro-logical rehabilitation intervention(s). Process evaluationsneed to assist researchers in identifying the mechanismsunderlying this possible impact. Ignoring the learningeffect could potentially lead to non-conclusive resultssince often the trialled intervention will only have identi-fiable significant impacts once adequate experience isgained.71 Learning curve effects also have implicationsfor the length of the intervention period within the trialtimetable and associated costs.Several authors72 73 have argued that interventions

    that have been designed and tested with participants

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 11

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • from homogeneous or atypical population groups willhave very limited generalisability. There is more to deli-vering the intervention than simply measuring howmany elements were delivered.74 75 There is a need totailor interventions to patients limitations and culturalbackground in order to be able to replicate interven-tions across settings.76 However, this can lead to tensionbetween tailoring and the need to have fidelity to theoriginal intervention. Song et al76 further explain thattailoring does not mean that the provider may improvisewhat he/she does, it means that what is standardised willbe contrasted and clearly defined and monitored againstwhat is customised (including delivery of unplannedcomponents of the intervention). The way in which thiscan be accurately performed remains unclear to date.Our findings show that guides for tailoring for staff deli-vering rehabilitation trial intervention are rare. Songet al76 argue that the assessment of fidelity will have tobe standardised and tailored to the actual level of stand-ardisation and tailoring of the trialled intervention. Inline with these suggestions, the results from this reviewhave identified a strong need for process evaluations tohave strategies in place to investigate and monitor indetail the level of tailoring according to patients needsthat is taking place when providers deliver the trialledintervention. It is only by doing so that researchers canavoid the tailoring process having a negative impact onfidelity of implementation. The recommendations herepresented stress the importance of providing a detaileddescription of trialled interventions and their tailoring.The uptake by researchers of recent published toolssuch as The Consolidated Standards of Reporting Trials(CONSORT) 2010 statement77 or the more recentTemplate for Intervention Description and Replication(TIDieR) checklist and guide78 can potentially have apositive impact on the development of this vital compo-nent of process evaluations. Contrary to what processevaluation and complex interventions research guide-lines strongly recommend, the results here reportedhave identified that links between process and outcomeevaluation results are often not being clearly addressedand that to date, a widely accepted and standardised wayto achieve this is yet to be proposed. This finding is sup-ported by previous research16 21 23. In 2006, researchersinvolved in the process evaluation of the RandomisedIntervention of Pupil Peer Led Sex Education (RIPPLE)study analysed process data in two stages. The first stagewas carried out before outcome analyses had takenplace. This stage generated answers in the form ofhypotheses which were then tested during the secondstage via statistical analyses that integrated process andoutcome data. These included initially on-treatment ana-lyses and then regression analyses followed, where appro-priate, by tests for interactions assessing the impact ofmediating factors identified in the process evaluation.The recommendations that this review proposes can be

    challenging to implement for researchers working withtight funding constraints. Researchers will be faced with

    choices and trade-offs about what aspects of the interven-tion and its delivery to focus on in process evaluations.These will need to be carefully designed to maximise theirability to gain an in-depth understanding of the antici-pated factors that are likely to impact on outcomes. Thiswill avoid wasting resources collecting unnecessary data.

    Review limitationsAlthough database searches, carried out in order toidentify studies to be included in stream I, wereinformed by reliable sources and an expert librarianreviewed the search strategies, there is a possibility thatrelevant studies were not identified. The main reasonfor this relies in the fact that the term process evalu-ation is not as yet, considered a MeSH heading in anyof the databases that were searched. As a consequence, awide range of term combinations were used.A high proportion of the evidence included in stream

    I were process evaluations alongside RCTs. Although theinclusion criteria were broader, the fact that so many ofthem were RCTs could mean that the findings mighthave greater relevance to researchers thinking about thistype of research design. Finally, only studies written ortranslated into English were included in this review(because of limited financial resources to translate);there is a chance that, by doing this, a number of rele-vant studies, written in other languages, were left out.

    CONCLUSIONSThis review identified the following key findings. (1)There remains no consensus regarding process evalu-ation terminology, and this provides an excellent oppor-tunity to engage the rehabilitation research communityin creating one that reflects the nuances of research inthis field; (2) there is a need for process evaluations toaddress the nature of context, and the role that context-ual factors (and their changes over time) can play ininfluencing outcomes; (3) there is a strong need forprocess evaluations to clearly describe intervention staffskills and experience prior to joining the research andinvestigate how these may influence outcomes; (4)process evaluations to date do not investigate learningcurves and their potential impact on outcome evalua-tions; (5) there is a strong need for process evaluationsto have strategies in place to investigate and monitor indetail the level of tailoring according to patients needsthat is taking place when providers deliver the trialledintervention; and (6) further research is needed todevelop clear and standardised methods for linkingoutcome and process evaluation results.This systematic review has provided a valuable insight

    into the design and quality of process evaluationresearch in neurological rehabilitation. Further researchis needed to promote the use of process evaluationalongside health research trials, and more emphasis onproviding specific training in process evaluation researchis strongly recommended.

    12 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://bmjopen.bmj.com/

  • Twitter Follow Jo Rycroft-Malone at @jorycroftmalone

    Acknowledgements The authors acknowledge the support of Professor CathSackley and Professor Marion Walker in completing this work. They alsothank Marion Poulton for providing input in the development of this reviewssearch strategy.

    Contributors PM-A made substantial contribution to conception and design,acquisition of data and statistical analysis. PM-A prepared the manuscript andrevised for important intellectual content. CRB made substantial contributionto the conception and design of the study and to the revision of themanuscript for important intellectual content. JR-M made substantialcontribution to the conception and design of the study and to the revision ofthe manuscript for important intellectual content. All authors read andapproved the final manuscript.

    Funding This research was funded by a Research Grant from the College ofOccupational Therapists Specialist Section-neurological Practice (COTSS-NP).

    Competing interests None declared.

    Provenance and peer review Not commissioned; externally peer reviewed.

    Data sharing statement No additional data are available.

    Open Access This is an Open Access article distributed in accordance withthe Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license,which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, providedthe original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/

    REFERENCES1. Tate DG. The state of rehabilitation research: art or science? Arch

    Phys Med Rehabil 2006;87:1606.2. Hart T, Bagiella E. Design and implementation of clinical trials in

    rehabilitation research. Arch Phys Med Rehabil 2012;93:S11726.3. Clark AM. What are the components of complex interventions in

    healthcare? Theorizing approaches to parts, powers and the wholeintervention. Soc Sci Med 2013;93:18593.

    4. Townsend E. Enabling occupation: an occupational therapyperspective. Ottawa: Canadian Association of OccupationalTherapists, 2012.

    5. Medical Research Council. Developing and evaluation complexinterventions: new guidance. London: Medical Research Council,2008.

    6. Robinson L, Francis J, James P, et al. Caring for carers of peoplewith stroke: developing a complex intervention following the MedicalResearch Council framework. Clin Rehabil 2005;19:56071.

    7. Tilling K, Coshall C, McKevitt C, et al. A family support organiser forstroke patients and their carers: a randomised controlled trial.Cerebrovasc Dis 2005;20:8591.

    8. Redfern J, McKevitt C, Wolfe CDA. Development of complexinterventions in stroke care: a systematic review. Stroke2006;37:241019.

    9. Oakley A, Strange V, Bonell C, et al. Process evaluation inrandomised controlled trials of complex interventions. Health ServRes 2006;332:41316.

    10. Moore G, Audrey S, Barker M, et al. Process evaluation of complexinterventions: Medical Research Council guidance. London: MRCPopulation Health Science Research Network, 2014.

    11. Steckler A, Linnan L. Process evaluation for public healthinterventions and research. San Francisco (CA): Jossey-Bass, 2002.

    12. Hulscher M, Laurant MG, Grol RP. Process evaluation on qualityimprovement interventions. Qual Saf Health Care 2003;12:406.

    13. Bonell C, Oakley A, Hargreaves J, et al. Assessment ofgeneralisability in trials of health interventions: suggested frameworkand systematic review. BMJ 2006;333:3469.

    14. Byng R, Norman I, Redfern S, et al. Exposing the key functions of acomplex intervention for shared care in mental health: case study ofa process evaluation. BMC Health Serv Res 2008;8:27484.

    15. Braun SM, van Haastregt JC, Beurskens AJ, et al. Feasibility of amental practice intervention in stroke patients in nursing homes; aprocess evaluation. BMC Neurol 2010;10:7483.

    16. Grant A, Treweek S, Dreischulte T, et al. Process evaluations forcluster-randomised trials of complex interventions: a proposedframework for design and reporting. Trials 2013;14:1525.

    17. Centre for Reviews and Dissemination. Systematic reviews: CRDsguidance for undertaking reviews in health care. York: University ofYork, 2009.

    18. EPPI-Centre: EPPI-Centre methods for conducting systematicreviews. 2010. http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=hQBu8y4uVwI%.3d&tabid=88&mid=6162

    19. Harden A, Garcia J, Oliver S, et al. Applying systematic reviewmethods to studies of peoples views: an example of publichealth research. J Epidemiol Community Health 2004;58:794800.

    20. Oliver S, Harden A, Rees R. An emerging framework for includingdifferent types of evidence in systematic reviews for public policy.Evaluation 2005;11:42846.

    21. Murta SG, Sanderson K, Oldenburg B. Process evaluation inoccupational stress management programs; a systematic review.Am J Health Promot 2007;21:24854.

    22. Cooper Robbins SC, Ward K, Skinner SR. School-basedvaccination: a systematic review of process evaluations. Vaccine2011;29:958899.

    23. Yeary KH, Klos LA, Linnan L. The examination of process evaluationuse in church-based health interventions: a systematic review.Health Promot Pract 2012;13:52434.

    24. Kavanagh J, Trouton A, Oakley A, et al. A systematic review of theevidence for incentive schemes to encourage positive health andother social behaviours in young people. London: EPPI-Centre,Social Science Research Unit, Institute of Education, University ofLondon, 2006.

    25. Popay J, Roberts H, Sowden A, et al. Guidance on the conduct ofnarrative synthesis in systematic reviews: Final report. Swindon:ESRC Methods Programme.

    26. Thomas J, Harden A, Newman M. Synthesis: combining resultssystematically and appropriately. In: Gough D, Oliver S, Thomas J,eds. An introduction to systematic reviews. London: Sage,2012:179226.

    27. Thomas J, Harden A. Methods for the thematic synthesis ofqualitative research in systematic reviews. BMC Med Res Methodol2008;8:110.

    28. McGinley JL, Martin C, Huxham FE, et al. Feasibility, safety, andcompliance in a randomized controlled trial of physical therapy forParkinsons disease. Parkinsons Dis 2012;2012:795294.

    29. Stephens S, Feldman BM, Bradley N, et al. Feasibility andeffectiveness of an aerobic exercise program in children withfibromyalgia: results of a randomized controlled pilot trial. ArthritisRheum 2008;59:1399406.

    30. Alwin J, Krevers B, Johansson U, et al. Health economic andprocess evaluation of AT interventions for persons with dementiaand their relativesa suggested assessment model. Technol Disabil2007;19:6171.

    31. Alwin J, Persson J, Krevers B. Perception and significance of anassistive technology interventionthe perspectives of relatives ofpersons with dementia. Disabil Rehabil 2013;35:151926.

    32. Khalil H, Quinn L, van Deursen R, et al. Adherence to use of ahome-based exercise DVD in people with Huntington disease:participants perspectives. Phys Ther 2012;92:6982.

    33. Khalil H, Quinn L, van Deursen R, et al. What effect does astructured home-based exercise programme have on people withHuntingtons disease? A randomized, controlled pilot study. ClinRehabil 2013;27:64658.

    34. Scianni A, Teixeira-Salmela LF, Ada L. Challenges in recruitment,attendance and adherence of acute stroke survivors to arandomized trial in Brazil: a feasibility study. Rev Bras Fisioter2012;16:405.

    35. Scobbie L, McLean D, Dixon D, et al. Implementing a framework forgoal setting in community based stroke rehabilitation: a processevaluation. BMC Health Serv Res 2013;13:190203.

    36. Dpp CM, Graff MJ, Teerenstra S, et al. A new combined strategy toimplement a community occupational therapy intervention: designinga cluster randomized controlled trial. BMC Geriatr 2011;11:13.

    37. King G, Tucker MA, Alambets P, et al. The evaluation of functional,school-based therapy services for children with special needs: afeasibility study. Phys Occup Ther Pediatr 1998;18:127.

    38. Voigt-Radloff S, Graff M, Leonhart R, et al. WHEDA study:effectiveness of occupational therapy at home for older people withdementia and their caregiversthe design of a pragmaticrandomised controlled trial evaluating a Dutch programme in sevenGerman centres. BMC Geriatr 2009;9:4459.

    39. Voigt-Radloff S, Graff M, Leonhart R, et al. Why did an effectiveDutch complex psycho-social intervention for people with dementianot work in the German healthcare context? Lessons learnt from aprocess evaluation alongside a multicentre RCT. BMJ Open 2011;1:e000094.

    Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002 13

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://twitter.com/jorycroftmalonehttp://creativecommons.org/licenses/by-nc/4.0/http://creativecommons.org/licenses/by-nc/4.0/http://creativecommons.org/licenses/by-nc/4.0/http://dx.doi.org/10.1016/j.apmr.2005.11.013http://dx.doi.org/10.1016/j.apmr.2005.11.013http://dx.doi.org/10.1016/j.apmr.2011.11.039http://dx.doi.org/10.1016/j.socscimed.2012.03.035http://dx.doi.org/10.1191/0269215505cr787oahttp://dx.doi.org/10.1159/000086511http://dx.doi.org/10.1161/01.STR.0000237097.00342.a9http://dx.doi.org/10.1136/qhc.12.1.40http://dx.doi.org/10.1136/bmj.333.7563.346http://dx.doi.org/10.1186/1472-6963-8-274http://dx.doi.org/10.1186/1471-2377-10-74http://dx.doi.org/10.1186/1745-6215-14-15http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=hQBu8y4uVwI%.3d&tabid=88&mid=6162http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=hQBu8y4uVwI%.3d&tabid=88&mid=6162http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=hQBu8y4uVwI%.3d&tabid=88&mid=6162http://dx.doi.org/10.1136/jech.2003.014829http://dx.doi.org/10.1177/1356389005059383http://dx.doi.org/10.4278/0890-1171-21.4.248http://dx.doi.org/10.1016/j.vaccine.2011.10.033http://dx.doi.org/10.1177/1524839910390358http://dx.doi.org/10.1186/1471-2288-8-1http://dx.doi.org/10.1155/2012/795294http://dx.doi.org/10.1002/art.24115http://dx.doi.org/10.1002/art.24115http://dx.doi.org/10.3109/09638288.2012.743603http://dx.doi.org/10.2522/ptj.20100438http://dx.doi.org/10.1177/0269215512473762http://dx.doi.org/10.1177/0269215512473762http://dx.doi.org/10.1186/1472-6963-13-190http://dx.doi.org/10.1186/1471-2318-11-13http://dx.doi.org/10.1186/1471-2318-9-44http://dx.doi.org/10.1136/bmjopen-2011-000094http://bmjopen.bmj.com/

  • 40. Chung JC. An intergenerational reminiscence programme for olderadults with early dementia and youth volunteers: values andchallenges. Scand J Caring Sci 2009;23:25964.

    41. Morris DM, Taub E, Macrina DM, et al. A method for standardizingprocedures in rehabilitation: use in the extremity constraint inducedtherapy evaluation multisite randomized controlled trial. Arch PhysMed Rehabil 2009;90:6638.

    42. stensj S, Oien I, Fallang B. Goal-oriented rehabilitation ofpre-schoolers with cerebral palsya multi-case study of combineduse of the Canadian Occupational Performance Measure (COPM)and the Goal Attainment Scaling (GAS). Dev Neurorehabil2008;11:2529.

    43. Mayo NE, MacKay-Lyons MJ, Scott SC, et al. A randomized trial oftwo home-based exercise programmes to improve functional walkingpost-stroke. Clin Rehabil 2013;27:65971.

    44. Leuty V, Boger J, Young L, et al. Engaging older adults withdementia in creative occupations using artificially intelligent assistivetechnology. Assist Technol 2013;25:729.

    45. Macht M, Gerlich C, Ellgring H, et al. Patient education inParkinsons disease: formative evaluation of a standardizedprogramme in seven European countries. Patient Educ Couns2007;65:24552.

    46. Li F, Harmer P, Fisher KJ, et al. Tai chi-based exercise for olderadults with Parkinsons disease: a pilot-program evaluation. J AgingPhys Act 2007;15:13951.

    47. Taylor NF, Dodd KJ, McBurney H, et al. Factors influencingadherence to a home-based strength-training programme for youngpeople with cerebral palsy. Physiotherapy 2004;90:5763.

    48. Resnick B, Michael K, Shaughnessy M, et al. Exercise interventionresearch in stroke: optimizing outcomes through treatment fidelity.Top Stroke Rehabil 2011;18(Suppl 1):61119.

    49. Vant Leven N, Graff MJ, Kaijen M, et al. Barriers to and facilitatorsfor the use of an evidence-based occupational therapy guideline forolder people with dementia and their carers. Int J Geriatr Psychiatry2012;27:7428.

    50. Dpp CM, Graff MJ, Rikkert MG, et al. Determinants for theeffectiveness of implementing an occupational therapy interventionin routine dementia care. Implement Sci 2013;8:131.

    51. MacNeil Vroomen J, Van Mierlo LD, Van de Ven PM, et al.Comparing Dutch case management care models for people withdementia and their caregivers: the design of the COMPAS study.BMC Health Serv Res 2012;12:132.

    52. Whiting DL, Simpson GK, McLeod HJ, et al. Acceptance andcommitment therapy (ACT) for psychological adjustment aftertraumatic brain injury: reporting the protocol for a randomisedcontrolled trial. Brain Impair 2012;13:36076.

    53. Borrelli B, Sepinwall D, Ernst D, et al. A new tool to assesstreatment fidelity and evaluation of treatment fidelity across10 years of health behavior research. J Consult Clin Psychol2005;73:85260.

    54. Vluggen TP, van Haastregt JC, Verbunt JA, et al. Multidisciplinarytransmural rehabilitation for older persons with a stroke: the designof a randomised controlled trial. BMC Neurol 2012;12:164.

    55. Saunders RP, Evans MH, Joshi P. Developing a process-evaluationplan for assessing health promotion program implementation: ahow-to guide. Health Promot Pract 2005;6:13447.

    56. Letts L, Dunal L. Tackling evaluation: applying a programme logicmodel to community rehabilitation for adults with brain injury. CanJ Occup Ther 1995;62:26877.

    57. Carroll C, Patterson M, Wood S, et al. A conceptual framework forimplementation fidelity. Implement Sci 2007;2:409.

    58. Dane AV, Schneider BH. Program integrity in primary and earlysecondary prevention: are implementation effects out of control? ClinPsychol Rev 1998;18:2345.

    59. Dusenbury L, Brannigan R, Falco M, et al. A review of research onfidelity of implementation: implications for drug abuse preventions inschool settings. Health Educ Res 2003;18:23756.

    60. Bate P. Context is everything. In: Perspectives on context. Originalresearch. The Health Foundationinspiring improvement. London:The Health Foundation, 2014:130.

    61. Dixon-Woods M. The problem of context in quality improvement. In:Perspectives on context. Original research. The Health Foundationinspiring improvement. London: The Health Foundation, 2014:87101.

    62. Campbell NC, Murray E, Darbyshire J, et al. Designing andevaluating complex interventions to improve health care. BMJ2007;334:4559.

    63. Elford J, Sherr L, Bolding G, et al. Peer-led HIV prevention amonggay men in London: process evaluation. AIDS Care2002;14:35160.

    64. OCathain A, Thomas KJ, Drabble SJ, et al. Maximizing the value ofcombining qualitative research and randomised controlled trials inhealth research: the QUAlitative Research in Trials (QUART) studya mixed methods study. Health Technol Assess 2014;18:1197.

    65. Dumas JE, Lynch AM, Laughlin JE, et al. Promoting interventionfidelity. Conceptual issues, methods, and preliminary results fromthe Early Alliance prevention trial. Am J Prev Med 2001;20(1Suppl):3847.

    66. Santacroce SJ, Maccarelli LM, Grey M. Intervention fidelity. NursRes 2004;53:636.

    67. Horner S, Rew L, Torres R. Enhancing intervention fidelity: a meansof strengthening study impact. J Spec Pediatr Nurs 2006;11:809.

    68. Ramsay CR, Wallace SA, Garthwaite PH, et al. Assessing thelearning curve effect in health technologies. Lessons from thenonclinical literature. Int J Technol Assess Health Care2002;18:110.

    69. Cook JA, Ramsay CR, Fayers P. Statistical evaluation of learningcurve effects in surgical trials. Clin Trials 2004;1:4217.

    70. Masterson-Algar P, Burton CR, Rycroft-Malone J, et al. Towards aprogramme theory for fidelity in the evaluation of complexinterventions. J Eval Clin Pract 2014;20:44552.

    71. Ramsay CR, Grant AM, Wallace SA, et al. Assessment of thelearning curve in health technologies. A systematic review. IntJ Technol Assess Health Care 2000;16:1095108.

    72. Castro FG, Barrera M, Martinez CR. The cultural adaptation ofprevention interventions: resolving tensions between fidelity and fit.Prev Sci 2004;5:415.

    73. Morrison DM, Hoppe MJ, Gillmore MR. Replicating an intervention:the tension between fidelity and adaptation. AIDS Educ Prev2009;21:12840.

    74. Harshbarger C, Simmons G, Coelho H, et al. An empiricalassessment of implementation, adaptation, and tailoring: theevaluation of CDCs National Diffusion of VOICES/VOCES. AIDSEduc Prev 2006;18(4 Suppl A):18497.

    75. Hawe P, Shiell A, Riley T. In response to Spillane V., Byrne M. C.,Byrne M., Leathem C. S., OMalley M. & Cupples M. E. (2007)Monitoring treatment fidelity in a randomized trial of a complexintervention. Journal of Advanced Nursing 60(3), 343352. Importantconsiderations for standardizing complex interventions. J Adv Nurs2008;62:267.

    76. Song MK, Happ MB, Sandelowski M. Development of a tool toassess fidelity to a psycho-educational intervention. J Adv Nurs2010;66:67382.

    77. Schulz KF, Altman DG, Moher D, et al., CONSORT Group.CONSORT 2010 statement: updated guidelines for reporting parallelgroup randomized trials. BMJ 2010;340:c332.

    78. Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting ofinterventions: template for intervention description and replication(TIDieR) checklist and guide. BMJ 2014;348:g1687.

    14 Masterson-Algar P, et al. BMJ Open 2016;6:e013002. doi:10.1136/bmjopen-2016-013002

    Open Access

    on 20 August 2018 by guest. P

    rotected by copyright.http://bm

    jopen.bmj.com

    /B

    MJ O

    pen: first published as 10.1136/bmjopen-2016-013002 on 8 N

    ovember 2016. D

    ownloaded from

    http://dx.doi.org/10.1111/j.1471-6712.2008.00615.xhttp://dx.doi.org/10.1016/j.apmr.2008.09.576http://dx.doi.org/10.1016/j.apmr.2008.09.576http://dx.doi.org/10.1080/17518420802525500http://dx.doi.org/10.1177/0269215513476312http://dx.doi.org/10.1080/10400435.2012.715113http://dx.doi.org/10.1016/j.pec.2006.08.005http://dx.doi.org/10.1123/japa.15.2.139http://dx.doi.org/10.1123/japa.15.2.139http://dx.doi.org/10.1016/j.physio.2003.09.001http://dx.doi.org/10.1310/tsr18s01-611http://dx.doi.org/10.1002/gps.2782http://dx.doi.org/10.1186/1748-5908-8-131http://dx.doi.org/10.1186/1472-6963-12-132http://dx.doi.org/10.1017/BrImp.2012.28http://dx.doi.org/10.1037/0022-006X.73.5.852http://dx.doi.org/10.1186/1471-2377-12-164http://dx.doi.org/10.1177/1524839904273387http://dx.doi.org/10.1177/000841749506200506http://dx.doi.org/10.1177/000841749506200506http://dx.doi.org/10.1186/1748-5908-2-40http://dx.doi.org/10.1016/S0272-7358(97)00043-3http://dx.doi.org/10.1016/S0272-7358(97)00043-3http://dx.doi.org/10.1093/her/18.2.237http://dx.doi.org/10.1136/bmj.39108.379965.BEhttp://dx.doi.org/10.1080/09540120220123739http://dx.doi.org/10.1016/S0749-3797(00)00272-5http://dx.doi.org/10.1097/00006199-200401000-00010http://dx.doi.org/10.1097/00006199-200401000-00010http://dx.doi.org/10.1111/j.1744-6155.2006.00050.xhttp://dx.doi.org/10.1191/1740774504cn042oahttp://dx.doi.org/10.1111/jep.12174http://dx.doi.org/10.1017/S0266462300103149http://dx.doi.org/10.1017/S0266462300103149http://dx.doi.org/10.1023/B:PREV.0000013980.12412.cdhttp://dx.doi.org/10.1521/aeap.2009.21.2.128http://dx.doi.org/10.1521/aeap.2006.18.supp.184http://dx.doi.org/10.1521/aeap.2006.18.supp.184http://dx.doi.org/10.1111/j.1365-2648.2008.04686.xhttp://dx.doi.org/10.1111/j.1365-2648.2009.05216.xhttp://dx.doi.org/10.1136/bmj.c332http://dx.doi.org/10.1136/bmj.g1687http://bmjopen.bmj.com/

    Process evaluations in neurological rehabilitation: a mixed-evidence systematic review and recommendations for future researchAbstractIntroductionMethodsDesignmixed-evidence synthesisSearch methodsStream I searchesStream II searches

    Screening of resultsData extractionSynthesis of extracted dataSynthesis 1Synthesis 2Synthesis 3

    ResultsIncluded studiesSynthesis 1research evidenceTerminologyAims and strategiesAddressing contextRecruitmentDescribing those in charge of delivering the trialled interventionTraining and assessing intervention staff competence in delivering the interventionTailoring and adherence to intervention protocolsInvestigating participants' opinionA tool to identify barriers and facilitators to implementationLinking trial outcomes to process evaluation findings

    Synthesis 2methodological guidance/resourcesRecommendation 1Recommendation 2Recommendation 3Recommendation 4Recommendation 5Recommendation 6Recommendation 7Recommendation 8

    Synthesis 3overarching findingsAREA 1complex interventions and theoretical approachesAREA 2contextAREA 3recruitmentAREA 4describing intervention staffAREA 5describing the interventionAREA 6preparing and assessing intervention staffAREA 7delivering the trial interventionAREA 8understanding and interpreting process evaluation resultsAREA 9thinking about methodology

    DiscussionReview limitations

    ConclusionsReferences