Rapid evaluation of health and care services - planning a sustainable solution for the post-COVID reset - February 2021
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Rapid evaluation of health and care services - planning a sustainable solution for the post-COVID reset February 2021 Supported by: 1
Foreword In the first wave of COVID-19, health and campaign, which seeks to shape what the care services innovated and adapted at health and care system should look like in unprecedented speed to provide care the aftermath of the COVID-19 pandemic. and protect staff and patients during a As well as understanding what changes rapidly developing global pandemic. have taken place in response to COVID-19, through our Reset campaign we have been This White Paper, led by UCLPartners exploring what clinicians, academics, leaders and the London School of Hygiene & and innovators believe should be retained, Tropical Medicine, explores the barriers and adapted, reinstated or stopped, and for facilitators to performing timely, rigorous which populations or settings. The insights Richard Stubbs and effective evaluations of these changes. in this White Paper – developed through Lead for AHSN Network Health and Care Using learning from the pandemic, it sets interviews and a roundtable with leaders Reset campaign and Chief Executive Yorkshire & Humber AHSN out recommendations for how to prioritise from health, research and the voluntary and resource rapid service evaluations to sector – are a key element of this work. enable more efficient and effective scale-up of health and care innovations, both regionally The recommendations presented here and nationally. These recommendations are suggest the need for changes in how relevant to the future health and care system rapid service evaluations are resourced, PLAY both within and beyond the current pandemic. co-ordinated and delivered. At the AHSN Network we are committed to working with The findings highlight examples of outstanding the wide range of stakeholders involved to regional practice in rapid evaluation, as well help ensure that rapid service evaluation as revealing where changes are needed on underpins all significant future health and Click ‘PLAY’ to hear Professor Mike Roberts a broader national level. care service changes, providing confidence set the scene for this work. to commissioners, health care professionals This work was carried out as part of the and the public that changes are to the benefit Professor Mike Roberts AHSN Network’s Health and Care Reset of the health and care of the population. Managing Director UCLPartners www.ahsnnetwork.com/rapidevaluation 2
Executive summary UCLPartners on behalf of the Overview of recommendations: AHSN Network commissioned the London School of Hygiene & • There should be a national policy to • There should be greater national Tropical Medicine to undertake rapid promote evaluation of all significant and regional co-ordination of effort research to inform recommendations service changes across research and evaluation for how to prioritise and resource potential partners rapid service evaluations, drawing • Large-scale service change should have on learning during the pandemic. an appropriate funding allocation to • There should be a national repository of support a relevant evaluation programme available evaluations and applied research The findings have highlighted both areas of excellence and deficiencies. • Clarity is required on expectations of • There is a need to increase the capacity A number of recommendations are made different funded entities regarding for evaluation and applied research that if implemented would help ensure balance of research and evaluation that can be met through increased that all significant health and care staff training and collaborations service changes in the future would be • Greater parity for social care evaluation across a wider range of providers subject routinely to relevant evaluation and research is needed with complementary skill sets. that should provide confidence to commissioners and the public that • There should be a system for ongoing change is for the better – and where dialogue between the NHS and care with it is not, that it is discontinued. researchers to identify priority needs for service evaluation and research www.ahsnnetwork.com/rapidevaluation 3
Contents Contents 5 Background 6 Approach 7 What we learned - Key findings from interviews and the roundtable 7 Can we develop a shared agreement on what a good evaluation is? 10 What resources are available for service evaluations and what are the roles of national bodies? 12 How can we align research infrastructure with the NHS and agree a strategic approach at national, regional and local level? 16 Recommendations 18 Authorship 19 References Appendix www.ahsnnetwork.com/rapidevaluation 4
Background Background In 2014, the NHS Five Year Forward View set While pandemic plans provided a framework Service Evaluation Team (RSET) to do rapid service out a clear intention to strengthen innovation to prioritise funding for more traditional (often evaluation nationally, these teams were reported to and develop new ways of working, arguing clinical) research, there were no pre-prepared have had limited flexibility, as reflected in funding that future gains were as likely to come from plans for rapid evaluations or monitoring using and priorities, in the first stages of the pandemic. changes in process and service delivery as from routine data. Research priorities were defined technology (NHS, 2014). However, implementing nationally but it was unclear how these aligned At the same time, the academic community untested innovations without learning and with regional or integrated care system (ICS) mobilised quickly to produce additional research, sharing lessons about their impacts or level service needs, including evaluation of but their work was not always coordinated (for identifying the key ingredients that are required innovations. In many parts of the country there example multiple simultaneous studies of COVID-19 for them to succeed can be harmful and costly. were no obvious mechanisms that allowed ‘risk factors’ by different academic teams) and did For this reason, it is generally accepted that local health systems to signal their immediate not necessarily fulfil all the rapid service evaluation innovations should be evaluated before they priorities for rapid evaluation and activate a needs of health and care providers. This divergence are extended to other areas across the NHS. coordinated system to undertake them in the between the research enterprise and the health first wave. In addition, there was little clarity care delivery system has been shaped by a legacy During the early stages of the COVID-19 about where resources for rapid research and of policy decisions and investment over the past pandemic, health and social care service service evaluation could be mobilized from two decades (Walshe and Davies, 2013). providers across England transformed many and how they could be co-ordinated at pace aspects of service delivery, rapidly implementing to support requests for rapid evaluations While there have been some good regional new interventions and models of care, from NHS England and NHS Improvement solutions to systems change, these vary across the sometimes in the absence of a directly applicable (NHSE/I) and the care sector at ICS, regional, country. The Beneficial Changes Network (set up by evidence base. Health and care leaders had to or national level. NHSE/I and built on realignment of existing capacity act in the absence of a system-wide mechanism rather than new funding) is an example that seeks to evaluate and gather real time insights, leaving This challenge exists both during and outside to extract learning at a national level to determine them to innovate in spite of the system rather of a health crisis. Although the National Institute which innovations were successful, with widespread than because of it. Yet, while many performed for Health Research (NIHR) have a Health acceptance that these should be retained and rapid service evaluations and gathered rapid Services and Delivery Research Programme scaled-up. However, there remain bigger questions insights, not all did, and it was unclear whether and commission two teams including the about how service evaluations should be prioritised, the innovations that service providers deployed NIHR BRACE (The Birmingham, RAND and funded, resourced, and conducted in order were always appropriate or effective. Cambridge Evaluation Centre) and the Rapid to better align with the needs of the system. www.ahsnnetwork.com/rapidevaluation 5
Approach Approach To understand the challenges in aligning 2) What are the facilitators, barriers rapid service evaluation with service needs, and opportunities to performing rapid UCLPartners on behalf of the AHSN Network service evaluations, regionally and commissioned researchers at the London nationally, both during and outside School of Hygiene & Tropical Medicine of a rapidly developing emergency? to perform a piece of rapid qualitative research with key stakeholders. In December 2020, the London School of Hygiene & Tropical Medicine and UCLPartners Eighteen independent semi-structured presented the findings to an AHSN Network interviews were carried out with leaders from sponsored roundtable of eminent leaders in “Health and care leaders had to act a range of health policy, research and service health (including NHSE/I and NICE), research in the absence of a system-wide delivery organisations including NHSE/I, (including NIHR), the voluntary sector mechanism to evaluate and gather NIHR, AHSNs, regional medical directors, and independent thinktanks, to develop an applied research collaboration (ARC), recommendations about how to prioritise real time insights, leaving them universities, National Institute for Health and mobilise resources for rapid evaluation to innovate in spite of the system and Care Excellence (NICE), the Nuffield of services in the NHS and social care rather than because of it. “ Trust, the Strategy Unit, and individuals (for more details about how the roundtable with experience of carrying out regional was structured see Appendix). service evaluation (see Appendix for further information about methods). Interviews A parallel evaluation has been undertaken sought to understand stakeholders by the AHSN Network, led by Oxford AHSN, perspectives on two main questions: that specifically addresses the insights of patients and frontline workers as key 1) How can rapid evidence reviews and influencers of coproduction in evaluation rapid service evaluations be resourced and research. We therefore did not include and prioritised to inform meaningful these stakeholders as part of our research, service transformation in health and and its findings should be considered with social care systems? those presented here. www.ahsnnetwork.com/rapidevaluation 6
What we learned - Key findings from interviews and the roundtable What we learned - Key findings from interviews and the roundtable Can we develop a shared agreement The Medical Research Council has published on what a good evaluation is? a framework for performing complex evaluations (Craig et al., 2008; Moore et al., Evaluations are an important element of 2015), which includes a portfolio of methods, effective service change but there is no but the rigour with which they can be applied single framework that can be used for all may be limited by time constraints facing purposes, to the disappointment of some. service managers who need immediate One stakeholder we spoke to reflected: answers. Other frameworks also exist, including the Treasury’s Magenta and Green Books (HM Treasury, 2013, 2020) and various “The broader question is how others that are often used in the international do you respond to new areas of development sector. However, in practice, as one stakeholder we spoke to said: interest and is there a standard pre-packaged methodology that can rapidly be adopted off the shelf?” “Other areas of practice and policy do this perhaps slightly more The Bill and Melinda Gates Foundation rigorously …in health…there is define evaluation as ‘the systematic, a slightly sporadic approach.” objective assessment of an ongoing or completed intervention, project, policy, program, or partnership. Evaluation is We identified an appetite among some best used to answer questions about what key stakeholders to produce some kind of actions work best to achieve outcomes, pre-prepared package of evaluation methods how and why they are or are not achieved, and designs that could be used in health and what the unintended consequences have social care in a pandemic situation and been, and what needs to be adjusted to beyond. In real life, however, this is not improve execution.’ straightforward and evaluations exist ➜ www.ahsnnetwork.com/rapidevaluation 7
What we learned - Key findings from interviews and the roundtable on a continuum of scope and resource et al., 2019). This is partly attributable to We asked what a good evaluation requirements (Lamont et al., 2016), dissonance between research, evaluation needed to be: ranging from: and practice, which can mean that the priorities and expectations of researchers and • Those performed to monitor the progress research commissioners are often misaligned, Researchers of a project and adapt it – usually particularly with respect to what questions can said: performed in-house reasonably be answered in short timescales • Those that are intended for others to and how likely the work is to be publishable. • Independent learn from and test further – which may It is important also to recognise that whilst • Well designed • Assess: require modest external research funding applied health researchers (who are in short - Impact - Timing • Those that are designed to influence supply) are required for substantive research - Value - Process a change in practice at a national or evaluations, this level of expertise is not • Generalizable international level – which can require necessarily required for progress monitoring, • Publishable (to satisfy substantive research efforts that can which can be undertaken by those trained research funders) last as long as 3-5 years. specifically as service evaluators. Each requires different approaches, often Evaluations mean different things to different involving a mix of methods, ranging from stakeholders so any evaluation must be planned experimental designs, such as randomised in close consultation with those who will be trials, to complex adaptive system evaluations, affected by the resulting changes. This includes Commissioners and process evaluations, depending on what patient groups and carers as well as frontline said: is being evaluated and why. health and care workers. It is also important to take account of stakeholder readiness to • Rapid • Focused Previous work by the Nuffield Trust has receive information, ensuring clarity of intent, • Flexible • Easy to read revealed that many evaluations of health and whether commissioners want to know if • Be sufficiently and care services are poorly designed, fail to something that they have invested in does well-resourced, including define clear research questions or evaluate not actually work. It is crucial to ensure that specialist evaluators the processes involved, and are often unable commissioners and funders value the results • Inform service change to achieve their desired outcomes (Kumpunen enough to prioritise their implementation. ➜ www.ahsnnetwork.com/rapidevaluation 8
What we learned - Key findings from interviews and the roundtable It is important to note that every better to implement them than not to. innovation does not necessarily need Nevertheless, an evaluation of the costs, rigorous evaluation and a variety of benefits and potential harms should be approaches exist to prioritise what to undertaken at the first opportunity. evaluate and what not to (McGill et al., 2015). In general, it is, however, ill-advised The Beneficial Changes Programme is to scale up any major innovation without led by NHSE/I and seeks to catalogue local rigorously evaluating implementation, innovations that have been successfully although this does happen, with a recent implemented during the pandemic in and somewhat controversial example order to identify those that can be scaled being the national roll-out of mass up nationally in partnership with AHSNs repeated rapid antigen tests for COVID-19. and ARCs. The programme’s success will ultimately depend on the ability of service In times of crisis, where decisions must providers to evaluate the innovations be taken quickly and evidence from they have implemented and compare rigorous evaluations is lacking, decisions them in different settings, accounting to scale-up interventions may be taken for differences in context. where it seems likely, a priori, that benefits to patients and the workforce For many commissioners and policy-makers, will exceed the harms and where the relevance is more desirable than academic intervention is backed by theory, has rigour (Petticrew, Chalabi and Jones, 2012). plausible mechanisms and justifies the However, poorly designed studies could opportunity cost. During the pandemic cause harm and don’t help identify where many of the innovations that were rolled quality improvement or change is needed. out were already in the pipeline, with For these reasons, a rigid evaluation some benefiting from an existing evidence framework is challenging but a package base. Most were presumed to have few of options may well provide credibility for unforeseen harms and under the extreme the different methodological approaches circumstances, it was considered to be adopted matched to the topic in question. www.ahsnnetwork.com/rapidevaluation 9
What resources are available for service evaluations and what are the roles of national bodies? What resources are available One participant said: for service evaluations “There were some great examples of pre- programme, drawing on several regional and what are the roles specified work around pandemic response service evaluations in collaboration with the AHSN Network (West of England Academic of national bodies? that worked quite well….having pre-specified Health Science Network, 2020). teams was interesting. I don’t know if it is a Those interviewed were unable to identify re-usable model more generally….but there Earlier examples of national evaluations any formal mechanism whereby NHS or care was some really good practice from the within the NHS include the ‘New Care Models organisations could articulate their needs for pre-prepared pandemic response plans.” Programme’, commissioned by NHSE, the NIHR rapid research and evaluation, at national or Policy Research Programme and undertaken by regional level, during the first wave, or any various universities and the Health Foundation organisation with a formal role in conducting (Operational Research and Evaluation Unit NHS such work. Some regions have created their The national process for prioritising rapid England, 2016; Checkland et al., 2019; Morciano own solutions to this, which are outlined in research and evaluation needs is based in NIHR, et al., 2020). This evaluation of NHS vanguards case studies on page 14. which funds two specialized national rapid played a key role in informing the NHS Five Year evaluation teams, including the NIHR BRACE Forward view and is a good example of one Rapid Evaluation Centre and the Rapid Service of the ways in which national evaluations can Evaluation Team (RSET). These teams reported support innovation in the NHS, but large-scale having limited capacity but the ability to evaluations such as this can take time to set What happened nationally? perform rapid, high-quality, and generalisable up and need to allow time to elapse in order research to inform national policy, a recent to make valid pre/post comparisons. The pandemic plans included pre-prepared example being remote home monitoring frameworks for clinical trials, including of COVID-19 patients (Vindrola et al., 2020). More recently, an internal evaluation hub for treatments and vaccines. Research Priorities are established in a horizon-scanning within NHSE/I has been established to enable priorities were determined early in the process that includes multiple stakeholders. collaboration across different teams to share pandemic by the Government Office for knowledge, tools and experience. A wider Science and SAGE, feeding into national Other organisations also fund and undertake evaluation community supports shared learning funding calls by UKRI and NIHR. However, rapid evaluations, including the Health by bringing together people working on applied rapid service evaluations did not receive Foundation. NHSE has also supported the evaluation across health and social care system, dedicated funding. scale-up of the COVID-19 Oximetry @home About Applied Evaluation Community of Practice. www.ahsnnetwork.com/rapidevaluation 10
What resources are available for service evaluations and what are the roles of national bodies? What happened regionally? We were unable to identify where the Although ARCs individually and collectively from multiple local partner organisations responsibility or funding to perform rapid carried out local evaluations during the and regional budgets to fund rapid service evaluations of complex interventions was pandemic, they are mainly funded to carry evaluations. These examples of linkages intended to come from regionally. There was out research. Rapid service evaluations may across local geographies were seen as a particular absence of direction in relation not result in publishable work, presenting extremely positive, although it was noted to social care, and in the largely private care a challenge for researchers who are normally that the AHSN boundaries do not always align home sector. judged and funded on their publication with NHS regions. In terms of funding, AHSNs record both by the NIHR and their hosting are commissioned mainly to support regional Many commented on the Applied Research higher education institutes. uptake and spread of innovations at pace Collaborations (ARCs), which are funded by and scale rather than to evaluate innovations NIHR, as best positioned to lead this work as The ARCs differ regionally in terms of the (Ferlie et al., 2017). they are designed to respond to local needs. research themes and priorities they are When asked what recommendations they funded to fulfil and how that funding is What was clear was that the NHS was not would make to NHSE/I and NIHR to develop allocated in advance. All have some kind viewed as an entity that systematically funds frameworks for health and care leaders to of responsive function to address service rapid service evaluations or training of staff articulate their rapid evaluation needs, one providers’ needs, but the degree to which to conduct them, instead seeing the NIHR person told us: rapid service evaluations feature in this work as holding that function, and as a result varies. There are examples of outstanding the response has been inconsistent. Those individual leadership in supporting service organisations that innovated and performed “Work more closely with the evaluation at regional level amongst single rapid service evaluations felt that they did so and groupings of ARCs formed through in spite the system and not because of it. ARCs because they have their shared interest but not co-ordinated by networks. That would be the a national response. strongest message.” Some interviewees also gave examples of rapid evaluations being undertaken by the Academic Health Science Networks (AHSNs). Manchester and Yorkshire and Humber AHSNs, for example, pooled funding www.ahsnnetwork.com/rapidevaluation 11
How can we align research infrastructure with the NHS and agree a strategic approach at national, regional and local level? How can we align research Barriers to rapid evaluation infrastructure with the NHS and Many stakeholders identified several barriers ‘capabilities’, scanning across the system to agree a strategic approach at to performing rapid evaluations, including identify what skills and contributions could a shortage of health services researchers be accessed in different ways, including national, regional and local level? and evaluators and a lack of funding for performance of rapid reviews of evidence, timely applied research, such as that using which struggle to acquire resources and are routine data. frequently duplicated by different groups. One participant told us: For example, many third sector organisations It was clear that there is a range of have suffered large budget cuts but organisations occupying the service can contribute research and evaluation “I’d like to get to a point where we evaluation landscape, including universities, resources, including support for patient and are not just thinking about what we consultancies, thinktanks, trusts, Public user engagement and, in some cases, skilled did in an emergency but about how Health England and others, but there evaluators. It was repeatedly emphasised is often inadequate collaboration or that, when designing evaluations rapidly, we create adaptable, responsive and coordination among them. One potential the contribution of patients and staff should flexible systems going forward.” solution has been to view resources as not be ignored. www.ahsnnetwork.com/rapidevaluation 12
How can we align research infrastructure with the NHS and agree a strategic approach at national, regional and local level? Facilitators of rapid evaluation Relationships, political will, existing capacity, One participant commented that: evaluation and scale up of ‘mass’ testing and pre-established programmes of work for COVID-19 using rapid tests. were identified as levers for resourcing and “Perhaps one way forward is to ensure we facilitating innovation and rapid evaluations. have the right network at national level Participants highlighted the importance of We heard that it was much easier to developing early plans to disseminate and including health, social care, voluntary sector resource and initiate rapid evaluations where embed evaluation findings within wider there were pre-existing relationships and people with lived experience….I’m sure practice, co-ordinating the cycle of continuous between researchers and service providers. many good networks were happening before improvement and innovation, underpinned Many were based on previous personal but the data-sharing and shared purpose by research and evaluation. connections and collaborations, while others through COVID-19 have definitely thrown involved creation of new formal and informal people together in a way not seen before Reduced restrictions on accessing and networks across boundaries. Areas that sharing data played a core role in this and already had applied health researchers leading to new partnerships and relationships.” one participant said: embedded in trusts were also more able to launch new evaluations more quickly. Allied to this was the call for a joined-up A particular theme was also that many approach linking those involved in evaluation “Everything that we do in terms of research… programmes of work, including remote and implementation across organisations to and improvement relies on data and there triage, social prescribing and virtual wards agree priorities. There were also suggestions is a golden opportunity to start to influence were also based on previously defined that a single repository might be created what data is collected. The Data Alliance programmes of research, where it was easier where regions could signal their research to overcome immediate barriers (such as needs and intentions; this could facilitate Partnership has come about because of data access) and move to scale-up quickly. pooling of resources and avoid duplication COVID-19…one of the biggest things that (See Recommendations). underpinned the ability to do things at pace Our discussions highlighted the importance was the data sharing across boundaries.” of ensuring that this process is as inclusive Initiatives that were backed by political will as possible, drawing on as many stakeholders were more likely to attract resources. One as possible in establishing the goals of controversial example was the prioritization, evaluations, noting that wide engagement accompanied by direct government funding, to is essential for subsequent dissemination. the University of Liverpool for the simultaneous www.ahsnnetwork.com/rapidevaluation 13
Case studies The London Evaluation Cell Chief Executive of Health Innovation Manchester, London Regional NHS has now convened an Evaluation Cell that incorporates the Professor Ben Bridgewater told us: three London AHSNs and the three London ARCs, working with the regional clinical “Health Innovation Manchester Together, we harnessed regional and transformation leads. incorporates Manchester’s AHSN, funding and partners brought their academic health science centre own funding too. We supported some The cell has agreed a set of criteria (AHSC), ARC and integrated of these programmes using local with regional clinical and academic care system digital office. funding at risk, and utilised national leaders, considering the scale of impact, It also represents Greater funding for others. generalisability, measurability etc to prioritise Manchester’s wider research regional evaluation plans. The cell is working and innovation system which A key reflection from our experience is to define and prioritise specific evaluation creates a powerful integrated that technology has been at the heart and research questions and to develop a cross-system perspective for of so much and ‘digital’ has rapidly regional learning health system programme research and evaluation, as well become elevated right to the top of using research grade evidence. as transformation. This enabled the agenda. Secondly, organisations us to collectively define trials must work as a network of capabilities The London evaluation cell is chaired by and diagnostics to evaluate, and (clinical, digital, academic, delivery, etc), an NHS chief executive and is a good respond to the national priorities not separate entities. Whilst national example of collaborative engagement on research, along with local oversight and planning is essential between key partners to perform rapid priorities for innovation and in a pandemic, senior NHS leadership learning and evaluation. It benefits from transformation from the city must also trust the local areas to regular meetings with clear actions region NHS command and know what they need to do and outputs that are aligned with control structures. and how to sustain it.” NHS service needs. www.ahsnnetwork.com/rapidevaluation 14
Case studies Evaluation Lead and Spread Fellow in the South West Academic Health Science Network, Sarah Robens, told us: “In the South West, rapid learning share links to online questionnaires from the pandemic has changed across our system, as well as our usual approach to service maximising our existing networks evaluation. It allowed us to gather and relationships. We built on information quickly, in a way that information gathered through is meaningful and useful to our questionnaires, with in-depth stakeholders in real-time. interviews and distilled information down to shareable summaries. When COVID-19 hit, we immediately recognised the imperative of Building on this region-wide learning from the pandemic in a learning, we were commissioned way that could help organisations across our three counties to and individuals to make decisions undertake rapid learning work on a much shorter cycle than our across each. usual evaluation work. The model we developed set out To ensure we harnessed the eight conditions for rapid change, positive changes and new outlining the importance of approaches, our focus was on how organizational and cultural shifts the response happened, rather in creating the environment for than what happened. We engaged positive change. We have now as many people as we could in a adopted this as an assessment tool short space of time, using twitter to and framework going forward.” www.ahsnnetwork.com/rapidevaluation 15
Recommendations Resources and Infrastructure for rapid evaluations appropriate funding allocation to support a relevant evaluation programme to understand NIHR should: the benefits or otherwise of those programmes - Clarify expectations of different funded entities - Align resources to create the infrastructure regarding balance of research and evaluation to engage with frontline organisations across Recommendations - Consider training more clinical academics to all sectors in order to build learning systems undertake applied research and evaluations, through new ways of working These headline recommendations, based potentially with the support of the Royal Colleges - Incentivise and build adaptable, responsible and upon stakeholder interviews and the - Ensure that funding for rapid service evaluations flexible systems by building on existing expertise subsequent roundtable discussion, are reaches other relevant sectors, including social e.g. the ARC and AHSN networks to create presented as learning from the COVID-19 care and ensure that the relevant expertise is an asset-based national network of regional pandemic relevant to the NHS and care made available to these partners structures for rapid service evaluations reset period and the longer term, to - Promote and fund work that can shows - Build on existing partnerships, such as the Data ensure that resources required for rapid demonstrable systems benefit, shifting the Alliance Partnership to permanently lower barriers evaluation of changes to the NHS and funding conditions and career progression of to accessing data quickly to enable sharing of care systems are prioritised, co-ordinated researchers to reward demonstrable systems health and social care data across boundaries and appropriately applied to the benefit benefit as well as academic outputs - Prevent duplication of resources by performing of patients and the wider public. - Provide opportunities for less established some rapid reviews nationally or regionally researchers to access funding for applied rather than locally. research, where they show the capacity to innovate to produce quality applied research Co-ordinating function (e.g. the OPENSAFELY group) - Reduce the bureaucracy for regional service NHSE/I and NIHR should: providers to access research funding for - Consider creating a national database of evaluating service innovations. evaluations that have been completed or are underway, as done for the global UKCDR NHS regions should work with sustainability COVID-19 research tracker or clinicaltrials.gov. and transformation partnerships (STPs)/ICSs to: This would have to be appropriately resourced - Ensure large scale service change has an and updated frequently ➜ www.ahsnnetwork.com/rapidevaluation 16
Recommendations - Consider developing pre-prepared pandemic evaluation needs, incorporating local otherwise response evaluations mimicking the process perspectives, and considering the priorities - Ensure that patients, service users and for drug trials such as the RECOVERY trial of the NHS Long Term Plan health and social care workers are central - Agree roles and responsibilities and key - Promote a dialogue between the service and to all evaluations, being particularly mindful contributions of research users and researchers to help frame the potential and to include social care, where research agree a memorandum of understanding the limitations of research and evaluation infrastructure is less well developed (e.g. PHE, NHSE, Health Education to facilitate honest conversations about - Ensure that any comparisons of outcomes England (HEE)) questions that can and cannot be answered account for differences in populations, study - Consider approaches to co-ordinating with available data. design, capacity of recipients to benefit expertise and resources across agencies, and key process outcomes such as the Improvement Analytics Unit, Evaluation design - Ensure, where feasible and appropriate the Health Services and Delivery Research that evaluations have a strategic plan Programmes, the Rapid evaluation centres ARC and AHSN Networks: for disseminating the findings, with (RSET, BRACE), This Institute, Discovery, - Use the existing ARC/AHSN networks consideration of how they will contribute and Q community to build stronger relationships between to any potential scale-up - Promote the ARCs and AHSNs to work sectors, potentially using the NHS Assembly - Ensure that process outcomes are monitored as collaborators in service evaluation as a forum to develop these relationships when evaluating any new intervention. when appropriate for example within the and ensuring that social care is not left Beneficial Changes Network programme. behind Implementers should: o As part of this, charitable organisations - Ensure ongoing monitoring if an evaluation Signalling research and evaluation needs could provide support in kind (e.g. is incomplete before scaling up, including nationally and regionally data, support with patient participant adverse effects with clear stopping rules involvement and potentially funding). o Specific examples included the use of Research organisations (e.g. NIHR, MRC non-invasive ventilation in COVID-19 and Economic and Social Research Council Evaluators should: respiratory failure patients and also the use (ESRC)) and NHSE/I should: - Design evaluations to reflect the mixed of lateral flow devices for COVID-19 testing - Develop and co-ordinate clear demand economy of methodologies that can - Align evaluations with systematic and signalling processes for health and care most appropriately address the desired co-ordinated data collection to complete providers to articulate their research and outcomes, whether national, regional or the cycle back to implementation. www.ahsnnetwork.com/rapidevaluation 17
Authorship Selina Rajan is an experienced Public Health Professor Martin McKee Specialist and Honorary Research Fellow at Martin McKee CBE is Medical Director and the London School of Hygiene & Tropical Medicine. Professor of European Public Health at She also consults for the European Observatory the London School of Hygiene & Tropical on Health Systems and Policies, hosted by Medicine, and a member of the board of the WHO Regional Office for Europe and has UCLPartners. He has published extensively experience across health and care, performing local on health and health systems in countries public health evaluations in social care settings, undergoing major economic, social, and international policy analyses and global mental political change, and on European health health policy and epidemiology research. policy and law. Professor Mike Roberts is Managing Director Katie Mantell is Director of of UCLPartners. Mike is a Respiratory Physician, Communications and Engagement at Professor of Medical Education for Clinical Practice UCLPartners, where she leads on all at Queen Mary University of London, and Deputy aspects of communications and patient Director at NIHR Applied Research Collaboration and public involvement. She has more Supported by: North Thames and Senior Clinical Lead in the than two decades’ experience of working Strawberry.London managed the logistics, Clinical Quality and Effectiveness Department in the health and research sectors. development and design of the report, for the Royal College of Physicians of London. roundtable and filming. www.ahsnnetwork.com/rapidevaluation 18
References References Walshe, K., & Davies, H. T. (2013) ‘Health Mikelyte, Louise Laverty and Pauline Moore, G. F., Audrey, S., Barker, M., Bond, L., research, development and innovation in Allen (2019) National evaluation of the Bonell, C., Hardeman, W., Moore, L., O’Cathain, England from 1988 to 2013: from research Vanguard new care models programme. A., Tinati, T., Wight, D. and Baird, J. (2015) production to knowledge mobilization’ Available at: https://www.necsu.nhs.uk/ ‘Process evaluation of complex interventions: Journal of Health Service Research and Policy wp-content/uploads/2019/07/2019-07- Medical Research Council guidance’, BMJ doi: 10.1177/1355819613502011 InterimReportNCM-ManchesterUniversity.pdf (Online). doi: 10.1136/bmj.h1258. (Accessed: 25 January 2021). Craig, P., Dieppe, P., Macintyre, S., Morciano, M., Checkland, K., Billings, Michie, S., Nazareth, I. and Petticrew, Kumpunen, S., Edwards, N., Georghiou, T. J., Coleman, A., Stokes, J., Tallack, C. M. (2008) ‘Developing and evaluating and Hughes, G. (2019) Evaluating integrated and Sutton, M. (2020) ‘New integrated complex interventions: The new Medical care: Why are evaluations not producing care models in England associated with Research Council guidance’, BMJ. the results we expect? https://www. small reduction in hospital admissions in doi: 10.1136/bmj.a1655. nuffieldtrust.org.uk/resource/evaluating- longer-term: A difference-in-differences integrated-care-why-are-evaluations-not- analysis’, Health Policy. doi: 10.1016/j. HM Treasury (2013) The Green Book: producing-the-results-we-expect healthpol.2020.06.004. appraisal and evaluation in central government - GOV.UK. Available at: Lamont, T., Barber, N., De Pury, J., Fulop, Ferlie E., Nicolini D., Ledger J., D’Andreta https://www.gov.uk/government/ N., Garfield-Birkbeck, S., Lilford, R., Mear, D., Kravcenko D. , de Pury J. ‘NHS top publications/the-green-book-appraisal- L., Raine, R. and Fitzpatrick, R. (2016) ‘New managers, knowledge exchange and and-evaluation-in-central-governent approaches to evaluating complex health leadership: the early development of (Accessed: 15 January 2021). and care systems’, BMJ (Online). doi: Academic Health Science Networks – a 10.1136/bmj.i154. mixed-methods study’. Health Services and HM Treasury (2020) The Magenta Book - Delivery Research. doi: 10.3310/hsdr05170. GOV.UK. Available at: https://www.gov.uk/ McGill, E., Egan, M., Petticrew, M., Mountford, government/publications/the-magenta-book L., Milton, S., Whitehead, M. and Lock, K. NHS (2014) ‘NHS five year forward (Accessed: 15 January 2021). (2015) ‘Trading quality for relevance: Non- view’, NHS England. Available at: health decision-makers’ use of evidence on www.england.nhs.uk/wp-content/ Kath Checkland, Anna Coleman, Professor the social determinants of health’, BMJ Open, uploads/2014/10/5yfv-web.pdf. Jenny Billings, Julie Macinnes, Rasa 5(4). doi: 10.1136/bmjopen-2014-007053. NHS England (2020) Advice on how to ➜ www.ahsnnetwork.com/rapidevaluation 19
References establish a remote ‘total triage’ model in general The Strategy Unit (2020a) COVID-19 practice using online consultations. Available Evidence - Helping you to keep up to date. at: https://www.england.nhs.uk/coronavirus/ Available at: https://www.strategyunitwm. wp-content/uploads/sites/52/2020/03/C0098- nhs.uk/evidence-helping-you-keep-date total-triage-blueprint-september-2020-v3.pdf (Accessed: 19 January 2021). (Accessed: 15 January 2021). The Strategy Unit (2020b) How can we learn Ogilvie, D., Cummins, S., Petticrew, M., from changes in practice under COVID-19? White, M., Jones, A. and Wheeler, K. (2011) A guide for health and care teams to learn from ‘Assessing the evaluability of complex public innovations during the pandemic. https://www. health interventions: Five questions for strategyunitwm.nhs.uk/publications/how-can- researchers, funders, and policymakers’, we-learn-changes-practice-under-covid-19-0 Milbank Quarterly, 89(2), pp. 206–225. doi: 10.1111/j.1468-0009.2011.00626.x. Vindrola-Padros C., Sidhu M.S., Georghiou T, Sherlaw-Johnson C., Singh KE, Tomini Operational Research and Evauation Unit S.M., Ellins J., Morris S., Fulop N.J. ‘The NHS England (2016) New care models: implementation of remote home monitoring valuation strategy for new care model models during the COVID-19 pandemic in vanguards. Available at: https://www.england. England’ medRxiv 2020.11.12.20230318; doi: nhs.uk/wp-content/uploads/2015/07/ncm- https://doi.org/10.1101/2020.11.12.20230318 evaluation-strategy-may-2016.pdf (Accessed: 25 January 2021). West of England Academic Health Sciences Network (2020) COVID Oximetry @home and Petticrew, M., Chalabi, Z. and Jones, D. R. COVID virtual wards - West of England Academic (2011) ‘To RCT or not to RCT: Deciding when Health Science Network. Available at: https:// “more evidence is needed” for public health www.weahsn.net/our-work/transforming- policy and practice’, Journal of Epidemiology services-and-systems/keeping-people-safe- and Community Health, 66(5), pp. 391–396. during-and-after-covid-19/covid-oximetry-at- doi: 10.1136/jech.2010.116483. home/ (Accessed: 26 January 2021). www.ahsnnetwork.com/rapidevaluation 20
Appendix Rapid evaluation of health and care services - planning a sustainable solution for the post-COVID reset
Research methods for stakeholder analysis Research methods for stakeholder analysis: Results were analysed using thematic 1) Delivering a shared aim: analysis, with key themes tested iteratively How to overcome the many different This piece of qualitative research was in subsequent interviews to produce a set views about the role of evaluation undertaken by a researcher at the London of draft recommendations. Results were in healthcare and work to deliver a School of Hygiene & Tropical Medicine compiled into a set of slides outlining the common goal. who conducted 18 independent semi- key themes identified in the interviews, structured key informant interviews with and these are available on request. 2) Funding and responsibilities: leaders across a range of health policy To establish what sources of funding and service delivery organisations as A roundtable on 10 December 2020 already exist and to explore the roles well as selected applied health services brought together 12 leaders from of national bodies, including NIHR researchers. Interviewees were selected research, NHS and voluntary sector and improve transparency and purposively to include perspectives from organisations, including national and understanding around how the a range of different types of organisations regional NIHR organisations, NICE, NHS funds and resources rapid that fund, deliver or benefit from rapid NHSE/I (including leaders from the service evaluation. research and evaluation and snowball Accelerated Access Collaborative sampling was used to enable further and Beneficial Changes Network 3) Capability and aligning resources: investigation of evolving themes. Key programmes), Alzheimer’s Society, the To establish ways of creating a system organisations included NHSE/I, NIHR, Health Foundation, AHSNs and the London to align research infrastructure with AHSNs, regional medical directors, an School of Hygiene & Tropical Medicine. the NHS and to mutually agree a individual ARC, universities, NICE, the Participants discussed the results strategic approach at national, regional Nuffield Trust, the Strategy Unit, and of the qualitative research and draft and ICS level, considering what this individuals with experience of carrying recommendations and considered system could look like and how it out regional service evaluation. three major challenges: could be implemented. ➜ www.ahsnnetwork.com/rapidevaluation 22
Research methods for stakeholder analysis Semi structured interview topic guide The purpose of this evaluation is to Consent: understand what went well and what didn’t ‘I would like to start by explaining a little go so well during the first wave of COVID-19 ‘I would like to ask you some questions background the research and to check that in terms of prioritizing local research needs, about your experiences of the systems you are still happy to participate. This work conducting rapid evidence reviews and and resources available for rapid service is being carried out as part of the Academic rapidly evaluating frontline services during evaluations in health and social care. I will Health Science Network Health and Care a rapidly evolving emergency. The hope is record this interview for my own records, Reset campaign. It is led by UCLPartners that this learning may also be applied to a but will only use it for the purposes of the in association with the London School of future long-term relationship between the evaluation and the analysis and nothing Hygiene & Tropical Medicine. NHS and the health and social care system you say will be directly attributed to you. and, in the broadest sense, the research We are also making a short video to ‘During the first wave of COVID-19 and evaluation provision. showcase the highlights of the research at infections, best practice guidance was not the roundtable in December and may ask available so much was unknown about Our aim is to establish how we can mobilise some participants if they might be able to the virus and many health and social resources to support closer collaboration contribute a very short video clip to this care providers innovated and adapted to between academic partners and the health a little later. We appreciate the pressures provide the best possible care. In order to and social care system to rapidly evaluate on your time and there is of course no establish how successful these changes frontline services to improve patient care obligation to do this.’ are, providers must rapidly collate in the longer term, using COVID-19 as an emerging frontline clinical evidence and exemplar for change. These findings will undertake service evaluations as quickly inform policy recommendations that will be as possible, whilst awaiting traditional discussed at a roundtable event with health research findings, which tend to be systems leaders in December, and will considerably slower. eventually be presented in a White Paper.’ ➜ www.ahsnnetwork.com/rapidevaluation 23
Research methods for stakeholder analysis Questions: there between academic and health and social care partners? (Process) 1. From your experience, do service evaluations c. What resources, if any were available differ from traditional research and if so to facilitate these research and could you describe these differences to me? evaluation needs? (Financial, technical expertise, workforce) 2. Could you explain to me how research d. What resources would you have liked to priorities were determined during have had but weren’t easily available? the first wave nationally and how this differs regionally? 6. In your experience, what went well with respect to service evaluations and rapid 3. Is there a similar process for service evidence reviews during the first wave? evaluations and how are they then prioritised nationally and regionally? a. What didn’t go so well and what do you think we can learn for the future? 4. Are those national and regional b. Which bodies do you think should be prioritisations linked in any way? responsible for overseeing service evaluations and rapid reviews in the 5. Were you involved personally in conducting NHS and who should be involved? or co-ordinating any rapid reviews or service evaluations during the first wave? 7. Who do you think should fund this? a. Who was responsible for overseeing this 8. And who do you think should actually and who was involved in implementing carry out these evaluations? it? Who did what? What were the roles of NHSE/I and NIHR and other stakeholders 9. What should a policy document recommend e.g. third sector, Wellcome, MRC? for a future relationship between the system b. How was this taken forward and and the research and evaluation community co-ordinated. Prompt: what links were to ensure that the system’s needs are met? ➜ www.ahsnnetwork.com/rapidevaluation 24
Brief overview of evaluation approaches Brief overview of evaluation approaches Source: Lamont, T., Barber, N., De Pury, J., Fulop, N., Garfield-Birkbeck, S., Lilford, R., Mear, L., Raine, R. and Fitzpatrick, R. (2016) ‘New approaches to evaluating complex health and care systems’, BMJ (Online) ➜ www.ahsnnetwork.com/rapidevaluation 25
Brief overview of evaluation approaches However, it is also important to note that An example of a ‘Network of Capabilities’ The Midlands Decision Support Network every innovation does not necessarily need We spoke to the Strategy Unit in the acts as a network of capabilities across a rigorous evaluation, and there is a risk that West Midlands who also described a new number of local footprints and includes the ‘perfect’ can become the enemy of the collaborative system, that has been enabled training on evaluation methodologies as an good in imposing such standards universally. by pooling local funds between numerous integral part, as shown below. (The Strategy The level of rigour required depends on the health, research and civic organisations. Unit, 2020b, 2020a) decisions that the evaluation is expected to inform, the plausibility of benefit, the risk of harm, the cost of the innovation and whether the benefits are large. If an innovation is highly likely to produce benefit, at low cost and low risk then rigorous evaluation may be superfluous, although this decision must be taken carefully to avoid potential unintended harms. This has been the case for policies on minimum unit pricing of alcohol in Scotland and smoking bans, neither of which were evaluated before being implemented nationally. It is also important to note that evaluations of relatively new innovations are more easily and quickly assessed by process or formative evaluations than those that seek to capture outcomes. (Ogilvie et al., 2011). www.ahsnnetwork.com/rapidevaluation 26
For further information about The AHSN Network Health and Care Reset campaign please see: www.ahsnnetwork.com/reset Produced by Strawberry.London - February 2021
You can also read