Is Implementation Science too inward looking? Review of an #impscichat Tweet chat

Alice Simon (web)

Dr Alice Simon is a behavioural psychologist at King’s College London and leads NIHR CLAHRC South London’s short courses in improvement and implementation science.

Over the past two decades, the field of implementation science has grown rapidly. Policy makers, healthcare professionals and applied health researchers have increasingly recognised that a vast range of barriers have hindered and prevented research evidence from being incorporated into everyday clinical practice (How the NHS can translate research evidence into better care, Health Service Journal, 30 September 2013).

Implementation science aims to address this problem by, ‘Scientifically studying methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice,’ Eccles and Mittman, Implementation Science, 2006.

Researchers have dedicated themselves to exploring how healthcare systems work with a view to understanding how best to implement changes within the systems.  On 29 June 2017, researchers were invited to join a tweet chat to discuss whether implementation science is too ‘inward-looking’ with researchers focusing on theories and frameworks which lack ‘real-world’ impact.
Implementation_Masterclass_tweet

The tweet chat #impscichat was hosted by the NIHR Collaboration for Leadership in Applied Health Research and Care South London @CLAHRC_SL.  You can read a summary of the discussion in this Storify. The chat posed four, key questions with discussions summarised below:

Q1 Could anyone give a summary of what #ImpSci is in 140 characters?

The first question asked participants to define the term ‘implementation science’. There were some useful and concise definitions. For example, James Harris’ (a midwife and implementation researcher at King’s College London; @james_harris) succinct contribution was that implementation science is ‘the use of systematic methods and theories to bring about useful change within healthcare settings’.

However, this question also stimulated discussion about whether or not using the term implementation science was a help or a hindrance. Trisha Greenhalgh (Professor of Primary Care Health Sciences at the University of Oxford; @trishgreenhalgh) put forward the point that implementation science ‘is a contradiction in terms, since implementation is not a science at all, it’s a science-informed, situated practice’.  Her argument spoke to the wider issue of whether ‘implementation science is too inward looking’ by placing emphasis on the need to apply any ‘implementation science’ as ‘as a situated practice, not as a universal set of rules’.

The discussion of how to practice implementation science led neatly into the next two questions about how to meet the needs of service leaders and policy makers:

Q2: What are the needs of health service leaders and policy makers?

Q3. At its best, how can #ImpSci help health service leaders and policy makers?

John Øvretveit (professor of Healthcare Improvement, Implementation and Evaluation at the Karolinska Institute; @jovret1) noted that ‘Does it work?’ is not the only question that practical improvers have of those investigating quality improvements. They also want to know, ‘Will it work here? What conditions do we need to implement and sustain it? Can we adapt it?’ This was supported by Trisha Greenhalgh’s point that implementation of any new healthcare practice is ‘relentlessly contextual’.

Liz Hoffman (journal development manager at Biomed Central; @LizHoffmanbmc) cited a useful paper (Cairney & Oliver 2017) which explored the gap between scientists’ and policy-makers’ views on the importance of evidence-based policy.  The paper raises an ethical consideration for scientists: How far should scientists go to persuade policymakers to act on their evidence?  The authors point out that scientists must address and reduce ambiguity and complexity in their discussions with policymakers in order to be effective. From the scientists’ point of the view this presents a major challenge.

As Nick Sevdalis (professor of Implementation Science and Patient Safety and Director of the Centre for Implementation Science at King’s College London; @NickSevdalis) pointed out in the tweetchat ‘There is no such thing as a simple intervention.The problem is that scientists know that changing behaviour and systems is never simple, but that to be persuasive a simple story needs to be told.’

The tweet chat discussion explored the use of persuasive story telling as a method for helping both policy-makers and healthcare implementers learn from the experience of others.  A simple story initially engages the audience. However, to effectively implement any changes, a network of communication needs to develop which explores and critiques the detail of each implementation story over time. In this way a complex intervention can be successfully adapted to the local situation without losing the key ingredients which made it work in the first place.

Kirsty Loudon (research impact fellow at the University of Stirling; @KirstyLoudon) pointed the group to a useful webinar on the topic of adaptation versus fidelity by Jess Power (PhD researcher and physiotherapist at Trinity College Dublin; @jesspower13) (www.hrb-tmrn.ie/online-material-info/works-next-trial-results-adapted-fit-local-settings-webinar/), which helpfully summarises these issues.

 Q4 Is it true to say #ImpSci has focused, so far, on establishing core principles and terms, at the expense of reaching out?

This question was posed in response to the precipitously expanding academic literature on theories, models and frameworks (see eg Nilsen 2015) in implementation science. Critics have suggested that this has simply led to confusion among practitioners, who are unable to select the most appropriate theory to apply to any given situation (Bhattacharyya et al 2006).

Nick Sevdalis argued that as a ‘young’ field of enquiry, this type of activity, which focuses on defining the field, is to be expected.  Proposing a range of theories, models and frameworks provides testable options.  Many models may ultimately be discarded in favour of those which prove to perform best. Core principles and terms are also needed to help build the field’s identity. Ultimately this question brought the tweet chat full circle.

Trisha Greenhalgh argued that the focus on core principles and terms was at the expense of reaching out because the term ‘implementation science’ is a misnomer which has led researchers to explore in the wrong direction. She argued that ‘implementation practice´ was more appropriate.

Not all participants agreed.  Kirsty Loudon pointed out that ‘practice sounds more like examples whereas science pulls together everything that needs to be put into practice’.

What next?

It was clear that the participants enjoyed this debate. CLAHRC South London offered to host another tweet chat for implementation scientists in the future.  Alexandra Ziemann (postdoctoral research fellow in implementation science at King’s College London; @_aziemann) suggested another topic ‘How health implementation scientists could learn from other implementation science sectors, such as education, public administration, or business management’. Ideas and suggestions for topics are welcomed by @CLAHRC_SL

The Implementation Science Masterclass will be held on Tuesday 17 and Wednesday 18 July 2018. Read more


 

References

Bhattacharyya O et al (2006) Designing theoretically-informed implementation interventions: fine in theory, but evidence of effectiveness in practice is needed. Implementation Science. 2006;1:5.

Cairney P and Oliver K (2017) Evidence-based policymaking is not like evidence-based medicine, so how far should you go to bridge the divide between evidence and policy? Health Research Policy and Systems. 15:35 DOI: 10.1186/s12961-017-0192-x

Nilsen P (2015) Making sense of implementation theories, models and frameworks. Implementation Science 0:53 DOI: 10.1186/s13012-015-0242-0

 

 

 

Be the first to comment

Leave a Reply

Your email address will not be published.


*