25
Measuring Quality: Using Clinical Quality Indicators, Metrics and Dashboards to Measure Quality in Your Organisation 1 John Varlow, Director of Information Analysis Health and Social Care Information Centre

John Varlow, Director of Information Analysis Health and Social Care Information Centre

  • Upload
    paniz

  • View
    19

  • Download
    0

Embed Size (px)

DESCRIPTION

Measuring Quality: Using Clinical Quality Indicators, Metrics and Dashboards to Measure Quality in Your Organisation. John Varlow, Director of Information Analysis Health and Social Care Information Centre . Environment and Context. System wide changes: - PowerPoint PPT Presentation

Citation preview

Page 1: John Varlow, Director of Information Analysis Health and Social Care Information Centre

1

Measuring Quality: Using Clinical Quality Indicators, Metrics and Dashboards to Measure Quality in Your Organisation

John Varlow, Director of Information AnalysisHealth and Social Care Information Centre

Page 2: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Environment and Context

• System wide changes:– A new system for commissioning, delivering, and

accounting for health, public health and social care outcomes

– New structures and responsibilities between NHS England, Public Health England, the Health and Social Care Information Centre (HSCIC), the Department of Health (DH) and Government

– Attempt at genuine devolution to local organisations – New regulatory functions for statutory bodies

Page 3: John Varlow, Director of Information Analysis Health and Social Care Information Centre

The Quality Framework

NHS OUTCOMES FRAMEWORKDomain 1

Preventing people from

dying prematurely

Domain 2Enhancing the quality of life for

people with LTCs

Domain 3Recovery

from episodes of ill health /

injury

Domain 4Ensuring a

positive patient

experience

Domain 5Safe

environment free from avoidable

harm

NICE Quality Standards (Building a library of approx 150 over 5 years)

Commissioning Outcomes Framework

Commissioning Guidance

Provider payment mechanisms

Commissioning / ContractingNHS Commissioning Board – certain specialist services and primary care

GP Consortia – all other services

Duty of quality

Duty of quality

Dut

y of

qua

lity

tariff standard contract CQUIN QOF

NHS OUTCOMES FRAMEWORKDomain 1

Preventing people from

dying prematurely

Domain 2Enhancing the quality of life for

people with LTCs

Domain 3Recovery

from episodes of ill health /

injury

Domain 4Ensuring a

positive patient

experience

Domain 5Safe

environment free from avoidable

harm

NICE Quality Standards (Building a library of approx 150 over 5 years)

Clinical Commissioning Group Outcomes Indicator Set

Commissioning

Guidance

Provider payment mechanisms

Commissioning / ContractingNHS Commissioning Board – certain specialist services and primary care

GP Consortia – all other services

Duty of quality

Duty of quality

Dut

y of

qua

lity

tariff standard contract CQUIN QOF

Page 4: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Outcomes Frameworks

• NHS Outcomes Framework (NHSOF)• Clinical Commissioning Group Outcome

Indicator Set (CCGOIS)• Public Health Outcomes Framework

(PHOF)• Adult Social Care Outcomes Framework

(ASCOF)

Page 5: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicators in Context: What Can We Say?The HSCIC’s website lists over 3,000 indicators, alongside other products, yet covers only a part of the full range of clinical care. There are many more indicators in use locally. This is illustrative of the challenges we face in monitoring clinical quality.

EMERGENCY ADMISSIONS TO HOSPITAL FOR ACUTE CONDITIONS USUALLY MANAGED IN PRIMARY CARE, ALL AGES, ENGLAND 2007/08 (Source: NHS IC Compendium, Crown Copyright)

0100

200300

400500

600700

800900

1000

ONS Area Group / Local Authority

Indi

rect

ly a

ge s

tand

ardi

sed

rate

pe

r 10

0,00

0 &

95%

con

fiden

ce

inte

rval

s

Page 6: John Varlow, Director of Information Analysis Health and Social Care Information Centre

The Move to Monitoring Outcomes

• Accountability shift from what is done, to what is achieved with available resources, demonstrating continuing improvement

• In the absence of evidence based standards for some services, comparative data, for example stroke deaths, may show that outcomes are less than optimal

• Evidence-based process indicators, for example those listed in NICE Quality Standards and the Outcomes Frameworks act as a proxy for outcomes

• An intervention now may have an impact years / decades in the future; an outcome now may reflect interventions going back years / decades

• Attribution and apportioning credit, hence accountability is likely to be difficult

Page 7: John Varlow, Director of Information Analysis Health and Social Care Information Centre

What is a Metric?

A metric is a measure of a known attribute

eg a speedometer in a car dashboard eg within clinical care, a blood pressure

reading

Metrics, whether based on physical instruments or questionnaires, need rigorous testing and calibration plus precision in use

Page 8: John Varlow, Director of Information Analysis Health and Social Care Information Centre

What is an Indicator?

An indicator describes how a measure is expected to be used to judge quality

includes clear statements about the intended goal / objective;

whether it is expected to be used in isolation or in combination with other measures or indicators;

any thresholds or standards which are expected to be applied

e.g. a gauge to show whether speed is within legal limits in a car dashboard

e.g. within clinical care, the proportion of patients with controlled high blood pressure

An indicator may act as an alert to an issue that needs further investigation

Page 9: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator or Metric?

• Metric – number of emergency readmissions to an acute hospital trust following an appendectomy

• Indicator – rate of readmissions • Consider the context and may need to

take into account • whether the readmissions are

avoidable• co-morbidities• whether a certain number are

acceptable• casemix of patients

Page 10: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator Development

• Is the indicator rationale supported by evidence?

• Does the indicator relate to clinical care or outcome that is influenced by actions of commissioners or providers?

• Has this aspect been identified as a priority?

• Can the indicator be developed so that it is measurable?

• Is there evidence of inappropriate variation in clinical care or outcomes?

• Could adoption of best practice significantly improve quality and outcomes?

• Is there scope for improvement?

Page 11: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator Development

• Do you want/need to look at a single aspect of care or whole pathway?

• How will improvement be measured?

• Who is your intended audience?

• If you are comparing with other trusts are you comparing like with like?

• Do you need a simple or composite indicator?

• Provider or commissioner based?

• Longitudinal or cross sectional?

• Selection of number of indicators is not easy….

Page 12: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Deciding how many indicators to focus on

Single aspect eg renal dialysis versus whole pathway eg obesity, uncontrolled high blood pressure, kidney disease, QOL, deaths

Tension – too few may leave gaps and distort priorities, too many may overwhelm the organisation

Potential solution - hierarchies, with ability to drill down to detail, as necessary

Potential solution – menu, with ability to select those to be displayed in the dashboard

RISK

DISEASE / ILL HEALTH

ADVERSE EVENTS

QUALITY OF LIFE

PREMATURE DEATH

AVOIDING RISK

REDUCING RISK

TIMELY INTERVENTION

LATE INTERVENTION

Clinical Quality

Potential activities

Page 13: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicators: NICE Quality Standards

Information 5: Education and self-management

Page 14: John Varlow, Director of Information Analysis Health and Social Care Information Centre

NHS Outcomes Framework

Page 15: John Varlow, Director of Information Analysis Health and Social Care Information Centre

CCG Outcomes Indicator Set

Page 16: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Establishing Limits and Thresholds

• In any absence of evidence-based standards, it is important to establish a basis for judging quality and improvement

• The ‘National Average’ is not always the best marker as it combines good and poor quality

• It may be possible to arrive at some notion of ‘optimum’ based on best levels achieved elsewhere, for example cancer survival or emergency admissions in some parts of the country / other countries

• Dependent on clarity around purpose of indicator and audience e.g. clinician, patient, policy maker, manager, public etc.

Page 17: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator Assurance Process

• Hosted on behalf of the whole system• Indicator Assurance Service• Standard indicator assurance templates• Methodology Review Group• Independent Peer Review• Indicator Assurance Process• Indicator Governance Board• National Library of Assured Indicators

– Repository

Page 18: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator Assurance Process

Page 19: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator Assurance Considerations

• Purpose of indicator

• Rationale, evidence based standard

• What is measured – numerator, denominator, construction, source of data, completeness of counts, quality of data

• How data are aggregated - type of analysis (direct/indirect standardisation), risk adjustment e.g. for age, gender, method of admission, diagnosis, procedure, co-morbidity etc. to compare ‘like’ with ‘like’

• Scientific validity – face, content, construct, criterion, predictive; validity for public, clinicians, performance

• Interpretation – identifying outliers, explaining observations

• Use – timeliness, gaming, costs, access, credibility, feasibility, usefulness

• Investigation and action – play of chance, artefacts (e.g. data quality), quality of care

Page 20: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Indicator Development and Assurance

• Skills and expertise from HSCIC and the wider system– Methodologists– Epidemiologists– Statisticians– Subject Matter Experts– Informatics Specialists– Measurement Specialists– Clinicians and Patients

Page 21: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Dashboards

• “All that glitters is not gold”Shakespeare – Merchant of Venice

• “Simplify, simplify, simplify!”Henry David Thoreau

• “Maximise the data – ink ratio”Edward R Tufte – The Visual Display of Quantitative Information

• “Unless you know what you’re doing you’ll end up with a cluttered mess”

Stephen Few – Information Dashboard Design: The Effective Visual Communication of Data

Page 22: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Dashboards: 13 Common Mistakes

• Exceeding a single screen• Supplying inadequate context• Displaying excessive detail or precision• Choosing deficient measures• Choosing inappropriate visualisation• Introducing meaningless variety• Using poor design• Encoding quantitative data inaccurately• Arranging the data poorly• Highlighting important data ineffectively• Cluttering with useless decoration• Misusing colour• Unattractive display

Page 23: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Clinical Quality Dashboards: Maternity

Page 24: John Varlow, Director of Information Analysis Health and Social Care Information Centre

Accident and Emergency DashboardThe Rotherham NHS Foundation Trust Accident & Emergency Department Clinical Quality Indicators

Overview

95% of patients who needed admission to hospital waited under 389 minutes from

arrival to departure

In March 95% of patients waited under 43 mins for an initial assessment however the median time to the assessment was 9 mins. 6793 patients attended the A&E department in March which is 593 more than expected based on the average attendance being 200 per day. On average patients waited 74 mins to see a Doctor or Nurse Practitioner, 95% of patients waited less than 4hrs from arrival to departure and those admitted to hospital 95% waited less than 6hrs 29mins. There are several factors that influence the amount of time a patient has to wait before being admitted to a bed on a ward, one of these being the availability of a bed. This does not mean that we do not have enough beds to care for our patients, it may be the hospital is busier than normal or that to provide a safe and secure discharge for patients make take additional time. Those patients not admitted to hospital 95% waited less than 3hrs 49mins. Rotherham Hospital are committed to improving the service provided to the people of Rotherham and are always welcome to receive your views and ideas.

Summary of A/E Performance - March 2012

This dashboard presents a comprehensive and balanced view of the care delivered by our A&E department, and reflects the experience and safety of our patients and the effectiveness of the care they receive. These indicators will support patient expectations of high quality A&E services and allow our department to demonstrate our ambition to deliver consistently excellent services which are continuously improving.

4.58% of attendances this

month left the department before

being seen4% of attendances this month were unplanned re-attendances"

Data will be available from an Audit facilitated by the Royal

College of Emergency Medicine. The date of the audit has yet to

be confirmed.

44.82 % of patients with cellulitus and 82.35% of patients with deep vein

thrombosis attending A/E are admitted to hospital.

95% of patients waited under 43 minutes from arrival to full initial assessment which includes all vital signs for patients arriving by ambulance

On average, patients waited 74 minutes

from arrival to see a Doctor or Nurse

Practitioner

95% of patients waited under 240

minutes from arrival to departure

95% of patients not requiring admission to hospital waited under 229 minutes from

arrival to departure

Patient arrives at

Left without being seen (<5%)

Re-attendance (<5%)

Initial Assessment (<15mins) Treatment (<60mins)

Total time in A&E ( <240mins )

Legend

Successfully meets performance threshold

Does not meet threshold

Feedback from patients, carers and staff relating to experience is important to improve the service. Monthly surveys are undertaken in the department and the results are shared with the public. The results for March show that 80% of patients say that they were treated with privacy and dignity when being examined or treated. and 88% of patients say that they had their health problems explained to them in a way that they understood.

Consultant Sign-off

Ambulatory Care

Service Experience

Page 25: John Varlow, Director of Information Analysis Health and Social Care Information Centre

In Conclusion

• There are a lot of indicators out there• Ultimate choice depends on whether they meet criteria

for good indicators• National indicators for NHSOF and CCGOIS – assured

and tested• Local indicator development based on local priorities• Consider triggers and alerts• Uses for Board reporting and assurance• Dashboards can be used to support delivery of safe and

effective care – but only if they are well designed• Integrating local data flows – instantaneous reporting