SIMS Data Use - datim · PDF fileData Quality Have there been any stock-outs that could impact...

Preview:

Citation preview

September 13, 2017

SIMS Data Use

2017 PEPFAR Data and Systems Applied Learning Summit

2

Welcome & Introductions

3

Agenda

Topic Estimated Time

1. SIMS data lifecycle

2. SIMS data sources and reference documents

3. Practical experience with data analysis

4. Summary and Conclusion

4

Session Learning Objectives

At the end of today’s session participants will be

able to:

• Understand the SIMS data lifecycle

• Identify sources of SIMS data and reference

documents

• Evaluate PEPFAR results in the context of

service quality using SIMS data, both in

Panorama and through ICPI tools

5

SIMS Background and Data Lifecycle

6

Purpose of SIMS

• Increase the impact of PEPFAR programs by

introducing a standardized approach to

monitoring for program quality and performance

• Primary Objectives

• Monitor capacity to provide high-quality

HIV/AIDS services in all PEPFAR supported

program areas

• Provide data for regional, national, and global

programmatic decision making

• Facilitate use of these data and quality

outcomes to improve services

7

SIMS Assessment Tools

Composed of

Sets of Core

Essential

Elements

(CEEs)

FACILITY

COMMUNITY

ABOVE SITE

Information on

IM/site; Set and CEE

selection to ‘build’ a

tailored Assessment

Tool

CEE Scoring Assessment Tools

Coversheet

8

Assessment Types

1) Comprehensive Follow-Up assessments occur annually at:

High Volume (Facility and Community)

National Entities (Above-Site)

2) USG Focused Follow-Ups are conducted within six months for any assessments that did not pass the 25/50 Rule*.

3) Implementing Partner (IP) should coordinate with the USG Activity Manager to review and agree on rescored CEEs. USG staff are responsible for entering the results from the rescored CEEs for all Follow-Up assessments.

*25/50 Rule: Any assessments that yield scores of >25% Red OR >50% Red + Yellow CEEs.

SIMS Assessment Visit Type Conducted by CEEs to be Assessed

Initial USG All relevant Sets and CEEs

Comprehensive Follow-Up1 USG All relevant Sets and CEEs

Focused Follow-Up USG2 USG Only CEEs which previously scored Red/Yellow

Focused Follow-Up IP Implementing Partner3 Only CEEs which previously scored Red/Yellow

9

What is the 25/50 rule?

A. Number of CEEs that scored RED

B. Number of CEEs that scored RED or YELLOW

C. Total number of CEEs that were assessed

To determine if the 25/50 rule applies:

• Calculate % REDS: = A / C %

• Calculate % REDS + YELLOW: = B / C %

25/50 Rule = > 25% of CEEs Red OR > 50% Red +

Yellow

10

CEE Title

F_1.18 [008] Injection Safety [ALL FACILITIES] Tool

Type

Set

#

Unique ID SET NAME

CEE Name

11

SIMS Life Cycle

Assessment Prep

Conduct Assessment

Data Entry, Review and

Cleaning

Remediation & Follow-up

Planning

SIMS

Coordination

Team

• Confirm w/Partner

• Create Coversheet

& Select Sets

• Prep Go Packs

• Logistics

• Assign Assessors

• Travel

• Record data (Paper

& Tab)

• Onsite

Checks/Dashboard

• Partner

Communication

• Data Entry/Import (Paper or

Tab)

• Data Quality Checks

• Error resolution

• Data Analysis and

Use: Program

Improvement &

Partner Management

• Partner

Communication

• Corrective Action

Plan

• Remediation

• Fiscal/Quarterly Projections

• Partner Communication

• New Assessor Certification

12

SIMS Data System Components

Data Entry Application

(Tablet or Offline Laptop)

SIMS Assessment Results

Data Management System

(HQ database) OGAC Data Exchange

Interface

DATIM

ICPI pulls data from DATIM to populate:

• Panorama

• Quarterly Dashboards

13

SIMS E-Learning Series

Training topics include: Orientation to SIMS Assessment Tools, Implementation Guide, Assessment Procedures, and Remediation

Serves as ‘101’ level training for those who want to increase general knowledge of SIMS. E-Learning is one of the required steps in the training process for all new SIMS Assessors.

URL Link: http://media.go2itech.org/sims/elearning.html

14

SIMS Data Sources and Reference Documents

15

SIMS Data Sources and Reference Documents

Data Sources

• SIMS Quarterly Dashboard

• Panorama

• Agency Database

Reference Documents

• SIMS:MER Linkage Reference Table

16

SIMS Quarterly Dashboard

17

SIMS Quarterly Dashboard

1. Produced every quarter

2. Includes data from FY16Q2 through present

3. Allows for quick overview by:

Assessment type (Initial, Comprehensive Follow-Up, Focused Follow-Up)

Tool type

Set

Individual CEEs

4. Allows for more thorough review of:

Individual assessment results

Aggregate results (IM, site, district)

5. Manipulate pivot tables and graphs to meet needs and interest in partner management and program improvement

18

Dashboard – Table of Contents, Overview, Data Considerations

19

Dashboard – SIMS Implementation

20

Dashboard – SIMS Score by Set

21

Dashboard – SIMS Scores by CEE

Must

match

22

Dashboard – SIMS Implementation (graph)

23

Dashboard – Individual Assessments (pivot)

24

Dashboard – SIMS CEE Scores (graph)

25

Dashboard – SIMS CEE Scores (pivot)

Click on + to

expand

26

Dashboard – SIMS CEE Scores by Type (graph)

27

Dashboard – Dataset for GIS

28

Panorama

29

Panorama

1

2

30

Panorama – Scores Scaled to Total Number

Left-click in

any bar to

see score

breakdown.

31

Panorama – Scores Scaled to 100%

32

Panorama – Scores Split at Meeting/Below Standard

33

Panorama – Implementation Graphs

Number of SIMS Assessments

Reported by Tool & Type

Number of SIMS Assessments

Reported by IM & Type

34

Panorama – Implementation Grid

Column

Sorting:

• Ascending

• Descending

• Advanced

35

Panorama - Mapping

1

3

4

2

36

SIMS:MER Linkage Reference Table

37

SIMS:MER Relationship

1) Data Use a) Program Quality and Implementation Assessment b) Partner and Site Management: Identify program and data

quality issues and best practices c) Contextualize and inform MER-Not a 1:1 relationship

2) Limitations: a) Sampling: SIMS data not a representative sample [SNU

analysis] b) Data Availability: SIMS data not available at all sites [site

analysis] c) Time bound: MER reporting (quarterly/semi-

annual/annual) vs SIMS (once or annual) [site/SNU analysis]

38

SIMS:MER Relationship

• Example 1: MER to SIMS (HTC_TST-high testing numbers/low SIMS scores)

Query: Has quality of HIV Testing been compromised to achieve testing targets? If F_1.11 scores low, are reported HTC_TST numbers accurate?

• Example 2: MER to SIMS (HTC_TST-low testing numbers/high SIMS scores)

Query: Are there testing implementation procedures that can be streamlined to increase testing numbers without compromising quality?

• Example 3 : MER to SIMS (HTC_TST-high testing numbers/high SIMS scores)

Query: Are there best practices that can be captured and used to improve performance at other sites/partners?

• Example 4: MER to SIMS (HTC_TST- low testing numbers/low SIMS scores)

Query: Has program quality and poor implementation impacted ability to achieve targets?

39

SIMS:MER Linkage Reference Table - Summary

40

SIMS:MER Linkages by MER Indicator

41

SIMS:MER Linkages by SIMS CEE

42

SIMS:MER Linkages by Program

43

Practical Experience with Data Analysis

44

Section 3 Content

Handouts

• SIMS Analysis Cheat Sheet

• Programmatic CEE Selections

45

Example: 2nd 90 Care and Treatment

National Strategic Level

Strategic Level PEPFAR

Program Level

Tactical Level

Number of PEPFAR supported PLHIV on

ART (Tx_Curr)

Number PLHIV Identified?

(HTC_Test_Pos)

Number identified Initiated? (Tx_New)

Number initiated Retained?

(Tx_Retain)

2nd 90 83% PLHIV on ART

Host Country Context

Policy env.

Budget/ Cost

Commodities Program Qualty

Staffing Levels

SIMS SIMS

46

Example: Identification of HIV Positives

Framing question

Specific question CEE Unique

ID Impact on HTS_Test_Pos Category

Co

uld

pro

gram

qu

alit

y h

ave

imp

acte

d o

ur

abili

ty t

o id

enti

fy P

LHIV

?

Are HIV Test Results reported accurately?

F_1.11 011

Inaccurate or incorrect HTC-TST # impact ability to detect and enroll patients in care

Data Quality

Have there been any stock-outs that could impact ability to test individuals?

F_1.20 020 Stock-outs of RTKs could impact total number of individuals tested

Testing Quality/Results

C_1.21 221

Is proficiency of testers meeting standards to ensure correct test results are recorded?

F_7.04 079

Poor testing quality may influence test results reported: impact on HTC_TST_POS number?

Testing Quality/Results

F_7.01 076 F_7.02 077 C_1.20 220 C_1.23 223 C_1.34 234

47

Example: Identification of HIV Positives

Low Number

48

Example: Linkages

Framing Question

Specific question CEE Unique

ID Impact on linkage Category

Co

uld

pro

gram

qu

alit

y h

ave

imp

acte

d o

ur

abili

ty t

o id

enti

fy

PLH

IV?

Do we have adequate referral systems for linkage of newly identified HIV positives to care?

F_7.03 078

Poor linkage and referral systems depletes number of patients who can start treatment

Referrals

F_2.08 028

F_3.11 028

C_1.19 219

49

Example: Initiation

Framing Questions

Specific question CEE Unique

ID Impact on TX_NEW Category

is t

he

imp

lem

enta

tio

n o

f te

st a

nd

st

art

hav

ing

an im

pac

t o

n

trea

tmen

t in

tiat

ion

?

Is Test and Start being implemented?

F_12.01 109 Swift transition to Test and Start increases TX_NEW

Treatment Initiation

Are we tracking and offering treatment to all pre_ART patients where Test and Start is offerred?

F_2.01 021

Poor tracking of pre-ART patients results in reduced number of patients started on treatment

Treatment Initiation

F_3.04 021 F_4.06 021 F_2.03 023 F_3.06 023 F_2.06 026

F_2.07 027

Co

uld

dat

a q

ual

ty

hav

e an

imp

act

on

re

sult

s?

Are TX_New results reported accurately?

F_1.10 010

Poor data quality/reporting impact results reported for TX_NEW

Treatment Initiation

F_2.04 024 F_3.07 024 F_4.07 024 F_2.05 025 F_3.08 025 F_4.08 025

Co

uld

av

aila

bili

ty o

f co

mm

od

itie

s h

ave

imp

acte

d

resu

lts?

Are there cases where treatment was not initiated due to ARV shortages?

F_1.16 016

Management of ARV supply chaing and stockouts may delay treatment initiation

Treatment Initiation

A_10.01 490

A_10.02 491

A_10.03 492

A_10.04 493

50

Example: Initiation

51

Example: Initiation

52

General Exercise

1. How many sites did not meet the 25/50 Rule during their initial assessment?

Instructions: Cheat Sheet 2.3, pg. 14

Tab(s): SIMS Implementation or Individual Assessments Pivot

2. How many sites received a follow-up assessment?

Instructions: Cheat Sheet 2.4, pg. 16

Tab(s): SIMS Implementation or Individual Assessments Pivot

3. Which sites did not improve?

Instructions: Cheat Sheet 2.5, pg. 17

Tab: Individual Assessments Pivot

53

How many sites did not meet the 25/50 rule during initial assessment?

Option 1: SIMS Implementation Tab

54

How many sites did not meet the 25/50 rule during initial assessment? (cont.)

By partner:

55

How many sites did not meet the 25/50 rule during initial assessment? (cont.)

Option 2: Individual Assessment (Pivot) tab

56

How many sites received a follow-up assessment?

Option 1: SIMS Implementation Tab

57

How many sites received a follow-up assessment?

Option 2: Individual Assessment (Pivot) tab

58

Which sites did not improve?

Example 1

Example 2

59

Programmatic Exercise: Retention/Adherence

Framing Question

Specific question CEE Unique

ID Impact on TX_RET or

TX_PVLS Category

Co

uld

th

e le

vel o

f re

ten

tio

n/a

dh

eren

ce s

up

po

rt

imp

act

resu

lts?

Are there adequate procedures in place for tracking patients who default on appointments?

F_2.02 021 Defaulters are not retained and are not adherent

Retention/Adherence F_3.04 021 F_4.06 021

Are patients receiving adequate adherence counseling?

F_2.10 030

Poor adherence leads to reduced viral suppression

Adherence F_3.13 030 F_4.09 030 C_2.01 242 C_2.06 247

Are patient monitored for treatment failure?

F_2.11 031 High viral load may indicate poor adherence or DR

Adherence/Viral load suppression

F_3.14 031 F_4.20 031

60

Programmatic Exercise: Retention/Adherence

61

Programmatic Exercise – DREAMS/General Prevention CEEs linked to PP_PREV

C_01.12 [212] Facilitation of Small Group Sessions for HIV Prevention [AP]

C_01.13 [213] Small Group Sessions for HIV Prevention [AP]

C_01.26 [226] Condom Availability (at the Service Delivery Point) [AP-HTC]

C_05.02 [255] Preventing HIV in Girls [OPP]

C_05.03 [254] Girls Secondary Education Transition [OPP]

C_05.06 [226] Condom Availability [OPP]

CEEs linked to GEND_GBV

C_01.17 [217] Standard Guidance for Gender-Based Violence Response in Community Setting [AP]

C_01.18 [218] Gender-Based Violence Referrals in Community Setting [AP]

F_06.01 [074] Capacity to Provide Post-Violence Care Services [GBV]

F_06.02 [075] Availability of Post-Violence Care Services [GBV]

62

Programmatic Exercise – DREAMS/General Prevention

63

Programmatic Exercise – KP_PREV

CEEs linked to KP_PREV

A_04.01 [430] National Guidelines for Key Populations (National level) [GUIDE]

C_01.12 [212] Facilitation of Small Group Sessions for HIV Prevention [AP]

C_01.13 [213] Small Group Sessions for HIV Prevention [AP]

C_04.01 [226] Condom Availability [KP]

C_04.02 [249] Lubricant Availability [KP]

C_04.03 [261] STI Screening and Management Among Key Populations [KP]

C_04.04 [262] Monitoring Outreach for Key Populations [KP]

C_04.05 [263] Peer Outreach Management [KP]

C_04.06 [250] Family Planning/HIV Integration Service Delivery in Community Settings [KP]

C_04.07 [264] Service Referral System [KP]

C_04.08 [265] Data Reporting Consistency – KP_PREV [KP]

F_03.01 [049] Lubricant Availability at Point of Service [KP]

F_03.02 [050] STI Screening and Management for Key Populations [KP]

F_03.03 [051] Service Referral System [KP]

F_03.19 [105] Systems for Family Planning (FP)/HIV Integration [C&T KP]

F_03.20 [106] Family Planning (FP)/HIV Integration Service Delivery [C&T KP]

F_03.21 [032] Partner HIV Testing [C&T KP]

An additional 26 CEEs are linked programmatically to KP.

64

65

66

67

Programmatic Exercise – KP_PREV

68

Feedback

69

Summary

70

Summary

During today’s session we discussed how to:

• Understand the SIMS data lifecycle

• Identify sources of SIMS data and reference

documents

• Evaluate PEPFAR results in the context of

service quality using SIMS data, both in

Panorama and through ICPI tools

71

Questions?

SIMS Saturday Session

September 16, 2017 9:30am-12:00pm

Country Presentations

SIMS 4.0 Feedback

74

Thank You!