63
Survey Methods & Applications in Healthcare Nawanan Theera-Ampornpunt, M.D., Ph.D. Faculty of Medicine Ramathibodi Hospital February 17, 2016 www.SlideShare.net/Nawanan

Survey Methods

Embed Size (px)

Citation preview

Survey Methods &

Applications in Healthcare

Nawanan Theera-Ampornpunt, M.D., Ph.D.

Faculty of Medicine Ramathibodi Hospital

February 17, 2016

www.SlideShare.net/Nawanan

2

• Overview of Surveys

• Survey Methodology

• A Sample Survey

Outline

3

OVERVIEW OF SURVEYS

4

• An activity in which many people are

asked a question or a series of

questions in order to gather

information about what most people

do or think about something

Survey

Merriam Webster Dictionary

5

• A written set of questions that are

given to people in order to collect

facts or opinions about something

Questionnaire

Merriam Webster Dictionary

6

• To know something– Personal information

– Knowledge, Opinions & Attitudes

– Behaviors & Practice

– etc.

• About someone– Individuals

– Organizations

• In order to understand, create knowledge,

or make decisions

Why Do a Survey?

7

Examples of Surveys

Say2KFC.com

Service Satisfaction Survey (Online)

8

Examples of Surveys

http://www.kmutt.ac.th/building/pdf/application_form_2552_1.pdf

Service Satisfaction Survey (Paper-based)

9

Examples of Surveys

Image Source: http://pixabay.com/en/agent-business-call-center-18741/

Service Satisfaction Survey (Telephone)

10

SURVEY METHODOLOGY

11

• Survey Design– Study Design

– Modes of Data Collection

• Instrument Design

• Sampling

• Survey Conduct

• Data Analysis

• Reports

Survey Methodology

12

WHAT WOULD YOU DO TO

INCREASE SURVEY

RESPONSES?

13

The Tailored Design Method

1978 1999 2007 2008

Dillman et al.

14

The Tailored Design Method

2014

Dillman et al.

15

• “The development of survey

procedures that create respondent

trust and perceptions of increased

rewards and reduced costs for being

a respondent, which take into account

features of the survey situation and

have as their goal the overall

reduction of survey error”

The Tailored Design

Dillman et al. (2007)

16

• Rewards

• Costs

• Trust

Social Exchange Theory

Dillman et al. (2007)

Image Source: http://horsebusinessschool.com/using-stick-and-carrot-to-motivate-employees/

17

• “The likelihood of responding to the

request to complete a self-

administered questionnaire, and

doing so accurately, is greater when

the respondent trusts that the

expected fix rewards of responding

will outweigh the anticipated costs”

Social Exchange

Dillman et al. (2007)

18

• Providing Rewards– Show positive regard

– Say thank you

– Ask for advice

– Support group values

– Give tangible rewards

– Make the questionnaire interesting

– Give social validation

– Inform respondents that opportunities to

respond are scarce

Survey &

Social Exchange Theory

Dillman et al. (2007)

19

• Reducing Social Costs– Avoid subordinating language

– Avoid embarrassment

– Avoid inconvenience

– Make questionnaires appear short & easy

– Minimize requests to obtain personal

information

– Keep requests similar to other requests to

which a person has already responded

Survey &

Social Exchange Theory

Dillman et al. (2007)

20

• Establishing Trust– Provide a token of appreciation in advance

– Sponsorship by legitimate authority

– Make the task appear important

– Invoke other exchange relationships

Survey &

Social Exchange Theory

Dillman et al. (2007)

21

• Study Design– Cross-sectional

– Longitudinal

Survey Design

22

• Mode of Data Collection– Self-administered survey

• Paper-based

• Online

• Telephone (Automated/IVR)

– Interviewer-administered survey

(structured interview)

• In-person

• Telephone

– Mixed-mode survey

Survey Design

23

• Choose simple over specialized words

• Choose as few words as possible to pose the

question

• Use complete sentences to ask questions

• Avoid vague quantifiers when more precise

estimates can be obtained

• Avoid specificity that exceeds respondent’s

potential for having an accurate, ready-made

answer

• Use equal numbers of positive & negative

categories

Instrument Design (1)

Dillman et al. (2007)

24

Problem

• Number of years lived in Idaho

Years

• Your city or town

City or Town

• Your county

County

Use complete sentences

Dillman et al. (2007)

25

Revision

• How many years have you lived in Idaho?

Years

• In what city or town do you live?

City or Town

• In what Idaho county do you live?

Idaho County

Use complete sentences

Dillman et al. (2007)

26

Problem

• How often did you attend religious services

during the past year?

Never

Rarely

Occasionally

Regularly

Avoid vague quantifiers

Dillman et al. (2007)

27

Revision

• How often did you attend religious services

during the past year?

Not at all

A few times

About once a month

Two to three times a month

About once a week

More than once a week

Avoid vague quantifiers

Dillman et al. (2007)

28

Problem

• About how many books have you read for

leisure during the past year?

Number of books

Avoid too much

specificity

Dillman et al. (2007)

29

Problem

• About how many books have you read for

leisure during the past year?

less that 10

11-25

26-50

51-75

76 or more

Avoid too much

specificity

Dillman et al. (2007)

30

• Distinguish undecided from neutral by

placement at the end of the scale

• Avoid bias from unequal comparisons

• State both sides of attitude scales in the

question stems

• Eliminate check-all-that-apply question formats

to reduce primacy effects

• Develop response categories that are mutually

exclusive

Instrument Design (2)

Dillman et al. (2007)

31

Problem

• Which one of the following do you feel is most

responsible for recent outbreaks of violence in

American’s schools?

Irresponsible parents

School policies

Television programs

Avoid bias from

unequal comparisons

Dillman et al. (2007)

32

Revision

• Which one of the following do you feel is most

responsible for recent outbreaks of violence in

American’s schools?

The way children are raised by parents

School policies

Television programs

Avoid bias from

unequal comparisons

Dillman et al. (2007)

33

Problem

• From which one of these sources did you first

learn about the tornado in Derby?

Radio

Television

Someone at work

While at home

While traveling to work

Mutually Exclusive

Dillman et al. (2007)

34

• Use cognitive design techniques to improve

recall

• Provide appropriate time referents

• Be sure each question is technically accurate

• Choose question wordings that allow essential

comparisons to be made with previously

collected data

• Avoid asking respondents to say yes in order to

mean no

• Avoid double-barreled questions

Instrument Design (3)

Dillman et al. (2007)

35

Problem

• Should the city build a new swimming pool that

includes lanes for swimming laps that is not

enclosed for winter use?

Yes

No

Double Barreled

Dillman et al. (2007)

36

• Soften the impact of potentially objectionable

questions

• Avoid asking respondents to make unnecessary

calculations

Instrument Design (4)

Dillman et al. (2007)

37

• Survey Design– Study Design

– Modes of Data Collection

• Instrument Design

• Sampling

• Survey Conduct

• Data Analysis

• Reports

Survey Methodology

38

• Target Population

• Sample

• Inclusion & Exclusion Criteria

• Sampling Frame

• Sampling Method/Technique

Sampling Design

39

• Census

• Random (probability) sampling

– Simple random sampling

– Systematic random sampling

– Stratified random sampling

– Cluster random sampling

– Multi-stage random sampling

• Non-probability sampling

– Purposive sampling

– Convenience sampling

– Snowball sampling

– etc.

Sampling Design

40

Example sequence

• Pre-notice Letter

• 1st Questionnaire Mailout

• Reminder + 2nd Questionnaire Mailout

Survey Conduct

Simplified from Dillman et al. (2007)

41

• Sampling error

• Coverage error

• Measurement error

• Nonresponse error

– Questionnaire effects

– Data collection mode effects

– Interviewer effects

– Respondent effects

• Processing error

Errors in Surveys

Dillman et al. (2007) & Office of Management and Budget (2001)

42

• Sampling error

• Coverage error

• Measurement error

• Nonresponse error

• Processing error

– Data entry error

– Pre-edit coding errors

– Editing errors

– Imputation errors

Errors in Surveys

Dillman et al. (2007) & Office of Management and Budget (2001)

43

A SAMPLE SURVEY

44

NAWANAN’S DOCTORAL

STUDY

45

Unknown State of IT Adoption in Thai Hospitals

Need to improve theoretical knowledge

This

Study

Dual Opportunities

46

• To describe current state of IT adoption in Thai

hospitals nationwide

• To test proposed conceptual framework & explore

relationships between organizational

characteristics, IT management, and IT adoption

Study Objectives

47

Hypothesized Model

48

• Study Design: Nationwide cross-sectional mail survey

• Sample: All hospitals in Thailand except pilot (N = 1,302)

• Pilot: 5 hospitals (10 respondents each)

• Sampling Frame: List of hospitals from Ministry of Public Health’s Web site

• Subjects: Hospital’s staff responsible for managing information systems (CIO/IT manager or equivalent; hospital director if N/A)

• Data Collection Period: 16 weeks

Design & Population

49

• Modified from original instrument

• Face & content validity established (Theera-Ampornpunt, 2009)

• Further modified based on pilot findings

• Translated to Thai

Section 1 Hospital Profile

Section 2 IT Adoption & Use Profile

Section 3 Respondent’s Information

English version

Survey Instrument

50

• Managerial: To what extent do you agree or disagree with each statement? e.g.,

• Those who will use the information systems are fully involved in hospital IT development

• Functional: How much is each activity supported by computerized information systems in your hospital?

• Technological: To what extent is each technology made available in your hospital?

• Integration: To what extent is information shared or transmitted among information systems within/outsideyour hospital?

Sample Questions

51

Week 0:

Prenotice Letter

Week 1:

1st Questionnaire

Mailout

(Dec. 18, 2010)

Week 6:

2nd

Questionnaire

Mailout

(Jan. 27, 2011)

Weeks 9-13:

Planned Follow-up Phone

Calls to Nonrespondents

and 3rd

Questionnaire

Mailout If Necessary

(Never Conducted

Because of Satisfactory

Response Rate)

Week 16:

Survey Returns

Closed

(Apr. 8, 2011)• 150-baht (~US$5) incentive if completed

• Endorsed by President of the Thai Medical Informatics Association

• Funded by a leading medical school with known informatics focus

• Anonymous unless contact information provided for incentive & results

mailing

Survey Methodology

(Nationwide)

52

• 64% response rate

• Some items problematic

– Differing within-hospital responses on total & IT

budgets,

No. of IT staff, quality accreditation status

– Poor interrater reliability for some dimensions

• Quality accreditation status dropped

• Item wording revised & survey shortened

• Integration sophistication items restructured

Pilot Study Findings

53

Hospital Characteristic Site 1 Site 2 Site 3 Site 4 Site 5

Response rate 40% 50% 70% 70% 90%

Hospital beds

Authoritative source

30 ± 0

30

120.2 ± 0.4

(120-121)

120

360 ± 0

335

303.1 ± 9.4

(282-307)

305

1,058.1 ± 187.1

(863-1,500)

938

Public

Private

100%

0%

0%

100%

100%

0%

100%

0%

100%

0%

Accreditation status

Not accredited &

without plan

Not accredited,

with plan but no

significant progress

Not accredited,

with plan and

significant progress

Accredited

25%

75%

0%

0%

0%

40%

40%

20%

0%

0%

0%

100%

0%

14%

86%

0%

0%

0%

0%

100%

Number of IT staff

None

1-5

6-20

21-50

51 or more

0%

75%

25%

0%

0%

0%

80%

20%

0%

0%

0%

100%

0%

0%

0%

0%

43%

57%

0%

0%

0%

0%

22%

11%

67%

Pilot Study Findings

54

Hospital Characteristic Site 1 Site 2 Site 3 Site 4 Site 5

2009 total budget

(million baht)

22.0

[n=1]

300.0

[n=1]

578.0

[n=1]

368.4

± 93.7

[n=3]

7,000.0

± 1,414.2 [n=2]

2009 IT budget

(million baht)

0.4

[n=1]

10.0

[n=1]

2.1 ± 1.6

[n=3]

5.5

± 0.7

[n=2]

93.0

± 40.0

[n=3]

Number of computers in

hospital

23.8 ± 4.8

(20-30)

106.7 ± 90.2

(20-200)

170.0 ± 108.9

(10-300)

207.1 ±

82.2

(100-290)

2,350.0

± 1,332.3

(100-4,000)

Calculated percentage

of 2009 IT budget

according to provided

amount

1.8%

[n=1]

3.3%

[n=1]

0.5%

[n=1]

1.4%

[n=2]

1.3%

[n=1]

Subjective estimated

percentage of 2009 IT

budget (if amount not

provided above)

Below 1%

1-4%

5-8%

Above 8%

0%

75%

25%

0%

0%

40%

40%

20%

20%

60%

20%

0%

0%

86%

14%

0%

0%

17%

50%

33%

Pilot Study Findings

55

Construct Overall Site 1 Site 2 Site 3 Site 4 Site 5

Managerial

Sophistication

3.6 ± 0.4 3.2 ± 0.2 4.2 ± 0.4 3.9 ± 0.5 3.8 ± 0.4 3.2 ± 0.9

Technological

Sophistication

3.5 ± 0.3 3.1 ± 0.2 3.4 ± 0.6 3.7 ± 1.0 3.4 ± 0.5 3.8 ± 0.5

Functional

Sophistication

4.0 ± 0.3 3.5 ± 0.5 4.4 ± 0.4 4.1 ± 0.6 4.2 ± 0.5 4.0 ± 0.3

Integration

Sophistication

(Within Hospital)

3.8 ± 0.3 3.8 ± 0.3 4.2 ± 1.2 3.8 ± 0.6 3.7 ± 0.5 3.4 ± 0.7

Integration

Sophistication

(Outside

Hospital)

2.3 ± 0.9 1.1 ± 0.04 2.5 ± 1.4 3.6 ± 0.8 2.0 ± 0.9 2.1 ± 0.5

Overall IT

Sophistication

3.4 ± 0.4 2.9 ± 0.2 3.7 ± 0.7 3.9 ± 0.6 3.4 ± 0.3 3.3 ± 0.4

Pilot Study Findings

56

• IT Sophistication ItemsConstruct Intraclass

Correlation

Cronbach’s

Alpha

Managerial Sophistication 0.26* 0.91

Technological Sophistication 0.04 0.81

Functional Sophistication 0.20 0.93

Integration Sophistication

(Within Hospital)

0.00 0.89

Integration Sophistication

(Outside Hospital)

0.50* 0.97

Overall IT Sophistication 0.30* 0.96

*p < 0.05 on F-test.

Pilot Study Findings

57

• 4 of 1,302 hospitals ineligible

• Response rate 69.9%

Characteristic Overall Responding

Hospitals

Non-

Responding

Hospitals

N of eligible hospitals 1,298 908 390

Bed size** 106.9 117.5 82.9

Public status**

Private

Public

24.0%

76.0%

17.4%

82.6%

39.2%

60.8%

Geography*

Central

East

North

Northeast

South

West

33.4%

7.5%

11.1%

27.1%

15.3%

5.6%

31.1%

7.8%

13.5%

26.9%

14.9%

5.8%

39.0%

6.7%

5.4%

27.7%

16.2%

5.1%

*p < 0.01, **p < 0.001.

Nationwide Findings

58

Characteristic Number of Responses Statistic†

Public status

Private

Public

908

158

750

17.4%

82.6%

Teaching status

Non-teaching

Teaching

901

716

185

79.5%

20.5%

Total employees 890 368.2 ± 573.5 (10-5269)

IT employees 901 4.3 ± 5.3 (0-60)

Total budget (million baht) 443 146.67 ± 313.60 (0.25-3,067)

IT budget (million baht) 598 2.77 ± 8.79 (0-100)

Ratio of IT budget to total budget‡

< 1%

1-4%

5-8%

> 8%

416

135

218

40

23

2.7% ± 4.6% (0-43.3%)

32.5%

52.4%

9.6%

5.5%

Extent of overall IT utilization

Very low

Low

Moderate

High

Very high

905

5

35

169

454

242

0.6%

3.9%

18.7%

50.2%

26.7%

Total PCs in use 883 126.1 ± 218.6 (0-3,000)

Nationwide Findings

59

Estimate (Partial or Complete Adoption) Nationwide

Basic EHR, outpatient 86.6%

Basic EHR, inpatient 50.4%

Basic EHR, both settings 49.8%

Comprehensive EHR, outpatient 10.6%

Comprehensive EHR, inpatient 5.7%

Comprehensive EHR, both settings 5.3%

order entry of medications, outpatient 96.5%

order entry of medications, inpatient 91.4%

order entry of medications, both settings 90.2%

order entry of all orders, outpatient 88.6%

order entry of all orders, inpatient 81.7%

order entry of all orders, both settings 79.4%

Adoption Estimates

60

Final Model

61

• High IT adoption rates

• Drastic changes in adoption landscape

• Local context might play a role

– Supply Side

– Demand Side

• International Comparison

– Relatively higher adoption

Discussion

62

• Overview of Surveys

• Survey Methodology

• A Sample Survey

Recap

63

• Survey Design– Study Design

– Modes of Data Collection

• Instrument Design

• Sampling

• Survey Conduct

• Data Analysis

• Reports

Recap