53
1 Workbook 5 Cost Evaluations WHO/MSD/MSB 00.2f Workbook 5 Cost Evaluations

Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

  • Upload
    others

  • View
    9

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

1Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Workbook 5

CostEvaluations

Page 2: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

2 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

WHOWorld Health Organization

UNDCPUnited Nations International Drug Control Programme

EMCDDA European Monitoring Center on Drugs and Drug Addiction

World Health Organization, 2000c

This document is not a formal publication of the World Health Organization (WHO) and all rights are reserved bythe Organization. The document may, however, be freely reviewed, abstracted, reproduced and translated, inpart or in whole but not for sale nor for use in conjunction with commercial purposes. The views expressed indocuments by named authors are solely the responsibility of those authors.

Page 3: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

3Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Acknowledgements

The World Health Organizationgratefully acknowledges the contri-butions of the numerous individualsinvolved in the preparation of thisworkbook series, including the ex-perts who provided useful com-ments during its development for theSubstance Abuse Department, di-rected by Dr. Mary Jansen. Finan-cial assistance was provided byUNDCP/EMCDDA/Swiss FederalOffice for Public Health. ChristineGodfrey (United Kingdom) wrotethe original text for this workbook,and Brian Rush (Canada) edited theworkbook series in earlier stages.JoAnne Epping-Jordan (Switzerland)

wrote further text modifications andedited the workbook series in laterstages. Munira Lalji (WHO, Sub-stance Abuse Department) and Jen-nifer Hillebrand (WHO, SubstanceAbuse Department) also edited theworkbook series in later stages. Thisworkbook’s case examples werewritten by Michael French and KerryAnne McGeary (U.S.A.); by EllenWilliams and Dean Gerstein(U.S.A.), and were edited by GinetteGoulet (Canada). Maristela Monteiro(WHO, Substance Abuse Depart-ment) provided editorial inputthroughout the development of thisworkbook series.

Page 4: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

4 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Page 5: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

5Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Table of contentsOverview of workbook series 6

What is a process evaluation? 7

Why do a cost evaluation? 7

How to do a cost evaluation:general steps 81. Identify the costs associated with PSU services 82. Measure resource use per unit of activity level 93. Value this unit of activity 10One additional point: Allocating costs to units ofactivity - top down or bottom up? 11

Evaluation type 1: how to do cost evaluationswithin one agency 12

1. Outline the purpose of the study 122. Describe the service and interventions 133. Identify the resources needed for each intervention/unit of care 134. Collect and collate available data 145. Develop an analysis plan 146. Conduct new data collection 157. Conduct full analysis 16

Evaluation type 2: how to do cost evaluationsacross different provider agencies 17

Evaluation type 3: how to do cost evaluations ofwider social costs of PSU interventions 18

Identifying the range of costs 18Measurement and valuation issues 19It�s your turn 20

Conclusion and a practical recommendation 22

References 23

Comments about case examples 24

Case example of a cost evaluation 27The drug abuse treatment cost analysis program 27It�s your turn 36References 37

Case example of a cost evaluation 38The California Drug and Alcohol TreatmentAssessment (CALDATA): The costs and benefits of treatment 38

Page 6: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

6 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Overview ofworkbook series

Introductory WorkbookFramework Workbook

Foundation WorkbooksWorkbook 1: Planning EvaluationsWorkbook 2: Implementing Evaluations

Specialised WorkbooksWorkbook 3: Needs Assessment EvaluationsWorkbook 4: Process EvaluationsWorkbook 5: Cost EvaluationsWorkbook 6: Client Satisfaction EvaluationsWorkbook 7: Outcome EvaluationsWorkbook 8: Economic Evaluations

This workbook is part of a seriesintended to educate program plan-ners, managers, staff and other de-cision-makers about the evaluationof services and systems for the treat-ment of psychoactive substance usedisorders. The objective of this se-ries is to enhance their capacity forcarrying out evaluation activities. Thebroader goal of the workbooks is

to enhance treatment efficiency andcost-effectiveness using the informa-tion that comes from these evalua-tion activities.

This workbook (Workbook 5) isabout cost analysis. In general terms,it involves evaluating the use of resour-ces ‘spent’ on the treatment of peoplewith substance use disorders

Page 7: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

7Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

What is a costevaluation?A cost evaluation assesses the use of re-sources ‘spent’ on the treatment of peoplewith PSU disorders. There are three differ-ent levels of costing studies. Instructions forhow to complete each of these types of costevaluations are located later in this workbook:

• Evaluating resource use within one par-ticular agency

• Comparing the costs of different interven-tions across two or more agencies

• Wider studies of the fuller social costconsequences of different interventions

Cost evaluations can be done by differentgroups of people with different purposes.

Different groups include treatment services,the funders of the services, wider regulatoryauthorities, or supporting agencies. Treat-ment providers may be most interested intracing their own resource use, and/or theconsequences of any changes in service pro-vision. The funders of services or regula-tory authorities may be interested in com-parisons between agencies. In contrast, awider society perspective attempts to exam-ine all the resource consequences, regard-less of who bears these costs. Taking a so-cietal perspective can considerably extendthe work required, but allow for better com-parison across agencies.

Why do a costevaluation?The economic technique of cost evaluationis one of the tools available to help choosewisely from a range of alternatives and todesign and implement efficient programs.Cost evaluations assess the gains and thecosts of carrying out a set of activities. Thepurpose of this analysis is to identify ways todo the most with a limited budget. In otherwords, it is designed to identify the most ef-ficient approach.

Resources available for treating those withPSU disorders are limited in all countries. Anumber of different groups, including treat-

ment providers, want to monitor resourceuse. In addition to such monitoring, they wantto understand the relationship between levelsof activity and resource use and costs. Ques-tions might include: how much will costs in-crease if the number of outpatient visits rises?What would be the fall in costs and resourcesif the number of PS users treated decreases?Other questions assess the overall level of re-sources and how this relates to populationneeds; the distribution of resources amongdifferent groups of the population with vary-ing problems, and the efficiency of resourceuse within individual services.

Questionsmight include:how much willcosts increaseif the numberof outpatientvisits rises?What would bethe fall in costsand resourcesif the numberof PS userstreateddecreases?

Page 8: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

8 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Before doing a cost evaluation, it is im-portant to review Workbook 1 of thisseries, which outlines general steps toevaluation planning. In addition, reviewthese three common steps within each of thedifferent types of costing studies: (1) identifythe resource use for the appropriate unit ofactivity; (2) measure resource use per activ-ity level; and (3) value this unit of activity.

After this section, three specialised ‘how to’sections are presented, for the three differ-ent levels of evaluations:• Evaluating resource use within a particu-

lar agency• Comparing the costs of different interven-

tions across agencies• Wider studies of the fuller social cost

consequences of different interventions

How to do a costevaluation: general steps

There are three broad cost groups of PSUinterventions:

• the direct costs of service provision• the costs to the individuals or their families• the costs (or averted costs) falling on other

agencies as the result of the ‘treatment’episode

The direct costs of services can be furtherbroken down into:

• capital costs - building, equipment furni-ture and fittings

• building related expenditure - heating,lighting, property taxes, maintenance

• staff costs• other service related expenditure - sta-

tionary, telephone, travel costs, etc.• overhead costs - management and admin-

istrative expenses often shared acrossinterventions or services

These types of costs have different relation-ships with the level of activity of the service.Fixed costs (e.g., building rent, heating, etc.)do not vary with the level of activity. In con-

1. Identify the costs associatedwith PSU services

trast, other costs are variable, for example,the travel costs associated with caring forclients at home will vary directly with the num-ber and location of these clients. Staff costs(e.g., salaries) tend to be of a semi-fixed na-ture and are important because for most PSUservices they make up most of the costs ofservice provision. Staff may undertake someadditional ‘caseload’ but there are limits tothe numbers any one person can handle.Expansion of the service at this point wouldrequire some extra staff.The mixture of fixed and variable costs de-termines how average costs change as levelsof activity change. For example, a servicewith a large fixed cost element, say a largestand-alone residential unit, would yield a highcost per inpatient stay if occupancy of theunit was low. The average cost would fall asthe unit approaches capacity and would riseagain if some new accommodation wasneeded. Marginal cost is the cost at any pointof providing one extra unit. It is importantfor planning purposes to calculate marginalas well as average costs.

Page 9: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

9Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

The costs borne by individuals and their fami-lies will vary across health care systems.Some care has to be taken to count genuineresource use, but not simple transfers. Forexample, individuals may receive benefitsfrom the state or an insurance scheme if theyare participating in treatment. This is not aresource but a transfer between one group,the taxpayer or other members of the insur-ance scheme, to those who are ill. The re-source loss in this situation (especially whereinpatient or residential care is concerned) isthat of the ‘productivity’ or leisure time ofthe individual. These indirect costs may beborne by the individual or the employer. Thereis considerable discussion about the role ofsuch indirect productivity losses in economicevaluations. Generally, the advice is to showthese costs as a separate item so that overallresults can be calculated with and withoutallowance for these costs (Drummond et al.,1987). Individuals may have direct costs oftreatment such as out of pocket expenses,and the time costs of treatment which will

not be included in any traditional, service-oriented tracing of resource use. Some treat-ment, particularly inpatient treatment may in-volve individuals? loss of working or leisuretime. As well as these tangible items thereare more intangible elements to interventions.These include the ‘pain, grief and suffering’of the PS user and their families.

The last group of costs are those borne byother agencies. Some interventions may re-quire input from other social, welfare andhealth care agencies. On the other hand, someinterventions may reduce future demands forsuch services provided by other agencies.These are the type of costs which are identi-fied for social cost type of study.

The first case example at the end of this work-book (French and McGeary) presents astructured and scientifically-based instrumentfor estimating costs of treatment services. Thisinstrument, the DATCAP, includes costs froma variety of relevant categories.

2. Measure resource useper unit of activity level

Units of activity have to be measured alongwith the resources needed to deliver them.The ‘units’ of activity will be determined bythe purpose of the study, available data andthe type of intervention being delivered. Unitscould be in many forms. Some common ex-amples would be counselling visits, inpatientstays, assessment interviews, or some divi-sion per time period such as cost per type ofcare per hour or per week. If comparisonsare being made across different organisations,it is essential that units of activity are mea-sured in a standardised way.

If the purpose of the study is to provide somegeneral idea of overall resource use within atreatment agency, then direct costs of provi-

sion would generally be measured on annualdata. However, capital costs (such as build-ings or equipment) will not necessarily be in-curred in that year, although they are one ofthe resources being used to provide care forPS users. The value of these assets can beincluded by estimating their actual value orreplacement cost, and spreading this costover the expected lifetime of the asset. Build-ings are often estimated to have a ‘life’ of 60years, whereas other capital equipment isestimated to have a life of five to ten yearsdepending on the item. Using an interest of6%, for example, a building worth $1 millionwould have an annual value of $61,876 (thecalculation can be made using standard in-terest rate tables).

Units could bein many forms.Some commonexampleswould becounsellingvisits, inpatientstays,assessmentinterviews, orsome divisionper time periodsuch as costper type of careper hour, perweek etc.

Page 10: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

10 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

The choice of the time period over whichsome of the wider consequences of an inter-vention are measured is not clear cut. If, af-ter treatment, individuals quit smoking or re-duce drinking or other PSU, the potentialsavings in consequent health care costs couldbe spread over the rest of their lifetime. How-ever, direct observation and measurement ofsuch costs would not be possible. In somecases epidemiological modelling can be em-ployed to calculate these potential savings inhealth and other welfare costs. However, if

reliable models are not available, measure-ment may be curtailed to some arbitrary pe-riod after the end of the intervention.

The DATCAP case example located at theend of this workbook demonstrates how thebasic cost estimate can be expressed in avariety of units: total annual cost per ser-vice, weekly cost per client, and cost pertreatment episode. In the case example, con-clusions differed depending on the unit ofactivity.

Most agencies will have some informationabout costs in their routine accounting infor-mation system. You can use this informationto value your units of activity.

As mentioned earlier, the value of capitalitems such as buildings or office equipmentmay be the actual value or replacement costsand this cost would be spread over the ex-pected lifetime of the asset. Buildings are of-ten estimated to have a ‘life’ of 60 years,whereas other capital equipment is estimatedto have a life of five to ten years dependingon the item.

In most instances, people, buildings, and ve-hicles have multiple functions. It is importantto identify cost sources that are shared byother activities, and to find a reasonably ac-curate way of dividing these costs among thevarious activities. This process is called costallocation.

Examples of Cost Allocation:

1 One room in a PSU treatment centre (an-nual cost estimated at $3,000) is used fora prevention programme. The room is900 square metres and the whole centreis 20,000 square metres. The annual valueof the space for the preventionprogramme is: 900/20,000 x 3,000 =0.045 x 3,000 = $135

3. Value this unit of activity

2 A nurse receives $900 per year. In thepast year, 20,000 patients were seen ather treatment centre and 1,000 were pa-tients in the prevention programme. Theallocation fraction is 0.05 (1,000/20,000). This means that $45 of her an-nual salary should be allocated to preven-tion programme costs.

For PSU services, another common valua-tion problems is the use of volunteers to de-liver part of the treatment programme. Vol-unteers may involve some direct costs to theservices, for example, in reimbursing theirexpenses, training and administrative costs,but their labour time does not appear in theaccounts. However, volunteers are also con-tributing time which may have an alternativevalue, for example, in work for a wage orleisure activity. One method would be tovalue volunteers? time by some market wagerate, say, the unskilled wage rate or a ratebased on their past or current occupation. Itis also clear that volunteer resources can notbe assumed to be directly substitutable withtime from the professional workers. They maybe less flexible in how much time they canoffer to the service and they may choose toavoid some of all the activities of paid staff.As with many other aspects of costing meth-odology, there is no single answer. In gen-eral, most studies would give volunteers aninitial zero value but explore the impact on

Page 11: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

11Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

results of the cost analyses by valuing theirtime by different amounts.

The costs of resources which would beused in the future are generally valuedless than those in the current period. Thisis called discounting and is a means of

bringing all costs to a present value.Costs are usually discounted at the pre-vailing general interest rate. Drummondand colleagues (1987) provide tablesthat provide the conversion from futureamounts to this present value, given theinterest rate.

The top-downapproachinvolvesestimating thetotal resourceuse for anyorganisation fora specifiedperiod and thenallocating theresource use tothe units ofactivity.

The bottom upapproachinvolves directlyrecording theresource use foreachinterventionincluding theappropriateshare ofoverhead andcapital costs.

One additional point:Allocating costs to units of activitytop down or bottom up?

There are two basic methods for determin-ing the direct cost of any intervention: ‘top-down’ and ‘bottom-up’. The top-down ap-proach involves estimating the total resourceuse for any organisation for a specified pe-riod and then allocating the resource use tothe units of activity. This method ensures allthe observed resources are allocated to theactivities of the unit.

The bottom up approach involves directlyrecording the resource use for each interven-tion including the appropriate share of over-head and capital costs. With good resourcemanagement and time monitoring systems,this approach can give accurate estimates ofcosts for individual units of activity (e.g., costper hour of assessment) or clients (e.g., av-erage cost per client). This methodology willonly value the resources that are used andany ‘spare’ capacity would not be valued.In this situation the total value of resourcescould be below those shown in the accounts.Conversely, if staff are working more thantheir contracted hours, the total amount ofresources may be above the official budget.

Ideally, costing studies should use the bot-tom-up approach for the costs of major in-terest. However, in practice it may only bepossible to use a top-down approach in situ-ations with limited data. Some costs, par-ticularly overheads and capital, may alwaysbe allocated in a top-down fashion.

In some instances, both a top-down andbottom-up approach may be needed. Asimple example would be a service deliver-ing just one type of standardised interven-tion. In this case a simple top-down estimateof average cost per unit of activity would bestraightforward to calculate from the estimateof all resources and all activity. However, thismethod would not indicate whether there wasany variability in the costs. It may be, forexample, that younger clients are in receiptof more counselling sessions than older cli-ents. To examine this variability, some bot-tom-up costing would be needed.

The next four sections provide more detailedinformation about different types of costevaluations.

Ideally, costingstudies shoulduse the bottom-up approach forthe costs ofmajor interest.

Page 12: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

12 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

The process of doing cost evaluations withinone agency can be broken down into sevenmain steps:

• outline of the purpose of the study• describe the service organisation and the

interventions under review• identify the resources needed to deliver

the interventions• collate the existing data sources

• develop an analysis plan• conduct new data collection• conduct full analysis

The information provided in these steps iscomplementary to that outlined in Workbooks1 and 2 of this series. Be sure to reviewWorkbooks 1 and 2, if you have not al-ready done so, follow the steps outlined inthose workbooks.

Evaluation type 1

How to do costevaluations withinone agency

For agencieswith nocurrent costinformation, afirst step maybe to deviseunit costs fordifferentinterventionsoffered

There may be several reasons for undertak-ing a cost evaluation within a particularorganisation. For agencies with no currentcost information, a first step may be to de-vise unit costs for different interventions of-fered. This may be undertaken with a nar-row focus, tracing only the resources directlyused in the interventions offered by theagency. This would normally be undertakenduring a set period of time, (usually a year),and with data retrospectively. One differentkind of this type of study would be to inves-tigate the cost across clients rather than unitsof activity. However, this may need a differ-

1. Outline the purpose of the study

ent time period and be undertaken prospec-tively to trace the use of resources for a groupof clients. This type of study would usuallybe undertaken as part of an economic evalu-ation (Workbook 8), even if some data wereavailable through patient billing systems.

Other studies may be undertaken with spe-cific changes in service provision in mind. Forexample, you may want to predict the con-sequences of increasing or decreasing thefrequency of maintenance therapies. Or, youmay be contemplating more drastic re-organisations of the whole agency. For this

Page 13: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

13Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

The next step in any costing study is to ex-amine the service and provide some break-down of the activities. Development ofprogramme logic models will help identify thespecific services and activities to be costed.The purpose of the breakdown of activitieswithin specific services is two-fold. First, itprovides an overall framework over whichthe resources needed for the whole service

can be allocated. The second purpose is todetermine the units of activity for costs to becalculated. The detail of your breakdown willbe related to the purpose of your study.Have you completed a programme logicmodel, as described in Workbook 1 of thisseries? Retrieve and review your programmelogic model now, or complete one if you havenot already done so.

type of study, it is necessary to relate currentresources with the units of care that can bedelivered and how changes in differentprogrammes may interact with one another.This requires knowledge of the potentialcaseload of different workers taking into ac-

count both the contact time with clients andthe non-contact time, for example, in co-ordinating with other agencies or writing upcase notes. Planning such studies requiresclose partnership between the clinical staffand those doing the costing study.

2. Describe the service andinterventions

The next step is to consider what types ofresources are required to deliver each typeof care. As described earlier, the componentswill be staff time, consumables, space andbuilding related expenditure, and administra-tive and management expenses. Theconsumables will include items such as post-age, telephone, stationery as well as medi-cation costs, testing procedures and travelcosts directly linked to client care. Providingcare can involve a long list of different itemsincluding insurance, training, and recruitmentexpenditure.

3. Identify the resources needed foreach intervention/unit of care

When �free�resources areincluded, youwill be costingthe servicefrom thesocietalperspective.When they areexcluded, youwill haveadopted theagencyperspective.

Some items should be included eventhough they may not appear on the bud-get. For example, treatment services mayhave a number of ‘free’ resources, in freeor subsidised property, or seconded stafffrom other welfare agencies or volunteers.Although these resources are free in finan-cial terms, they are not without value. Also,services may not be replicated at the initialcost and failure to take account of free re-sources in planning could have financialimplications.

Page 14: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

14 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Most agencies will have some data alreadyavailable, for example, hours of work, clas-sification and costs of staff. You can use thisdatabase for your data analysis.

The ability to use current information to al-locate resource use across activities will de-pend in part on how the organisation man-ages its finances. In some agencies, therewill be clear cost accounting across activi-ties and some confidence may be placedon the current tracing of resources if ‘costcentres’ are in operation. However, somecare must be taken with such divisions. Forexample, in one study of a local agency(Godfrey at al., 1995), it was found that thewhole of the director’s salary was allocatedto the core services even though the direc-tor also managed the work within prisons.This was a historical allocation based onsources of funding rather than actual re-source use.

It is also necessary to examine current datasources for levels of activity. In some sys-tems, detailed utilisation of services may bekept with details of each client. In general,however, some data will be available on to-tal levels of activity of a unit, for example,the total number of clients seen, the total num-ber of counselling appointments, etc. Themore detailed the activity database, the moredetailed can be the cost analysis.

The final source of information is records ofindividual staff’s working practices. Some willonly be assigned to certain activities while oth-ers may work in a number of areas. Someorganisations will have time budgets for theirstaff but, as with other budgets, there is a needto check that resources are used in line with thebudgets. The cost of staff should include allemployers’ costs such as payroll taxes andpension contributions. Other expenditures areusually available from the expenditure accounts.

4. Collect and collate available data

The moredetailed theactivitydatabase, themore detailedcan be the costanalysis.

5. Develop an analysis plan

There are four main aspects of planning theanalysis of a costing study: choice of units ofactivity; measuring and valuing of resources;allocation mechanisms for related resourcesto units; and choice of simulations from thecalculations.

The choice of units will relate to the purposeof the study, the activity breakdown, and vi-ability of collecting data. Some of the gen-eral measurement and valuation issues havebeen discussed in preceding sections. Youwill need to decide whether to value re-sources by the actual value in the service orby some national average rate. Staff, build-ings and many other resources may have dif-

ferent values according to the locality of theservice. National valuation figures may beuseful for generalising results, but for pro-viders and funders of services, local varia-tions in the cost of resources will be impor-tant. In general, it is useful for all costingstudies to report resource use in physical units(e.g., 3 buildings: 20,000 square metres) aswell as their value.

The next step of the analysis plan is to con-sider the means of allocating resources tounits of activity and the mixture of top-downand bottom-up methodology to be adopted.This plan will depend on the type of serviceand the information available on suitable

Page 15: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

15Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

allocators. For most PSU treatments, staffresources make up the majority of the reportedresources. As discussed earlier, percentagesof total time taken in face-to-face contact (orif available the total time including non-con-tact) is one of the most efficient ways to allo-cate staff expenses. Similarly, building andspace-related expenditure such as rent (capi-tal cost), maintenance, heating, lighting, clean-ing, etc., could be allocated by the proportionof floor area used in different activities.

The main part of the analysis plan for moststudies is to obtain some current average costper unit of activity, for example the cost percounselling appointment. Where available, itmay be possible to examine how differentunits of activity are combined for the cost ofan intervention, for example, cost per clientfor an outpatient programme. These aver-

age costs, however, will depend on the overalllevel of activity of the unit and may not re-main the same if activity rises or falls. Calcu-lating the marginal, or extra, costs for an in-crease or decrease in activity is a usefuladdition to the analysis. For example, Brad-ley et al (3) calculated the cost for one addi-tional client and an extra 25 clients for amethadone treatment programme.

All cost calculations require assumptions tobe made, and the analysis plan should includewhat attempts are to be made to assess theeffects of changing these assumptions. Usingthe results in a number of simulations of treat-ment changes may also be part of the analy-sis plan. This sort of exercise may includesome of the factors which may influence thefuture costs of delivering PSU treatments andtypes of interventions.

One of the main purposes of the analysis planis to consider whether existing data are suffi-cient to fulfil the objectives of the study. Ifnot, some plan has to be made to collect newdata. This may be in the form of an observa-tional study. For example, staff in settingssuch as an inpatient facility could be observedfor a set period and resource use directlynoted. This could form the basis for a de-tailed bottom-up costing study. There areproblems with this type of data collection. Itcan be expensive, people may behave dif-ferently if observed, and the process mayseem threatening to staff or disruptive to thecare process.

An alternative is to use questionnaires or dia-ries with staff and/or clients. It is the alloca-tion of staff time across the different activi-ties performed which is most often missing.Asking staff to note use of time in a diary fora set period is a reasonable way to under-stand how services work and give some in-formation about the resources required for

different activities. It is particularly useful forexamining the amount of non-contact timerequired for each treatment event. The firststep is to explore all categories of staff timeso that the full working day can be accountedfor by the staff. For example, it is importantto include training time, team meetings, su-pervision, case note writing, liaison work andtravelling time. In some cases, it may also beuseful to note the resources used by clientseither by asking them to complete diaries orattaching recording forms to case notes.There are similar problems with data collec-tion by survey as those encountered by ob-servational study. Although the method maynot be so intrusive, there are problems withvalidating self-reports. Often this will needto be examined in relation to other sourcesof data such as the level of activity. The tim-ing of surveys are also important. For ex-ample, while a two-week survey may yieldsufficient data, there would be problems ifthe weeks chosen fell, for example, in a mainholiday period. In most cases, some cross-

6. Conduct new data collection

Page 16: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

16 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

checking can be made with other sources.For example, total yearly time available canbe calculated from the contracted hours mi-nus holidays and average time off for sick-ness.

Gathering information requires resources.One question is whether these costs shouldbe included as part of the costs of treatment.Again, there is no universal answer. Part of

any good quality intervention should be sys-tems to ensure quality. Monitoring systemsincluding resource tracing can be seen as partof normal care and hence costs included.Similarly, there is a need to include somedevelopment costs in any situation. However,other costing studies may be undertaken aspart of a wider research exercise, and if thecosts of data collection are for research pur-poses, these costs should be excluded.

Only with the previous steps in place can thefull analysis be undertaken. It is useful as partof the analysis to build in a number of checkson the calculations. For example, do the dataon staff time correspond to the total numberof hours available according to their con-tracts? Are resources being ‘over’ or ‘un-der’ used and hence is the current level ofactivity sustainable for the future? Are the new

7. Conduct full analysis

data collected in line with staff expectationsor have some resources, including time, beenomitted? It is important to state in the dis-cussion of the results what items were ex-cluded from the analysis.

For more details on data analysis and re-porting of results, see Workbook 2 of thisseries.

Page 17: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

17Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Evaluation type 2

Data need to be generated in a similar form acrossagencies. The ease of this process will depend onthe comparability of existing data between agen-cies and developing comparable new data collec-tion methods. This may be easier when compari-sons are being made between stand-alone servicesthan where the PSU treatment is being undertakenin a larger organisation, for example a hospital (Bra-dley et al., 1994). To facilitate comparability, it maybe necessary to develop a checklist of the resourcesrequired for each unit or type of intervention tocheck that all have been included.

It also is easier if programmes of care are deliv-ered according to a similar protocol across agen-cies. While there is a need to have comparablemethodology, it could be counter-productive tohave an overly rigid system of accounting for everytype of resource use. It may be impossible givencurrent information systems for allocations to bemade at the level of detail of all individual resourcesand a rigid system may lead to inaccuracies. Again,the needs depend on the purpose of the study andthe detail of the analysis of any figures required. Amore feasible approach may be to aim for broadcomparability as a first round and then undertakemore detailed analysis where large discrepanciesbetween agencies delivering seemingly similar in-terventions or activities are found.

There are also particular valuation issues to be ad-dressed when making comparisons over more than

one agency. Local valuations of the same amountof resources may vary. It would be expected that ifthere is any substitutability of resources these localvariations in values may affect the treatments of-fered. However, the purpose of the study may notbe to compare the values but the resource use it-self. In that circumstance, figures giving values alonemay be misleading.

You should work with each agency and devise asystem which will meet the overall aims of the studyand also provide useful information to individualagencies. Unless the agencies see the purpose ofthe costing exercise and not feel overly threatenedby the analysis, data reliability and continued datacollection cannot be guaranteed. Over large ar-eas, such as a country, such detailed work maynot be possible and there may be a reliance oncharges or expert opinion on treatment costs (see,for example, Holder et al., 1991).

Another application for information on resourceuse across agencies is to examine the distributionof total resources compared to population needs.For this purpose, the focus is on measuring andvaluing all resources rather than accurately mea-suring the resources needed for each intervention.This type of study may take place at a more ag-gregate level and consist only of broad averagecosts. However, the loss of detail often means thatno sensible comparisons could be made acrossservices.

Data need to begenerated in asimilar formacross agencies.

How to do cost evaluationsacross different provideragencies

Page 18: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

18 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

People with PSU disorders can have a num-ber of health, social and legal problems.These problems can incur costs to a rangeof agencies and other members of society.Wider social cost studies attempt to trace outthe full resource implications of PSU treat-ment (French et al., 1994; Godfrey, 1994).While these studies are commendable in theirefforts to be comprehensive, they are typi-cally too complicated to conduct and have

been criticised on a number of methodologi-cal issues.

The second case example located at the end ofthis workbook (by Williams and Gerstein), de-scribes an evaluation of wider social costs andbenefits for treatment programmes in the Stateof California, USA. Evaluators examined so-cial costs and benefits of PSU treatment, suchas criminal activity and employment earnings.

Evaluation type 3

How to do costevaluations of widersocial costs of PSUinterventions

PSU treatment can involve costs for clients,families, other agencies and the rest of soci-ety. However, the treatment may also reducefuture health and other welfare costs. A largerange of effects could potentially be identi-fied for inclusion in these types of studies.For example, some studies have been un-dertaken from the narrow focus of the healthcare sector, comparing the costs of PSUtreatment to reduced future health care costsattributed to that treatment (for example,Holder and Blose, 1987; Goldman et al.,

1991). While such studies may be useful foradvocacy purposes, there is a problem if com-parisons are being made across programmesbecause the one with the highest savings infuture health care costs may not necessarilybe the one with the highest total benefits, in-cluding those to the individual client.

Most studies adopt a societal perspective and in-clude only resource costs. Others, however, haveattempted to trace out the impact on state financesor the taxpayer (see, for example, Gerstein et al.,

Identifying the range of costs

Page 19: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

19Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

1994). The taxpayer perspective will focus onwho bears the costs. Thus, some transfer pay-ments may be included to trace who loses andwho gains as a result of the treatment process.

The main areas considered are:• health and other welfare care use• criminal costs• productivity loss

Health and welfare use could include addi-tional costs as a result of treatment or fu-ture savings as a consequence of a success-ful treatment. Lost productivity has alreadybeen mentioned. As well as any losses as aconsequence of treatment per se, there isalso the potential effect on future job pro-ductivity.

In general,however, self-reportedoutcomes haveto be used.These areusually valuedby taking anaverage costfigure fromsome officialsource of data.

The tracing of resource use after treatment usu-ally requires a comparison at a client level. Inother words, you need to trace a cohort of cli-ents. However, this raises a major measure-ment problem of attributing any future costs tothe intervention itself and not to other factors.Because most existing studies have used a com-parison of costs before and after treatment forthe same individuals, the data are subject tobiases the decline in resource use and costs maynot be directly attributable to the treatment in-tervention. Another major problem of thesestudies is the lack of information about the so-cial costs of those not in treatment.

With such wide effects to measure, there hasto be some compromise with accuracy. Insome systems, actual health care costs (or atleast charges) can be traced. In general, how-ever, self-reported outcomes have to be used.These are usually valued by taking an averagecost figure from some official source of data.Some potential and costly effects, for example,a road traffic accident, may be so low in inci-dence that either a very large sample or a longtime period would be needed to get an accu-rate measurement. In some cases, epidemio-logical risks are known and, therefore, futureprobable consequences can be modelled viacomputer analyses.

A particular evaluation problem occurs withmeasuring the costs of criminal activity. Theresponse to crime from the criminal justicesystem is one cost. More debatable iswhether the value of goods stolen should be

included as a social cost. It could be arguedthat the value of the goods in itself is not aresource loss. These goods are not lost butare ‘transferred’ to others in society. Theresource costs of such crime are fear andother intangible costs, the criminal justice re-sponse and extra resources devoted to se-curity measures which may result from highlevels of crime. As with other controversialareas, a final decision will be linked to thestudy purpose. For purposes ofgeneralisability, it is useful to present resultswhich are capable of re-analyses for thosewho want to make other assumptions.

The measurement and evaluation of lost pro-ductivity poses other problems. These indi-rect costs can be large for long residentialand in-patient care and overshadow all othercosts. However, evaluing the time spent intreatment, as if someone was fully employed,is likely to overestimate the resource loss tosociety. Also, it can rarely be assumed thatsuccessful treatment will result in large pro-ductivity gains if there is a high level of un-employment in the economy.

The final issue is the time period over whichsuch studies are conducted. Some effects couldbe life long, although the longer the period fromthe intervention the more difficult is it to attributeany resource change to the intervention ratherthan to some other factor. In general studiesare limited to relatively short periods of fol-low-up (e.g., one to three years).

Measurement and evaluation issues

Page 20: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

20 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

It�s your turn

Put the information from this workbook touse for your own setting. Complete theseexercises below.

Remember to use the information fromWorkbooks 1 and 2 to help you complete afull evaluation plan. Review that informationnow, if you have not already done so.

1 Decide the scope of your study. Will youevaluate costs within an agency, acrossseveral agencies, or evaluate wider so-cial costs?

• Within an agency• Across several agencies• Wider social costs

2 Determine what ‘unit’ of activity you willmeasure. Your unit of activity level willdepend largely on your research ques-tion (see Workbook 1). It could be aspecific component of the programme:common examples would be counsellingvisits, inpatient stays, assessment inter-views, or some division per time periodsuch as cost per type of care per hour,or per week.

3 List programme cost sources that youwant to evaluate. If evaluating servicesacross agencies, decide the commonmeasurement(s) you will use. Meet withplanners from the other agency(ies) toachieve consensus on the evaluationmethods.

For right now, do not assign specific mon-etary amounts to these sources — justrecord the different areas. We havestarted the list as an aide for you. Crossout the sources that do not apply to yoursituation, and add others that are not al-ready listed.

Capital costs• building• equipment

• furniture• vehicles

Building related expenditure• heating• lighting• property taxes• maintenance

Staff costs• salaries• fringe benefits

Other service relatedexpenditure• stationary• telephone• vehicle operating costs• travel costs

Overhead costs • management and administrative

expenses

4 Assess the data that are available to youfrom existing sources, such as patient bill-ing records, payroll accounts, etc. Deter-mine what information you have available,and what other information you will stillneed to find out. If you need to collectadditional data, decide what method youwill use to do this. Review Workbook 2to help you choose an appropriate datacollection measure.

5 Use your list of your programme’s costsources to begin to assign specific mon-etary amounts — per unit of activity level.Consider capital costs, cost allocation, vol-unteers, and discounting in your calcula-tions. If you do not have all the informa-tion right now, make note of what youknow and also make note of what youneed to find out later to complete the list.Decide whether you will use a top-downand/or a bottom-up approach for estimat-ing costs. Consider your research re-sources and your available data whilemaking this decision.

Page 21: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

21Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Example Case:

Capital costs

• building (building value/60 years — then divided by 4 becauseprevention programme occupies only 25% of building) 15,000

• equipment (equipment value/10 years) 5,000

• furniture (furniture value/10 years) 2,000

• vehicles (vehicle value/10 years) 1,000

Building related expenditure

• heating 500

• lighting 200

• property taxes 1,000

• maintenance 300

Staff costs

• total prevention staffsalaries plus benefits 120,000

Other service related expenditure

• stationary 400

• telephone 1,200

• vehicle operating costs 670

• travel costs 2,000

Overhead costs

• management andadministrative expenses 80,000

Total 1996 costs 229,270

A research group was interested in measuring the cost of the prevention component of theirprogramme during 1996. They compiled the following list:

Page 22: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

22 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

In this workbook, we have outlined thebasic principles and practices of costevaluations within PSU services and sys-tems. In undertaking cost evaluations, itis essential that you pay close attentionto the principles and practices of plan-ning and implementation as outlined inWorkbooks 1 and 2. Trade-offs have tobe made as to the rigour with which youcollect and analyse information to an-swer your evaluation questions, and theresources you have available. You muststrive to achieve the best possible infor-mation with the time and resources avail-able to you. You must carefully docu-ment the limitations of your findings andconclusions. With these principles inmind, you will be able to undertake prac-tical and useful cost evaluations withinyour treatment service or system.

After completing your treatment evaluation,you want to ensure that your results are putto practical use. One way is to report yourresults in written form (described in Work-book 2, Step 4). It is equally important,however, to explore what the results meanfor your programme. Do changes need tohappen? If so, what is the best way to ac-complish this?

Return to the expected user(s) of the re-search with specific recommendationsbased on your results. List your recommen-dations, link them logically to your results,and suggest a period for implementation ofchanges. The example below illustrates thistechnique.

Based on the finding that operating costsin programme A are 20 percent lowerthan operating costs of programme b, yetserve the same number and type of cli-ents, we recommend that programme ad-ministrators study information about cli-ent outcomes in these two settings. Ifoutcomes are similar, it may be feasibleto adopt the practices of programme Aon a larger-scale basis.

Remember, cost evaluations are a criticalstep to better understanding the day to dayfunctioning of your PSU services. It is im-portant to use the information that processevaluations provide to redirect treatmentservices. Through careful examination ofyour results, you can develop helpful rec-ommendations for your programme. In thisway, you can take important steps to cre-ate a ‘healthy culture for evaluation’ withinyour organisation.

Conclusion and a practicalrecommendation

Page 23: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

23Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Godfrey, C. Assessing the cost-effectivenessof alcohol services. Journal of Mental Health,1994, 3:3-21.

Godfrey, C., Hamer, S., & Sutton, M. Cost-ing the York Drugs Resource Scheme.Working Paper, Centre for Health Econom-ics, University of York, 1995.

Goodman, A.C., Holder, H.D., & Nishiura,E. Alcoholism treatment offset effects: a costmodel. Inquiry, 1991, 28:168-178.

Holder, H.D., & Blose, J.O. Changes inhealth care costs and utilization associatedwith mental health. Hospital and Commu-nity Psychiatry, 1987, 39(10): 1070-1075.

Holder, H.D., Longabaugh, R., Miller, W.R.,& Rubonis, A.V. The cost effectiveness oftreatment for alcoholism: a first approxima-tion. Journal of Studies on Alcohol, 1991,52 (6): 51740.

References

Bradley, C.J., French, M.T., & Rachal, J.V.(1994). Financing and cost of standard andenhanced methadone treatment. Journal ofSubstance Abuse Treatment, 1994, 1 (5):433-442.

Drummond, M.F., Stoddart, G.L., & Tor-rance, G.W. Methods for the EconomicEvaluation of Health Care Programmes.Oxford, UK: Oxford Medical Publications,1987.

French, M.T., Bradley, C.J., Calingaert, B.,Dennis, M.L., & Karuntzos, G. T. (1994).Cost analysis of training and employment ser-vices in methadone treatment. Evaluationand Program Planning, 1994, 17(2), 107-120.

Gerstein, D.R., Johnson, R.A., Harwood,H.J., Fountain, D., Suter, N., & Malloy, K.Evaluation recovery services: the Califor-nia drug and alcohol treatment assessment(CALDATA). California Department of Al-cohol and Drug Programs: Sacramento,1994.

Page 24: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

24 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

The following cases present evaluations re-lated to different aspects of cost evaluation.In the first case, Michael French and KerryAnne McGeary describe the rationale anddevelopment of a cost data collection instru-ment called the Drug Abuse Treatment CostAnalysis Program (DATCAP). The secondcase, written by Ellen Williams and DeanGernstein, presents an evaluation of widersocial costs and benefits for PSU treatmentin California, USA.

The DATCAP was developed and refinedover a five year time period. The evaluators?goal was to create a structured and scientifi-cally-based instrument for estimating the eco-nomic cost of PSU treatment services, thusproviding a tool for making comparable costestimates across agencies and over time. TheDATCAP includes cost estimates from avariety of categories, such as major equip-ment, contracted services, and personnel.‘Free resources’ such as volunteers also areincluded at estimated fair economic value.The basic unit of analysis is total cost for anindividual treatment programme, but this fig-ure can be transformed to other units suchas weekly cost per client. Examples of evalu-ations using the DATCAP are provided.

The California cost evaluation (CALDATA)was undertaken to provide scientific justifi-cation for continued PSU treatment costs.This was a broad-based evaluation, exam-ining costs of several treatments as well asthe wider social impact of delivering thesetreatments in terms of clients? criminal ac-tivity, hospitalisations, employment earnings,and substance use. Evaluators used com-puterised record abstraction and client in-terviews to collect data. Results were gen-erally supportive of PSU treatments,indicating that the economic benefits of PSUtreatment outweighed the costs of provid-ing treatment.

Authors of both cases point out the im-portance of obtaining reliable data for costevaluations. In some settings, a majorityof cost data are compiled already for ac-counting purposes. In other settings, nosuch data are available, or may be insuffi-cient and/or unreliable. Evaluators mustconsider always whether they believe thatexisting data is sufficiently reliable to usein their cost evaluation. If not, alternate andmore reliable sources of data should beconsidered.

Comments aboutcase examples

Page 25: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

25Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

The drug abuse treatmentcost analysis program*

byMichael T. FrenchUniversity of [email protected]

andKerry Anne McGearyUniversity of [email protected]

Department of Economics and Health Services Research CenterUniversity of Miami1400 NW 10th Ave. Suite 1105 (D-93)Miami, FL 33136

Acknowledgements: The authors would like to acknowledge support from the NationalInstitutes on Drug Abuse which provided funding support for earlier drafts of this paper. Weare grateful to Marta Murillo and Valerie Deveux-Shepard for data entry assistance andnumerous colleagues for comments and suggestions. Finally, we would like to thank RobertAnwyl for his editorial contributions to the present document.

Who was asking thequestion(s) and whydid they want theinformation?

Drug abuse treatment providers need toknow the cost of the services they provide.Indeed, continued public and private fund-ing is now being linked to cost and outcomemeasures, and providers can use financialdata to improve organisational efficiency.

One of the dangers, however, of promotingcost studies for treatment programs is thatmost program staff are not technically pre-pared to perform cost analyses and little user-friendly information is available to offer themassistance. Furthermore, not all cost meth-ods are consistent, which can lead to non-comparable estimates that are difficult to usefor policy or planning purposes. Our paperprovides treatment programs with a much-needed technical assistance tool. Specifically,we present a structured and scientifically-based instrument for estimating the economic

Case example of acost evaluation

The authors alone areresponsible for theviews expressed in thiscase example.

* This paper is arevised and condensedversion of an earliermanuscript by FrenchM.T., Dunlap L. J.,Zarkin G. A., McGeary,K.A. and McLellan A.T. (1997).

Page 26: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

26 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

cost of treatment services ? the Drug AbuseTreatment Cost Analysis Program(DATCAP). The components of this instru-ment are outlined and findings from threeactual case studies are presented.

Program evaluation research has a long tra-dition in health care with several evaluationmethods that are scientifically developed andempirically tested. Evaluation of substanceabuse treatment programs is, however, a rela-tively new endeavour. Although the demandfor such analysis is extensive, the availabilityof techniques and methodologies has beenlacking. The recent increase in the demandfor these evaluation studies has fostered thegrowth of distinct ways to conduct cost analy-ses. However, a lack of standardisation incost analysis for the substance abuse fieldhas led to an uneven set of initial studies thathave varied considerably in terms of meth-ods, terminology, and perspectives (e.g.,Cruze, et al., 1981; Harwood, et al., 1984;Rice, et al., 1991; Annis, 1986; McLellan etal., 1983; Holder and Blose, 1992;McCrady et al., 1986; NIDA, 1987;Horgan, 1991; CALDATA, 1994; Finigan,1996). Consequently, most of the early stud-ies cannot be compared across programs orto more recent studies because the measuresare not equivalent (French, 1995; Dunlap andFrench, In Press). As more economic evalu-ations are conducted, there will be an in-creasing need for comparable estimates oftreatment services for treatment planning andpolicy recommendations. Performing a costanalysis is the first step in any complete eco-nomic evaluation of treatment interventions(Gold, et al., 1996; French, 1995). If the costanalysis is unstructured or methodologicallyflawed, then a subsequent cost-effectivenessanalysis, medical cost-offset analysis, or ben-efit-cost analysis will be compromised. Giventhe critical importance of following a struc-tured and methodologically sound approachfor estimating treatment cost, the primary

purpose of this paper is to introduce astandardised data collection procedure thatcan meet these objectives.

The Drug Abuse Treatment Cost AnalysisProgram (DATCAP) has evolved over fiveyears of research into a flexible cost instru-ment that follows economic principles, pro-vides useful information to treatment pro-grams and funding sources, and is userfriendly. One of DATCAP’s most appealingfeatures is that the data and cost estimatesfrom any particular program can be directlycompared with similar data and estimatesfrom other programs, and for individual pro-grams over time.

The discussion and analysis that follow pro-vide a conceptual framework for economiccost estimation and offer practical guidanceon how to use the DATCAP instrument forsubstance abuse treatment programs. First,economic cost estimation through theDATCAP instrument is discussed followed bythe evolution of the instrument into its presentform. Second, we review each section of theinstrument and describe the information nec-essary to answer pertinent questions. The fi-nal sections outline current applications andlimitations of DATCAP1 along with policyimplications and recommendations.

What resources wereneeded to develop andimplement thisinstrument?

This instrument was developed over fiveyears and with the assistance of several fund-ing grants. The programs usually invest ap-proximately two to three day to assemble theinformation and complete the instrument.

1 For a more extensiveand detailed discus-sion of the DATCAPinstrument see Frenchet al., 1997.

Page 27: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

27Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

How were the datacollected?

Conceptual framework for costestimation

The following five elements are important stepsto ensure reliability in cost estimation.

1 Select a single measurement perspective thatis flexible, standardised, and widely sup-ported (e.g., economic (opportunity) cost).

2 Focus on a single analysis perspective thatis workable, standardised, and policydriven. In our case we use the treatmentprogram as our perspective for analysis.

3 Define a fixed set of cost categories thatare consistent with standard economictheory and used in other cost analyses.

4 Develop and define a standard set of ques-tions to arrive at cost estimates within eachcategory.

5 Propose a relatively narrow range of ac-ceptable sources for the resource use andcost data that will be used to calculate theestimates.

6 Describe and follow a consistent methodfor estimating costs for each category ofdata used.

The elements noted above can be consideredstrategic steps in the collection of reliable andaccurate treatment cost estimates. Each stephas been incorporated into the DATCAP in-strument. By clearly defining and followingthese perspectives and steps, the reliability ofthe cost estimates will be maximised. The re-mainder of this section explains these steps ingreater detail and how they areoperationalized through DATCAP.

Regarding the actual content and organisationof DATCAP, the instrument reflects standardeconomic resource and cost variables thathave appeared throughout the economic lit-erature for many years (Mishan, 1975;Sugden and Williams, 1979). Specifically, theheart of economic cost analysis is opportunitycost or opportunity value (Drummond,Stoddard, and Torrance, 1987). Economistsmeasure the cost of program resources basedon the value of those resources in their nextbest use, while accountants and practitionersin most other disciplines typically estimate pro-gram costs based on actual expenditures(Horngren, 1982). The opportunity costframework is operationalized in the DATCAPinstrument, which fosters standardisation andcomparability.

The concept of opportunity cost and othereconomic principles related to DATCAP arebest explained in the context of examples.Recall that economic costs hinge on the op-portunities that are foregone by using a re-source. Thus, the economic cost of utilising atreatment counsellor for 40 hours per weekover the course of a year is the value of thenext best use of that counsellor’s time. Ac-countants, on the other hand, would normallyview the cost of this counsellor as the salarythat she earned for a year of work. If thecounsellor’s salary was equal to the next high-est value of her time, then economic cost andaccounting cost are equivalent. However, ifthe counsellor was being paid a salary belowthe rate that she could obtain in the next bestuse of her time, then economic cost would begreater than accounting cost. Since competi-tive organisations do not typically acquire re-sources at a rate that is greater than market-clearing prices, accounting costs are almostalways less than or equal to economic costs.

Economists are more concerned with oppor-tunity costs than accounting costs becauseprogram evaluations require standardisation.For example, it would be unfair to comparethe accounting costs of program A with pro-gram B when program A received significant

Page 28: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

28 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

donated resources and program B had topay full market prices for essentially thesame resources. If both of these programsused exactly the same resources within aparticular year, program A would show amuch lower accounting cost than programB, but the economic costs would be equal.In comparing the programs, the economicapproach is the superior method because itis not distorted by program-specific differ-ences in resource costs. In addition, pro-gram replicability would require informationon economic costs rather than accountingcosts because economic costs are a puremeasure of resource value.

Another important issue for cost estimationis the perspective of the analysis. Depend-ing on the study objectives, costs can beestimated from the perspective of treatmentclients, treatment programs, third-partyorganisations (e.g., insurance companies),or society as a whole. Each perspective re-quires a different data collection process andestimation strategy. For example, treatmentclients incur costs related to travel time, lostwork time, and child care. Costs to thetreatment program include personnel time,equipment, and facility rental. The DATCAPinstrument adopts a treatment program per-spective for data collection and analysis be-cause programs require financial informationfor funding and performance measurement.Resource allocation rules are best developedfor treatment programs; the treatment programis generally the unit for licensing and qualityassurance evaluation, and the program is typi-cally the contracting unit.

The final conceptual issue involves the actualpresentation of economic costs. Most costanalyses will be based on annual costs be-cause of record-keeping convenience andbecause one year of time represents a rea-sonable period to reflect a pattern of resourceusage. The first variable that can be computedfrom annual data is total annual cost. Thismeasure includes the cost of all treatment ser-vices and operations throughout the year. To-tal costs can be in turn divided into fixed and

variable components. Fixed costs such asphysical capital and facilities are invariant tothe number of treatment clients served. Vari-able costs such as personnel vary with the sizeof a treatment program.

Information on total annual cost and clientcaseflow can be combined to estimate averageannual costs such as the cost of treating oneclient continuously for one year. Average costestimates are useful because data are normalisedso that treatment programs that vary widely insize can be directly compared. For example,program A could have a much higher total an-nual cost than program B, but a smaller aver-age cost due to the fact that these costs aredistributed over a much larger client base. Thisconcept is referred to as economies of scale.

The total cost variable is an important mea-sure for understanding the financial implica-tions of program expansions. Specifically,marginal cost represents the incrementalchange in total cost due to a small change (e.g.,one client) in program production or output.Depending on program operations, marginalcost may be very low up to a threshold levelof additional clients (e.g., 10 or 15 clients)and then increase sharply because a newcounsellor is needed or facility space is re-quired as additional clients are enrolled inthe program.

In summary, DATCAP collects resourceuse and cost data to enable the estima-tion of both economic and accountingcosts since each measure serves a dif-ferent purpose. The unit of data collec-tion and analysis is the individual treat-ment program rather than the client, theinsurance company, or the taxpayer. In-formation from DATCAP can be used toestimate the total annual cost for a par-ticular program, along with measures ofaverage and marginal cost. Lastly, thebasic structure of the DATCAP instru-ment has a long history in the economicsliterature, which strengthens the compa-rability and the ability to generalise thefindings.

Page 29: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

29Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

2 Given the preciseformatting and lengthof DATCAP, it wouldbe difficult toreproduce theinstrument within thismanuscript. However,the latest version ofthe DATCAPinstrument and User’sGuide is available fromeither Dr. French or Dr.McGeary at theUniversity of Miami.Please direct allinquiries (e-mailpreferred) to Dr. Frenchor Dr. McGeary at theaddress on the firstpage of this paper.

Design and organisationof DATCAP2

DATCAP is usually administered as aface-to-face interview for treatment pro-vider sites that collects and organises detailed

TABLE 1: Resource use and cost categories in the DATCAP instrument

Personnel Supplies Major Contracted Buildings Miscellaneous Not recorded& materials equipment services & facilities resources elsewhere

Direct Medical Office Laboratory Total space Utilities Goodssalaries furniture

Fringe Office Computers Repairs/ Total usable Insurance Servicesbenefits maintenance space

Volunteers Housekeeping Electronics Security Rate Taxes Contractsof use

Food Medical Housekeeping Rental rate Telephone Printing

Residential Advertising

information on the resources used in treat-ment operations and the associated economiccosts (e.g., French, Dunlap, Zarkin, and Galinis,1994). Table 1 presents the broad resourcecategories included in DATCAP along withspecific items.

These categories are generally set up to en-compass the range of economic costs asso-ciated with many treatment programs. It isleft to the program director to itemise re-sources and (with the assistance of an expe-rienced administrator) estimate economiccosts within each category. In some caseswe have provided a list of items that may beappropriate only for a specific type of treat-ment program. Therefore, items may beadded or omitted based on the program’sneeds. To ensure that the reliability of theinstrument is not compromised by allowingthis flexibility, the sections were developedto include cross checks so that the data areaccurate and consistent. The strategy here isto maximise reliability of information by (1)defining and standardising a fixed number ofquestions to use in determining each cost cat-

egory; (2) facilitating the interpretation of thequestion; and (3) recommending the types ofdata that will be acceptable to answer thosequestions. In general, the economic cost dataare collected from general ledgers, personnelreports, expenditure reports, equipment req-uisitions, and inventory reports rather than frombudgets because budgets do not always co-incide with actual resource use.

In focusing on the total economic costs in-volved with treatment, DATCAP also asksprogram directors to list free and/orsubsidised resources used in treatment pro-vision (e.g., volunteer workers). The marketvalue of free/donated resources can beestimated by multiplying the share of a re-source used by the drug abuse treatment pro-gram by the estimated fair-market value of

Page 30: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

30 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

the resource. For example, the estimatedmarket value of a volunteer worker is thesalary he or she could earn in a paid positiondoing similar tasks. Thus, the costs reportedfrom DATCAP should include the values forall the resources used by a program in pro-viding treatment regardless of whether theprogram actually incurred an expense for theresource. We believe that this costing methodprovides a realistic measure of economic costand more accurate comparisons betweenprograms because it does not distort the trueresource cost for programs that have betteraccess to free, donated, and/or subsidisedresources. The data collection process forDATCAP also includes questions on programrevenues and client case flows. This infor-mation provides a useful link between thecosts of the program, treatment services, andthe sources of funds for the program (French,et al., 1996). By collecting resource use andcost information we are able to gain a betterunderstanding of treatment operations, whichwill enable a full economic evaluation.

For each program, it is desirable to con-duct a half-day site visit to administer theDATCAP instrument. However, much of thedata collection can be done prior to the visit.Approximately one week before the sitevisit, the instrument can be sent to the

programs with instructions on how to com-plete as many data categories as possiblebefore a trained data collector arrives. Datacollection often continues for a few daysafter the site visit to ensure that completeand accurate information is provided for allsections of the instrument.

How were the dataanalysed?

As noted earlier, DATCAP is a program-level instrument and data collection wouldnormally occur annually at each program.The specific variables that will be availablefrom the instrument are diverse. Forexample, DATCAP information allows usto calculate the average cost of servicingone client continuously for one year at eachprogram. This variable normalises the annualcost information so that average costestimates can be compared acrossprograms of different size and durations oftreatment for the average client. Once again,these ratios convert the aggregateinformation into units that can be comparedacross programs for a standard period oftime. The diagram below is an example ofthe type of information that can be compiledfrom DATCAP data.

• Clients served

• Average length of stay

• Static caseload

• Licensed capacity

• Range of services

• Type of program

• Equipment

• Office supplies

• Capital

• Facility

• Rent

• Contr. services

Client/Program Opportunity costsCharacteristics Fixed Variable Revenue

• Personnel

• Equipment

• Medical supplies

• Physician services

• Advisory board

• Federal

• State

• Local

• Client fees

• Private grants

• Donations

Page 31: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

31Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

What did they find out?

Recent case studies using DATCAP

Table 2 reports the results of three case studycost analyses that were completed late in1996. The economic variables that are cal-culated for each site are total annual eco-nomic cost, weekly economic cost for a typi-

TABLE 2. Characteristics and costs of case study Treatment Programs (1995)a

A Private, not-for-profitMethadone maintenance 150 5591,973,601 67.90 7,662

B Private, not-for-profit Outpatient drug-free 16 165 718,921 83.79 1,341

C Private, not-for profit Outpatient drug-free 22 148 618,748 80.40 1,778

aThe cost measures are reported in 1995 dollars.bEconomic cost per treatment episode may not exactly equal the product of weekly cost per client and average lengthof stay due to rounding.

Prog. FinancialStructure

Modality

Totalannual

economiccost

Averagelength of

stay(Weeks)

Averagedaily

census

EconomicCost PerTreatmentEpisode b

WeeklyEconomicCost per

Client

cal client, and economic cost per averagetreatment episode. All three programs areprivate, not-for-profit entities. Generally,public programs may have slightly lower eco-nomic costs compared to private, not-for-profit programs. However, if public programsare heavily subsidised at the federal, state,or local level, their accounting cost would besignificantly lower than a similar private pro-gram in the same modality.

For our case studies, we found the two out-patient drug-free programs to have similaraverage lengths of stay C 16 and 22 weeks.These two programs were comparable in size,based on an average daily census of 165 and148 clients. However, a substantial differenceof about $100,000 in total annual economiccost is present between the two programs.This total cost differential leads to some dif-ferences in other economic cost measures suchas the average weekly cost to service one cli-ent and the average cost for a treatment epi-sode. Higher total and average costs for Pro-

gram B relative to Program C can be partiallyexplained by significant relocation and reno-vation expenses incurred by Program B dur-ing the relevant time period.

Cost differences are also found when com-paring estimates across modalities. For ex-ample, Program A is a methadone mainte-nance clinic with an average daily census of559 clients and an average length of stay ofaround 150 weeks. Given these relatively highcaseflow statistics, the estimated total annualcost at Program A was considerably higher

Page 32: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

32 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

than Program B or C. The estimated cost pertreatment episode C weekly economic costper client multiplied by the average daily cen-sus C was also relatively high at Program Abecause methadone maintenance clients typi-cally have a longer length of stay. However,the weekly cost per client for Program A islower than the other two programs. There-fore, the methadone maintenance programdistributes its higher economic costs over alarger number of clients. In the field of eco-nomics this concept is known as economiesof scale. While the purpose of these case stud-ies and cost estimates is to illustrate the typeof information that can be derived fromDATCAP, comparisons with other DATCAPfindings can be made by referring to French etal. (1996).

Table 3 presents a different perspective onresource use by showing the distribution ofcosts across resource categories. As ex-pected, labour is clearly the dominant resourcewith cost shares around 50 percent or higher.Building and facility costs account for a largeproportion of Program A’s total annual op-portunity costs. This situation is unusual be-cause Program A is using two very large fa-cilities to accommodate their large case flows.The building and facility costs for ProgramB and C are more representative of an av-erage distribution. Additionally, we found thatthe distribution of costs across resource cat-egories both within or across modalities is

TABLE 3. Distribution of costs across resource categories*

Prog.FinancialStructure Modality

MajorEquipment

ContractServices

A Private, Methadone Maintenance 62 3 20 5 4 6Not-for-Profit

B Private, Outpatient Drug-Free 58 2 5 3 30 3Not-for-Profit

C Private, Outpatient Drug-Free 71 5 4 8 4 8Not-for Profit

*Percentages may not add to 100 due to rounding.

Suppliesand

Materials

Buildingsand

Facilities

OtherItemsLabour

highly program specific. Program B had a largedistribution of costs for major equipment. Thisis primarily due to the renovation and relocationexpenses mentioned earlier. Program C hadmuch lower major equipment purchases, anda significantly larger percentage of ProgramC’s resources were allocated to labour com-pared with Program B. Otherwise, the smallvariation in the distribution of costs across pro-grams is consistent with the organisationalstructure of most treatment programs and thereliance on labour-derived services.

Potential limitations of the DATCAP

The information reported in Tables 2 and 3represents a significant improvement in theway treatment cost data have been collected,analysed, and reported. However, a few po-tential limitations are still present with theDATCAP approach to cost estimation. Forexample, some programs were not able tolocate actual records to describe all re-sources used by their clinics. In these caseswe had to rely on recall approximation andexpert judgement. Obviously, reliabilitycannot be as high under these circum-stances. Second, to estimate the opportu-nity cost of some resources we sampledlocal markets to estimate existing prices(e.g., real estate markets for rental rates ofcommercial property). These calculationsmay have some sampling error because we

Page 33: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

33Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

generally relied on a small number of vend-ers or agents for price quotes. Lastly, themost important contribution of DATCAP totreatment research is still limited by the rela-tively small sample of cost and effective-ness studies that have been completed todate. When standardised cost estimates areavailable from a larger sample of programswe will have much more accurate informa-tion on treatment cost and cost-effective-ness. The method and data summarised hereare certainly informative, but it would beincorrect to assume that the cost estimatesare representative of all or even most sub-stance abuse programs in the United States.

How were theresults used?

Few studies have examined the costs oftreatment programs, especially alcohol,drug, and mental illness treatment programs.The shortage of useful policy-relevant in-formation in this area is especially un-fortunate for federal and state agenciesthat are responsible for the allocation ofpublic funds to treatment programs. Theseagencies often make policy decisions us-ing outdated cost estimates and rough av-erages in their decision rules for BlockGrants and other disbursements to indi-vidual programs. The lack of financing andcost information is also unfortunate for in-dividual drug abuse treatment program di-rectors as they try to develop strategies inanticipation of changes in health caremarkets and financing; and more generallyto improve the operation and efficiency oftheir clinics.

In addition to the lack of cost and financinginformation noted above, proposed inter-national health care reforms threaten toplace new restrictions on availability of funds

for substance abuse services, causing chal-lenges to the financial viability drug treat-ment programs. In this changing environ-ment, it is unclear whether treatmentprograms will receive more attention andpossibly more funding from policy makers.However, policy makers lack criticalinformation on the concerns and potentialresponses of substance abuse treatmentprograms to proposed changes.

We have tried to take the first step in ad-dressing a part of the information gap facedby agencies at all levels, and by treatmentprogram managers, by developingDATCAP. DATCAP provides timely, ac-curate, and comparable cost estimates forall of the purposes and constituents notedabove. This ensures that resource alloca-tion decisions are made more reliably andaccurately at the federal, state, and locallevels. By focusing on economic cost, pro-gram administrators will be able to betterestimate the opportunity costs of their ser-vices and identify which funding sourceshold the most promise for continued growth.As more programs use DATCAP, moreaccurate and informative comparisons canbe made across patient types, treatmentmodalities, and variations in therapeuticservices.

Results from these types of cost analyseswill provide program directors, other policymakers, and researchers with useful andcurrent information on the cost and finan-cial structure of a range of drug abuse treat-ment programs. Our past experience showthat program directors who completedDATCAP were grateful to have quantita-tive information. These directors have unani-mously stated that the benefits of havingthese cost estimates significantly outweighedthe investment of their time and otherresources to assemble the data and partici-pate in the data collection exercise.

Page 34: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

34 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

It�s your turnWhat are the strengths and the weaknesses of the presented case example? List threepositive aspect and three negative aspects:

Strengths of the case study

1

2

3

Weaknesses of the case study

1

2

3

Page 35: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

35Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

Gold, M.R., Siegel, J.E., Russell, L.B., & Weinstein,M.C. (eds). Cost-effectiveness in health andmedicine. New York: Oxford University Press, 1996.

Harwood, H.J., Napolitano, D.M., Kristiansen, P.L.,& Collins, J.J. Economic costs to society of alcoholand drug abuse and mental illness: 1980. (Finalreport prepared for the Alcohol, Drug Abuse, andMental Health Administration, RTI 2734/00-01FR).Research Triangle Park, NC: Research TriangleInstitute, 1984.

Holder, H.D., & Blose, J.O. Typical patterns andcost of alcoholism treatment acrossa variety ofpopulations and providers. Alcoholism: Clinicaland Experimental Research, 1992, 15(2): 190-195.

Horngren, C.T. Cost accounting: A managerial em-phasis (5th Edition). Englewood Cliffs, NJ: PrenticeHall, 1982.

Horgan, C. Cost of drug treatment programs:Preliminary findings from the 1990 drug servicesresearch survey. (Presented at the NIDA NationalConference on Drug Abuse Research and Prac-tice) Washington, DC, 1991.

McCrady, B., Longabaugh, R., Fink, E., Stout, R.,Beattie, M., & Ruggieri-Authelet, A. Cost effective-ness of alcoholism treatment in partial hospital versusinpatient settings after brief inpatient treatment:12-month outcomes. Journal of Consulting andClinical Psychology, 1986, 54(5):708-713.

McLellan, A.T., Woody, G.E., Luborsky, L., O’Brien,C.P., & Druley, K.A. Increased effectiveness of sub-stance abuse treatment: A prospective study ofpatient-treatment matching. Journal of Nervous andMental Disorders, 1983, 171(10): 597-605.

Mishan, E.J. Cost-benefit analysis. London: GeorgeAllen and Unwin. National Institute on Drug Abuse(NIDA). (1987). National drug and alcoholism treat-ment unit survey (NDATUS). (Final Report, Publica-tion No. (ADM) 1975:89-1626.

Rice, D.P., Kelman, S., & Miller, L.S. Economic costsof drug abuse. In Economic costs, cost-effectiveness,financing, and community-based drug treatment.Research, Monograph Series 113. pp. 10-32. Rockville,MD: National Institute on Drug Abuse, 1991.

Sugden, R., & Williams, A.H. The principles of prac-tical cost-benefit analysis. New York: Oxford Uni-versity Press, 1979.

References for DATCAP case example

Annis, H.M. Is inpatient rehabilitation of the alco-holic cost-effective? Con position. In Controver-sies in Alcoholism and Substance Abuse. TheHaworth Press, 1986:175-190.

The California Drug and Alcohol TreatmentAssessment (CALDATA).General report. Sacra-mento, CA: State of California, Department ofAlcohol and Drug Programs, 1994.

Cruze, A., Harwood, H., Kristiansen, P., Collins,J., & Jones, D. Economic costs of alcohol anddrug abuse and mental illnessC1977. ResearchTriangle Park, NC: Research Triangle Institute, 1981.

Drummond, G.W., Stoddardt, G.L., & Torrance,G.W. Methods for the economic evaluation of healthprogrammes. Oxford, England: Oxford UniversityPress, 1987.

Dunlap, L.J., & French, M.T. A comparison of twomethods for estimating the costs of drug abusetreatment, Journal of Maintenance in the Addic-tions, forthcoming.

Finigan, M. Societal Outcomes and Cost Savingsof Drug and Alcohol Treatment in the State ofOregon. Report prepared for Office of Alcohol andDrug Abuse Programs, Oregon Department ofHuman Resources, Salem, OR, 1996.

French, M.T., Dunlap, L.D., Zarkin, G.A., & Galinis,D.A. Drug abuse treatment cost analysis program(DATCAP): Cost interview guide for provider sites.Drug abuse treatment module. Version 2. Research Tri-angle Park, NC: Research Triangle Institute, 1994.

French, M.T. (1995). Economic evaluation of drugabuse treatment programs: Methodology and find-ings. American Journal of Drug and AlcoholAbuse, 1995, 21(1): 111-135.

French, M.T., Dunlap, L. J., Zarkin, G. A., McGeary,K.A., & McLellan, A.T. A structured instrument forthe economic cost of drug abuse treatment: usingthe drug abuse treatment cost analysis program(DATCAP). University of Miami, Health Services Re-search Center, Miami, FL. Submitted to Journal ofSubstance Abuse Treatment, 1996.

French, M.T., McGeary, K.A., & McLellan, A.T.Drug abuse treatment cost analysis program(DATCAP): Cost interview guide for provider sites.Drug abuse treatment module. Version 3. CoralGables, FL: University of Miami, 1996b.

Page 36: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

36 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Who was asking thequestion(s) and whydid they want theinformation?

The California Drug and Alcohol TreatmentAssessment (CALDATA) evaluated publiclysupported drug and alcohol treatmentprograms delivered in the state of Californiafrom 1991 to 1992 . The project involvedan analysis of costs and benefits that appliedto all drug users discharged from the major

types of programs in the state. BeforeCALDATA, every U.S. study was somewhatlimited either by the treatment facilities beingselected for convenience or the study focus-ing on only some subgroup of participants.For example, researchers might study pa-tients in facilities that co-operated becauseof professional affiliation with the researchteam. Other studies originated because ofinterest in special treatment philosophies orapproaches—studying the effectiveness ofmedical, psychological, or social treatmentmodels or of various combination ap-proaches at facilities chosen for special pur-

The California Drug and AlcoholTreatment Assessment (CALDATA):The costs and benefits of treatment

by

Ellen Williams97 Hickory Ridge RoadConway, Massachusetts USA 01341Telephone 413-369-4983

and

Dean R. GersteinNORC/Washington DC Office1350 Connecticut Avenue, Suite 500Washington DC USA 20036Telephone 202-223-6040

Case example of acost evaluation

The authors alone areresponsible for theviews expressed inthis case example.

Page 37: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

37Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

poses. Evaluations based on these limitedsamples, were suggestive but could not beconfidently projected to treatment outcomesor costs in U S. facilities at large, or in somelarge subdivisions like the state of California(with one-eighth of the U.S. population.)

Andrew Mecca, Dr. P.H., Director of theCalifornia Department of Alcohol and DrugPrograms (ADP) in the administration ofGovernor Pete Wilson, had early experienceoverseeing drug treatment programs in Viet-nam. He had seen outcomes that told himthat “treatment works” — drug and alcoholrecovery programs changed the behaviourof their participants for the better and were abenefit to society. However, early in the1990s, Dr. Mecca also knew that hisdepartment had no arguments that would rig-orously justify use of scarce public funds topay for treatment costs. A statistical analysisof the treatment experience of a representa-tive sample of clients was missing, eventhough California, given its population sizeand the scope of its economy, is larger thanmany nations. No other U.S. state had un-dertaken such a study, nor was such infor-mation available on a national level. Early inthe 1990s, events came together to make theCALDATA analysis possible.

First, California had a database newly inplace in 1991 that accounted for treatmentprograms. The California Alcohol and DrugData System (CADDS) collected key dataon admissions and discharges from all drugand alcohol treatment facilities legally requiredto report to ADP. Also in 1991, the NationalOpinion Research Center at the Universityof Chicago (NORC) received a federalaward to design a survey gathering treatmentoutcome data from a nationallyrepresentative sample of persons dischargedfrom drug and alcohol treatment facilities andto make a cost/benefit analysis of the results.By 1992, the research team at NORC hadbrought the design of the national treatmentoutcome study to completion, but areorganisation of the federal agencies post-poned the data collection phase. When Dr.Dean Gerstein, the Principal Investigator on

the team outlined the study to a meeting ofthe state directors of Alcohol and Drug Pro-grams as a preliminary step to gaining theirco-operation, Dr. Mecca saw the opportu-nity to fund a survey for the state of Califor-nia alone that would anticipate the postponednational study, answering questions about theeffectiveness, the costs, and the social utilityof drug and alcohol treatment programs thatwere of critical importance to California.

Using federal funds earmarked for localevaluation research, ADP commissionedNORC to adapt the national design to gatherdata only in California and in 1994, publishedthe resulting CALDATA analysis of the ef-fectiveness and the social costs and benefitsof drug and alcohol treatment at the statelevel in the U.S. The report set forth the datafrom interviews with 1,850 individuals re-cently receiving treatment for drug and alco-hol problems in facilities publicly supportedby the state of California.

What resources wereneeded to collectand interpret theinformation?

Conduct of a study of this scope requiredthe resources of a survey research centerexperienced in all aspects of survey design— questionnaire development, sample de-sign, informed consent and clearance pro-cedures. Implementing the design requiredstaff experienced in interviewer recruitmentand training and field management of datacollection in dispersed localities, ranging frommassive urban cores to remote rural enclaves.Delivering the data required resources fordata-entering and cleaning information sub-mitted on hard copy instruments, and forpreparation of final data files accessible toanalysis. Completing the data analysis re-quired an expert in policy and clinical issuesregarding treatment services, and an econo-mist with general expertise in assessing thecosts and benefits of problem behaviour andspecific knowledge of drug treatment deliv-

Page 38: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

38 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

ery systems in the U.S.. NORC undertookto complete all tasks under a U.S. $2.2 mil-lion contract, which included subcontractingthe cost/benefit analysis. The necessary staffincluded Dr. Gerstein as overall director, asampling statistician, a statistical programmer,a project manager with operational respon-sibility for instrument design, data collection,data preparation, and budget management,a field project manager with a subordinatestaff of five field managers overseeing sev-enty field interviewers, and a director of fieldtraining, with administrative support fromNORC’s financial centre and technical sup-port from its centre for information services.Henrick Harwood conducted the economicanalysis; he was co-author a few years pre-viously with Dr. Gerstein of a major nationalstudy, sponsored by the U.S. Department ofHealth and Human Services, on drug treat-ment policy and programs. A field office wasset up in Pasadena, near the residence of thefield project manager, to facilitate communi-cations within the large state of California,where the surveyed programs were as muchas 750 miles apart and more than 1800 milesfrom NORC’s central office in Chicago.

How were the datacollected?

The CALDATA study was carried out in fourstages, 1992-1994: Stage 1 - Facility Samplingand Program Director Interview; Stage 2 - Re-spondent Sampling and Record Abstraction;Stage 3 - Respondent Interviews; Stage 4 -Analysis. The work in each stage is describedin more detail in the sections that follow.

Stage 1 - Facility sampling andprogram director interview

Information in the CADDS database as ofSeptember, 1992, provided the foundationfor the sample of drug and alcohol treatmentprograms used by CALDATA. CADDSincluded all providers who received any typeof public funding via ADP for treatment orrecovery services during the current or pre-vious year, including grants, contracts, and

reimbursements from MediCal (the stateadministered federally funded program ofmedical care for the poor, the disabled, andthe near-poor). State law required that theyprovide key administrative data about theirprograms and their participants in treatment.For practical purposes, the CADDS data-base yielded an exhaustive listing of drug andalcohol treatment facilities in California.

Facility sampling proceeded in three stages,under the direction of Dr. Robert A. Johnson.The ultimate goal was selection of a group ofdischarged participants that would yield dataapplicable to the whole population ofparticipants discharged from publicly fundeddrug and/or alcohol treatment facilities inCalifornia in the sample year October 1,1991 - September 30, 1992. CALDATA’Srespondents had to be clustered geographi-cally to economise the costs of interviewertravel. (California extends 1200 miles northto south along the Pacific coastline, withpopulation concentrated in the southern two-thirds, and 300 miles west to east beyondthe coast and the central valleys across thinlypopulated mountains and deserts.) The re-spondents had also to be related to the spe-cific type of drug and alcohol treatment pro-grams from which they were discharged, sothat outcomes could be studied in relation tothe kinds of services received and the set-tings in which they were delivered. The sam-pling team determined that at least 400 re-spondents were needed to analyserelationships between each type of treatmentprogram, or modality, and participant out-come. Selection of a multi-stage probabilitysample met the sampling goals. Statistical ac-countability required that each member of thepopulation of participants discharged in thesample year be given a known chance ofselection, equalised to the extent possiblewithin the strata of locality and treatmentmodality. At each stage, sample membersreceived numerical weights reflecting thevariation from random choice created byclustering and stratification.

To ensure appropriate geographical cover-age, the 58 counties in California were first

Page 39: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

39Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

discriminated into five groups—Los Ange-les County, the San Francisco Bay Area,Southern Urban, Central Valley, and Moun-tain/Rim counties. Because of the size of theirtreatment systems, six highly urbanisedcounties were selected with certainty: LosAngeles, San Francisco, San Diego, Orange,Alameda, and San Bernardino. Selection of10 further counties proceeded by associat-ing the counties with smaller numbers of par-ticipants in recovery with geographicallyneighbouring counties with larger numbers ofparticipants. The final county selection wasmade from these county clusters, so thatFresno, Kern, Riverside, Sacramento, SanMateo, Santa Clara, Solano, Stanislaus,Tehama, and Ventura counties were addedto the six certainty counties.

The final stage of program sampling appliedthe modality principle—selection accordingto type of treatment program— to facilitiesin the 16 selected counties. Based onCADDS data, five distinctive treatment mo-dalities were designated. Four modalitieswere to be sampled for participant dischargesonly, while the fifth had added a subsampleof participants in treatment. Treatment set-tings, services, and characteristic populationserved varied among modalities:

Residential treatment - Facilities of-fering residential services with a varietyof drug and/or alcohol treatments.

Social model recovery - Residen-tial treatment plans specially developedin California that provide 31 days or moreof recovery/ treatment services to alco-hol users in small programs that stresspeer support and communal approaches.

Non-methadone outpatient - Fa-cilities offering a variety of outpatient ser-vices and treatment plans. Programs maybe large or small, and usually rely on psy-chological counselling.

Methadone/Detox - Outpatient facili-ties offering daily doses of methadone and/or other prescribed medicines to support

planned withdrawal from heroin use, usu-ally for short periods such as 30 days.

Methadone maintenance outpa-tient - Facilities offering chemical main-tenance and other non-residential servicesfor the opiate-dependent.

The CALDATA sampling plan worked withtotals of participants discharged from treat-ment at its selected facilities in the course ofthe sample year, October 1, 1991 - Sep-tember 30, 1992. Because participants re-ceiving maintenance doses of methadonecommonly remain in treatment for years, asubsample of participants currently in treat-ment at Methadone Maintenance Outpatientfacilities (the CMM sample) was added tothe study design.

NORC’s nationally distributed field staff,with broad experience in social science re-search, was a project asset. Five California-based field managers joined Kay Malloy,CALDATA’s California-based project fieldmanager, to make telephone contact with thedirectors of the selected facilities, followingup on an advance letter from ADP. Duringthe initial phone contact, the field staff soughtto schedule a site visit from a CALDATAabstractor and arrange a mailing of theCALDATA Provider Questionnaire, an in-strument that gathered basic administrativeand cost data about the facility. It was ex-pected that most facilities would require in-put from several staffers to complete the Pro-vider Questionnaire. Individual questionssought data that measured administrative andstaff turnover, qualifications of staff members,space owned or rented by the facility, thevalue of property and the cost of rentals,annual costs of equipment, supplies, exter-nal services, and volunteer services, annualtreatment revenues and public and privateinsurance coverages for services, with unitcosts for various types of service. The datacollection plan called for an advance mailingof the Provider Questionnaire to allow thefacility to research answers, followed by aformal administration of the instrument by theabstractor in the course of the site visit.

Page 40: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

40 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Another important element in the campaignto gain co-operation of the facilities was Dr.Gerstein’s presentation to a meeting of Cali-fornia directors of county alcohol and drugtreatment agencies, explaining the methodsand objectives of CALDATA. A free-flow-ing question and answer session allayed theirnumerous concerns about the approach torespondents, the confidentiality of recordsand privacy of respondents, and the usesof the data, and gained the assistance ofmany in reassuring other sampled provid-ers concerned with participation in the study.

Two large chains of proprietary methadonemaintenance facilities represented the ma-jor problem of provider enlistment, endlesslydeferring or categorically refusing appealsfor co-operation from field and projectstaffers. When facility contacting hadconcluded, the field staff reported an 85%co-operation rate, scheduling site visitscovering 83 of the 97 selected providers,which included 110 modalities of treatment.Some facilities were selected for more thanone treatment modality; for example,methadone providers commonly offeredmaintenance as well as detoxificationservices. Facilities sampled twice for par-ticipant record abstraction were countedonly once for Provider Questionnaire col-lection and site visit logistics. One of the 83facilities withdrew from the study afterabstraction was completed.

While scheduling site visits, the field staffworked energetically to recruit the requiredrecords abstractors. Many of the 55 abstrac-tors trained in November 1992 to completethe sampling and abstraction phase of thestudy were interviewers with experience onother NORC studies, known to the fieldmanagement staff and consequently, re-cruited. However, the sampling and abstract-ing tasks at a CALDATA facility did not ap-peal to some experienced interviewers whopreferred more traditional interviewing. Incontrast, individuals with backgrounds inmedical records abstraction or drug treat-ment counselling were attracted to theCALDATA work.

Confidentiality considerations also con-strained staffing. If an employee of the re-search team had a history of employment ortreatment at a local facility, especially oneselected for the CALDATA sample, it mightappear to facility staff or to respondents thattheir information would circulate in the localcommunity. That belief, however ill-founded,would compromise the candor of response.All prospective abstractors, both experiencedand new to NORC, were screened for as-sociations with California treatment facilities,either in staff or participant roles. Only thosewithout any ties to local treatment facilitieswere employed.

Respondent sampling andrecords abstraction

A two-day training session introduced theCALDATA abstractors to procedures forlisting and selecting a random sample of dis-charged participants and for working withfacility records to abstract data about eachselected participant. Besides practising withsampling procedures and materials, abstrac-tors practised with the two CALDATA ab-straction instruments, the Participant Ab-straction Record (PAR) and the ParticipantLocating Record (PLR). They were alsotrained on confidentiality principles and pro-cedures, on managing their relations with fa-cility staff, and on administering the ProviderQuestionnaire.

The Participant Abstraction Record wasdesigned to capture analytical data about thesampled treatment episode and about priorparticipant history from administrativerecords. The PAR requested data on basicdemographic characteristics, sources of pay-ment for treatment, and living arrangementsat admission. Other PAR items includedsections on arrest and imprisonment, medi-cal history, alcohol and drug history, drug testresults, prior treatment, services received,discharge status, and charges for the sampleepisode. Nothing in the PAR permitted iden-tification of the individual. The ParticipantLocating Record, by contrast, was designedto facilitate locating. Most of the PLR was

Page 41: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

41Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

given over to date of birth, Social SecurityNumber, address and telephone number,driver’s license, and other official numbersassociated with the selected participant, in-cluding additional names and contact data onemployers, relatives, friends, professionalstaff, and other treatment facilities.

The abstractors worked with sampling rulesfor each facility extrapolated from the infor-mation in CADDS. Sampling rates variedaccording to the size of the facility and weredesigned to yield an average of about 30selected participants per sampled facility.Each modality at a facility had a set of sam-pling rules designed for it, furnished to theabstractor on a computer printout. A sam-pling interval, like Take each third dischargestarting with listed discharge #2, pre-scribed the selection rate and the randomchoice. The printout also described upperand lower limits for total number of dischargesin the sample year. The limits of one-half ordouble the expected number were designedto insure that the number of discharges in thelist created on-site by the abstractor corre-sponded within an acceptable range to theinformation reported to CADDS. When thetotal of listed discharges fell outside the lim-its, the abstractor phoned the NORC sam-pling department, which provided an ad-justed sampling procedure.

In some facilities, sampling proceeded quitesmoothly. However, many abstractors dis-covered discharge totals that fell above theupper limit given in their sampling rules. Sincethe CADDS database from which the sam-pling rates were derived was new and con-tained less than one year’s worth of data,the twelve-month discharge estimate for eachprogram had been developed by projectinga straight line from the number of months ofdischarges that were thought to be reflectedin the database. The actual 12-month list ofdischarges proved to exceed the estimatederived from CADDS by a factor of two ormore in 31 of the 87 co-operating modali-ties. Overall adjustments were required toprotect the integrity and statistical propertiesof the sample design, and also to keep the

number of selected participants within the lim-its of the data collection budgets.

When the participant sample was success-fully selected at a facility, abstraction fromprogram records with the PAR and the PLRusually proceeded without major difficulties.There were field stresses because abstrac-tors found it difficult to work efficiently in manyfacilities. The drug and alcohol treatmentprograms were open at irregular or inconve-nient hours and their staffs, preoccupied withtheir own work, often could provide onlyminimal assistance to the abstractors. Work-ing space was cramped and improvised, andabstractors were forced to develop self-suf-ficiency at interpreting program record sys-tems and finding the required data in partici-pant records. A common disappointment forabstractors was absence in the record ofmuch of the detail needed for locating, onwhich success in finding the selected partici-pant for interview in Phase 2 of the data col-lection depended. A number of completedPLRs had scant identifying detail; sometimeseven the participant’s name was borrowedfrom fiction or folklore, like Scaramouche orHumpty Dumpty.

At the close of Phase 1, data collectionCALDATA had selected and abstracted datafor 2,746 discharged participants at 87 co-operating modalities, with an additional Con-tinuing Methadone Maintenance (CMM)participant sample of 309 selected at 12 ofthe Methadone Maintenance facilities. (Fourmodalities were found ineligible at site visitbecause they discharged no participants inthe sample year.) Adjusting for a handful ofabstractions determined to be duplicates, atotal of 3,045 respondents were to be soughtfor the Participant Interview. The 19 eligiblemodalities that refused co-operation repre-sented 31,529 participant discharges in thesample year. Adjusted for participants lostfrom uncooperative providers as well as formissing records at co-operative providers,the Phase 1 participant abstraction comple-tion rate was 76.5%. Table 1 on the follow-ing page summarises Phase 1 sampling andfacility co-operation.

Page 42: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

42 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

Collecting completed Provider Question-naires, the final Phase 1 CALDATA datacollection activity required a good deal of timeand patience in the field. The requested in-formation was not readily available at manyfacilities. Several different facility stafferswere often needed to complete the question-naire appropriately, and their responsiveness,indeed their perception of responsibility forthe document, was confused and variable.Diplomacy and persistence eventually pro-duced a completed Provider Questionnairefrom 76 of the 82 facilities that contributedto the participant sample.

Respondent interviews

In Phase 2 of the CALDATA data collec-tion, the field staff was increased to 70 inter-viewers who were assigned the tasks of lo-cating the 3,045 selected participants,explaining the study and securing their co-operation, and administering the DischargedParticipant or Continuing Methadone Ques-tionnaire as appropriate. The questionnaireswere designed for a face to face interviewaveraging 1.5 hours.

The interview for discharged participantswas structured around sharply defined sec-

tions covering experiences before, during,and after a sample treatment. After pre-liminary questions about the respondent’sethnic and educational background, thequestionnaire repeated in three time framesa series of questions about drug and alco-hol use, mental and physical health, ille-gal activities and criminal status, livingarrangements and family, employment, anddrug and alcohol and mental health treat-ments. A calendar designed for use withthe questionnaire graphically displayed thethree time segments for which behaviourquestions were repeated: the 12 monthsjust prior to the sampled treatment episode,the time span of the sample treatment, andthe time elapsed since the sampled treat-ment episode. The questionnaire incor-porated design features common to a num-ber of large-scale drug outcome surveysunder development at the time. The ques-tionnaire used to interview the participantsin continuing methadone maintenance (theCMM Questionnaire) was a modified ver-sion of the Discharged Participant Question-naire. CALDATA Field Managers carriedout a pretest of the questionnaire, the calen-dar, and procedures for explaining the studywith 32 respondents in January/February1993 that was particularly concerned to

TABLE 1: CALDATA phase 1 response rates by modality

Sample providers 19 23 27 19 18 106 18

Participants in target pop. 21,409 6,699 50,963 49,500 8,296 136,867 9,741

Co-operating providers 18 21 23 13 12 87 12

Participantsrepresented in co-operating providers 20,370 6,079 40,034 32,940 5,916 105,338 6,946

Response rate basedon co-operating providers 95.1% 90.7% 78.6% 66.5% 71.3% 77.0% 71.3%

Indicator Discharge sample modality Total Total discharge CMM

sample sample Residential Social Outpatient Methadone Methadone

model Non-meth. Detox maintenance

Page 43: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

43Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

test respondents’ ability to focus on time seg-ments and keep information within the ap-propriate time frames, and interviewers’ abil-ity to keep the respondent on track throughrepetitive sets of questions.

In March, 1993, a three-day training intro-duced the CALDATA interviewers to theParticipant Questionnaires. In addition toscripted practice with their instruments, in-terviewers practised with protocols for ap-proaching respondents, including a longscript prescribed by the California Commit-tee for the Protection of Human Subjectsexplaining the purposes of the study, its vol-untary nature, and the confidentiality withwhich all information was held. A field pro-cedures manual laid down the steps in con-tacting and locating respondents, beginningfrom the principle that CALDATA inter-viewers were never to explain the researchexcept as a “health study” until they weresecure that they were speaking to a selectedrespondent in private.

Training in field procedures drew upon the32-person pretest. The field managers con-ducting the pretest had found their smallgroup of pretest respondents dramaticallymore difficult to locate than respondents forresearch on issues like job market behaviouror health care expenditure. Once located, thepretest respondents as a group had beendifficult to bring to the point of interview—evasive, suspicious and prone to break ap-pointments. Hence the CALDATA inter-viewer training prepared the field staff forspecial difficulty in contacting and locating,suggesting that patience, persistence, and astrong positive attitude from the interviewerwould eventually reassure a respondent.Confidentiality protocols and issues wereanother training emphasis: interviewers’ suc-cess in gaining the co-operation ofCALDATA participants and gatheringvalid data from them depended above allon giving them confidence that their an-swers would in truth be held in utmost pri-vacy, used only for research, and publishedonly as summary statistics. To assist thefield staff in making such assurances, the

study obtained a Confidentiality Certificatefrom the U.S. Department of Health andHuman Services that shielded the interview-ers from legal action that might require themto testify in court about the whereabouts orcriminal activity admissions of respondents.

At conclusion of training, each interviewerreceived an initial assignment of about 30cases. As expected, a proportion of respon-dents was located and interviewed fairlyquickly, but many others were harder to pindown. To get word to this group, interview-ers were encouraged to distribute a distinc-tive blue business card with the neutral nameused in fielding CALDATA and the toll-freephone number of the project’s California of-fice in the course of their locating inquiries.Word was circulated among friends and con-tacts of the respondents that they weresought for a health study, and that a checkin payment was due them. The project staffbelieved that the distinctive card wouldenhance the visibility of the study in thedrug using communities with which respon-dents had contacts. If it were generallyacknowledged in the local drug culturesthat the “health study” was legitimateunthreatening research with a payment of$15, hesitant respondents could be wonto call up the office and make appoint-ments for interview. Consequently, about400 respondents called the toll-free phonenumber.

In addition to circulating the card with theoffice phone number, the project institutedlocating procedures with local agencies. Aregular weekly check of a new-inmates jaillist circulated by Los Angeles County (sitefor one-fourth of the sample) effectively lo-cated a number of respondents. The Cali-fornia state prison system endorsed the studyand facilitated contacts with respondents instate prisons. Commercial databases giv-ing recent addresses for persons usingcredit cards were accessed at modest cost.Computerised telephone directories wereemployed by field managers to track re-spondents with distinctive names to dis-tant communities or to look for relatives who

Page 44: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

44 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

might have recent contact. ADP arranged forinquiries about respondents’ addresses withthe California Department of Motor Vehicles,which maintains a huge database showing theaddresses to which drivers’ licenses and wellas motor vehicle registrations are issued. TheCalifornia Department of Social Servicescould not by statute divulge address infor-mation but agreed to search its large data-base of welfare recipients and to circu-late a “health study” letter asking for phonecontact to any respondents discovered.

The various database searches contributedsignificantly to the success of the data col-lection, but CALDATA staff found that thepersistent, patient, unthreatening personalinquiries of interviewers were indispens-able to the final outcome. Interviewers whocould inspire confidence while maintain-ing patience and masking their eagernessto push the inquiry to a final result madethe difficult contacts with homeless andall but nameless respondents. During therecords abstraction, staff of facilities treat-ing numbers of homeless people had as-serted that CALDATA would never find andinterview their discharged participants. Butinterviewers did find a reasonable number (49

of 121 assigned cases with no address in-formation for the respondent or any relativeor associate were successfully interviewed).The interviewers used every avenue open tothem and were inventive about developingtheir own locating procedures, like blendinginto the background and lingering at centersmaintained for the homeless or in drug-deal-ing areas in metropolitan cities until someoneanswered a casual question and furnished aclue to a missing respondent.

At close of the nine-month Phase 2 fieldperiod, 1821 respondents, 61.4% of theeligible participant sample, had completedthe interview. Table 2 summarises thePhase 2 data collection by modality. Sev-enty-seven participant cases were ineli-gible, assigned a final Out of Scope sta-tus. Fifty-seven of these had died since thetreatment episode 8 spoke neither Englishnor Spanish and could not be interviewed(CALDATA developed a Spanish transla-tion of the Discharged Participant Inter-view but was not prepared to gather datain other non-English languages). Eight werephysically or mentally incapacitated to the pointthat they could give no interview and four wereineligible for other reasons.

TABLE 2: CALDATA final participant case status counts by modality

Modality type Original Out of Net Completed No % sample scope sample interview complete

1- Social Model Recovery 703 9 694 401 293 57.8%

2- Methadone Detox 474 24 450 294 156 65.3%

3- Nonmethadone Outpatient 641 11 630 382 248 60.6%

4- Residential Treatment 615 15 600 343 257 57.2%

5a- Methadone Maintenance Outpatient 302 13 289 182 10763.0%

5b- Continuing Methadone Maintenance 310 5 305 219 86 71.8%

TOTALS 3045 77 2968 1821 1147 61.4%

Page 45: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

45Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

One thousand one hundred forty-seven eli-gible participants were not interviewed.Two hundred thirty-one of this group, 7.8%of the eligible net sample, were contactedand refused to give an interview. Such apercentage of refusals is well within ac-ceptable range for most surveys. Five hun-dred ninety-two of the non-interview cases,19.9% of the net sample, could not be lo-cated in the course of the nine month fieldperiod. This percentage loss would not beexpected in a population under no specialstress. Given the population that CALDATAinterviewers attempted to contact, and theuneven quality of identifying information,this loss rate is not surprising.

Besides the refusals and unlocatables, theother non-interview cases were participantswho had been located, but for one reason oranother interviews had not been arranged.Small groups were in jails, prisons, ortreatment centers that did not allow inter-viewer access. Seventy cases had beentraced outside the borders of Californiabut could not be reached to arrange a tele-phone interview (An interview over longdistance phone was CALDATA’S standardapproach to remotely located respondents.Altogether, 119 interviews were com-pleted by phone.). Fifty-two non-interviewcases had been located in California buthad broken appointments, often repeatedly.Time ran out on this group. The generalCALDATA experience with evasive, slowto convince respondents suggests that manyin the broken appointment group wouldhave given an interview had the deadlinenot halted field work. The final non-inter-view group represented 185 respondentswho were known to be welfare clients inCalifornia. Towards the end of the fieldperiod the Department of Social Servicesforwarded letters to them describing thehealth study and giving the project officephone for contact. Phone calls in responseto the mailing were just beginning to come into the project office as data collection ended.Indeed, the final 33 interview documentsreached NORC’S central office too late tobe included in data processing and were set

aside from the analysis. Had the field periodbeen longer a number of the 185 welfarecases would have completed interviews.

A final component of CALDATA Phase 2data collection was the ValidationReinterview, organised to confirm the cir-cumstances that interviewers reported whenthey submitted completed questionnaires.Such validation is routinely built into datacollection plans employing in-person inter-views with dispersed samples. Survey re-search interviewers seeking to conduct in-terviews at the convenience of randomlyselected, scattered respondents necessarilywork very independently. For CALDATANORC, field managers took weekly phonereports of case status and field costs frominterviewers and transmitted the results tocentral office. Completed questionnairesreceived a quality control edit at theproject’s California office and their receiptwas then communicated to central office.Objective verification of at least ten per-cent of each interviewer’s completedcases with a brief telephone ValidationReinterview was required. Thereinterview instrument confirmed that theinterview took place with the sampled re-spondent when and where reported,checked the time elapsed, and verifiedcontent by re-asking a few innocuous dataitems. The validation process turned upone suspect interviewer when a samplerespondent denied answering the question-naire, asserting that he had refused to makean appointment when requested. The vali-dation staff undertook a systematic reviewof the suspect interviewer’s entire com-pleted caseload, and eventually determinedto replace every interview. When valida-tion of all interviewers’ work was com-pleted, 346 cases had responded to vali-dation reinterview, a validation rate of19% of the completed caseload of 1821,systematically distributed in early, middle, andlate field period.

When Phase 2 data collection ceased onDecember 1, 1993, the large task of datapreparation began. This meant cleaning and

Page 46: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

46 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

data entering the information from ProviderQuestionnaires, Participant AbstractionRecords, Discharged Participant Question-naires, and Continuing Methadone Mainte-nance Questionnaires. As part of on-goingquality control during the field period, theproject office in California had given prelimi-nary edits to the PQs, the PARs, and theDischarged Participant Questionnaires. Onreceipt at NORC’s central office the instru-ments received a full edit, with coding of se-lected items in the Participant Questionnaire,before entry at computer terminals using aComputer Assisted Data Entry Program withAutoquest software. Range and consistencychecks for individual items were pro-grammed into the software. Additional con-sistency or logic checks were added to theprogram by project staff. Quality control ofdata entry included a random re-keying of10% of each operator’s work by a secondoperator. After data entry, hard copy casematerials were filed by case ID in the librarymaintained by NORC’s data preparation fa-cility. The final data files were delivered ondiskette to the project team responsible forthe analysis. Dean Gerstein, Robert. A.Johnson, and Natalie Suter of NORCworked with Henrick J. Harwood and Dou-glas Fountain of Lewin-VHI, Inc., subcon-tractor for the cost/benefit analysis, to com-plete the report to ADP.

How were the dataanalysed?

The analytical team first inquired into thestrengths and limits of the data. In Phase1, the field staff had successfully completedwork at 87 of 106 eligible modalities foran 82% modality completion rate. In Phase2, the field had interviewed 60.9% of 3,045sampled participants in time to meet thedata preparation deadline. Were the result-ing data sufficient to represent the wholepopulation of participants discharged frompublicly supported treatment in the sampleyear in California, plus the population in con-tinuing methadone maintenance? The partici-

pant completion rate was the product of therate for participants represented in the Phase1 abstraction times the rate for the partici-pants data-entered from Phase 2, 60.9% of76.5%, or 46.5%. Could the analysis pro-ceed to discuss the effects and costs of drugtreatment in California, looking at informa-tion that represented less than half of the par-ticipant sample? Answering this questionmeant separate consideration of sample biasdue to provider non-response and partici-pant non-response.

Bias due to provider non-response was notequally distributed among modalities. Tak-ing into consideration the number of dis-charged participants represented by success-ful abstractions, the Phase 1 completion ratewas greater than 90% for the Residential andSocial Model modalities, greater than 75%for the Outpatient Non-Methadone modality,and less than 75% for the Methadone Detoxand Methadone Maintenance modalities(See Table 1, page 42). The lower co-op-eration rate from methadone programsstemmed from refusals of owners of twolarge chains of methadone facilities oper-ating for profit. To get some idea of thebias created by the relatively light sampleof methadone providers, the analystslooked outside their CALDATA files. Inthe fiscal year before the CALDATAsample was drawn, California had re-ported on a number of participant and pro-vider characteristics to the National Drugand Alcoholism Treatment Unit Survey(NDATUS), conducted by the National In-stitute on Drug Abuse. A comparison be-tween CALDATA and NDATUS was runfor participant age, sex, and ethnicity, andfor the weekly work hours of several kindsof professional staff at facilities. Both forresidential and methadone providers, theCALDATA and NDATUS participant datawere broadly similar. The data about pro-fessional staff levels in the two sorts of pro-grams were likewise similar. The compari-son with NDATUS suggested that bias inCALDATA results due to provider non-co-operation might not be severe.

Page 47: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

47Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

To study the characteristics of participantswhose data were missing because they hadbeen sought but had not been interviewedduring Phase 2 of data collection, the ana-lysts had another resource—the data ab-stracted in Phase 1 from provider recordsinto the Participant Abstraction Record. Formost measures there was no difference be-tween sampled respondents and samplednon-respondents. Where differences werestatistically significant, a lack of data at thePAR item often seemed responsible. Forexample, participants described as His-panic—Mexican Americans or other Ameri-cans reporting ethnic derivation from Span-ish America—represented 37% ofCALDATA respondents and 30% of non-respondents. But the Hispanic variable wasmissing in more than 20% of PARs for bothrespondents and non-respondents. This sug-gested data distortion from item non-re-sponse rather than a systematic differencebetween respondents and non-respondents.The overall conclusion from the comparisonof PAR data was that the Phase 2 respon-dents could reasonably be taken as repre-sentative of the whole population of sampleddischarged participants in CALDATA’ssample year. It appeared that Phase 2 par-ticipant non-response depended primarily onpoor quality address and other locating in-formation obtained from provider records,and the quality of locating information in theprovider record seemed largely independentof the characteristics and treatment outcomesof discharged participants.

The CALDATA analysts also considered thestrengths and the possible limits of the de-sign implicit in the Discharged ParticipantQuestionnaire. The questionnaire askedabout repeated categories of experiencein three time segments—the 12 months be-fore treatment, time during sampled treat-ment, and time since treatment. The dataitems permitted measurement and compari-son of the behaviour of the same individualbefore and after treatment, or pre/post com-parison. Pre/post comparison allows theanalyst to use sampled participants as their

own controls. Participant characteristicswhich tend to have a permanent influence onbehaviour, like gender, ethnicity, personalappearance, early experience and upbring-ing, and other aspects of character and per-sonality, are held constant in comparisons ofexperiences pre- and post treatment. If thereappear to be treatment effects, such back-ground characteristics can be ruled out ascauses. To get comparable controls for thesampled group using another sort of researchdesign, it would be necessary to match eachparticipant with a drug user not in treatmentwho had the same background characteris-tics—a sampling exercise of impossible com-plexity. The CALDATA design had anotheradvantage for analysis. When statistical mea-sures indicated pre/post treatment changesin the sample, the analysis could identify in-dividuals who had changed and seek to de-scribe the ways in which certain sub-groupswho changed behaviour differed from eachother and from other sorts of participantswhose behaviour did not change.

In designing CALDATA to allow pre/postcomparison, the analytical team relied on re-spondents’ ability to recall and report detailsof their behaviour over several years withreasonable accuracy. The sample year be-gan with discharges on October 1, 1991,while the last Phase 2 interviews were con-ducted in November, 1993. The maximumretrospective recall for 12 months pre-treat-ment extended over something more thanthree years. Given the familiar experience,that some people cannot routinely recallevents a week in the past, was the researchteam justified in expecting high quality retro-spective recall from its participant sample?The pretest of the Discharged ParticipantQuestionnaire conducted by field managersin January/February 1993 had sought to testsampled participants’ ability to recall and todiscriminate between past events in the threetime frames. The results from that smallsample indicated that administration of thequestionnaire was indeed feasible. But asresponsible students of behavioural changeover time, the CALDATA analytical team felt

Page 48: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

48 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

obliged to address questions about the kindand degree of measurement error involvedin retrospective recall when they examinedtheir final data. They considered five pos-sible sources of error:

Recall delay - People have progres-sively greater difficulty rememberingevents, the more remote they are in time.CALDATA sought to minimise this sourceof inaccuracy by limiting recall to recenttime. Questions focused on behaviour ofhigh personal interest, more likely to berecalled than monotonous routines. Thestudy’s best safeguard against recall de-lay came from the fact that recall delaywould minimise reporting that suggestedbeneficial treatment effects.

Telescoping - This measurement er-ror derives from people’s tendency toassign events to an earlier or later timeperiod than the one in which they oc-curred. CALDATA sought to minimisetelescoping by repeatedly focusing re-spondents’ attention on the reference pe-riod of each question and by associatingthe beginning and end of each period withmemorable events like admission to anddischarge from treatment. To the extentthat evidence of treatment effectivenessappeared in the data, it could be consid-ered stronger for having risen above anyerror created by telescoping.

Under reporting of sensitivebehaviours - It is common wisdom thatpeople are more likely to suppress re-sponses that implicate them in criminalbehaviour or otherwise tend to castshame on them, such as having sex formoney or drugs, armed robbery, diag-nosis as HIV-positive, or use of drugsand alcohol. But participants dis-charged from drug treatment do not re-act to such questioning with the sponta-neous alarm that might be expected of thegeneral population. The research teamknew that if CALDATA respondentscould be reassured that they would incur

no damage in answering they would re-port illegal and illicit behaviour. Earlierdrug research studies had run compari-sons between self-reported illicit drug useand drug test results that showed respon-dent reporting was valid. In interviewertraining CALDATA emphasised non-judgemental management of the interview,unbiased probing techniques, and confi-dentiality guidelines. Also, because theanalysts knew of research showing thatUnder reporting is most common for sen-sitive events in the present or the near past,they did not rely on respondent reports ofcurrent illicit behaviour only, but basedanalyses on periods of 12 months or longer.

Reversion to more typicalbehaviour - Earlier studies of drugtreatment had shown that participants usemore drugs and alcohol and commit morecrimes, often related to the need to sup-port a drug habit, in the time period justbefore admission to treatment. Indeedthe high levels of substance abuse andrelated deviant behaviour are factors thatinduce participants to seek treatment.Thus CALDATA’s analysis needed to becareful that it did not describe as a treat-ment effect changes caused by reversionto more typical behaviour. The year-longrecall periods built into its data tended tosmooth out pre-admission jumps in devi-ant behaviour. The analytical team agreedthat small changes revealed by pre/postcomparisons should be set aside as pos-sibly influenced by reversion to more typi-cal behaviour, but that reversion was anegligible issue when changes were large.

Differential non-response - A fi-nal bias in the data might arise becausemore participants whose treatment wasbeneficial co-operated with CALDATAthan participants whose treatment wasnot. The analysts studied their PAR dataand Non-Interview Reports on refusalcases and found little evidence of biasfrom this source. Those who rejectedthe study tended to come from more se-

Page 49: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

49Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

cure backgrounds, to have more finan-cial and social resources, and to live inmore comfortable circumstances post-treatment than those who co-operated.

Having reviewed potential sources of datadistortion and satisfied themselves that thestudy design compensated appropriately forthem, the analysts proceeded to run pre/postcomparisons of data for reported respondentbehaviour in the areas covered by the re-peated sections of the Discharged Partici-pant Questionnaire, testing changes for sta-tistical significance using standard methods.

Analysing the cost and benefits of treatmentcalled for use of measures derived from thirtyyears of research into the economics of drugtreatment. Economic impacts of participantbehaviour were calculated by measuring itsnegative drain on the overall economy, off-set by values contributed by participant em-ployment. When post-treatment economicimpacts were less than the cost of treatmentitself, a benefit was judged to exist, and whenthey exceeded the cost of treatment, a losswas declared. This cost/benefit analysis dis-tinguished costs to society as a whole fromcosts to taxpaying citizens. Costs to Societyinclude losses of society’s net productivityor losses in society’s net wealth. Thus,participants who do not earn up to their po-tential because of drug and/or alcohol abuserepresent negative economic impacts, as dothe costs of health services, police forces,and corrections facilities deployed becauseof participants’ drug use and related crimes.However, the values of goods or cash stolenby participants, and of welfare and disabilitypayments they receive, do not count as costsbut as transfer payments, in which money sim-ply moves from one pocket to anotherwithin the society as a whole. Costs toTaxpaying Citizens include only thoselosses to individuals in society who donot engage in any substance abusingbehaviour. For these people, the loss ofearnings from drug- or alcohol-dependentparticipants who are not living up to theirpotential is of little concern, while the value

of goods and cash stolen by substanceabusers is a serious cost, as is money ex-pended on welfare and disability paymentsmade to drug and alcohol abusers.

What did theyfind out?

CALDATA analysts calculated the costsrelated to participants’ criminal behaviourand health care utilisation and the value oftheir labour force productivity by assign-ing average values to each criminal act,arrest, incarceration, health careutilisation, earnings, and welfare/disabil-ity reported by participants, factoring indata from current statistical compilationssuch as the Sourcebook of Criminal Jus-tice Statistics and Hospital Statistics.Cost of participants’ sampled drug treat-ments was based on data fromCALDATA’s Provider Questionnaire aug-mented by CADDS data. The State of Cali-fornia spent $209 million in treating the146,609 participants represented byCALDATA’s discharged and continuingmethadone samples. The average treatmentfor the 136,867 discharged participantslasted 95 days and cost $1,361. Residen-tial treatment was substantially higher incost than outpatient treatment—$61.47 perday, $4,405 per episode for the Residen-tial modality, vs $7.47 per day, $990 perepisode for the Nonmethadone outpatientmodality.

The analysis found that benefits to tax-pay-ing citizens during the time participants werein treatment and in their first year post-treat-ment represented approximately 1.5 billionin savings, due mostly to reductions in crime.Each day of treatment paid for itself—thebenefits to taxpaying citizens equaled or ex-ceeded costs—on the day it was received.Table 3 summarises the pre/post compari-son of costs both to taxpaying citizens andto society (Earnings totals are bracketed asproductive inputs counting against the othernegative impacts).

Page 50: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

50 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

The benefits of alcohol and other drug treat-ment outweighed the costs of treatment byratios from 4:1 to greater than 12:1 depend-ing on the type of treatment. For the wholesociety, the cost-benefit ratios ranged from2:1 to more than 4:1 for all modalities exceptfor methadone maintenance outpatient.

The analysis supported the hypothesis thattreatment was effective in changing partici-pant behaviour. The level of criminal activitydeclined by two-thirds from before treatmentto after treatment. Declines of approximatelytwo-fifths occurred in the use of alcohol andother drugs from before treatment to aftertreatment. About one-third reductions inhospitalisations were reported from be-fore treatment to after treatment, with corre-sponding significant improvements in otherhealth indicators. Treatment for problems withthe major stimulant drugs was as effective as

treatment for alcohol problems and some-what more effective than treatment for heroinproblems. For each type of treatment stud-ied, there were slight or no differences in ef-fectiveness between men and women,younger and older participants, or amongAfrican-Americans, Hispanics, and Whites.Overall, treatment did not have a positiveeffect on the employment of participants. Themost common source of income for partici-pants before and after treatment was full-timeemployment. Welfare, illegal activities, anddisability payments were the next most com-mon income sources. Rates of employmentand income from employment were bothgenerally lower post-treatment than pre-treatment. Overall employment earnings de-clined by 29%. This finding is consistent withthe depressed economic trend in California1991-1993. In every type of treatment, moreparticipants enrolled in disability programs

TABLE 3: CALDATA costs

Pre/Post Treatment Total Dollar Costs Dollar Costs Per Person

Year Before Year After Year Before Year After

Criminal Justice System $1,086,043,000 $841,800,000 $7,935 $ 6,151

Victim Losses 524,727,000 310,387,000 3,834 2,268

Theft Losses 815,738,000 253,297,000 5,960 1,851

Health Care Costs 441,698,000 337,923,000 3,227 2,469

[Earnings] [1,378,105,000] [1,101,356,000] [10,069] [8,047]

Lost Earnings 2,343,151,000 2,619,912,000 17,140 19,164

Income Transfers 250,466,000 275,563,000 1,830 2,013

Costs to Taxpayers $3,118,672,000 $2,018,971,000 $22,786 $14,751

Costs to Society $4,395,447,000 $4,109,605,000 $32,151 $27,035

Page 51: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

51Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

and participants received more money in dis-ability payments after treatment, increasesranging from one-sixth to one-half. Theanalysis indicated that treatment increased eli-gibility for disability payments even though itled to overall improvements in health status.

The cost-benefit analysis concluded that par-ticipants in the California treatment systemreduced their criminal activity and health careutilisation during and in the year followingtreatment by amounts worth well over $1.4billion, for an overall ratio of benefits to costsof 7 to 1. Savings included reduced criminaljustice expenses (police protection, adjudi-cation, and corrections), reductions in victimlosses (stolen and damaged property, inju-ries, and lost work), and lower levels of healthcare utilisation (hospitalisations, emergencyroom use, outpatient care.) Savings wereoffset by modest increases in welfare and dis-ability dependence.

How were theresults used?

Dr. Mecca released CALDATA’s findingsat two back-to-back press conferences inSeptember 1994 in San Francisco and LosAngeles. The report received extensive na-tional press coverage and was influentialin altering U.S. perceptions of the effec-tiveness and social value of drug and al-cohol treatment. The fact that the study wasbrought to fruition so quickly while perti-nent national studies were not yet readyfor release was to its advantage. The WhiteHouse Office of National Drug ControlPolicy and other federal agencies have ref-erenced the study and it is routinely citedin academic research into drug treatmentand mental health evaluation.

CALDATA continues to yield results. Furtheranalysis of the data has been supported bythe Office of the Secretary of the U.S. De-partment of Health and Human Services andby the Robert Wood Johnson Foundation, oneof the largest health-oriented philanthropies inthe U.S. The March 1996 A Treatment Pro-tocol Effectiveness Study issued by GeneralBarry R. McCaffrey, Director of the WhiteHouse Drug Office, cited CALDATA as amajor source of evidence for its White Paperon treatment effectiveness. Federal spendingon drug treatment increased by over 17 per-cent between fiscal year 1994 and fiscal year1996; the increase was over 25 percent whenone compares 1994 with the federal budgetrequest for the fiscal year 1998.

Page 52: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

52 Evaluation of Psychoactive Substance Use Disorder Treatment

WHO/MSD/MSB 00.2f

It�s your turn

What are the strengths and the weaknesses of the presented case example? List three posi-tive aspect and three negative aspects:

Strengths of the case study

1

2

3

Weaknesses of the case study

1

2

3

Page 53: Workbook 5 -  · Workbook 5 • Cost Evaluations 5 WHO/MSD/MSB 00.2f Table of contents Overview of workbook series 6 What is a process evaluation? 7 Why do a cost evaluation? 7 How

53Workbook 5 · Cost Evaluations

WHO/MSD/MSB 00.2f

References for CALDATA case example

Gerstein, Dean R., et. al. Evaluating Re-covery Services: The California Drug andAlcohol Treatment Assessment, GeneralReport. Sacramento: California Departmentof Alcohol and Drug Programs, 1994.

Gerstein, Dean R., et. al. Evaluating Re-covery Services: CALDATA MethodologyReport. Submitted to the State of CaliforniaDepartment of Alcohol and Drug Programsby the National Opinion Research Center atthe University of Chicago and Lewin- VHI,Inc. Chicago: 1994.

Gerstein, Dean R. and Harwood, Henrick J.Treating Drug Problems (Volume I: AStudy of the Evolution, Effectiveness andFinancing of Public and Private DrugTreatment Systems). Washington, D.C.:National Academy Press, 1990.

Suter, Natalie, et. al. CALDATA Methodol-ogy Report Supplement I: Data CollectionInstruments for the California OutcomesStudy (COS). Submitted to the State of Cali-fornia Department of Alcohol and DrugProblems by the National Opinion ResearchCenter at the University of Chicago. Chicago:1994.

Williams, Ellen, et. al. CALDATA Method-ology Report Supplement 3: Data Collec-tion Manuals and Reference Materials.Submitted to the State of California Depart-ment of Alcohol and Drug Programs by theNational Opinion Research Center at theUniversity of Chicago. Chicago: 1994.