DOCUMENT RESUME
ED 297 171 CE 050 692
TITLE Measuring Up. Planning and Managing with PerformanceStandards--PY (Program Year) 88.
INSTITUTION National Alliance of Business, Inc., Washington,D.C.
SPONS AGENCY Department of Labor, Washington, D.C.REPORT NO ISBN-0-88713-604-4PUB DATE 86
NOTE 65p.; Original version was written by the Employmentand Training Institute, Inc. and funded by the NewJersey Department of Labor.
AVAILABLE FROM National Alliance of Business Clearinghouse, 101515th Street, NW, W-ashington, DC 20005 (Relatedsoftware package--$399.00; annualupdates--$50.00).
PUB TYPE Guides - Non-Classroom Use (055)
EDRS PRICE MF01 Plus Postage. PC lot Available from EDRS.DESCRIPTORS Adult Education; *Employment Programs; Federal
Legislation; Federal Programs; *Job Training;Performance; *Performance Factors; *ProgramEffectiveness; *Program Evaluation; Specifications;*Standards
IDENTIFIERS *Job Training Partnership Act 1982
ABSTRACTThe purpose of this guide is to provide local
policymakers and program managers with an approach for usingperformance standards as a tool for reviewing and improvingperformance in Job Training Partnership Act programs at the locallevel. The guide is keyed to the Department of Labor adjustmentmethodology f.r activities beginning July 1, 1988. Section Idiscusses the evolution of performance standards and their supportingrationale. Section II details a systematic approach to obtainingcomprehensive baseline data on local performance, an analysis of theperformance of specific activities and contractors within the servicedelivery area (SDA), an analysis of performance trends, and acomparison with other SDAs. Section III describes two approaches forimproving performances the Oversight Response, which identifiesprpbable causes of poor performance, and the Planning Response, whichprovides tools to examine new strategies and approaches that willimprove performance. Section IV summarizes the key issues related toplanning and managing with performance standards. Sample reports,charts, and worksheets are provided as needed. Many of the reportsand worksheets in the guide require routine, yet time-consuming,computations using local participant and financial data. A softwareoption (apt available here) is available for microcomputer users tospeed up the analysis. (YLI)
mommomminamoommommonommumplimmgmmongumplimamploommonomx Reproductions supplied by eDRS pre the best that can be made
from the origipal document.mommonammonommommommommommumommigimmommonommomm*
U.S. DEPARTMENT Of EDUCATIONOffice of Educational Research and Improvement
EDIdCATIONAL RESOURCES INFORMATIONCENTER (ERIC)
This document has been reproduced asreceived from the person or organizationoriginating it
C Minor changes have been made to improvereproduction qublity
Points of view oropinonststedintSdument do not necessarily represent "callOEM Poston or policy
"PERMISSION TO REPRODUCE THISMATERIAL IN MICROFICHE ONLYHAS BEEN GRANTED BY
; - 7 T-1- - -f- t____t 4 -4
t -1 -11
II !
I1 - 4_
PaYIN17TO THE EDUCATI AL RESOURCESINFORMATION CENTER (ERIC)
t
WAVAr 100.144 j I al' I
loin ri-41lull
1 1
4-III
aL
-t 1 1-
i
-4,
I 61I !PA
Arr-4Al,-rt,
1H JI1I
AI
t-t+ - --1- -4 r
--4-
4-
r
4-- 4- 4 4
1 -f 1 4 r i t4
-1-- 4- -1, 1-
1---t 4 t- fL
LILL!
7:1151 7711
PINION111111111.117011:1111!:
4
_ I 1
r
-t
2
Measuring Up
Planning and Managing WithPerformance Standards - PY 88
3
Activities of the National Alliance of Business are financed with both public andprivate resources. The largest share of public funding comes in a grant from the U.S.Department of Labor. Opinions reflected in this publication do not necessarilyreflect official Labor Department policy.
Copies of this guide may be obtained by writing or calling the NAB Clearinghouse,1015 15th Street, N.W., Washington, DC 20005202/289-2910
Copyright ° 1986 by the National Alliance of Business. All rights reserved.TRN/068/1.5TRN/056/2TRN/126/1ISBN 0-88713-604-4
Table of Contents
IntroductionSection I: Overview of Performance StandardsSection II: Identification of Local Performance
Calculating Current PerformanceBaseline Program ReviewTrend and Comparison Analysis
Section III: Strategies for Improving PerformanceOversight ResponsePlanning Response
Section IV: Conclusion
Worksheet 1:Worksheet 2:Worksheet 3:Worksheet 4:Worksheet 5:Worksheet 6:Worksheet 7:Worksheet 8:Worksheet 9:Worksheet 10:Worksheet 11:Worksheet 12:
Summary of Reports/Charts/Worksheets
Performance Status ReportPerformance Status GraphAnalysis of Enrollments and ExpendituresPerformance of Terminees by Characteristic- AdultPerformance of Terminees by Characteristic - YouthAnalysis of Programs/Contractors - AdultAnalysis of Programs/Contractors - YouthAdult Trend AnalysisYouth Trend AnalysisFollow-up Trend AnalysisTrend Analysis By StandardComparison Analysis
Oversight Response SummarySystems SummariesEffect of Changing Variables on Performance StandardsWorksheet 13: Model Adjusted Worksheet for Analyzing Client
Characteristic Performance - AdultWorksheet 14: Model Adjusted Worksheet for Analyzing Client
Characteristic Performance - YouthWorksheet 15: Model Adjusted Worksheet for Analyzing Client
Characteristic Performance - Adult Post ProgramWorksheet 16: Model Adjusted Worksheet for Analyzing Client
Characteristic Performance - Adult WelfareNAB Regional Service Office List
Page
1
41010111927273747
12
1314151617182223242526303139
43
44
45
4548
Acknowledgments
The original version of this guide was written by the Employment andTraining Institute, Inc. and funded by the New Jersey Department ofLabor through the Center for Human Resources, Institute forManagement and Labor Relations, Rutgers University.
This guide, in its present form, was further developed and refined bySteven Go lightly of the National Alliance of Business and Kenneth Ryanof the Employment and Training Institute, Inc..
A special thanks is extended to Kathleen Hacker for her creative anddiligent document iroduction.
Introduction
The Job Training Partnership Act is designed to be a performance-driven program. Performance standards are the tools that are usedto assure that job training programs are a productive investment inhuman capital.
To fulfill the Secretary's mandate and assure that the goals of theJob Training Partnership Act are achieved, the U.S. Department ofLabor has selected twelve performance measures, established anational standard for each measure, and designed a nationaladjustment methodology for adapting the national standards to localconditions. States must set standards for local service delivery areasfor eight of the twelve, using either DOL's adjustment methodologyor their own adjustment methodology established withinparan ters set by the Secretary. States must also determinewhether these standards have been met, provide technicalassistance, reward performance, and impose sanctions whenstandards are not met for two years. Governors must also set anentered employment rate standard for Title III formula fundedprograms and establish a goal for Title DI wage at placement.Service delivery areas must formulate policies and operateprograms that will meet or exceed the state-established performancestandards, but they are encouraged to seek further adjustments iflocal conditions warrant.
Clearly, the most challenging aspects of implementing the newperformance standards system fall to professionals at the local level.It is there that policies are made and programs are shaped. Thepurpose of this guide is to provide local policy makers and programmanagers with an approach for using performance standards as atool for reviewing and improving performance at the local level. Theguide is keyed to the PY 88 DOL adjustment methodology foractivities beginning July 1, 1988. Earlier versions of this guide,which include the PY 84 through PY 87 models, are availablethrough the National Alliance of Business. An outline of the guidefollows.
Section I: Overview of Performance Standards - This sectiondiscusses the evolution of and rationale supporting performancestandards.
Section II: Identification of Local Performance - This sectiondetails a systematic approach to obtaining comprehensive baselinedata on local performance, an analysis of the performance of specificactivities and contractors within the SDA, an analysis of perfor-mance trends, and a comparison with other SDAs.
Section III: Strategies for Improving Performance - Thissection describes two approaches for improving performance: first,the Oversight Response, which identifies probable causes of poorperformance, and second, the Planning Response, which providestools to examine new strategies and approaches that will improveperformance.
Section IV: Conclusion - This section summarizes the key issuesrelated to planning and managing with performance standards.
Readers are cautioned not to rely solely on performance standards toplan, manage, and evaluate JTPA; performance standards are onlyone measure of the effectiveness of JTPA. The mission establishedL)y the private industry council (PIC) and local elected officials maymake other goals as significant as achieving the performancestandards. For example, the use of JTPA in effecting institutionalchange at the local level or the use of JTPA in promoting jobcreation and economic development activities could be as importantas local performance measures and must be considered in the self-evaluation process.
Many of the reports and worksheets presented in the guide requireroutine, yet time-consuming, computations using local participantand financial data. A software option is available for microcomputerusers that speeds up the analysis and allows for graphicpresentation of data. This program operates on MS DOS 2.0 andhigher and requires an IBM PC, XT, AT or true compatible with aminimum of 256 K. The cost of the software is $399.00 with futureyear updates priced at $50.00. For more information on thesoftware, contact the Employment and Training Institute at 1 -800-932 -0085.
The software option includes the following reports and worksheetswhich are referenced in this guide:
Title Worksheet Number
Performance Status Report 1
Performance Status Graph 2
Analysis of Enrollments and Expenditures 3
Performance of Terminees by Characteristic - Adult 4
Performance of Terminees by Characteristic - Youth 5
2 8
Analysis of Programs and Contractors - Adult 6
Analysis of Programs and Contractors- Youth 7
Model Adjusted Worksheet for Analyzing ClientCharacteristic Performance - Adult 13
Model Adjusted Worksheet for Analyzing ClientCharacteristic Performance - Youth 14
Model Adjusted Worksheet for Analyzing ClientCharacteristic Performance - Adult Post Program 15
Model Adjusted Worksheet for Analyzing ClientCharacteristic Performance - Welfare 16
Section I: Overview of Performance Standards
Throughout the 20-year evolution of Federal job training programsthe government, local policy makers, and practitioners have agreedon the need to assess program performance. An equitable method ofassessing these local programs, however, was a complex issue. Eachcommunity has a unique set of circumstances which directly affectsits ability to operate effective job training programs. A program inan urban setting with a high unempioyment rate serving a highpercentage of "hard to serve" individuals may be performing well if40 percent of trainees are placed in jobs. A similar program in asuburban setting with a low unemployment rate serving a lowpercentage of "hard to serve" individuals may be performing poorlyif only 40 percent of trainees are placed in jobs. Since no standardmethod of making adjustments for local variables was available, theexpected performance of past programs was determined bynegotiations between local program operators and representatives ofthe Department of Labor.
Performance standards under JTPA are different: They allow forlocal variations in measuring the effectiveness of job trainingprograms. Thus, performance standards have become an"adjustment system" in which the actual local standard reflects suchfactors as who is served and local economic conditions. For example,an SDA with a low unemployment rate serving a low percentage ofdifficult-to-place trainees will have comparatively high standards ofperformance. In contrast, an SDA with high unemployment servinga high percentag' of difficult-to-place trainees will have relativelylow standards of performance. If the model works as intended, thesecond SDA would not be penalized because of difficult localconditions; its standards would be adjusted downward.
The specific methodology incorporated in performance standardshas evolved over the last six years. For the next two year planningcycle (PY 88 - PY 89) five new measures have been added to theoriginal seven performance standards. These are: a youthemployability enhancement measure and four post-program follow-up measures designed to assess the long-term impact of JTPAservices. Governors are required to establish numericalperformance standards in each SDA for at least eight of the 12Federal standards. Governors are also required to establish anumerical standard for the entered employment rate in Title III(Dislocated Worker) Programs and are encouraged to establish anaverage wage at placement goal for Title DI terminees.
I4
0
SpecificPerformanceMeasures
The Federal strategy for meeting performance management goalsincludes encouraging governors to use the authority granted tothem under JTPA to further adjust SDA performance standards toaddress broader performance management concerns. THE GUIDEFOR SETTING J IPA TITLE II-A PERFORMANCE STANDARDS FORPY 88, published by the U.S. Department of Labor is an essentialresource for seeking such adjustments.
The performance standards system is the first attempt to establish anational system for judging program performance that allowsadjustment for the local factors that affect performance. Indeveloping its adjustment approach, the Department of Laboranalyzed data on prior programs and chose variables which werefound to have a significant impact on performance. Each variablewas weighted according to its average impact on performance. Forexample, by plotting the data from hundreds of communities, it wasfound that serving more dropouts resulted in fewer placements. Theinfluence of this factor was determined mathematically and acorresponding weight was assigned. The current methodology is anextension of this simple example. Ultimately 23 variables werefound to affect performance significantly and are included in themodel. Several critical policy issues were addressed in theestablishment of the current methodology. Many of these issuesincluded technical items such as statistical modeling, selection ofdata bases from CETA and JTPA, and tolerance level adjustments.Three of the most significant areas addressed deal with the selectionof the performance measures, the establishment of nationalstandards, and the identification of the local adjustment factors. Adescription of each of these issues follows.
The twelve performance measures selected by the Department ofLabor include four measures for adult programs, fourmeasures foryouth programs and four measures for adult post-programperformance under Title IIA. Separate measures were establishedfor the youth and adult populations based on the differing needs andhistorical results for each population.
Adult Measures Entered Employment Rate the number of adults who enteredemployment at termination as a percentage of the total numberof adults who terminated.
Welfare Entered Employment Rate the number of adultwelfare recipients who entered employment at termination as a
511
percentage of the total number of adult welfare recipients whoterminated.
Average Wage at Placement - average hourly wage for all adultswho entered employment at the time of termination.
Cost per Entered Employment - total expenditures for adultsdivided by the number of adults who entered employment.
Youth Measures Entered Employment Rate - the number of youth who enteredemployment as a percentage of the number of youth whoterminated.
Positive Termination Rate - the number of youth who had apositive termination as a percentage of all youth who termi-nated. Positive terminations include entered employment,obtained employability enhancements, or attained PIC-approvedyouth competencies.
Cost per Positive Termination - total expenditures for youthdivided by the number of youth who had a positive termination.
Employability Enhancement Rate - the number of youth whoattained one of the employability enhancements whether or notthey also obtained a job as a percentage of the total number ofyouth who terminated. Youth Employability Enhancementsinclude:
a. Attained PIC-Recognized Youth Employment Competencies.
b. Entered Non-Title II Training
c. Completed Major Level of Education
d. Completed Program Objectives (14-15 year olds).
Post-Program Follow-up Employment Rate -- the numberof Adult respondentsMeasures who were employed during the 13th Week after program
termination as a percentage of the total of all respondents(terminees who completed follow-up interview)
Welfare Follow-up Employment Rate -- the number of AdultWelfare respondents who were unemployed during the 13th
6,2
week after program termination as a percentage of the total of alladult welfare respondents.
Weekly earnings at Follow-up -- total gross weekly earnings ofrespondents who were employed at the 13th week divided by thetotal number of respondents empl-yed at the 13th week.
Weeks Worked in the Follow-up Period -- total number of weeksworked in the 13 week follow-up period for all respondentsdivided by the total number of respondent..
Performance The Secretary of Labor established national performance standardsStandards (specific numerical le /els) for each of the twelve measures based onTitle IIA an analysis of prior accomplishments. The Secretary's standards
have beta set at a point ,- which 75% of the SDAs are expected toexceed based on previous performance. This differs from prior yearswhen meeting the Secretary's standards simply represented averageperformance. The national performance standards are as follows:
Adult Standards
Entered Employment Rate 68%Welfare Entered Employment Rate 56%Average Wage at Placement $4.95Cost per Entered Employment $4,500.00
Youth Standards
Entered Employment Rate 45%Positive Termination Rate 75%Cost per Positive Termination $4,900.00Employability Enhancement Rate 30%
For Post Program
Follow-up Employment Rate 60%Welfare Follow-up Employment Rate 50%Weeks Worked in the Follow-up Period 8Weekly Earnings of all Employed at
Follow-up $177.00
73
Local The factors that are considered in adjusting the national standardsAdjustment for local conditions fall into three categories: the characteristics ofFactors those served, the length of training, and local economic conditions.
Who Is Served
The following trainee characteristics were found to affectperformance significantly. They are either included in the law orare used in the adjustment model.
1. Female2. Age 14 153. Age 16-174. Age 30 +5. Black6. Hispanic7. Asian/Pacific Islander8. Dropout9. Student
10. Post high school attendee11. AFDC Recipient12. GA/RCA Recipient13. Unemployment compensation claimant14. Handicapped15. Offender16. Person not in labor force17. Unemployed 15 + weeks
Length of Training / Percent Terminees
Factors such as the length of training and percent of individualsin classroom and on-the-job (OJT) training were found toinfluence performance. However, for a variety of reasons, onlyone factor dealing with the length of the training program isincluded in the national model.
This factor takes into account the percentage of participants whoterminate during the program year and only influences the twoperformance measures dealing with cost. Experience has shownthat programs carrying over more participants into the nextprogram year tend to have higher costs than those programs thatcarry over fewer participants.
,48 4
Local Economic Conditions
Five factors which reflect local economic conditions are includedin the model: local average wage, unemployment rate,population density, number of families below poverty level, andemployee /resident worker ratio.
The state may establish an SDA's performance standards usingDOL's adjustment methodology or its own adjustment methodologyestablished within the parameters set by DOL. This adjustmenttakes into consideration local SDA conditions and calculates anumerical performance level for each of the twelve measures. SDAsmust plan and operate programs to meet the established standardsor seek adjustments to the state-established standards when localconditions warrant. The ability to perform at or above the standardis largely dependent on the policy direction provided by the PIC andthe planning and management skills of local administrators.
The remainder of this guide provides an approach and some toolsthat will enable SDA/PIC staff to use the performance standardssystem as a "point of departure" for improving planning andmanagement practices.
Section II: Identification of Local Performance
In order to begin a serious examination of performance, an SDAmust have accurate, complete program information available on acontinuous basis. A sound management information system isessential. Without these resources a manager does not have theability to determine the status of current activities and cannot beginthe process of improving performance. This first level of review,which includes calcrqating current performance, baseline programreview, and tr.- and comparison analysis, allows the SDA toconduct a comprehensive analysis of overall performance. Thefollowing reports are required for this review:
State prepared SDA reportLocally prepared program reports
Calculating Calculating your current performance enables you to take a snap-Current shot of local SDA performance and compare it to the twelve primaryPerformance JTPA performance standards, either as approved in your Job Train-
ing Plan or as adjusted based on actual experience. Worksheet 1,Performance Status Report, will provide a report on current perfor-mance (actual performance) in relation to planned or readjustedperformance standards (expected performance). The readjustedstandards are obviously the most accurate way to analyze yourperformance since they base your standard on the actual character-istics of terminees, percent terminated, and current local economicdata. For SDAs that wish to include the readjusted standards,Worksheets 13, 14, and 15 provide the methodology to generatereadjusted standards. Depending on which method has beenselected by your state to calculate the Welfare Entered EmploymentRate, you may also need Worksheet 16. The Performance StatusReport indicates how close you are to meeting your standards. Therequired information may be available through existing systemseither within your SDA or from the state.
The following information is required to complete Worksheet 1.
1. Number of adults terminated2. Number of adults entered employment3. Total adult expenditures4. Total youth expenditures5. Average hourly wage at placement (adults)6. Number of adult welfare recipients terminated7. Number of adult welfare recipients entered employment8. Number of youth terminations9. Number of youth entered employment
10. Number of youth positive terminations11. Number of youth employability enhancements
10
1 g
12. Number of adults employed 13 weeks after programtermination
13. Number of Adult Welfare respondents employed 13 weeks afterprogram termination
14. Average number of weeks worked in the 13 week period afterprogram termination for adults
15. Average gross weekly earnings of all respondents employed atthe 13th week after program termination
16. Planned performance standards (Job Training Plan) or re-adjusted standards as calculated on Worksheets 13, 14, and 15
If information on any category is unavailable, it is not possible toidentify your current level of performance in relation to the twelvekey performance standards. Adjustments to your ManagementInformation System may be required.
Worksheets 1 and 2 are designed to provide an ongoing analysiscapability, allowing SDAs to increase the frequency of performanceanalysis beyond that currently provided by state and localstandardized reports.
Baseline Baseline program review helps identify probable causes of poor per-Program Review formance through a comprehensive review of program outcomes and
client characteristics. It begins the process of narrowing down thespecific areas which influence performance by gathering data onprogram and subcontractor performance.
This process begins with an initial review of total enrollments andexpenditures. Deviations from plan in the areas of total served,youth service levels, and expenditure rates will have a dramaticimpact on actual performance. Worksheet 3 provides a suggestedformat for this review. This review may indicate, for example, thatalthough expenditures are 90 percent of plan, adult and youthenrollments are only 60 percent of plan. This situation may resultin failure to meet the cost standards since this would increase the"cost per" outcome significantly.
The next two steps in the baseline program review process involve adetailed analysis of actual performance: first by target groups andthen by programs or contractors. Worksheets 4 and 5 will enable anSDA to analyze, by performance standard, youth and adult outcomesby terminee characteristics. Worksheets 6 and 7 will enable an SDAto analyze, by performance standard, youth and adult outcomes by
1117
Worksheet 1
PERFORMANCE STATUS REPORT
Report Period:
S
E
T
ON
1
Enter Current Values in Columns A, B, and C
1. Number of Terminees
-aA
Adult BIWelfare Youth
2. Number Placed
3. Positive Terminations
4. Employability Enhancements
5. Average Wage at Placement
6. Total Program Cost
7. F/U Employment Rate
8. F/U Weeks Worked
9. F/U Weekly Earnings
S
E
T
ON
2
Enter Actual and ExpectedPerformance and % of Standard
EnteredEmployment
Rate
Actual Performance
AAdult
Expected Performance
Cost per EnteredEmployment
% of StandardActual Performance
B CWelfare Youth
Average Wageat Placement
PositiveTermination Rate
Expected Performance% of StandfirdActual Performance
Cost per PositiveTermination
EmployabilityEnhancement
RateFollow-Up
EmploymentRate
Follow-UpWeeks Worked
Expected Performanceof n ar
Actual Pe ormanceExpected Performance% of StandardActual PerformanceExpected Performanc% of StandardActual PerformanceExpected Performance% of tandardActual PerformanceExpected Perfance% of StandardActual Performance
Follow-UpWeekly Earnings
Expected Performance% of StanrdpdActual Pe ormanceExpected Performance% of Standard
12
Worksheet 2 Report Period
PERFORMANCE STATUS GRAPH
Transfer totals for expected performance (EP) and actual performance (AP) fromWorksheet 1 PERFORMANCE STATUS REPORT to complete the following bar graphs.
ADULT
ENTEREDEMPLOYMENT
% EP AP
.100
90
80706050
40
30
20
10
0
WELFARE ENTEREDEMPLOYMENT
% EP AP
100
90
8070
60
5040
30
20
10
0
COST PER ENTEREDEMPLOYMENT
EP AP$800075007000650060005500500045004000350030002500200015001000500
WAGE TPLACEM ENT
E AP$6.256.005.755.505.25.0 04. 54 .50
.254.003.753.503.253.002.75
YOUTH
ENTEREDEMPLOYMENT
% EP AP
100
908070
60
50
40
302010
0
COST PER POSITIVETERMINATION
EP AP$7000
650060005500500045004000350030002500200015001000
500
POSITIVETERMINATI ON
% EP AP
100
90
807060
0
40302010
0
EMPLOYABILITYENHANCEMENT
% EP AP
100
908070
6050
403020
10
0
13 trj
WORKSHEET 3
ANALYSIS OF ENROLLMENTS AND EXPENDITURES
Report Period
12 MonthPlanned
% ofTotal
Planned % ofTotal
Actual %?f,,Total % of Plannedfor Period
Enrollments (Total)
Adult
Youth
Expenditures (Total)
Adult
Youth
Detail of Total Expenditures ,
,
. i .
4 /% //:Administration
Salary & Fringe
Operating Costs
Other
Training
Salary & Fringe
Occupational Training
Other Classroom Training
OJT
Special Programs
Other
Services
Needs Based Payments
Support Services
Other
WORKSHEET 4 Report Period
PERFORMANCE OF TERMINEES BY CHARACTERISTIC - ADULT
CHARACTERISTICS
ACTUAL PERFORMANCEActual Terminees Entered
EmploymentRate
Cost PerEntered
Employment
Average
Wage''`Follow-Up
EmploymentRate
Follow-UpWeeksWorked
Follow-UpWeeklyEarnings# %
Male
Female
Age 33+
Black
Hispanic
Dropout
UC Claimant
Unempl. 15 +
Not in Labor Force
AFDC
GA/RCA
TOTAL
Others (Optional)
WORKSHEET 5 Report
PERFORMANCE OF TERMINEES BY CHARACTERISTIC - YOUTH
Period
CHARACTERISTICS
ACTUAL PERFORMANCEActual Terminees Entered
EmploymentRate
PositiveTermination
Rate
Cost PerPositive
Termination
EmploymentEnhancement
Rate# %
Male
FemaleiAge 14 - 15
Aie 16 -17WhiteBlack
Hispanic
Alaskan Native/American Indian
Asian/Pacific Islander
DropoutStudentNot in Labor Force
Post High School Attendee
AFDC Recipient
GA! RCA Recipient
Handicapped
Offender
TOTAL
Others (Optional)
WORKSHEET 6
Programs/Contracts
Report Period
ANALYSIS OF PROGRAMS/CONTRACTORS - ADULT
PERFORMANCE DATA POST PROGRAM FOLLOW-UP
EnteredEmployment
Rate
Welfare EnteredEmployment
Rate
Cost PerEntered
Employment
AverageWage at
Placement
Follow-upEmployment
Rate
Follow-upWeeksWorked
Follow-upAvg. Weekly
Farnings
Occupational Training
1.
2.
3.
4.
5.
6.
Total Occupational Training
Other Classroom Training
1. Job Finding Skills
- ,
2. Remedial Programs
A.
B.
3. Special Programs
A.
B.
C.
OJT
Other 2 627
TOTAL ALL PROGRAMS
WORKSHEET 7
Programs/Contracts
ANALYSIS OF PROGRAMS/CONTRACTORS - YOUTH
Report Period
PERFORMANCE DATA
EnteredEmployment
Rate
PositiveTermination
Rate
Cost Per PositiveTermination
Employability Enhancement Rate
Occupational Training
1.
2.
3.
4.
5.
6.
Total Occu 'atonal Training_
Other Classroom Training
1. Job Finding Skills
2. Remedial Programs
A.
B.
3. Special Programs
A.
B.
C.
OJT
Other
TOTAL ALL PROGRAMS
28 29
program or contractor. Taken together these reports will, in mostcases, help isolate problem areas and provide a starting point forcorrective action. This information will also be useful when you areplanning new strategies to improve performance since you willknow what programs work best, for what groups, and at what cost.
Although the type of analysis that can be done using Worksheets 4-7can be very useful in pinpointing problem areas, SDAs are cautionednot to apply the SDA's overall expected performance levels to allprograms and contractors or to specific target groups.
Programs must be looked at in the context of varying client needsand service strategies. For example, if an SDA has an adult enteredemployment rate of 55 percent and a cost per entered employment of$5,800, the SDA might be tempted to identify programs and targetgroups performing below this level as "problem areas." But, uponcloser review, such programs may be exceeding model-based perfor-mance expectations based on the characteristics of participantsbeing served and the type of training being provided. Similarly, along-term program serving dropouts with an entered employmentrate of 43 percent and a cost per entered employment of $6,400 maynot necessarily reduce overall performance.
Not all activities will produce outcomes consistent with the overallexpected performance levels established for an SDA: some willproduce higher results and some lower. The challenge is toconstruct a job training system that, in total, exceeds overallestablished performance levels.
While analysis of the performance of specific program activities andsubcontractors and the outcomes for specific client groups can beguided to a certain extent by the DOL model, DOL warns that astrict application of the model in these instances would beinappropriate.
Trend and Once you have calculated your current level of performance andComparison completed your baseline program review, you may want to concludeAnalysis your overall performance analysis by examining performance trends
and by comparing current performance with past SDA performanceor with the performance of comparable SDAs. In order to completeboth the comparison and trend steps you will need the followinginformation:
'930
Trend Analysis
1. Prior year's performance standards2. Performance reports (available from this section)3. Planned performance goals contained in the Job Training Plan4. Statewide performance reports (available through your state
JTPA office)
Trend analysis allows the SDA to visualize, on a regular basis,performance variations over a longer period of time. With this typeof analysis PIC members and SDA staff will easily be able todetermine the direction in which the program is going. They canthen take corrective action when a downward trend becomesapparent rather than waiting for performance to fall hopelesslybelow the standard.
Performance may 'luctuate at monthly or quarterly review pointsfor legitimate reab ms. Fluctuations in performance may have anumber of explanations: closed entry courses, late program start-ups, or funding delays. Trend analysis may help pinpoint apotential problem area. It may indicate, for example, that yourparticular SDA has developed a trend of decreasing enteredemployment rates with increased cost per placement for a six monthperiod. These trends would then be considered in analyzing SDAperformance.
Use Worksheets 8 and 9 to maintain an ongoing record of trends foreach of the two performance standards groups, adult and youth.Worksheet 10 can be used to track each standard on a yearly or morefrequent basis.
Comparison Comparison analysis offers a broader perspective on "how you'reAnalysis doing." Comparison analysis provides you with the opportunity to
compare your current performance with:
Past local SDA experience
Performance of comparable SDAs (economy, labor market, andunemployment rate similarities)
Statewide SDA performance
The approved performance standards in your Job Training Planrepresent minimal expectations. Many SDAs have established moreambitious performance goals or have planned incrementalimprovements in selected standards. You may find, for example,that although you have met all of your performance standards,
203
comparable SDAs are exceeding standards. Many SDAs are nowcompeting for a portion of the "incentive" funds that are awarded bythe states, thus, comparison analysis takes on new meaning.
Use Worksheet 12 to record and compare your SDA performancewith various jurisdictions within a state. This worksheet enablesthe SDA to Pxamine the extent to which individual standards weremet by comparable SDAs. For example, if the SDA standard foradult entered employment is 55 percent and the actual performanceis 52 percent, the chart would display the percent of standard at94.5.
The approach and reports detailed in this section complete the firstlevel of performance review and should provide sufficient informa-tion for SDAs to determine priority areas for further investigationand improvement. This baseline data should be carefully reviewedwith local policy makers and professional staff. Section III:Strategies for Improving Performance describes steps that can betaken to improve performance - first the Oversight Response whichhelps pinpoint reasons for poor performance and then the PlanningResponse which helps identify new strategies for improved results.In almost all cases, an SDA will be involved in some combination ofthe Oversight Response and the Planning Response.
a221
WORKSHEET 8 Report
ADULT TREND ANALYSIS
Complete the bottom portion of the worksheet and graphtrends in performance standards.
Period
the quarterly/monthly
% of StandardPeriod Ending Period Ending Period Ending Period Ending
150%
140%
130%
120%
110%
100 %.
90%
80%
70%
60%
50%
40%
PerformanceStandards Legend
% of Standard
Entered Emp.
Cost Per E.E.
Welfare E.E.
Wage at Plcmt.
22
WORKSHEET 9 Report
YOUTH TREND ANALYSIS
Complete the bottom portion of the worksheet and graphtreads in performance standards.
Period
the quarterly/monthly
% of StandardPeriod Ending Period Ending Period Ending Period Ending
150%
140%
130%
120%
110%
100%.
90%
80%
70%
60%
50%
40%
. . .
PerformanceStandards Legend
% of Standard
Entered Emp.
,
Positive Term. . .
Cost/Pos. Term. ........Emp. Enhan.
1423
WORKSHEET 10 Report
POST PROGRAM TREND ANALYSIS
Complete the bottom portion of the worksheet and graphtrends in performance standards.
Period
the quarterly/monthly
% of StandardPeriod Ending Period Ending Period Ending Period-Ending
150%
140%
131:::
120%
110%
100 % -
90%
80%
70%
60%
50%
40%
PerformanceStandards Legend
% of Standard
F/U Emp. Rate
F/U Weeks Wkd.
F/U Wel. Emp.
F/U Wkly. Wage
..4024
WORKSHEET 11 Report
TREND ANALYSIS BY STANDARD
Select the performance standard, complete the bottom portiongraph the results.
Performance Standard
Period
of the worksheet and
% of StandardPeriod Ending Period Ending Period Ending Period Ending
150%
140%
130%
120%
110%
100 % -
90%
80%
70%
60%
50%
40%
PY 86 EP
Legend AP
%
PY 87 EP
Legend AP
%
PY 88 EP
Legend AP
%
:1625
1.2
WORKSHEET 12
Fill in the following worksheet using percentSDAs and/or the state as a whole (Columnsments still working after 60 days).
Report
COMPARISON ANALYSIS
of standard accomplished within your SDA (ColumnB-E). Fill in actual values for lines 8-11 (e.g., retention
Period
A) and other comparablerate 85% = 85% of place-
A
User SDA
BOther
COther
DOther
E
Other
1. Adult Entered Employment Rate
2. Cost Per Entered Employment
3. Welfare Entered Employment Rate
4. Average Wage at Placement
5. Youth Entered Employment Rate
6. Youth Positive Termination Rate
7. Cost Per Positive Termination
8. Youth Employability Enhan. Rate
9. Follow-up Employment Rate Adult
10. rollow-up Empl. Rate Adult Wlfr.
11. Follow-up Weeks Worked
12. Follow-up Weekly Earnings
Section III: Strategies For Improving Performance
The USDOL has significantly revised the Federal performancestandards for PY 88. The general objective of the changes in theperformance standards is to send a clear message to the JTPAsystem that resources and effort should be focused on thoseindividuals with multiple barriers to employment and thoseprogram components which provide long-term benefit to participantlabor market success. This policy direction supports thecongressional intent in establishingJTPA.
For SDAs that need or want to improve performance, there arebasically two approaches to take. The "Oversight Response" optionhelps pinpoint the factors in your existing programs and systemswhich are causing performance problems. The "Planning Response"option helps identify new strategies for improving your perfor-mance. Both responses should be viewed as ways of validating ormodifying the key policies established by the PIC and electedofficials.
Oversight Once the key activities causing poor performance have been identi-Response fied, the next step is to understand why, performance is not meeting
expectations. The "Oversight Response" provides the SDA with adiagnostic tool to review and identify the factors that contribute topoor performance.
In order for JTPA to function effectively at the local level, a complexand interrelated set of program and administrative systems must bein place. These systems and procedures cover a broad range, includ-ing areas such as participar t assessment, job development, contract-ing, MIS, supportive services, and classroom training. Thesesystems are operated either directly by the administrative entity or,in some cases, subcontracted to local service providers.
These basic elements are classified in the following three categories:
JTPA Services and Programs - This is the most critical componentof the employment and training system and includes the followingactivities: outreach, eligibility determination, assessment, employ-ability counseling, supportive services (including needs-based pay-ments), job development, classroom training, on-the-job training,try-out employment, work experience, and other special programs.
Administrative and Management Systems - The quality andeffectiveness of JTPA programs and services are directly determined
27 3 9
by the basic management and administrative systems which shapeand support them. The most critical elements here include:planning (including youth competencies), organizational structureand staffing, fiscal management, MIS, contracting, and monitoring.
Organizational Roles and Relationships -JTPA allows wide latitudein determining the roles and relationships formed in local SDAs.The scope and -aature of these roles can have a profound influence onthe quality and cost effectiveness of the programs and servicesprovided under JTPA. The critical areas here include:r lnsibilities of the giant recipient and administrative entity,planning responsibilities, and cooperative relationships with relatedagencies such as job service, economic development, welfare, andother special purpose agencies.
The Performance Status Reports, Baseline Program ReviewReports, and Trend and Comparison Analysis Reports detailed inSection II can provide you with an abundance of data which offerspecific "access points" to program areas, contractors, or activitieswith performance problems. Failure to meet a given performancestandard should be viewed by management as a flag of a possibleproblem. However, the performance standards model cannot tell theprogram manager exactly what the problem is.
Using the observed performance problem as the departure point, theOversight Response enables the SDA to determine which of thebasic employment and training systems may be causing thenerformance problem and provides specific guidance on ly_th theareas to be reviewed ar: l the method of review.
For example, suppose an SDA's actual cost per entered employmentis substantially above the standard predicted by the model. Themanager would first identify programs or contractors significantlyabove the standard. This could be determined from the baselineprogram review conducted in Section H. Assuming that the enteredemployment rate meets the standard, the problem could be isolatedto the cost of training. The manager might then determine if theproblem was caused at the planning level, the contracting level, or ifthe actual costs of training differ from planned and contractedterms. Thus, a review would be conducted of the planning andcontracting systems of the SDA and the fiscal system of the specificprograms or contractors. This oversimplified example does notaddress many important factors, but it does illustrate the type of
28
branching analysis that can pinpoint the cause of performanceproblems.
Some SDAs may opt for a total review of all the suggested systems inorder to maximize performance rather than confine the review toprograms identified as having problems in the baseline review. Byconducting a complete review of all systems the SDA has anopportunity to approach performance improvement in a morecomprehensive manner, allowing for, in most cases, a one-timealignment of these primary systems.
SDAs should use the Oversight Response Summary to identify theprogram, administrative, and organizational systems that may becausing poor performance; the JTPA systems which are most likelyto influence the standard are marked with an X. Since all systemsmay affect performance standards to some extent, SDAs should befamiliar with the interrelationship of the standards and concentratetheir reviews on the primary problems.
Once an SDA has determined which systems should be examinedfurther, it should refer to the Systems Summaries which provideguidance on both the areas to be reviewed and the method of review.Finally, appropriate corrective action plans should be formulatedand implemented.
Summary of Step 1 - Identify specific performance measures needing improve-Oversight ment through Performance Status Report.Response.Approach Step 2 - Identify programs, contractors, or problems with target
groups which may be causing poor performance by usingWorksheets 4, 5, 6, and 7. Make allowances for varyingclient needs and service strategies.
Step 3 - Identify probable systems causing poor performanceeither within the program or contractor or throughoutthe entire program using the Oversight ResponseSummary.
Step 4 - Use the Systems Summaries to conduct a review of eacharea identified above.
Step 5 - Design, implement, and follow-up on corrective actionrecommendations.
29
OVERSIGHT RESPONSE SUMMARY
Mostemploymentterminatione.
likely to influence enteredrates and positive
ratesstem Comments
Most likely to influence co L Most likely to influence averagestandards wage at placement
System Comments System Comments
JTPA Services & ProgramsServicesOutreachEligibility DeterminationAssessmentCounselingNeeds Based SystemOther Supportive ServicesJob Development/Placement
ProgramsSkill TrainingOther Classroom TrainingSpecial Target ProgramsWork ExperienceOJT
Ira
nri;4;u57III
flra
n
Initial outreach and assessment(matching client to training)directly affect dropout rate.
Ongoing counseling, needs-based payments, and othersupportive services also affectretention. The job developmenteffort is most critical to thesestandards.
The quality of training and OJTslots directly affects thesestandards
..EtflElnfl
-
Cost of needs-based paymentsand supportive services affectscost standards
A significant variance in averagelength of training and actualcost will affect the coststandards
..X..,
A.
iIIIA.
Job development mostdirectly affects this standard.
The nature and quality oftraining and OJT affect thestarting wage of completer
Problems may occur at theplanning or contractingstage.
Administration &Management SystemsPlanning (including youthcompetencies)Organization/StaffingFiscalMISContractingMonitoring
.Ea
IIEl
Planning in demand occupationsis essential.
Status of youth competencysystem will have a dramaticeffect on positive terminationrate.
Accurate MIS is required, as wellas clear, performance-typecontracts, in order to meetstandards.
nnnflFEw--
Inadequate youth competencysystem will affect 'cost per'standards.
Problems with cost and lengthof 'training may occur at theplanning, contracting, orimplementation stage.
Organizational Roles &Relationships IPIC-110 AgreementCoordination with:
Problems at this level could have a negative effect on all performance standards, depending onorganizational structure.
Job ServiceEconomic DevelopmentUnemployment InsuranceWelfare
,_--
The nature and vitality of coor-dinatioMinkage agreementswith other agencies can have asignificant influence on thesestandards
Cost-effective interagencyagreements can dramaticallyaffect these standards.
Relationships with place-ment and job development/creation agencies will affectthis standard.
SYSTEMS SUMMARIES
PROGRAMS AND SERVICES
1. OUTREACH
AREAS TO BE REVIEWED
Organization and staffingSubrecipients' responsibilitiesLinkages with community agencies serving targeted groupsSpecific outreach goals by target group and geographic areaOutreach process - use of media, written material, personal contact
METHOD OF REVIEW
Review outreach goals by target groupInterview staffReview sources of referralsTrack characteristics of applicants and enrollees
2. ELIGIBILITY DETERMINATION/SELECTION
AREAS TO BE REVIEWED
Intake/eligibility processStaff/subrecipient responsibilitiesLength of time from application to enrollmentCharacteristics of eligibles compared to those enrolledExistence of specific goals
METHOD OF REVIEW
Review the eligibility system as described in JTPAExamine application/eligibility determination processAnalyze characteristics of eligibles not enrolled
3. ASSESSMENT
AREAS TO BE REVIEWED
Administration of assessmentStaffing and organizationQuality of assessmentRelevance of assessment data to entrance criteria
METHOD OF REVIEW
Review assessment procedures and toolsSample assessment filesSample employability development plansReview skills and characteristics of successful completers
4. COUNSELING
AREAS TO BE REVIEWED
Organization and staffingProcedures for counseling throughout the JTPA systemUse of the EDP as an ongoing tool
METHOD OF REVIEW
Interview clientsReview client folders for progress reportsDetermine caseload trInds
31A r"
5. NEEDS BASED PAYMENT SYSTEM/SUPPORTIVE SERVICES
AREAS TO BE REVIEWED
Methodology for needs-based payment determinationConsistency of such payments for different target groups and activitiesEffect of payments and services on dropout rates and enrollment trends
METHOD OF REVIEW
Review written proceduresTrack dropout rates and underenrollment trends
6. JOB DEVELOPMENT/PLACEMENT
AREAS TO BE REVIEWED
Organization and staffingSubrecipient responsibilityPerformance-based contract usageCoordination of job development with job serviceMarketing plan and implementation
METHOD OF REVIEW
Interview job development and placement staffReview job orders and placement documentationTrack specific vocational training activities with poor wage rates
7. SKILL TRAINING/OTHER CLASSROOM TRAINING
AREAS TO BE REVIEWED
Organization and structure of the courseInstructional materialsStaffingInstructor's flexibility and responsivenessFrequency and accuracy of trainee assessment
METHOD OF REVIEW
Review course curriculumConduct site visitsInterview instructor and trainees
8. SPECIAL TARGET PRC1RAMS
AREAS TO BE REVIEWED
Ability of targeted programs to increase participation of the "hard to serve"Services and support provided to "hard to serve" populations
METHOD OF REVIEW
Perform a desktop review of each targeted programReview enrollments, positive terminations and placement dataCompare enrollee profile to SDA goals for target groups
4532
9. WORK EXPERIENCE (YOUTH EXEMPLARY PROGRAMS)
AREAS TO BE REVIEWED
Organization and staffingCharacteristics of clients served in this categoryQuality of training experience
METHOD OF REVIEW
Review work experience agreementsConduct site visitsInterview SDA staff responsiblo for monitoring worksitesInterview participantsReview timesheetsReview progress reports
10. OJT
AREAS TO BE REVIEWED
Organization and staffingWritten procedures describing specific responsibilitiesLinkages with other job development activity at subrecipient levelProcedures for marketingOJT contract development
METHOD OF REVIEW
Review OJT contractsConduct site visitsReview completion rates
ADMINISTRATION AND MANAGEMENT SYSTEMS
11. PLANNING
AREAS TO BE REVIEWED
Organization and staffingPlan preparationPIC involvementPlanning processYouth competency systemProcedures for tracking performance of subcontractorsProcedures for collecting and distributing ongoing performance informationMETHOD OF REVIEW
Review writ.en proceduresInterview staffReview program evaluationsReview PIC input through interviews or review of minutesExamine youth competency system procedures and monitor implementation
33
12. ORGANIZATION AND STAFFING
AREAS TO BE REVIEWED
Number and functions of staffLines of authorityFormal and informal methods of excharging informationStaff training and development effortsRatios of service staff to client population
METHOD OF REVIEW
Review organizational charts and job descriptionsInterview staff to validate lines of authovicy and information exchange proceduresExamine the reasonableness of agency departments or unitsAnalyze ratios of staff to clients for program and service systems
13. FISCAL
AREAS TO BE REVIEWED
Staffing and proceduresBudgeting procedures and cost categoriesAccrual system, report preparation and distributionSubrecipient reporting systemTimeliness of reports
METHOD OF REVIEW
Review back-up of total JTPA budget and subrecipient budgetsInterview staff, review and validate monthly reports, accruals and report distribution
14. MIS
AREAS TO BE REVIEWED
Organization and staffingData collectionExpertise of MIS staffDistribution of reports and validation process
:.iETHOD OF REVIEW
Review written proceduresValidate written procedures and definitions through interviews with staffTrack sample reports to source documents
15. CONTRACTING
AREAS TO BE REVIEWED
Organization and staffingSoliciting proposals, evaluating proposalsNegotiating contracts, modifying contractsEvaluating contractsStaff expertise in federal, state, and local procurement regulationsClarity of request for proposalsSpecificity of contract goals and objectives
47
34
METHOD OF REVIEW
Review written proceduresInter iew staffTrack RFP process through contract execution and modificationReview use of performance contracts
16. MONITORING
AREAS TO BE REVIEWED
Organization and staffingMonitoring proceduresFrequency and depth of monitoring and evaluationProcess for report distribution and corrective actionUtilization of evaluation results in planning and contract modificationMETHOD OF REVIEW
Review written proceduresInterview staff and determine the level and frequency of monitoringTrack reports through corrective action processDetermine the extent to which evaluations are used in the planning process
ORGANIZATION AND COORDINATION
17. PIC - LEO AGREEMENT
AREAS TO EE REVIEWED
Clarity of roles and responsibilitiesProcedures for developkg job training planLevel of involvement in policy guidance and oversight
METHOD OF REVIEW
Review written agreementInterview PIC membersReview PIC meeting minutes
18. JOB SERVICE
AREAS TO BE REVIEWED
SDA/Job Service coordination agreementStaffing and organizationInformation exchangeClient flow
METHOD OF REVIEW
Review written agreementInterview Job Service and SDA staffTrack number and results of referrals
35
19. ECONOMIC DEVELOPMENT
AREAS TO BE REVIEWED
Cooperative SDA/Economic Development agreementsProcedures used to share local economic development and labor market informationInformation exchange:
Listing of companies planning to relocate to the SDAListing of companies anticipating expansion within the SDAListing of planned or actual plant closings or reductions in force within the SDA
METHOD OF REVIEW
Review written agreementsInterview staff of economic development agencies and SDATrack results of any cooperative projects
20. UNEMPLOYMENT INSURANCE DIVISION
AREAS TO BE REVIEWED
Coordination agreement with the local Unemployment Insurance DivisionProcedures for timely information exchange and referral of UI claimantsMETHOD OF REVIEW
Review written proceduresInterview appropriate UI ane. SDA staffAnalyze number and results of UI referrals
21. WELFARE
AREAS TO BE REVIEWED
Welfare recipient participation ratesInformation exchange and referrals
METHOD OF REVIEW
Review written proceduresInterview staff of public assistance agencies and SDATrack numbers and results of referrals
36
Planning In addition to correcting deficiencies identified through the "Over-Response sight Response," most SDAs engage in what is termed the 'Planning
(or Replanning) Response." That is, they rethink key policydecisions, after examining the results, and initiate new policies insuch critical planning areas as:
Who will be served
Job training funds are limited. Only a portion of the eligiblepopulation can be served in most communities. Beyond theFederal requirements to serve particular target groups, SDAsdecide which groups of individuals will receive priority.
What services and programs will be provided
Taking into account the characteristics of the people who will beserved, SDAs determine the mix of training, remedial programs,and services that will be offered and how financial resources willbe allocated.
What will be acceptable performance standards
In addition to the minimum standards established by DOL andthe governor, SDAs may develop additional performance guide-lines or request waivers of state standards when appropriate.
The performance standards system, while establishing accounta-bility for such policy and planning decisions, does not usurp thisimportant local policy making role. In fact, one of the strengths ofthe performance standards system is that it does not reward orpenalize SDAs for these types of policy choices, but simply holdsthem accountable for achieving performance standards which reflectthose choices. SDAs which allow performance standards to "drive"the decisions in these key areas rather than consciously establishingpolicies will quickly get lost in a maze of variables and formulas atthe expense of quality programs.
Local policy makers should, however, have a clear understanding ofthe mechanics of the performance standards system so that they canbe aware of the performance-related implications of their decisions.The following list shows that 20 of the 23 adjustment variableswhich affect the computation of an SDA's expected performance aredirectly influenced by the SDA's policy making and planningdecisions in the areas noted above.
5 I)37
Adjustment Variables
Influenced By SDA
Who Is Served
1. Female2. Age 14-153. Age 16-174. Age 30 +5. Black6. Hispanic7. Asian/Pacific Islander8. Dropout9. Student
10. Post High School Attendee11. AFDC Recipient12. GA/RCA Recipient13. Unemployment Compensation Claimant14. Handicapped15. Offender16. Person not in labor force17. Unemployed 15 + weeks
What Type of Services and Programs Will be Provided
18. Percent Terminees
Not Influenced By SDA
19. Average Annual Wage in Wholesale/Retail Trade20. Unemployment Rate21. Population Density22. Number of Families below Poverty Level23. Employee/Resident Worker Ratio
Moreover, as can be seen on the "Effect of Changing Variables onPerformance Standards" chart, SDAs can determine precisely whatimpact each of the variables will have on their expected perfor-mance. The variables across from each of the performancestandards are listed in order of their significance. For example, thevariable which most significantly decreases the adult enteredemployment rate is unemployment rate. The next most influentialvariable for decreasing this standard is GA/RCA Recipients.
38
EFFECT OF CHANGING VARIABLES ON PERFORMANCE STANDARDS - 1988 Model(Listed in order of significance of impact)
Performance Standard Raises Standardsif increased
Lowers Standardsif increased
Entered Employment Rate 1. Population Density 1. Unemployment RateAdult 2. GA/RCA Recipient
3. Dropout(The hierarchy of variables 4. Age 30 and Abovefor Welfare Entered 5. AFDC RecipientEmployment Rate is the 6. Not in Labor Forcesame as it is for Entered 7. FemaleEmployment Rate.) 8. Emp/Res Worker Ratio
9. Black10. Unemployed 15 weeks or +
Average Wage at Placement 1. Average Annual Earnings 1. Fam. Below Poverty LevelAdult in Retail/Wholesale Trade 2. Dropout
2. Population Density 3. Unemployment Rate3. U. C. Claimant 4. Female4. Age 30 and Above 5. Black
6. Empl/Res Worker Ratio7. Unemployed 15 weeks or +
Entered Employment Rate - 1. Unemployment RateYouth 2. 14-15 Years Old
3. GA/RCA Recipient4. Student5. Dropout6. Handicapped7. Black
Positive Termination Rate-- 1. Population Density 1. Asian/Pacific IslanderYouth 2. Post-H.S. Attendee 2. GA/RCA Recipient
3. Student 3. Hispanic4. Empi/Res Worker Ratio5. Black
Employability Enhancement 1. 14-15 Years Old 1. HandicappedRate -- Youth 2. Student
3. Dropout 1. BIB :A3. GA/RCA Recipient4. Not in Labor Force
5 16-17 Years Old 4. AFDC Recipient
6. Female
Follow-Up Employment Rate 1. UC Claimant 1. Unem_ployment Rate-- Adult 2. GA/RCA Recipient
3. Offender4. Black5. Dropout6. Age 30 and Above7. AFDC Recipient8. Not in Labor Force9. Unemployed 15 weeks or +
EFFECT OF CHANGING VARIABLES ON PERFORMANCE STANDARDS - 1988 Model (cont.)(Listed in order of significance of imoct)
Performance Standard Raises Standardsif increased
Lowers Standardsif increased
Follow-Up Weeks WorkedAdult
Follow-Up Weekly Earnings--Adult
1. Avg. Annual Earnings inretail/Wholesale Trade
2. Population Density3. UC Claimant4. Age 30 and Above
1. Unemployment Rate2. GA/RCA Recipient3. Unemployed 15 weeks or +4. Black5. AFDC Recipient6. Dropout
1. Unemployment Rate2. Fam. Below Poverty Level3. Female4. Offender5. Dropout6. Empl/Res Worker Ratio7. Unemployed 15 weeks or +8. Black
Raising/lowering the cost standards below has different implications thanlisted above. Exceeding your standard for the two cost standards means actualstandard.
raising/lowering the eight standardsperformance lower than the
Cost Per EnteredEmployment--Adult
Cost Per PositiveTermination -- Youth
1. Population Density2. Avg. Annual Earnings ir,
Retail/Wholesale Trade3. Unemployment Rate4. UC Claimant5. AFDC Recipient6. GA/RCA Recipient7. Not in Labor Force8. Hispanic
1. Avg. Annual Earnings inRetail/Wholesale Trade
2. AFDC Recipient3. Hispanic4. Handicapped5. Not in Labor Force
1. % Terminees
1. % Terminees2. Student3. 14-15 Years Old
40 1173
It is important to note that the impact of specific variations on theoverall standard is not, in most cases, substantial. The cumulativeeffect, however, can significantly change the local SDA standards.
Familiarity with the dynamics of the performance standards systemwill enable SDA staff to examine how local factors are shapingcum.'''. expected performance and to determine what effect programmodifications would have on future expected performance levels. Byfollowing the "exams ing impact" instructions on the next page andusing Worksheets 11 and 13, an SDA can identify the extent towhich each factor is increasing or decreftsing their expectedperformance levels. Users should also comparr, local values withextreme factor values detailed in Section G of the Department ofLabor Guide For Setting JTPA Title II-A PerformanceStandards For PY S3. The value of this exercise is inunderstanding how lucal factors have shaped current expectedperformance levels, not in determining the extent to which the SDAhas deviated from the national averages.
Once an SDA has completed the type of analysis recommended inthis guide, it may consider changes in program design or emphasisfor the balance of the program year. The Baseline Program Reviewconducted in Section II provides valuable information for thisreplanning effort. Obviously any new program emphasis should bebased on what has worked best, for what grtups, and at what cost.By following the "dry run" instructions on the next page and usingWorksheets 13, 14, 15 a d 16, SDAs will be able to determineprecisely what effect these various policy and planning optionswould have on their expected performance levels.
5 441
INSTRUCTIONS FOR WORKSHEETS 13, 14, 15 AND 16
Option A - Recalculation Based on Actual Terminee Characteristics
This option enables SDAs ..) recalculate their expected performancestandards based on the actual terminee characteristics.
Complete only Columns B, D, E, and F of the top section and ColumnsA, B, and C of the bottom section.
Option B - Examining Impact of Local Factors on Expected Performance Levels
This option enables SDAs to determine the extent to which local factorshave increased or decreased expected performance levels. (ExamineColumns D, E, and F to observe + or- effect of local factors on currentexpected performance levels.)
Complete only Columns B, D, E, and F of the top section.
Option C - Dry Run of Varic Policy and Planning Options
This option enables the SDA to determine precisely what effect variouspolicy and planning options would have on the expected perforniance.
Complete only Columns B, D, E, and F of the top section and Column Bin the bottom section.
Description of top section columns
Column A.Column BColumn CColumn D.
Column E.Column F.
National average percent of termineesActual percent of terminees at the time of reviewPercent of national averageCalculates the effect on each standard indicated withinthe column (B1) actual terminee percent minus (Al)national st,ndard times (.20) weight = effectSame as aboveSame as above
Description of bottom section columns
Column A.Column B.
Column C.
Actual local measuresP ctual standards if calculated on actual current terminees(The SDA must add the effects from each column aboveand adjust the national standard to display the local modeladjusted standard)Percent of standard
42
Worksheet 13
MODEL ADJUSTEr WORKSHEET FOR ANALYZING CLIENT CHARACTERISTIC PERFORMANCE- ADULT - PY 88
Variables National Actual Percent Standards Calculations
A. B. C. D. E. F.
Adult [B/A] Ent. Emp. Rate Cost/Ent. Emp. Average Wage1. Female 55.1 [B1-A1(-.073)] ////////////////////////// [B1-A1(-.0065)]2. Age 30 + 52.8 [B2-A2(-.166)] ////////////////////////// [B2-A2(.0033)]3. Black 23.2 [B3-A3(-.055)] ////////////////////////// [B3-A3(-.0051)]4. hispanic 8.3 ////////////////////////// [B4-A4(6.2)] ////////////////////////////5. Dropout 24.8 [B5-A5(-.177)] ////////////////////////// [B5-A5(-.0119)]6. UC Claimant 10.3 ////////////////////////// [B6-A6(34.5)] [B6-A6(.0290)17. Unempl. 15 + 48.7 [B7-A7(-.015)] 11111111111111111111111111 [87-A7(-.0015)]8. Not in Labor F 11.9 [B8 -A8(- .082)] [B8-A8(8.0)] ////////////////////////////9. AFDC 23.8 [B9-A9(-.159)] [B9- A9(24.9)] ////////////////////////////
10. GA/RCA 5.2 [B10-A10(-.312)] [B10-A10(15.2)] / / / / / / / / / / / / / / / / / / / / / / / / ////11. % Term i nees 73.2 ////////////////////////// [B11-A11(-22.0)] ////////////////////////////
1 12. Unemp. Rate 7.4 [B12-Al2(-.608)] [812-Al2(63.8)] [B12-Al2(-.0089)13 Avg. Earn FtNV 12.5 ////////////////////////// [B13-A13(79.0)] [B13-A13(.0936)14. Pop. Density 0.7 [B14-A14(.633)] [B14- A14(79.5)] [B14-A14(.0880)]15. Fam. Bel. Pov. 9.7 / / / / / / / / / / / / / / / / / / / / / / / ///// 1111111111111111111111111111 [B 15-A 1 5(-.0307)16. Emp. Res Ratio 99.9 [B16 -A16(- .064)] /////////////////////////// [B16-A16(-.0040)
NET EFFECTSummary Section
NET EFFECT NET EFFECT
Performance Standard Calculated Standard Actual Performance % of Standard
Entered Employment Rate "D" Net Effect = + 68.0 =Cost per Entered Employment E" Net Effect + $4500 =Welfare Entered Employment "D" Net Effect = + 68.0 ( x)* =Wa e at Placement "F" Net Effect = + $4.95 =
*(9 Public Assistance Employment Ratio (Method II )
Worksheet 14MODEL ADJUSTMENT WORKSHEET FOR ANALYZING CLIENT CHARACTERISTIC PERFORMANCE - YOUTH - PY 88
Variables
Youth
Nat'l. Actual Pct. Standards Calculations
A. B. C. D. E. F. G.
[B/A] Ent. Emp. Rate Pos. Term. Rate Emp. Enh. Rate Cost/Pos. Term.
1. Female 50.6 //////////////////////// //////////////////////// [B1-A1( + .064)] ////////////////////////2. Age 14-15 5.7 182-A2(-.589)1 //////////////////////// [82-A2( + .332)] [82-A2(--9.5)]3. Age 16-17 32.4 //////////////////////// 11////////////////////// 1B3 -A3( + .070)] ////////////////////////4. Black 27.9 184-A4(-.058)1 184-A4(-.045)1 [84-A4( -.071)] ////////////////////////5. Hispanic 9.5 //////////////////////// 185-A5(-.060)1 //////////////////////// [B5-A5( + 9.0)]6. Asian/Pacific Islander 2.0 //////////////////////// 1 1 186-A6(-.122)1 //////////////////////// ////////////////////////7. Dropout 25.2 187-A7(-.273)1 //////////////////////// [B7-A7( + .139)1 ////////////////////////8. Student 39.6 1138-A8(-.30)] 188-A8( + .027)] [B8-A8( + .253)] 1138-A8(-11.7)1
9. Post High School 6.7 //////////////////////// [89-A9( + .202)] //////////////////////// ////////////////////////10. Handicapped 15.9 1810-A10(-.068)] ////////////1/////////// [B10-A10(-.1031 [B10-A10( + 8.4)]11. Not in Labor Force
12. AFDC Recipient
38.7
20.4
////////////////////////
////////////////////////
////////////////////////
////////////////////////
1811-A11(+.091)1
[B12-Al2(-.035)]
[B11-Al 1( + 4.2)J
1812-Al2(+ 15.1)]13. GA / RCA Recipient
14. % Terminees
3.3
75.9
[813-A13(- .370)]
////////////////////////
[B13-A13(-.116)]
////////////////////////
[813-A13(-.067)]
////////////////////////
////////////////////////
[B14-A14(-18.5)]15. Unemployment Rate 7.4 [815-815(-1.333)] //////////////////////// //////////////////////// ////////////////////////16. Population Density .7 //////////////////////// [816-A16( .1.353)] //////////////////////// ////////////////////////17. Emp. Res/ Ratio 99.9 //////////////////////// [817-A17(- 051)] //////////////////////// ////////////////////////18. Avg. Earnings - W/R 12.5 //////////////////////// //////////////////////// //////////////////////// [818-A18( + 64.3)]
NET EFFECT
Summary Section
NET EFFECT NET EFFECT NET EFFECT
Performance Standard Calculated Standard Actual Performance % of Standard
Entered Employment Rate "D" Net Effect = + 45,0 =Positive Termination Rate "E" Net Effect = + 75.0 =Employability Enhancement Rate "D" Net Effect = + 30.0 =Cost per Positive Termination "F" Net Effect = + $4900 = r r-N
161
(x) is Public Assistance Employment Ratio ( Method 11)
Worksheet 15
MODEL ADJUSTED WORKSHEET FOR ANALYZING CLIENT CHARACTERISTIC PERFORMANCEADULT POST PROGRAM - PY 88
Variables National Actual Percent Standards Calculations
A. B. C. D. I F.
Adult [B/A) F/U. Emp. Rate F/U Weeks Work. F/U Wkly Earn1. Female 55.1 ////////////////////////// ////////////////1///////// [B1-A1(-.511))2. Age 30 + 52.8 [B2-A2(-.112)] 11111111111111111111111M [B2-A2(.488)]3. Black 23.2 1133-A3(-.130)) (83-A3(-.015)) (133-A3(-.081))4. Dropout 24.8 1134-A4(-.122)) 1134-A4(-.013)) [B4-A4(-.360))5. UC Claimant 10.3 185-A5(.123)1 ////////////////////////// [85-A5(1.516)]6. Offender 8.4 1136-A6(-.131)) ////////////////////////// (136-A6(-.361))7. Unempl. 15 + 48.7 [B7-A7(-.038)) (137-A7(-.016)] (137-A7(-.097)]8. Not in Labor F 11,9 [B8-A8(-.049)i ////////////////////////// ////////////////////////////9. AFDC 23.8 1E19-A9(-109)1 (139-A9(-.014)) ////////////////////////////
10. GA/RCA 5.2 [810-A 10(- .165)1 [B 10-A 10(- .023)] ////////////////////////////11. Unemp. Rate 7,4 [B11-A11(-.860)] (1111-A11(-.060)] [811-A1U-1.167)12. Avg. Earn RNV 12.5 11111111111111111111111111 1111111111111111MIIIIIIM 1E112-Al2(2.867)13. Pop. Density 0.7 ////////////////////////// //////////////////////////// [813-A13(2.248)14. Fam. Bel. Pov. 9.7 /////////////////////////! //////////////////////////// (B14-Al4(-.762))15. Emp. Res Ratio 99,9 1111111111111111111111111111 1111111111111111111111111111 (1315-A15(-.222)
NET EFFECT
Summary Section
NET EFFECT NET EFFECT
Performance Standard Calculated Standard Actual Performance % of Standard
Follow-Up Employment Rate "D" Net Effect = + 60.0 =
Follow-Up Weeks Worked "E" Net Effect = + 8.0 =
Follow-Up Weekly Earnings "F" Net Effect = + $177 =
i F/U Welfare Employment Rate "D" Net Effect = + 60.0 (x)* =
61
Worksheet 16
MODEL ADJUSTED WORKSHEET FOR ANALYZING CLIENT CHARACTERISTIC PERFORMANC:ADULT WELFARE - PY 88
Variables National Actual Percent Standarts Calculations
A. a. C. D. E.
All Adult Welfare [B/A) Ent. Emp. Rate F/U Emp. Rate.
1. Black 32.6 B1 -A1 -.046) B1-A1(-.126
2. Dropout 29.9 (B2-A2(-.155)) [B2-A2(-.211))
3. Unemployed 15 + weeks 54.8 [1:13-A3(-.068)) [B3-A3(-.026)1
4. Not in Labor Force 19.0 (134-A4(-.151)) [B4-A4(-.085))
5. GA/RCA 13.9 (135-A5(-.139)) 135-A5(-.089))
6. Unemployment Rate 7.4 [B6-A6(-.721)) [B6-A6(-.928))
NET EFFECT
Summary Section
NET EFFECT
Performance Standard Calculated Standard Actual Performance % of Standard
Entered Employment Rate "Di' Net Effect = + 56.0 =
Follow-Up Employment Rate "E" Net Effect = + 50.0 =
"S
Section IV: Conclusion
The Job Training Partnership Act emphasizes the importance ofprogram performance and outcomes. The law establishes specificcriteria by which an SDA will be measured and provides for theaward of incentives for local programs that perform exceptionallywell. The purpose is to ensure that SDAs concentrate on resultsrather than on regulatory and compliance issues.
Performance standards are not "the" solution to effective JTPAplanning and management but rather one of several tools availableto administrators to improve performance. The challenge to localpolicy makers is to use performance standards to guide the develop-ment of programs which will meet the needs of the eligiblepopulation and business community.
In order to work effectively with performance standards, an SDAmust have a clear vision of its role in the community. The key policyissues related to who is served and with what type of programsshould be important building blocks in establishing the foundationfor planning and management decisions. Since the performancestandards model adjusts an SDA's standards primarily according towh- is served and the program design, the degree of difficulty inmeeting the standards will be the same, regardless of who is servedand what is done for them. For local policy makers who fail toestablish this foundation, performance standards can easily becomea "numbers game," unrelated to the mission and purpose of JTPA inthe community.
As J TPA matures, the performance standards system will continueto change. New measures will be added, standards will be modified,the adjustment methodology will be revised, and so on. However,the need to understand the dynamics of the performance standardssystem and its relationship to planning and management willremain constant. SDAs need to stay abreast of such changes andupdate and refine their planning and management practices aschanges occur.
(1447
National Alliance of Business Regional Service Offices
New England
Connecticut, Maine, Massachusetts,New Hampshire, Rhode Island,Vermont
Joseph Fischer, Regional VicePresident20 Walnut StreetWellesley Hills, Massachusetts 02181(617) 235-1332
Atlantic
Delaware, District of Columbia,Maryland, New Jersey, New York,Pennsylvania, Puerto Rico, VirginIslands, Virginia, West Virginia
Fredrick RadlmannRegional Vice President317 George StreetNew Brunswick, New Jersey 08901(201) 524-4110
Southeast
Alabama, Florida, Georgia, Kentucky,Mississippi, North Carolina, SouthCarolina, Tennessee
William 'Sonny' WalkerRegional Vice President100 Edgewood Avenue, Suite 1800Atlanta, Georgia 30303(404) 522-9350
Midwest
Illinois, Indiana, Iowa, Michigan,Minnesota, North Dakota, Ohio, SouthDakota, Wisconsin
48
Midwest coned
Vincent SerritellaRegional Vice PresidentEleven East Adams, Suite 1008Chicago, Illinois 60603(312) 341-9766
Central
Arkansas, Kansas, Louisiana,Missouri, Nebraska, New Mexico,Oklahoma, Texas
Henry McHenryRegional Vice President9400 North Central ExpresswayDallas, Texas 75231(214) 373-0854
West
Arizona, California, Colorado, Guam,Hawaii, Nevada, Pacific Islands, Utah,Wyoming
Janet KeyesRegional Vice President350 Sansome St. Suite 1040San Francisco, California 94104(415) 391-4061
Pacific Northwest
Alaska, Idaho, Montana, Oregon,Washington
Rueben Flores, Regional VicePresident427 Skinner BuildingSeattle, Washington, 98101(206) 622-2531