Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
0
Workforce Innovation Technical Assistance CenterFORMATIVE AND SUMMATIVE EVALUATION:
YEARS ONE, TWO, AND THREE
NOVEMBER 12, 2018(Early Draft – Addendum to Come End of Year With All Data)
I. Table of Contents
I. Table of Contents..........................................................................................................1
II. Executive Summary...................................................................................................4
III. Introduction................................................................................................................9
IV. Evaluation Methods and Sources.............................................................................10
A. Theoretical Frameworks Guiding WINTAC Structure and Services..................11
B. Theoretical Frameworks Guiding WINTAC Evaluation.....................................12
i. Logic Model and Theories of Change..............................................................12
VI. Formative Evaluation Focus Areas and Findings....................................................15
A. Implementing Conceptual, Theoretical, and Evidence-Based Frameworks........15
B. Achieving Collective Impact...............................................................................17
C. Meeting Needs.....................................................................................................20
D. Summary..............................................................................................................22
VIII. Summative Evaluation Focus Areas and Findings..............................................24
A. Universal Technical Assistance...........................................................................24
i. Website Resources...........................................................................................24
ii. WINTAC Contacts Through Website..............................................................24
iii. WINTAC Website Traffic................................................................................25
iv. WINTAC Webpage Evaluations......................................................................28
v. Summary..........................................................................................................33
1
B. Targeted Technical Assistance.............................................................................34
i. Overview..........................................................................................................34
ii. One-on-one targeted and specialized TA.........................................................34
iii. Immediate Post-Training Evaluations..............................................................36
iv. Six-Month Training Follow-up Evaluations....................................................39
v. Communities of Practice..................................................................................47
vi. Summary..........................................................................................................53
C. Intensive Technical Assistance............................................................................54
i. Pre-ETS progress..............................................................................................61
ii. Section 511 progress........................................................................................64
iii. Competitive Integrated Employment...............................................................65
iv. Workforce Integration......................................................................................67
v. Common Performance Measures.....................................................................69
vi. Summary..........................................................................................................71
IX. Workforce Innovation Pilot Projects (WIPPs) and Special Projects.......................73
A. The Career Index Plus..........................................................................................73
B. SARA...................................................................................................................75
C. Peer Mentoring.....................................................................................................81
D. Integration Continuum.........................................................................................85
E. Customized Employment.....................................................................................87
2
F. Other Special Project Possibilities.....................................................................104
G. Summary............................................................................................................105
X. Conclusions and Recommendations......................................................................106
XI. References..............................................................................................................108
3
II. Executive Summary
The Workforce Innovation Technical Assistance Center (WINTAC) provides training and
technical assistance (TA) to State Vocational Rehabilitation Agencies (SVRAs) and related agencies
and rehabilitation professionals and service providers to help them develop the skills and processes
needed to meet the requirements of the Workforce Innovation and Opportunity Act (WIOA). The
WINTAC offers three types of TA: (1) intensive, sustained TA, (2) targeted, specialized TA, and (3)
universal, general TA in five key areas: (1) Pre-Employment Transition Services, (2) Section 511,
(3) Competitive, Integrated Employment, (4) Integration of VR into the Workforce Development
System, and (5) Common Performance Measures, along with Workforce Innovation Pilot Programs
and Special Projects.
This report documents the findings of the program evaluation for the first three years of
WINTAC implementation, with emphases on formative issues, outputs, and short-term outcomes
given the early stages of progress to be expected at this juncture.1 This report is a cumulative report
that includes analyses and findings from each year to ensure a complete and comprehensively
packaged summary of the first three years of WINTAC operation.
In Year 1 an aggregation across conceptual programmatic and conceptual evaluation
frameworks yielded an integrated set of principles by which WINTAC’s formative structure and
progress could be evaluated. Using those principles (understanding SVRA context, ensuring
stakeholder engagement, and assessing needs), evaluations to date find WINTAC to be making
strong progress. These principles were met both at the time of the proposal and continue to be
important as WINTAC implements its activities in all areas. At the end of Year Three for example,
1 This draft of the Year Three Report is being provided in early November of the fourth year, prior to the end of the first quarter after the prior year’s end. An addendum report will be provided at the beginning of the second quarter of Year Four which will more fully capture the results of the first three years’ performance.
4
these principles were used to plan (and begin to implement) a strategic approach to the various
Communities of Practice in place and starting anew in Year Four; thus, the framework remains
relevant and useful as WINTAC continues to build and implement its array of services. The
partnerships that drive and support stakeholder engagement and inform WINTAC of current needs
remain vibrant.
A comparison of stakeholders’ articulated needs (from the collaborative needs assessment
conducted in Year 1 by PEQA-TAC on behalf of TACs) and the nature and amount of WINTAC
services across topic areas across the three years clearly demonstrates that WINTAC is meeting
needs and providing services beyond those which were even anticipated by the SVRAs themselves –
notably, the need for Pre-ETS.
And finally, ongoing examinations of collective impact achieved by WINTAC demonstrates
(1) the effective architecture and relationships in place for the partners that form the WINTAC to be
successful in progressing WIOA implementation nationally and increasing best practices by SVRAs
to increase employment outcomes for their clients in a way that exceeds what any individual entity
would have been able to do operating independently and (2) the behavioral and systems changes we
need to see as a lever to client outcomes.
As the gateway to Universal TA, it is important to ensure the WINTAC’s website is effective
in providing information that stakeholders need and which they ultimately use. Website traffic
statistics showed an increase each quarter in Year 2 in terms of unique visitors, page views, and
visits/session and a steady pace for returning visitors, new visitors, pages viewed per session and
duration per session. In Year 3 this pattern was largely the same, although new visitors, visits per
session, and pages per session were all markedly up from Year 2 demonstrating an ongoing pattern
of increased utility of the WINTAC website. Cumulatively, since going online in October of 2015,
5
the WINTAC website has had 97,352 visitors with 52, 210 returning (53%) – a strong return rate
demonstrating that stakeholders indeed view WINTAC as an important and trustworthy resource for
information, updates, and assistance. Cumulatively, there was a total of page views of 488,704. In all
years the highest demand for information topically was related to Pre-ETS.
Stakeholder evaluations for the website are done in two ways: (1) through a button on the
website that allows visitors to evaluate the specific page they are viewing and (2) through follow-up
surveys we send to individuals who conducted the web-page evaluations. Both forms of evaluation
demonstrated positive experiences with the majority of individuals predicting they would use the
information for general knowledge development.
In Year One, WINTAC engaged in targeted TA with 30 SVRAs. In Year Two, WINTAC
provided over 1,000 instances of targeted TA with 50 SVRAs through webinars, live trainings, calls
and emails. Over Year Three, WINTAC engaged in one-on-one targeted TA with 46 SVRAs, and
several joint TA sessions with multiple SVRAs.
Targeted TA also includes trainings, which are evaluated immediately afterwards and
approximately six months hence. Participants routinely express high levels of satisfaction with
trainings provided immediately following the trainings as well as in the follow-up evaluation
surveys. Participants have also started putting the knowledge gained through the targeted TA
trainings into action at their agencies. In Year Two, suggestions for future improvements have
included making the training sessions longer time to allow more time to go over the materials,
opportunities for ongoing and continuous learning, and getting more practical knowledge through
other agencies’ practices or hands-on learning. In Year Three, evaluations demonstrated that these
changes had been made and participants rated their experiences highly. Follow-up surveys
demonstrated that over half of participants were already using what they had learned and putting it
6
into practice. Community of Practice distribution lists collectively numbered at 2, 745 in Year Two
and 3, 393 in Year Three. Robust plans for evaluation have been outlined and will be collaboratively
finalized for implementation in Year 3. The Job-Driven Strategies for Serving Individuals Who Are
Blind/Visually Impaired Summit is an example of how CoPs, Summits, and Forums can be designed
with input from the beginning (needs assessing, goal-setting), evaluated over time, and yield tangible
outputs and outcomes.
WINTAC is implementing 32 intensive TA agreements signed with 34 SVRAs. Seventy
percent of the agreed-upon activities have been initiated, and two-fifths (38%) are complete. In Year
Three (and the very early part of Year Four), a significant undertaking to refine and specify the
language of activities, outputs, and short-term outcomes was conducted and new topic area logic
models developed for most topic areas. The area of Section 511 on subminimum wage did not
require this as it has accomplished its key objectives completely. Mechanisms to provide updates on
progress were also refined, so that instead of being one “lump” category, incremental but important
steps in progress can be identified.
WIPPs and Special projects offer WINTAC an opportunity to engage in focused activity in
innovative areas, using innovative techniques and technology, and to elevate the evaluations of
impact associated with WINTAC’s primary topic areas. Year Two saw the development of robust
plans of evaluation for several of these areas, notably SARA and the Integration Continuum Tools
and Year Three saw these evolve further and begin implementation. Efforts and discussions in Years
One and Two also reshaped some activities and features of tools being provided, a reflection of
WINTAC’s commitment to continuous quality improvement. Year Three saw Customized
Employment in particular ramp up significantly in terms of the number of pilot projects and the
progress in their implementation. Evaluation plans have been developed and implemented and allow
7
for comparison across common structures, while being responsive to the needs of SVRAs. Several
tools to facilitate CE effective implementation by agencies have been developed and the Evaluation
Team has constructed data collection and evaluation tools as well. Findings will allow for evaluation
of effective training approaches, examination of integrity in implementation, comparison of
implementation approaches in terms of agency structures (e.g., fee schedules, turnover and training
approaches, etc.), and the connection between these factors and client outcomes. Unforeseen impacts
reported demonstrate significant and valuable systems changes that have already taken place.
WINTAC’s performance demonstrates its unmitigated commitment to providing the highest
level of service to any SVRA that needs it. Meeting the minimum requirements is the least of any
TA Specialist’s concern. WINTAC staff are driven to ensure WIOA implementation is effective, and
most importantly, impactful. WINTAC’s WIPPs and Special Projects also highlight this level of
commitment. When the tools to implement or measure WIOA implementation are not available or
are insufficient, WINTAC staff develop, validate, improve, and provide them. Resources developed
will not only have value for states currently receiving intensive TA, but the field of vocational
rehabilitation, workforce development, and disability employment at large.
8
III. Introduction
The Workforce Innovation Technical Assistance Center (WINTAC) provides training and
technical assistance (TA) to State Vocational Rehabilitation Agencies (SVRAs) and related agencies
and rehabilitation professionals and service providers to help them develop the skills and processes
needed to meet the requirements of the Workforce Innovation and Opportunity Act (WIOA). Led by
the Interwork Institute at San Diego State University, the WINTAC is a collaboration of the National
Disability Institute and their LEAD Center, the George Washington University’s Center for
Rehabilitation Counseling Research and Education, the University of Arkansas CURRENTS,
PolicyWorks, the Career Index, the Council of State Administrators of Vocational Rehabilitation,
and the Burton Blatt Institute at Syracuse University. The WINTAC is funded by a five-year grant
from the Rehabilitation Services Administration. The WINTAC focuses on five topic areas:
Pre-ETS: Provision of pre-employment transition services to students with disabilities
and supported employment services to youth with disabilities;
Section 511: Implementation of the requirements in section 511 of the Rehabilitation Act
that are under the purview of the Department of Education;
Competitive Integrated Employment (CIE): Provision of resources and strategies to help
individuals with disabilities achieve competitive integrated employment, including
customized and supported employment;
Integration of the State VR program into the workforce development system; and
Common Performance Measures (CPM): Transition to the new common performance
accountability system under section 116 of WIOA, including the collection and reporting
of common data elements.
9
The WINTAC engages in three categories of activities: knowledge development activities,
technical assistance and dissemination activities, and coordination activities. Its primary focus is on
providing TA for each topic area. The WINTAC offers three types of TA: (1) intensive, sustained
TA, (2) targeted, specialized TA, and (3) universal, general TA. Partners with a lead role in the five
topic areas are responsible for providing TA to SVRAs and their partners in that topic area, with
support and in concert with the partners that have a supportive role in the identified topic area.
This report is an early presentation of the findings of the program evaluation for the first
three years of WINTAC implementation, with a review of key developments in Years 1 and 2, VR
staff members’ evaluations of TA and training services over the years informing continuous quality
improvement, and progress in implementation of intensive TA and special pilot project activities in
Year 3. Summative evaluation for intensive TA outcomes is presented more fully with progress on
outputs and short-term outcomes both in quantitative and qualitative terms as WINTAC crosses the
midpoint in the grant cycle.
IV. Evaluation Methods and Sources
The evaluation staff are represented on the WINTAC Leadership Team and participate in all
Leadership Team conference calls and in-person meetings. The evaluation thus includes first-hand
observation of the team’s performance through participation in activities with them. In addition, this
evaluation uses materials and communications produced by the WINTAC in the conduct of its
activities and formal reporting and tracking data. In Year 1, materials reviewed included quarterly
reports, google analytics, needs assessment data, the TA tracking system, the WINTAC Googledrive
folder, meeting notes, draft and final TA agreements, and follow-up Targeted TA training
evaluations. In Year 2, additional sources of data included webpage evaluations and follow-up
surveys of webpage visitors, follow-up surveys of recipients of targeted TA, and TA Team provided
10
responses on the output and outcome tracking spreadsheet. Year 2 also brought additional progress
on the development and implementation of individual plans of evaluation for the topical areas that
reflect a specialized approach tailored to specific subject matter areas and projects (e.g.,
development of the integration continuum assessment tool, SARA evaluation, and other areas). In
Year 3 the aforementioned sources were again consulted.
As described in the Year 1 report, formative evaluation questions were developed based on
the collaborative partnerships forming the WINTAC itself, as well as theoretical frameworks
grounding its approach to services and evaluation and summative evaluation questions were
developed based on the workplan, expected deliverables, and the nature and types of services being
provided by the WINTAC.
A. Theoretical Frameworks Guiding WINTAC Structure and Services
Three key conceptual frameworks or theoretical approaches, as well as additional evidence-
based approaches and strategies inform the structure and services of the WINTAC. These
frameworks guide the formative evaluation of the WINTAC; although critically important to laying
the foundation down for WINTAC establishment in early years, an evaluation of WINTAC’s
incorporation of principles from these frameworks will be conducted in all years. These frameworks
include (1) Bryson’s (2011) model for organizational planning and systems change emphasizing
early and ongoing planning discussions that include stakeholders and clear definitions of inputs,
outputs, and outcomes as defined by the use of logic models; (2) the Quality Enhancement Research
Initiative (QERI) (Stetler, McQueen, Demakis & Mittman, 2008) noting the importance of
identifying cultural norms, capacity, and supportive infrastructures to ensure change efforts fit into
or modify those constraints (Van Achterberg, Schooven & Grol, 2008); (3) Bolman & Deal’s (2013)
Four Frames Model advocating integration across four key pillars (structural, human resource,
11
political and symbolic frames); and (4) principles from adult learning and implementation science
emphasizing evidence-based approaches to tailoring TA (Bryan et al., 2009; Knowles, 2006; Odom,
Cox & Brock, 2013).
B. Theoretical Frameworks Guiding WINTAC Evaluation
In addition to the conceptual, theoretical, and evidence-based frameworks reviewed above
which guide the approach to WINTAC’s structuring and provision of TA services, the WINTAC
turns for guidance to other models when implementing knowledge translation / dissemination
strategies for universal TA and when evaluating its impact. With an emphasis on performance
feedback and continuous quality improvement, the WINTAC evaluation draws from (1) several
effective evaluation practices including Utilization Focused Evaluation, Stakeholder Based
Evaluation, and Real World Evaluation (Bamberger, Rugh, & Mabry, 2006; Cousins & Earl, 1992;
Patton, 2008; Rossi et al., 2004); (2) the Knowledge-to-Action framework (Graham et al., 2006); (3)
and The Collective Impact Model (Kania & Kramer, 2011). The formative and summative
evaluation techniques examine “how well the WINTAC is working” and “what difference the
WINTAC is making,” key questions identified by all evaluation models.
i. Logic Model and Theories of Change
The WINTAC and its evaluation are guided by a logic model that outlines in broad strokes
the inputs (partnerships and expertise that form the backbone and resources available to WINTAC),
activities (from knowledge development to all levels of training and technical assistance), outputs
(from reports to curricula to communities of practice), and to outcomes (short-, mid-, and long-term).
Figure 1 below represents the logic model as it was co-developed by partners (representing
stakeholders) at the time of the proposal. In the first two years of WINTAC operation, this logic
model has continued to be applicable and relevant.
12
Though some suggest that logic models and theories of change are the same, others have
argued a theory of change represents a far more detailed understanding of causal links between
program activities and outcomes achieved (De Silva, Breuer, Lee, Asher, Chowdhary, Lund, et al.,
2014) and that logic models are more of an overview and tool for conducting summative evaluation
that tracks results, usually for funders (Prinsen, & Nijhof, 2015). WINTAC’s mandate and scope of
work is vast and broad, preventing an adequate one-size captures all framing or universal theory of
change. Instead, its logic model is parsimonious and high-level and intended to evolve towards
greater detail and specification. This evolution is primarily occurring through the development of
several logic models for each of its specified subject matter areas, with details additionally identified
in intensive TA agreements developed collaboratively with SVRAs and other TACs. These more
specified and expansive models may be more appropriately considered theories of change and are
presented in the summative evaluation section below analyzing progress of intensive TA activities
by topic area. While the activities, outputs, and short-term outcomes for each of those topic area
logic models are customized and specified, the inputs and long-term outcomes for all of WINTAC
remain the same and are noted here.
13
Increasing SVRA Skills & Processes, Implement Significant New Activities, Collaborate With Workforce Development System, And Meet Requirements Of WIOAPartners/Inputs Activities Outputs Short/Mid OutcomesSDSU-II, CSAVR, GU, NDI-LEAD, BBI, SMEs’Expertise:
*State & Federal VR Program and Practices*Existing Capacity, Needs, & Resources of SVRAs*WIOA and the Rehabilitation Act*Workforce Development System*Pre-employment transition services*Customized & Supported Employment*Common Performance Accountability System*Adult Learning Principles*Implementation Science*Evaluation
Knowledge Development:
*Needs Assessment of all SVRAs & surveys of all stakeholders*Systematic review of literature, RSA monitoring reports, RSA-911 data, DOE and DOL policies and guidance, and SVRA MOUs/MOAs
Training & TA:
*Intensive sustained TA*Targeted specialized TA *Universal TA*Training on collaboration with Workforce Dev. System*Training on performance accountability system, Career Index Plus & Career GPS*Webinars on emerging & best practicesCreate:*Collaborations through peer mentor networks, 5 CoPs*IT platform
*Public reports of evidence-based and promising practices*Documentation of existing SVRA capacity and needs for training & TA related to WIOA*5 Curriculum guides for SVRA training*5 Communities of Practice*Comprehensive Evaluation Report in Year Three, with standard reports in other years and a summative report in Year Five
*Increased skills of SVRAs to meet WIOA requirements *Enhanced SVRA processes to meet WIOA requirements*Increased pre-employment transition services to SWD*Increased supported employment services to YWD*Implementation of Section511 of the Rehabilitation Act*Increased access to supported employment and customized employment supports for youth and adults with MSD*Implementation of new common performance accountability system.
Long-term Outcome: Improved and increased competitive integrated employment outcomes for VR clients due to increased and improved service delivery and collaboration with the workforce development system as a result of innovative, WIOA-focused employment strategies.
Figure 1. WINTAC Logic Model
V.
14
VI. Formative Evaluation Focus Areas and Findings
In the first year, the emphasis of the evaluation was on formative issues: the coming together
of partners to create the consortium that serves as the WINTAC, the establishment of organizational
structures and processes to facilitate its operation, and the approaches to service delivery the
WINTAC has taken. After the second year, a deeper look at formative evaluation was again
conducted. In this third year evaluation we continue to examine collective impact through a
formative lens, but begin the switch to a summative perspective. Another formative evaluation issue
revisited includes responsiveness to needs articulated by WINTAC’s audience.
A. Implementing Conceptual, Theoretical, and Evidence-Based Frameworks
Several different frameworks inform both the services and evaluation of WINTAC as
reviewed above. Across those theoretical models, conceptual frameworks, and evidence-based
approaches are articulated several constructs or elements, with each model or framework having its
own terms of art. Despite the differences in nomenclature, we find these elements are defined
similarly and thus represent a consensus in the literature on core issues that impact a successful
approach to constructing a multi-partner center and providing effective, impactful services resulting
in change. An analysis conducted for the Year 1 report revealed the following core issues: (1)
Context: the values, cultures, norms, infrastructures, human resource capacity, politics, and symbolic
environments in which constituents operate, (2) Engagement: the degree to which constituents, as
key stakeholders, are a part of the development of plans for what services will be delivered, how
they will be delivered, and how services will be evaluated, and (3) Assessment of Need: surveying
one’s constituents to determine their priority areas of need so that expertise is matched and provided
appropriately.
15
Year 1 findings2 indicated a strong integration of all principles throughout WINTAC’s
operation. “Context” of SVRAs was a part of the planning process for the WINTAC during its
proposal stage through partnership with individuals from organizations who have worked with
and/or provided services previously to SVRAs, as well as CSAVR who is the guild agency
representing the interests of SVRAs nationally.3 These partnerships have continued from the
proposal phase through to WINTAC’s ongoing operation ensuring a continuous process of
reevaluating the context within which SVRAs operate – an important consideration given the
changing landscape of regulations, funding, and mandates at the state and federal levels (particularly
the ongoing regulations resulting from WIOA implementation).
“Context,” “engagement,” and “assessment of need”45 were also built in to the very process
of TA provision itself and continue to be important as part of the collaborative process of both
drafting the intensive TA agreements and conducting ongoing updates. WINTAC intensive TA
agreements have been structured to ensure all elements are addressed and documented.6 These
agreements are not static contracts; rather, it is important to note that they can be iterative and
ongoing and be amended to become consistent with changing SVRA needs and resources when
significant changes occur and expectations for outcomes become further clarified. Indeed, as
described more fully below, intensive TA agreements drafted at the very start of WINTAC were
2 Appendix A excerpts the full analysis from Year One of these findings in detail.3 These groups are, represent, and/or serve stakeholder populations including youth and adults with disabilities; ethnically and geographically diverse groups with disabilities; rehabilitation professionals; administrators and executives; workforce development professionals; and researchers.4 In year 1 a baseline needs assessment was conducted of all 80 SVRAs. Data relevant to WINTAC was analyzed for the Year 1 Formative Evaluation Report. 5 WINTAC has commendably operationalized a distinction between “engagement” and “assessment of need.” Though these elements have clear overlap when done well (assessing need through direct contact with affected stakeholders, rather than through use of secondary source reports only), perfect overlap would mean not only redundancy but also sub-optimal or superficial interaction with stakeholders. Rather, “engagement” must mean more than reporting on need (and later on satisfaction); “engagement” must also mean the involvement of stakeholder voices in the development of service plans and the ways in which to assess their efficacy or utility. In this way, “engagement” entails full participation and enhances relevance for targeted audiences.6 Indeed, the template for the intensive TA agreements follows a logic model approach adapted from the 2004 Kellogg Foundation’s Logic Model Development Guide.
16
more variable (even within the same subject matter area) than those developed later in Year 2 and in
Year 3, demonstrating a growth in understanding between both SVRAs and WINTAC staff about
critical needs to be addressed, services best addressing them, and common expectations for outputs
and outcomes that are both reasonable and important to achieve.
B. Achieving Collective Impact
Collective impact is a way to examine whether social sector initiatives can coordinate across
sectors and, involving multiple partners and systems, create large-scale social change and impact
better than individual organizations. Based on Kania and Kramer’s (2011) articulation, collective
impact is successful when five specific conditions are met: (1) there is a common agenda, (2) there is
a shared management system, (3) there are mutually reinforcing activities, (4) there is continuous
communication, and (5) there are backbone support organizations for the overall initiative.
Across all three years, WINTAC continues to represent a successful implementation of these
five guiding principles. (1) Common Agenda: RSA’s expectations for WINTAC set the foundation
for the common agenda that is proscribed by the provision of funding to SDSU and its partners.
These expectations involve provision of services in five core subject matter areas (see Section II.
Introduction), but also involve special projects. (2) Shared Management: SDSU has created multiple
mechanisms and platforms for shared management involving information technology solutions (e.g.,
a common email system, online document sharing and archiving, and online reporting of activities
amongst others), as well as leadership-based systems whereby each partner is represented on a
leadership team and has primary responsibility for either a subject matter area or a jurisdictional
area. (3) Reinforcing Activities: As a team, WINTAC brings an extensive depth of expertise across a
breadth of subject matter relevant to disability employment including, but not limited to,
rehabilitation counselor training, continuing education, and leadership; cross-agency partnering and
17
collaborations; serving populations in transition, across the spectrum of disability, and across diverse
geographic and ethnic populations; career pathways; supporting SVRA systems change efforts;
knowledge translation and knowledge brokering of innovative and evidence-based best practices;
implementation of Communities of Practice and Peer Mentoring Networks; development and use of
electronic tools supporting SVRA provision of services, administration, case management, access to
career information, and other activities; and program evaluation. This expertise is shared across
subject matter areas, rather than divided up piecemeal into silos. Additionally, subject matter areas
blend naturally into one another and leverage knowledge gained from one area to support
implementation in another. One example is the connection between improved integration (one area)
supporting reporting of common performance measures (another area). States that are piloting The
Career Index Plus also benefit from a tool to support improved integration and reporting. Thus,
different teams of the WINTAC work together to provide universal, targeted, or intensive technical
assistance seamlessly and collaboratively ensuring both effective service provision to SVRAs, but
also (importantly for SVRAs), efficient service provision. This commitment to collaborative and
reinforcing activities is also seen in the way “joint” intensive technical assistance activities are
carried out between the WINTAC and other TACs funded by RSA which may have partially
overlapping foci (e.g., JDVRTAC, Y-TAC and NTACT). (4) Continuous Communication: SDSU
has established multiple standard channels for ongoing and regular communication and strategic
decision-making through shared management systems as described above, and regularized meetings
which take place for all teams every two weeks virtually and twice a year in-person. In addition,
subject matter teams have ongoing meetings to organize their work and are regularly joined by staff
from SDSU. (5) Backbone Support: SDSU serves as the backbone to a cohesive set of partners who
18
function as one entity and have become more than the sum of their individual parts.7 WINTAC is not
a loose collaboration of disparate entities providing services in a silo. Rather, partners are all actively
and continuously engaged as part of a leadership group, ensuring the inputs of resources and
expertise into the WINTAC is as needed on an ongoing basis. SDSU provides core funding for
WINTAC-related activities to all partners and requires defined workplans, and updates shared on
teleconferences, online via management systems, during in-person meetings, and as part of
documented update reports. As such, SDSU ensures consistent engagement with WINTAC’s defined
mission and is able to adapt to a changing environment as informed by its partners.
WINTAC has been structured by SDSU and implemented collectively by SDSU and partners
to integrate all five key elements of successful collective impact. Over the course of Years Three
through Five, we will continue to examine achievement of collective impact by examining emergent
principles to address the significant complexity inherent in WINTAC’s mission (Kania & Kramer,
2013) and assessing specific indicators under each of the five key elements. In Year Three,
WINTAC leadership, topic area TA teams, and program evaluators dealt with a key aspect of
emergence and complexity by reviewing progress and redefining activity articulation and goal
performance measurements and indicators for intensive TA agreements. As Kania and Kramer note,
“In fact, developing a common agenda is not about creating solutions at all, but about
achieving a common understanding of the problem, agreeing to joint goals to address the
problem, and arriving at common indicators to which the collective set of involved actors
will hold themselves accountable in making progress. It is the process that comes after the
development of the common agenda in which solutions and resources are uncovered,
7 In addition to the partnerships that create the WINTAC, there are further collaborations or partnerships that have been developed between the WINTAC as a whole and other groups. These collaborations stem from: (1) a recognition of the substantially high performance of the WINTAC in a short period of time and its strong leadership in the field, (2) a significant need by a major segment of SVRAs for assistance related to WIOA, and (3) overlap between the expertise provided by WINTAC and that provided by other TACs.
19
agreed upon, and collectively taken up. Those solutions and resources are quite often not
known in advance. They are typically emergent, arising over time through collective
vigilance, learning, and action that result from careful structuring of the effort.”
(Kania & Kramer, 2013, p.6, emphasis added)
Guidance on additional processes to use in order to understand collective impact is provided
in The Framework for Performance Measurement and Evaluation of Collective Impact Efforts with
recommendations articulated in three stages of development: (1) “early years” which should see a
team understanding context (achieved by WINTAC) establishing the five principles of collective
impact (achieved by WINTAC), and coordinating all the parties and pieces under an overarching
theory of change (achieved by WINTAC); (2) “middle years” which should see significant changes
to behavior by professionals and individuals, as well as systems changes, both of which will serve as
“gateways” or levers for outcomes at the client level; and (3) “later years” which should demonstrate
measurable change to the ultimate goals – summative evaluation. The Year One and Two Evaluation
Reports examined the “early years” developmental stage of WINTAC’s collective impact. In this
year’s evaluation report covering years one through three, we evaluate the “middle years”
developmental stage by examining outputs and short-term outcomes of intensive TA and special and
pilot projects (see the summative evaluation section further below).
C. Meeting Needs
As reviewed earlier, all 80 SVRAs were surveyed in Year 1 to determine their need for
various services from the several TACs funded by RSA. Data from this survey relevant to WINTAC
were summarized in the Year 1 report.8 Here, we map the needs identified to the breadth of services
provided thus far by WINTAC to examine the degree of needs being met.9 As a reminder, these
8 Appendix B excerpts the full analysis from Year One of these findings in detail.9 It is important to note that a lack of 1:1 correspondence between needs articulated in the Year 1 assessment and the current list of services in progress by WINTAC does not reflect a failure to meet needs. WINTAC has three more years
20
findings represent the needs articulated by 53 of the 80 SVRAs who responded to the survey,
representing a 65% response rate.
Respondents were asked to identify their level of need for TA in each of the topic areas of the
WINTAC along a four-point scale: none, low, medium, and high. If ambivalent or uncertain,
respondents could also select “unsure,” though in practice very few did. Except for the topic area of
common performance measures, the other four areas each garnered about a third of respondents
expressing a “high” need. Collapsing the four levels of need, by combining “none” with “low” and
combining “medium” with “high,” yields a clearer way to distinguish which topic areas seem to be
of higher priority for respondents. Table 1 below presents the reflection of need according to this
structure, and compares it to the nature of services currently being provided by WINTAC through
targeted TA activities and intensive TA agreements.
Table 1. Needs and Intensive TA Services Provided by WINTAC by Topic AreaTopic Area Year 1
Expressed Level of Need
Year 1-3WINTAC Intensive TA Agreements10
Pre-Employment Transition Services 18% 81%Subminimum Wage 38% 34%Competitive, Integrated Employment 35% 75%Integration of VR into WDS 37% 47%Common Performance Measures 41% 72%
The national needs assessment also demonstrated a high need for services in areas involving
overlap between WINTACs and other TACs. For example, WINTAC’s services in Pre-ETS and
Customized Employment engages youth and both the Y-TAC and NTACT also serve youth.
Services being provided by WINTAC and in collaboration with these other TACs demonstrated a
to provide services and needs may have changed. Survey responses anticipating need in the face of a changing regulatory environment where some mandates have not been clarified are in fact likely to significantly alter as regulations are clarified.10 These figures represent agreements categorized by WINTAC as “definite,” “likely,” and “probable” in their tracking records.
21
high degree of responsiveness to their constituents’ needs: 43% of intensive technical assistance
agreements (ITAs) involve collaboration with Y-TAC and 9% with NTACT. Collaborations with
other TACs included JDVRTAC (12.5%), PEQA (3%), as well as other national collaborations with
the Disability Employment Initiative (6%).
D. Summary
In Year 1 an aggregation across conceptual programmatic and conceptual evaluation
frameworks yielded an integrated set of principles by which WINTAC’s formative structure and
progress could be evaluated. Using those principles (understanding SVRA context, ensuring
stakeholder engagement, and assessing needs), Year 1 and 2 evaluations find WINTAC to be
making strong progress. These principles were met both at the time of the proposal and continue to
be important as WINTAC implements its activities in all areas. The partnerships that drive and
support stakeholder engagement and inform WINTAC of current needs remain vibrant.
A comparison of stakeholders’ articulated needs (from the collaborative needs assessment
conducted in Year 1 by PEQA-TAC on behalf of TACs) and the nature and amount of WINTAC
services across topic areas clearly demonstrates that WINTAC is meeting needs and providing
services beyond those which were even anticipated by the SVRAs themselves – notably, the need for
Pre-ETS. This is further evidenced when one looks at both “satisfaction” as evidenced by a
nationwide survey of SVRAs conducted by WINTAC in Year 3 and the progress of activities,
outputs, and outcomes discussed further below in the summative evaluation section.
And finally, an initial examination of collective impact achieved by WINTAC demonstrates
the effective architecture and relationships in place for the partners that form the WINTAC to be
successful in progressing WIOA implementation nationally and increasing best practices by SVRAs
to increase employment outcomes for their clients in a way that exceeds what any individual entity
22
would have been able to do operating independently. Future evaluations of collective impact by the
WINTAC will examine and highlight the strategies it is using to deal with the highly complex
ecosystem of VR services, particularly with the rapidly changing economic and regulatory context
within which WINTAC and SVRAs operate.
23
VII. Summative Evaluation Focus Areas and Findings
A. Universal Technical Assistance
i. Website Resources
The WINTAC website serves as the main gateway for providing Universal TA to SVRAs
and other relevant organizations. Materials uploaded to the WINTAC website for Universal TA
include a mix of peer-reviewed journal articles and other publications, fact sheets, government
reports, webinars, links to self-paced training courses, laws and regulations, and policy documents.
In the Year 1 Evaluation Report, the characteristics of the Universal TA provided in terms of
choice and flexibility, access, cumulative knowledge and skill building, accessibility, innovative
interactive challenges, and continuous quality improvement were assessed and reported. This year,
the report addresses traffic and website generated contacts to WINTAC, as well as feedback
obtained from WINTAC website visitors through (1) webpage evaluations and (2) follow-up surveys
of webpage evaluators.
ii. WINTAC Contacts Through Website
State VR agencies and other groups can reach out to WINTAC by phone, email or in-person
for universal TA requests. Table 2 shows the number of requests that the WINTAC received and
responded to in the first three quarters of Year 3 by type of request::
Table 2. Number of non-intensive and targeted TA requests via website in Year 1 and 2Type Year 2
TotalYear 3 Q1
Year 3 Q2
Year 3 Q3
Year 3 Q4
Pre-ETS 10 3 9 6 TBDSection 511 implementation 12 1 1 1 TBDCustomized and supported employment 3 0 0 0 TBDIntegration of VR into the Workforce Development System
5 1 1 0 TBD
Common performance measures 5 1 3 2 TBDOther (WIPPS) 5 0 0 1 TBDTotal 40 6 14 10 TBD
24
iii. WINTAC Website Traffic
WINTAC regularly tracks website usage data using Google Analytics to understand traffic
volume as well as trends in usage over time. In Year Two, website traffic statistics showed an
increase each quarter in terms of unique visitors, page views, and visits/session and a steady pace for
returning visitors, new visitors, pages viewed per session and duration per session as shown in Table
3. In the next table, data is presented for Year Three and the pattern is similar: unique visitors, page
views, and visits per session increase across the quarters. Interesting differences emerge with the
pattern of new visitors, visits per session, as well as pages per session. These statistics are all up
from Year Two; new visitors and visits per session are up markedly and pages per session are almost
doubled consistently compared to Year Two. This pattern seems highly suggestive of WINTAC and
its website continuing to be an important resource for those wishing to learn about WIOA
implementation for an increasingly large audience. Given the closely connected community of VR
and disability employment professionals, this is also strongly suggestive that WINTAC’s resources
and services are satisfactory or even exemplary and that as professionals avail of WINTAC’s
website and/or its direct services, others come to learn of it in positive terms and seek it out as well.
Table 3. Website Traffic Data for Year 2Metric 1st Quarter 2nd Quarter 3rd Quarter 4th QuarterUnique Visitors 2, 889 4, 689 4, 569 6, 312Page views 17, 996 32, 535 30, 231 39, 638Visits/Sessions 4, 848 8, 261 7, 766 10, 654Returning visitors 46% 47% 48% 47%New visitors 54% 53% 52% 53%Pagers per session 3.7 3.9 3.9 3.7Average duration per session
3 min. 52 sec. 3 min. 51 sec. 3 min. 59 sec. 3 min. 48 sec.
25
Table 4. Website Traffic Data for Year 3Metric 1st Quarter 2nd Quarter 3rd Quarter 4th QuarterUnique Visitors 6, 032 8, 188 9, 799 31, 126Page views 51, 972 91, 962 101, 248 334, 593Visits/Sessions 10,274 14, 527 16, 538 57, 113Returning visitors 47% 23.7% 23.4% 19.7%New visitors 53% 76.3% 76.6% 80.3%Pagers per session 5.1 6.35 6.12 5.86Average duration per session
4 min. 6 sec. 3 min. 53 sec. 3 min. 30 sec. 3 min. 42 sec.
Cumulatively, since going online in October of 2015, the WINTAC website has had 97,352
visitors with 52, 210 returning (53%) – a strong return rate demonstrating that stakeholders indeed
view WINTAC as an important and trustworthy resource for information, updates, and assistance.
Cumulatively, there was a total of page views of 488,704.
During the reporting period October 1, 2017 – September 30, 2018, the WINTAC Website
was viewed by 31,126 users of which 30,415 were new users. The website had visitors every day.
There was a total of 56,480 entrances to the website, ranging from 3 to 517 with 245 days of “100+”
entrances and an overall daily average of 155 entrances. Once again, this data buttresses the
interpretation that stakeholders are coming to WINTAC as a trusted resource for WIOA information
that they need. Data from page views is similarly high, indicating that stakeholders are staying with
the website and browsing it for information rather than coming to the site and leaving right away;
during this reporting period the total page views is 334,593 (ranging from 6 to 4,315) with an
average of 917 page views per day.
Another way to examine website traffic is by days of particularly high intensity. These
patterns may be explicable by considering the importance of the day to the stakeholder (e.g., an
upcoming implementation date for a WIOA regulation may drive information-seeking to ensure
compliance) or an outreach effort of WINTAC or a related entity (e.g., CSAVR or RSA). Table 4
below examines two different ways of looking at high intensity traffic days: the top 10 days of
26
highest page entrances and the top 10 days of page views. Looking at time-based trends in yet
another way, the following weeks had either 3 or 4 days in the “top 50” of page visits:
• 02/20/2018 (3)
• 03/05/2018 (3)
• 04/16/2018 (3)
• 05/07/2018 (4)
• 05/14/2018 (3)
• 07/09/2018 (3)
Table 5. Top 10 Days of Entrances and Page Views for WINTAC Website
Ranking Entrances Page Views
#1 5/15/2018 (517) 5/15/2018 (4315)
#2 5/24/2018 (504) 5/24/2018 (3652)
#3 7/31/2018 (433) 3/13/2018 (3323)
#4 12/12/2017 (417) 12/12/2017 (2593)
#5 3/13/2018 (378) 4/16/2018 (2524)
#6 7/10/2018 (372) 3/6/2018 (2430)
#7 8/27/2018 (370) 5/4/2018 (2424)
#8 4/17/2018 (368) 7/31/2018 (2368)
#9 4/16/2018 (367) 9/18/2018 (2213)
#10 6/12/2018 (327) 8/27/2018 (2176)
Prior analyses of high traffic timeframes and discussion with WINTAC TA Team members
indicated that many traffic spikes coincide with upcoming trainings and similar events. From a
continuous quality improvement perspective, this makes clear that relevant sections of the website
should be up-to-date before important webinars and site visits. These timeframes can also be
important opportunities to capitalize on a large incoming audience.
27
The WINTAC Year Three Annual Report also provides metrics for the “Top 10 Pages
Visited” and “10 States with Higher Traffic.” This data allows us to ask further questions beyond
“how much?” to “what do visitors see?” “who are they?” and, “how do they get there?” Particularly
when looking at days of high traffic, it becomes clear that one topic consistently drives a majority of
the traffic thus far: Pre-Employment Transition Services (Pre-ETS). Given the significant changes
and complex regulations, it is apparent that stakeholders needed guidance related to these areas and
the WINTAC was a resource for them. In fact in Year Three, 80% of website visitors went to the
Pre-ETS landing page as their first or second page, and approximately 38% (21,285) started on the
Pre-ETS page instead of the home page (having come from the learning management system for
webinars).
iv. WINTAC Webpage Evaluations
As discussed in the Year 1 report, a review of best practices was conducted to inform
approaches to conducting evaluations of the website and key questions to ask of evaluators. Based
on the review and WINTAC Leadership preferences for maintaining an efficient experience for
stakeholders, the use of pop-up surveys was eliminated as an option and instead key pages of the
website have a clear (but not obtrusive) “evaluate this page” button (see green button in screenshot
below). Website visitors can click this button and are directed to a short survey.
28
Figure 2. Screenshot of WINTAC Webpage with Evaluation Button
As the volume of responses makes clear, this strategy of requiring active selection by the
respondent generates fewer survey respondents than a strategy of pop-up surveys that are
administered to all or a subset of website visitors (or as with some websites, to those that aim to
register for resources or download materials); however, the data may be more meaningful as
choosing to respond is entirely self-generated and not imposed upon the visitor. The WINTAC
website was evaluated by 35 respondents in Years 1 and 2, with the following breakdown in Table 6
once again demonstrating the importance of the Pre-ETS topic area during that time.
Table 6. WINTAC Webpage Evaluations by Topic Area Years 1 and 2Area # of EvaluationsState Liaisons 1Announcements 1WIPPs 3Training 5Distribution List/User/Login 8Pre-ETS 17
29
The WINTAC website was evaluated by 18 respondents during Year Three, with the
following breakdown:
Table 7. WINTAC Webpage Evaluations by Topic Area Year 3Area # of EvaluationsOther 8Pre-ETS 4Transition to the Common Performance Accountability System
1
Other (left blank) 5
For Year Two, thirty-three out of 35 respondents (94%) indicated they found the information
they obtained useful and the same amount (but not same respondents) said they planned to use the
information in the future as follows:
Staff Development
Program Development
Implementation of Specific Activities
Policy or procedure creation or revision
General knowledge development
Provision of TA or training to others
Resource development
14
15
12
11
21
8
16
Figure 3. Planned Use of Information Obtained by WINTAC Webpage Evaluators
In Year Three, 13 out of the 18 respondents (72%) indicated they found the information they
obtained useful and Fourteen out of 18 respondents (78%) said they planned to use the information
in the future as follows:
30
Other (Please specify):
Program development
Policy or procedure creation or revision
Implementation of specific activities
Provision of TA or training to others
Resource development
Staff development
General knowledge development
0% 10% 20% 30% 40% 50% 60% 70%
0%
29%
29%
36%
43%
50%
57%
64%
Planned Use of Information Obtained by WINTAC Webpage Evaluators
Figure 4. Planned Use of Information Obtained by WINTAC Webpage Evaluators
Other (Please specify):
Program development
Policy or procedure creation or revision
Implementation of specific activities
Provision of TA or training to others
Resource development
Staff development
General knowledge development
0 1 2 3 4 5 6 7 8 9 10
0
4
4
5
6
7
8
9
Planned Use of Information Obtained by WINTAC Webpage Evaluators
Figure 5. Planned Use of Information Obtained by WINTAC Webpage Evaluators
When stakeholders turn to the WINTAC as a resource, such as by visiting the website for
information-seeking purposes, it is important to examine the quality of that experience in several
ways. The data reviewed above demonstrates that stakeholders see WINTAC as a resource and turn
to it for information, spending time on the website and making plans for use of the information they
obtain. A particularly positive sign is that two-thirds of survey respondents also indicated they had
31
further contact with the WINTAC by returning to the website, applying for and engaging in
Intensive Technical Assistance agreements, and joining Communities of Practice.
Knowing that the WINTAC website is an important portal to information and services
provided by WINTAC, it becomes important to consider referral sources. As seen earlier, some
website visitors were coming over from the webinar portal directly onto the website. In follow-up
surveys in Year Two, two-thirds of respondents indicated they had heard of the WINTAC website
from RSA’s and others’ websites and RSA’s newsletter. Now that SVRAs have become much more
familiar with the various Technical Assistance Centers (TACs) funded by RSA, and particularly with
the WINTAC as seen from the high level of engagement with the website and in TA activities
described below, there is less of a pressing need to consider how best to redirect stakeholders to the
website. Nevertheless, this information on referral is instructive for those occasions in the future
when WINTAC will post particularly important and time-based information to the website or needs
stakeholders to register through it as a portal for some other activity. Existing RSA, CSAVR, and
other guild Listservs, Newsletters, and Websites are likely to be good “connectors.”
Finally, no analysis is complete without assessing a baseline of satisfaction. Though not a
sufficient condition to achieving outcomes, it is a necessary one as stakeholders will simply not avail
of ongoing information, training, or technical assistance without it. In follow-up surveys to website
visitors in Year Two, two-thirds of respondents found information they accessed “very relevant” to
their organization and rated it “high quality.” Another one-third of respondents felt the material
would be very useful to improving their agency’s policies, procedures, practices, capacities, or
outcomes.
32
v. Summary
As the gateway to Universal TA, it is important to ensure the WINTAC’s website is effective
in providing information that stakeholders need and which they ultimately use. Now that the
WINTAC has completed three years of implementation and conducted extensive personal outreach
through dedicated subject matter and regional TA representatives, although fewer contacts for TA
come through the website itself (the pace decreased from Year 1 to 2 with Year 3 TBD), that any
come through indicates it is important to leave the mechanism available.
Website traffic statistics showed an increase each quarter in Year 2 in terms of unique
visitors, page views, and visits/session and a steady pace for returning visitors, new visitors, pages
viewed per session and duration per session. In Year 3 this pattern was largely the same, although
new visitors, visits per session, and pages per session were all markedly up from Year 2
demonstrating an ongoing pattern of increased utility of the WINTAC website. Cumulatively, since
going online in October of 2015, the WINTAC website has had 97,352 visitors with 52, 210
returning (53%) – a strong return rate demonstrating that stakeholders indeed view WINTAC as an
important and trustworthy resource for information, updates, and assistance. Cumulatively, there was
a total of page views of 488,704. In all years the highest demand for information topically was
related to Pre-ETS.
Stakeholder evaluations for the website are done in two ways: (1) through a button on the
website that allows visitors to evaluate the specific page they are viewing and (2) through follow-up
surveys we send to individuals who conducted the web-page evaluations. Both forms of evaluation
demonstrated positive experiences with the majority of individuals predicting they would use the
information for general knowledge development.
33
B. Targeted Technical Assistance
i. Overview
SVRAs can reach out to WINTAC for targeted technical assistance under the five topic
areas. The WINTAC’s website includes a “Request TA” section that allows users to formally request
targeted, specialized, or intensive TA. In addition, SVRAs shared their need for TA in the needs
assessment. After an initial self-selection for targeted, specialized TA by a SVRA either through the
initial assessment process, or through a request received directly from the SVRA, WINTAC
followed up with the agency Point of Contact to gather more information and develop a plan to
address their needs.
ii. One-on-one targeted and specialized TA
Over Year Three, WINTAC engaged in one-on-one targeted TA with 46 SVRAs (30 in Year
1 and 50 in Year 2), and several joint TA sessions with multiple SVRAs. Several SVRAs received
targeted TA multiple times, either to follow up on processes initiated through earlier TA sessions or
on other topic areas. Targeted TA can occur through different means including webinars, face-to-
face trainings, on-demand discussions over calls and emails, teleconferences, meetings, and
presentations at conferences. Targeted TA can include consultation, policy and procedure review,
development or revision of processes and documents, and sharing of best practices. WINTAC poses
questions or requests for clarifications from SVRAs to RSA to provide accurate guidance. A total of
1,950 individuals received in-person targeted TA in Year 3. Figure 6 provides the percentage of
SVRAs who received one on one targeted TA for each topic area in Year 2. Year Three data by topic
area is TBD as it is still being disaggregated, but the quarterly program reports provide information
on state requests for targeted TA across topic areas. The reports demonstrate that targeted TA often
34
leads to additional engagement with WINTAC from CoPs to ITA agreement modification or
initiation and pilot project initiation.
Pre-ETS 511 CIE Integration CPMs0%
10%
20%
30%
40%
50%
60%
70%
80%
90%82%
30% 32% 30%
50%
Percentage of SVRAs requesting Targeted TA per topic area
Figure 6. Topic areas requested by SVRAs in Year 2
In Year Three, WINTAC provided multiple joint TA sessions, i.e. targeted TA for multiple
SVRAs jointly. Examples include a session at the RSA Fiscal Conference where the Pre-ETS Team
co-presented with RSA fiscal staff on the revised model of moving from required to authorized
activities (150 attendees); and a live training on Order of Selection and partnerships (180 attendees).
In each instance the information was tailored to address some of the implementation issues that
interested SVRAs face in those respective areas.
In addition to SVRAs, WINTAC also received requests for targeted TA in Year Three from
other stakeholders such as National Goodwill, the Oklahoma Transition Council and the Alliance
GetAWARELive conference. WINTAC has also continued to take advantage of conferences and
other gatherings to reach a wide audience of SVRAs and relevant stakeholders including CSAVR
conferences, NCSAB conference, and at the National Rehabilitation Leadership Institute.
Across the years, states which already have an intensive TA agreement also requested
targeted TA for a topic area not covered by their agreements and not requiring intensive, ongoing
35
assistance. Thus, targeted TA can also supplement ongoing intensive TA offered by WINTAC to
SVRAs.
iii. Immediate Post-Training Evaluations
WINTAC conducts onsite and online trainings as another form of targeted TA. CRC credit is
available for those who are CRC certified; this has become a valued service which likely explains
the high traffic to this section of WINTAC’s website. For online trainings, the eight live sessions
offered in Year 3 were attended by 538 people, with individual session attendance ranging from 26
to 104 participants. WINTAC issued 254 Certificates of Completion and awarded CRC credit to 290
attendees. WINTAC conducts brief evaluations with attendees of trainings immediately following
the trainings. Figure 7 offers a summary of planned use for the TA by attendees of onsite trainings
from Years One and Two.
Staff Development 52%
Program De-velopment
45%
Implementation 56%
Policy or Proce-dure Development
43%
General Knowl-edge Development
56%
Provide TA or Training to Others
29%
Resource De-velopment
36%
Figure 7. Onsite Training participant responses on planned use of TA Year 1-2
Participants at onsite trainings indicated that trainings:
Improved their understanding of topics and increased the clarity of regulations and
requirements
Spurred policies and practices and gave them a strategic plan
36
Gave them direction on how to move forward and get to implementation
Helped them learn how they could train staff
Gave them useful specific examples
Taught them how changes would impact their work and promote long-term change.
Participants at onsite trainings felt that the following would be useful changes in future
trainings:
Splitting training up over two days
Having more time to go over all the useful information
Getting PowerPoint materials in advance
Having examples from other state agencies
Having some hands-on activities.
Figure 8 offers a summary of planned use for the TA by attendees of online (live/archived)
trainings from Years One and Two.
Staff Development 29%
Program Development 25%
Implementation 35%
Policy or Procedure Development 13%
General Knowledge Development
51%
Provision of TA or Train-ing to Others
14%
Resource Development 21%
Figure 8. Online Training participant responses on planned use of TA Years 1-2
These post-training surveys of online training participants were conducted in Year Three as
well, and satisfaction was high, with 99% indicating they found the information useful and 98%
37
indicating they planned to use the information. Figure 9 below summarizes reported planned use;
resource, staff, and program development plans ranged from 12-15%, implementation was at 16%,
general knowledge was about a third, and policy and procedure development and providing TA or
training to others were each selected by 6% of respondents. The “other” category represented 3%
and much of that was noted to be for the purpose of informing clients and/or providing services to
them.
Resource Development14%
Staff Development15%
Program Development12%
Implementation16%
Policy or Procedure De-velopment
6%
General Knowledge Development
29%
Providing TA or Training to Others6%
Other3%
Figure 9. Online Training participant responses on planned use of TA Years 3
In general, during Years One and Two, participants of online trainings:
Appreciated examples from other states when available
Planned to use information as a resource to understand state reports
Indicated that it was good to learn about new strategies (e.g., for data sharing) as that
was one of the most difficult things to achieve in the past
Found the webinars great for policy development and training tools for staff
Appreciated the clarification on the meaning of definitions
38
Felt the information would be useful in working with clients (some indicating they
were already doing that).
In Year Three, online training participants used the open-ended comment field on the
evaluations to report much of the same:
They appreciated concrete examples of policies and practices from other states
They appreciated concrete examples in the form of scenarios
They were overwhelmingly positive in their appreciation for the webinar and
WINTAC in general
iv. Six-Month Training Follow-up Evaluations
In addition, WINTAC’s evaluation team is conducting follow-up surveys after six months
post training that seek to assess some or all of the following:
Self-perceived change in knowledge about the topic area by asking about perception of
knowledge before (ideally include in baseline) and after the training;
Confidence in being able to apply the knowledge to their work;
Reasons for attending the training;
How knowledge was applied, facilitators and barriers to applying knowledge;
The survey also repeats questions on satisfaction with the training (e.g. relevance,
accessibility, benefits, evaluator feedback etc.). Surveys are disseminated to participants that provide
contact information for follow-up evaluation. The full set of survey findings for Year 2 is available
in Appendix C. Here we present findings from Year 3 from fifty-nine respondents:
Relevance of training: Respondents listed different reasons for participating in the targeted
TA trainings (Figure 10): to improve skills and knowledge (69%), because it was required for their
39
job or responsibilities (29%), because it was requested by their manager (8%), general knowledge
(34%), due to new processes introduced at work (22%), and continuing education credits (19%).
Other. Please specify:
I was asked to take part by your manager
Continuing Education Credits
New processes have been introduced at work
Required for my job or responsibilities
General Knowledge
To improve my skills and knowledge
0% 20% 40% 60% 80%
2%
8%
19%
22%
29%
34%
69%
Why did you participate in this webinar/training?
Figure 10. Reasons for participating in training
The charts below show participant reactions to the relevance of the training they received,
after a 6-12 month period. 95% of the respondents agreed (48% strongly agreed) that the assistance
received through targeted TA trainings was relevant to the goals of the agency (Figure 11), and 88 %
agreed (43% strongly agreed) that the assistance they received will be useful in improving my
agency’s policies/practices/capacity and/or outcomes (Figure 12). 89% found the targeted TA
received to be quality technical assistance (Figure 13).
40
48%
47%
3%2%
The assistance provided was relevant to the goals of my agency
Strongly Agree Agree Neutral Disagree Strongly Disagree
Figure 11. Relevance of training to agency goals
43%
45%
7%
3%2%
These technical assistance activities will be useful in improving my agency’s policies/practices/capacity and/or outcomes
Strongly Agree Agree Neutral Disagree Strongly Disagree
Figure 12. Usefulness of TA activities
41
42%
47%
8%
2%Overall I found this was quality technical assistance
Strongly Agree Agree Neutral Disagree Strongly Disagree
Figure 13. Quality technical assistance
Using the information provided in the training: The evaluation surveys specifically asked
about how the information was being used by the respondents. 91% of the respondents said that they
planned on using the information received during the training. 56% of the respondents were already
using the information, while 25% plan to use the information although they have not had the
opportunity yet. A majority are already putting the information into practice: 45% of the respondents
stated that they have been able to use the knowledge or skills learned in their job to a great extent
and 40% to some extent.
91%
9%
Do you plan to use this information in the future?
Yes No
Figure 14. Plan to use information
42
Other, please specify
No, I'm not using it
I don't need to use this skill in my job
I haven't had the opportunity to use it, but I plan to
Yes, I am using it
0% 20% 40% 60%
4%
13%
2%
25%
56%
Are you currently using the training content in your job?
Figure 15. Current use of information
Not at all
Very little
Somewhat
To a great extent
0% 20% 40% 60%
3%
12%
40%
45%
How much have you been able to use the knowledge or skills you learned in your job?
Figure 16. Extent of information use
Information provided through targeted TA (See Figure 17) was mainly used for general
knowledge development (66%), implementation of specific activities (38%), resource development
(26%), and staff development (25%). Other uses included policy or procedure creation or revision
(23%), providing TA or training to others (17%), program development (11%), and other (8%).
43
Other (Please specify):
Program Development
Provision of TA or training to others
Policy or procedure creation or revision
Staff Development
Resource development
Implementation of Specific Activities
General knowledge development
0% 20% 40% 60% 80%
8%
11%
17%
23%
25%
26%
38%
66%
How did you use the information in your organization?
Figure 17. How was information used
The main facilitator of applying the knowledge or skills was that it was relevant to the
participant’s role in the agency, as mentioned by 64% of the respondents. Some other facilitators of
applying the knowledge or skills include: having opportunities to apply (25%), having work
processes support use of skills (23%), having effective tools to apply the knowledge (19%),
receiving supporting from manager (17%) and time (11%). (See Figure 18).
Other, please specify
Nothing
Did a similar course previously
Encouraged by my workgroup
Encouraged by early success
Had the time
Received support from my manager
Effective tools available on the job
Work processes support use of skills
Plenty of opportunities to apply
Knowledge/skills relevant to my role
0% 20% 40% 60% 80%
2%
4%
4%
6%
9%
11%
17%
19%
23%
25%
64%
What has helped you to use the knowledge or skills you learned?
Figure 18. Facilitators to knowledge use
44
A majority (60%) of the of the respondents said that they had not faced any barriers to
putting the knowledge into action. A small number of participants cited barriers to use such as lack
of opportunities (15%), not had the time (13%), work processes do not support use (4%), tools not
available on the job (4%), no support from my manager (4%). (See Figure 19).
Discouraged by my workgroup
Was applying knowledge or skills already
My job has changed
Do not remember course content
Tried without success. Please elaborate:
No support from my manager
Tools not available on the job
Work processes do not support use
Other, please specify:
Have not had the time
No opportunity to apply
Nothing
0% 20% 40% 60% 80%
0%
0%
2%
2%
2%
4%
4%
4%
8%
13%
15%
60%
What has stopped you from using any of the knowledge or skills you learned?
Figure 19. Barriers to knowledge use
Training Experience
Participants were asked to rate their experience with the training using the following scale:
1 = Strongly Disagree, 2 = Disagree, 3 = Neutral, 4 = Agree, 5 = Strongly Agree. The majority of the
respondents rated their experience as positive. Eighty-two percent agreed or strongly agreed that the
webinar/training adequately explained the knowledge, skills and concepts it presented (Mean = 4.26,
SD = 0.84). Eighty-nine percent agreed or strongly agreed that the technical level of the information
met my expectations (Mean = 4.19, SD = 0.88). Ninety-one percent agreed or strongly agreed that
the presenter(s) had relevant knowledge and expertise (Mean = 4.26, SD = 0.86). Ninety-one percent
agreed or strongly agreed that the information was presented in a clear manner (Mean = 4.19, SD =
45
0.84). Ninety-three percent agreed or strongly agreed that they’d feel comfortable recommending
this session to peers and coworkers (Mean = 4.09, SD = 1.00). (See Figure 20)
The webinar/training adequately explained the knowledge, skills and concepts it presented
The technical level of the information met my expectations
The presenter(s) had relevant knowledge and expertise
The information was presented in a clear manner
I’d feel comfortable recommending this session to peers and coworkers.
0% 20% 40% 60% 80% 100%
4%
4%
4%
4%
6%
0%
2%
0%
0%
0%
4%
4%
6%
6%
13%
52%
54%
48%
56%
43%
41%
37%
43%
35%
39%
Select the rating that best describes your experience
Strongly Disagree Disagree Neutral Agree Strongly Agree
Figure 20. Training Experience
Follow-up Training Topics
An indicator of a successful training event is that the participants would like additional
training or technical assistance. Eight-nine percent of participants indicated an interest in additional
training topics or technical assistance. Over half the participants were interested in further technical
assistance on Pre-Employment Transition Services (55%). Strong interest in receiving technical
assistance (over 25% of participants) was also shown in the following topics: Integration of VR into
the Workforce Development System (49%), Common Performance Measures (39%), Competitive
Integrated Employment (39%), and Implementation of Section 511 Requirements (27%).
46
Other
SARA
The Career Index Plus
Implementation of Section 511 Requirements
Competitive Integrated Employment
Common Performance Measures
Integration of VR into the Workforce Development System
Pre-Employment Transition Services
0% 20% 40% 60%
8%
12%
22%
27%
39%
39%
49%
55%
After attending this webinar/training, what topic(s) would you like to receive more technical assistance on?
Figure 21. Technical Assistance topics requested
v. Communities of Practice
Another effective and increasingly popular method of providing targeted technical assistance
comes in the form of Communities of Practice (CoPs). Over the course of Year Two, Communities
of Practice (CoPs) began implementation (distribution lists across the five CoPs for each topic area
had over 500 members each, with a total distribution list of 2, 745 members). The number of CoPs
increased in Year 3, and the number of participants grew dramatically, as evidenced in the increased
number of individuals signed up by topic areas on WINTAC’s distribution lists (a total of 3,393).
Understanding the nature of audience participation and engagement is a “first-look” at
evaluating the CoPs. As noted by RSA attendees of the Year Three All-Team Meeting in San Diego,
a high and repeat level of engagement certainly suggests the CoPs are disseminating useful
information. The Career Pathways CoP certainly met this “test” in calendar year 2018 as is evinced
from the following measures demonstrating high participation, strong levels of repeat engagement,
and diverse member participation:
47
April 23, 2018 New York Time August 2, 2018 New York Time October 9, 2018 New York Time0
50
100
150
200
250
300
229
268
160
Meeting Attendance
Figure 22. Participation Rate for 2018 Career Pathway CoPs (Total=657)
33
5
Repeaters
Attended 2 of 3 meetings Attended all 3 meetingsFigure 23. Repeat Participators for 2018 Career Pathway CoPs (Total=38)
48
State Representation Alabama Maryland Alaska Michigan Arkansas Minnesota California Nebraska Colorado New York Connecticut North Carolina Delaware Ohio Florida Oregon Georgia Pennsylvania Guam Tennessee Hawaii Texas Illinois Virginia Iowa Washington Kentucky West Virginia Louisiana Wyoming Maine
Figure 24. States Represented for 2018 Career Pathway CoPs (Total=31)
Additionally Represented GroupsTraining providers Service ProvidersDD councils Other disability StakeholdersIndependent Living Health organizationsWorkforce Disability Rights OrganizationsGoodwill Industries APSEMedicaid RSAPost-secondary ODEP
Figure 25. Non-VR Participant Groups for 2018 Career Pathway CoPs (Total=14)
Several other CoPs are in place or set to begin in Year Four. In Year Three an evaluation
framework for CoPs was collaboratively refined with WINTAC Leadership and TA Specialists.
Sample substantive progress metrics for CoPs will include: identification of evidence-based
practices, promising, and emerging practices; successful implementation of new practices; the
capacity to expand and sustain new practices; and the capacity to replicate new practices in other
locations.
49
In Year Four, TA Specialists and Program Evaluation Specialists will identify or confirm
goals for each CoP group, so that these may be evaluated throughout the year to ensure
responsiveness to audience need and progress towards creating meaningful impact that results in
tangible outcomes. As done in prior years, evaluations will examine CoP participation and
engagement through attendance and CoP site metrics and responsiveness to posts for asynchronous
activity where applicable. In addition, the evaluation can include follow-up surveys to determine
satisfaction with experience, follow-up use of WINTAC and CoP websites, actual use and outcomes
of information obtained, whether participants are converted to Intensive TA recipients, and the
impact of participation and learning for existing ITA recipients. An example of this took place for an
activity similar to a CoP which benefited from the same evaluation framework – a “summit” meeting
to bring a community of agencies together around a common topic area of interest. (Forums can also
follow this format.)
The “Job Driven Strategies for Serving Individuals Who Are Blind/Visually Impaired
Summit” was developed specifically to bring together agencies serving individuals with blindness or
low vision around five topic areas in collaboration with the JDVRTAC where there is overlap with
WINTAC areas of focus. As with CoPs, the goal was to bring those with common interests together,
identifying promising and effective practices, discuss solutions to challenges, document and
disseminate recommendations and practices, and follow-up to examine implementation of new or
improved practices by participants. There were thirty attendees and the Summit was evaluated
immediately after the conclusion of the event and several months later to determine satisfaction and
impact. Results of the immediate evaluation conducted by JDVRTAC demonstrated satisfaction with
the event via ratings as demonstrated in the table below.
50
Question Response % nThe content presented was highly relevant to my job. Strongly Agree 9
019
Agree 10
2
The training activity greatly increased my knowledge/skills/abilities.
Strongly Agree 76
16
Agree 24
5
Overall, I found that this was a high quality training activity. Strongly Agree 95
20
Agree 5 1This training was relevant to my professional development. Strongly Agree 8
618
Agree 10
2
The objectives were appropriate and useful. Strongly Agree 86
18
Agree 14
3
The presenters were knowledgeable. Strongly Agree 86
18
Agree 14
3
The presenters were prepared and organized. Strongly Agree 81
17
Agree 19
4
I had an opportunity to participate, interact, and ask questions. Strongly Agree 86
18
Agree 10
2
Table 8. Job Driven Strategies for Serving Individuals Who Are Blind/Visually Impaired Summit Results
Participants also commented via open-ended text fields on aspects of the Summit, indicating
what they appreciated, what they felt could be eliminated, and how they envisioned using their
learning. These comments revealed that the content of the Summit was well-chosen with respondents
suggesting no eliminations and appreciating the interactive components where discussions were had
in small group formats. In the future participants aimed to discuss learning with management, train
51
staff, develop procedures and to collaborate with partners and clients. Respondents were interested
in follow-up or future sessions to continue learning as well.
Three months later, WINTAC followed-up with a survey to determine impact of the training.
Participants were asked to identify which of the sub-topic areas they had been interested in learning
about at the meeting across each of the main focus areas (Collaborative Implementation of WIOA,
Youth-Focused Initiatives, Career Pathways, Use of Labor Market Information, and Collaborative
Business Engagement) and then whether they were pursuing or planning to pursue practices in that
area, continuing conversations with others they met at the Summit related to that area, and/or
whether they needed further TA or training in that area. So far, there have been twelve respondents
to the survey; responses are summarized below in Table 9.
Sub-Topic AreaWas
Interested in Area
Continue to
Pursue Area
Continue Discussions with Others
in Area
Desire More Training/TA
in Area
Collaborative Implementation of WIOAIntegrated Resource Teams 1 1 1 -Serving on Workforce Investment Boards 2 1 1 1Data Sharing Systems with General Agencies 3 2 3 1
Youth-Focused InitiativesSummer Work Programs 4 4 2 1Student Internships 1 - 1 -Partnerships with Consumer Groups for Transition Services 3 1 2 1
STEM Camps, Events, Partnerships 1 1 1 -Post-Secondary Education Programs 2 1 - 1
Career PathwaysBuilding Project Search Partnerships 1 - - -Partnering with Community Colleges & OCB Training Centers 1 1 - 1
Use of Labor Market Information (LMI)Use of LMI with Assistive Technology 2 2 2 -Utilizing The Career Index Plus as an LMI Tool 1 1 - -
Collaborative Business EngagementUse of Varied Training Models & Platforms 2 2 1 1
52
Sub-Topic AreaWas
Interested in Area
Continue to
Pursue Area
Continue Discussions with Others
in Area
Desire More Training/TA
in Area
Co-Locating with Partners 3 1 2 1Table 9. Collaborative Implementation of WIOA, Youth-Focused Initiatives, Career Pathways, Use of Labor Market Information, and Collaborative Business Engagement Results
As noted above, Year Four evaluation plans for CoPs, Summits, Forums, and other
dissemination-focused activities will follow the planned framework of needs identification and goal
setting, output tracking, and outcomes/impact identification. Surveys and/or interviews and
discussions with CoP members will be employed as appropriate to the group and topics. These
strategies comport with previously discussed approaches by scholars and practitioners of CoPs as
recommended evaluation practices. For example, Wenger and colleagues outline cross-sectionally
examining issues of: Domain (topics, issues), Community (relationships, roles, conflict, and
structure), and Practice (learning activities and knowledge repositories developed) and longitudinally
examining immediate value, potential value, applied value, realized value, reframing value (Wenger,
Trayner, and de Laat, 2011). Similarly, the World Bank utilizes CoPs frequently in its work and
identifies the following key evaluation questions:
1. What kinds of knowledge are the CoPs creating?
2. Is the knowledge being used?
3. What has been the impact of the CoPs on the members?
4. How will/are the CoPs sustaining themselves?
1. This last element, sustainability, will begin to be addressed in Year Four, but focused on
more directly in Year Five.
53
vi. Summary
In Year One, WINTAC engaged in targeted TA with 30 SVRAs. In Year Two, WINTAC
provided over 1,000 instances of targeted TA with 50 SVRAs through webinars, live trainings, calls
and emails. Over Year Three, WINTAC engaged in one-on-one targeted TA with 46 SVRAs, and
several joint TA sessions with multiple SVRAs.
Targeted TA also includes trainings, which are evaluated immediately afterwards and
approximately six months hence. Participants routinely express high levels of satisfaction with
trainings provided immediately following the trainings as well as in the follow-up evaluation
surveys. Participants have also started putting the knowledge gained through the targeted TA
trainings into action at their agencies. In Year Two, suggestions for future improvements have
included making the training sessions longer time to allow more time to go over the materials,
opportunities for ongoing and continuous learning, and getting more practical knowledge through
other agencies’ practices or hands-on learning. In Year Three, evaluations demonstrated that these
changes had been made and participants rated their experiences highly. Follow-up surveys
demonstrated that over half of participants were already using what they had learned and putting it
into practice. Community of Practice distribution lists collectively numbered at 2, 745 in Year Two
and 3, 393 in Year Three. Robust plans for evaluation have been outlined and will be collaboratively
finalized for implementation in Year 3. The Job-Driven Strategies for Serving Individuals Who Are
Blind/Visually Impaired Summit is an example of how CoPs, Summits, and Forums can be designed
with input from the beginning (needs assessing, goal-setting), evaluated over time, and yield tangible
outputs and outcomes.
C. Intensive Technical Assistance
54
i. Overview
The WINTAC is required in this first five-year cycle to provide intensive, sustained TA to a
minimum of 23 State VR agencies and their associated rehabilitation professionals and service
providers in the topic areas. Currently, WINTAC already has 32 intensive technical assistance (ITA)
agreements which include 34 agencies.11
As discussed in the section on Formative Evaluation, WINTAC TA specialists engage in a
collaborative process to develop ITA agreements with agencies (and partners where relevant). Over
time, these processes continue to evolve and as a result, activities, outputs, and outcomes of
importance to agencies have begun to coalesce into a common set of needs. Thus, TA Teams have
an increasingly robust and comprehensive “menu” to guide initial agreement development with each
agency. Because each agency’s needs can vary, final agreements can look very different from one
agency to the next as a result of differential selection of “menu” items. The final product is a
customized one, particular to the agency and state. In this third year of WINTAC’s operation,
Program Evaluators worked with WINTAC Leadership and TA Team Specialists to refine and
increase the specificity of each core topic area’s set of activities, the outputs to which they
correspond, and the short-term outcomes to which they lead. These have resulted in the more
specified logic model frameworks alluded to earlier in the Formative Evaluation review and are
presented below under their relevant topic area discussions of progress. The “Inputs” and “Long-
Term Outcomes” bookending these logic models remain the same as specified for all of the
WINTAC earlier as all resources are available and the common goal for RSA, WINTAC, and the
agencies is the same: improved competitive, integrated employment outcomes for VR clients – thus,
they are not repeated in the models presented here.
11 These figures include agencies whose ITA agreements are definite, likely, and probable. They also include agreements incorporating topics outside of WINTAC’s original core five areas such as their workforce innovation pilot projects and business engagement and they also include agreements that are collaborative with other TACs.
55
ii. ITA Feedback, Generally
In Year Three of WINTAC, a survey was disseminated to garner feedback from all VR
agencies. The survey was sent to Vocational Rehabilitation (VR) agencies via CSAVR who
disseminated the link to their membership of VR agencies nationwide to garner as broad a response
from the population of VR personnel who have received intensive TA services from WINTAC. For
this reason, there can be more than one response from a given state as different personnel can be
receiving assistance related to different topic areas of technical assistance (e.g., Pre-Employment
Transition Services, Common Performance Measures, etc.).
Respondent Demographics. In total, there were 31 respondents with 15 from combined
agencies, 12 from general agencies, and 4 from agencies serving the blind. As noted, there are
duplicates in this sample, evident in the identification of states from which respondents hailed:
Florida (2), Nevada, South Carolina, Montana (3), Kentucky, North Dakota (2), Vermont, Idaho,
Michigan, Alabama, Guam (2), Pennsylvania, Louisiana, Texas, Maryland, Minnesota (4), Iowa,
District of Columbia, Wyoming, Missouri, Rhode Island, Arkansas, and Alaska.
In terms of professional roles, the majority of respondents hold senior positions in their
agencies with 45% being executives (14), 29% being managers (9), and 3% being supervisors (1).
Counselors (2) were 6% of the sample and “other” (16%) made up the remainder representing
positions in data, IT, and administration, as well as rehabilitation and transition specialists.
Overall Feedback. Respondents were evenly split in terms of awareness of whether the
agency was receiving intensive technical assistance with 42% responding in the affirmative (13) and
39% responding in the negative (12) (20% or 6 respondents were not sure). Given their differing
roles and interactions with WINTAC, it may be the case that someone other than the respondent in a
given state agency is managing the ITA agreement with WINTAC. Overall, feedback on the quality
56
of WINTAC services was extremely positive, with respondents rating it either Excellent or Good,
with few exceptions.
Pre-Employment Transition Services. Looking at WINTAC’s primary topical areas of
technical assistance (TA), 25 respondents (a strong majority of 83%) indicated receiving TA or
training on Pre-Employment Transition Services (Pre-ETS) and 22 of them (88%) found these
services excellent (another 8% or 2 found services good and 4% or 1 found services average). Open-
ended feedback was overwhelmingly positive as well, with respondents valuing the expertise, deep
level and closeness of assistance provided, the quick and timely responsiveness, and the accessibility
and positivity in communications.
“We work very closely with the WINTAC team on Pre-Employment Transition Services. We
send them materials to review and request information on a regular basis. They are quick to
respond and always extremely helpful. I have really appreciated their assistance!”
“They were instrumental in helping us define and deploy our pre-employment transition
services.”
Section 511. About a third of the sample, 11 respondents (37%) affirmed having received TA
or training to help them implement the requirements of Section 511. Of these, all were positive in
their ratings of the services with 73% (8) finding them excellent and 27% (3) finding them good.
Comments were similarly positive, noting:
“WINTAC helped us change our service delivery model to meet our 511 demands and create
our best practice.”
“Resources on WINTAC website helpful in the development of agency guidance pertaining to
Section 511 requirements and implementation.”
57
Supported Employment. A third of the sample, 10 respondents (35%) also affirmed having
received TA or training in Supported Employment. Once again, they were uniformly positive about
these services, rating them as excellent (5 or 50%) and good (5 or 50%). Services and resources
noted as particularly useful included those informing documentation of guidance on the practices
and developing or documenting policies and procedures, as well as in assisting with extended
services for youth with disabilities. Respondents look forward to resolution of issues presented to
RSA that are still under consideration and feel the Community of Practice could benefit from a
revitalization.
Customized Employment. Eleven respondents (41%) indicated receiving TA or training
regarding Customized Employment with the vast majority categorizing the services as excellent (8
or 73%) and 3 (27%) indicating they were good. Comments indicate agencies are pleased with on
site trainings, website resources, the involvement of WINTAC staff in pilot efforts, and the customer
service of WINTAC staff.
Competitive, Integrated Employment. Services pertaining to competitive, integrated
employment (CIE) other than supported and customized employment strategies were rated
separately. Nine respondents (33%) received training or TA on CIE and almost all (8 of 9) rated
them excellent (the remaining person rated it good). Respondents were positive about resources and
their value in developing agency guidance, and assistance with local and state education agreements,
business engagement, and collaboration with other Technical Assistance Centers:
“Assisted with information on performance planning, One Stop Services integration, Career
Index and use in employment planning, job candidate engagement and business
development.”
58
“AR[kansas] is receiving TA from JDVRTAC & WINTAC is always willing to assist in
anyway necessary”
Integration of VR into the WDS. Nearly two-fifths of respondents have received TA or
training pertaining to integration of VR into the larger WDS (11 respondents, 39%). Once again,
services are rated positively, with roughly half indicating they are excellent (6) and half indicating
they are good (5). Open-ended responses indicate that services facilitated the development of data
sharing agreements, unified state plans, collaboration, and alignment with the rest of the WDS.
Common Performance Measures. Slightly more than half of respondents (16 or 55%)
received services pertaining to the Common Performance Measures (CPMs). A majority (76% or
13), rated the TA or training excellent and the remaining 24% rated them good. Respondents were
glowing in their open-ended comments about WINTAC staff’s supports and assistance and indicate
that they are looking forward to future activities such as the Community of Practice focusing on
internal controls.
“Rachel is AMAZING!!!!”
“[R]esources and technical assistance in this area have been helpful in drafting agency
guidance pertaining to the common performance measures, especialy measureable skill gain
and credenital attainment.”
“WINTAC has been very helpful in assisting our state in capturing Common Performance
Measures especially Measure 6 Effectiveness in serving employers.”
Survey Summary and Future Needs.
59
Technical Assistance or Training Topic Area
Received Services In Area
%
Services Rated
Excellent%
Services Rated Good
%Pre-Employment Transition Services 83 88 8
Section 511 37 73 27Supported Employment 35 50 50
Customized Employment 41 73 27Competitive, Integrated Employment 33 89 11
Integration of VR into Larger WDS 39 55 45Common Performance Measures 55 76 24
Table 10. Survey Summary and Future Needs
Survey respondents were also asked to consider the future of WINTAC service provision and
whether there were suggestions or recommendations for improving them. Responses were provided
solely in an open-ended format and many used the opportunity to reiterate their positive impressions
of WINTAC resources, staff, and services as well. In the future, participants look forward to
continued restructuring of the information and resources provided as universal TA, training on
currently developing WIOA changes (and getting credit for training), continued training in
WINTAC’s key topic areas of CPMs, CIE, CE, and SE, and further support in workforce
development integration implementation.
“We are very appreciative of WINTAC and all it's help.”
“I really appreciate the staff at WINTAC who are always willing to assist me with transition
related questions specifically around Pre-ETS”
“We rely on the WINTAC on many areas which they have always provided great services
and information.”
iii. ITA Agreement Progress Metrics
As noted earlier, ITA agreements are structured in the form of a logic model, with activities,
outputs, and outcomes clearly articulated. Each quarter TA Team Specialists update the ongoing
60
progress in the conduct of activities, the development of outputs, and the accomplishment of
outcomes. To facilitate both concrete and deep understanding of progress, updates include metric-
driven assessments and open-ended fields for elucidating important contextual details. The latter has
been summarized in WINTAC’s annual performance report which presents narrative descriptions of
TA Team engagement and the emerging narrative of each state’s landscape as it relates to its
progress. In this section, we present summaries of progress derived from the metric-driven
assessments by topic area (Appendix D provides progress metrics broken out state-by-state).
Overall, out of the total 425 activities for WINTAC noted in the intensive TA agreements,
70% have been initiated. Nearly two-fifths (38%) are already completed and the rest are at different
stages of implementation (see Figure 26). In examining progress through Year Three, Program
Evaluators noted that the structure of the spreadsheet created to capture data left too many activities
in an open-ended category of “in progress” that did not differentiate movement towards completion
as interim steps were taken. Until an activity was absolutely complete, there was no way for TA
Team Specialists to change the status. For the fourth quarter of Year Three (and to be used as such
going forward), the spreadsheet for updates has been modified to categorize progress into sub-
categories that can allow for a more fine-grained examination. These categories are now:
0% or not started 25% 50% 75% 90% 100% or completed
Figure 26 below takes the “progress” bars reviewing all activities’ status and presents these
gradations in the level of progress.
61
All
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
38% 32% 30%
Intensive TA Activities Progress All Activities
Completed In Progress Not started
Figure 26. Implementation Progress for Intensive TA Activities
38% 3% 7% 8% 13% 30%
Progress ALL Activities
Completed 90% 75% 50% 25% Not started
Figure 27. Implementation Progress for Intensive TA Activities
iv. Pre-ETS progress
Twenty-six of the 32 intensive TA agreements cover Pre-ETS (81%). The logic model for
Pre-ETS, as refined in the 4th quarter of Year Three, is presented below. While not fixed or static, the
current menu delineates 10 key activities, concomitant outputs and 4 resulting short-term outcomes.
Activity Output
Review existing documentation, policies, procedures and internal controls for Pre-ETS, including the tracking and reporting of Pre-ETS
Documented feedback on documentation, policies, procedures and internal controls for Pre-ETS, including the tracking and reporting of Pre-ETS
Assist in the development of new or revised policies and procedures and internal controls for Pre-ETS including the expenditures, tracking and reporting of Pre-ETS
Draft of new and/or revised policies and procedures and internal controls for Pre-ETS
Review existing expenditures allocated to the reserved funds and determine if they are allowable costs
Documented feedback on allowable expenditures
Assist the agency in the development of processes and internal controls for accurate financial reporting of Pre-ETS
Draft of written processes and internal controls provided to the agency
Review current interagency agreement between VR and SEA that encompasses the required elements in WIOA
Documented feedback on interagency agreement
62
Activity OutputAssist in the development of an updated interagency agreement between VR and SEA that encompasses the required elements in WIOA to use as a model for LEA agreements
Draft updated agreement with recommendations
Assist agency in demonstrating that they have met the requirement for the provision of pre-employment transition services required and coordination activities before assigning authorized services to reserved funds
Completion of RSA approved model of movement from required to authorized activities
Provide training to VR staff regarding Pre-ETS and S. 113 of the Act as amended by WIOA
The number of individuals that complete the training(s).
Provide training to VR staff regarding Agency’s updated policies and procedures related to Pre-ETS
The number of individuals that complete the training(s).
Assist in the development of a strategy to explore and expand Pre-ETS service delivery including possible electronic/online options and modalities
Disseminate Explore-Work.com to agency
Short-Term Outcomes There will be an increase in the number of students with disabilities that receive at least
one of the five required services from year to year. The agency will completely and accurately report on the expenditure of the 15% reserve The VR program will increase the allocation of Pre-ETS expenditures towards the
minimum 15% reserve until it is fully expended Improved interagency coordination and collaboration regarding pre-employment transition
services at a state-level (systems change)Table 11 .Pre-ETS progress
Pre-ETS Intensive TA Agreement Status #Number of Intensive intervention sites 26Number of activities articulated 177Activity progress on average 69%Number of outputs completed 94 (70%)Number of short-term outcomes articulated 96Number of short-term outcomes completed 19 (20%)
Table 12. Pre-ETS Intensive TA Agreement Status
Pre-Employment Transition Services
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
54% 28% 19%
Progress of Pre-Employment Transition Services Activities
Completed In Progress Not started
63
Figure 28.. Progress of Pre-Employment Transition Services Activities
54% 3% 10% 7% 7% 19%
Progress Pre-eMPLOYMENT TRANSITION SERVICES Activities
Completed 90% 75% 50% 25% Not started
Figure 29. Progress of Pre-Employment Transition Services Activities
Short-Term Outcome 1: Increase in number of students with disabilities receiving at least one
of five required services. All states with Pre-ETS ITA agreements demonstrated an increase ranging
from a 0.7% to a 1, 111% increase. The TA Specialist Team notes that these figures are affected by
reporting capabilities across states, which are continuing to improve. As an example of data that can
be examined to measure these outcomes, sample reports of RSA data for the numbers of clients
receiving Pre-ETS services overall by quarter and by type of service are provided in Appendix E.
Short-Term Outcome 2: Agency will completely and accurately report on expenditure of 15%
reserve. In the states with Pre-ETS ITA agreements, there is an increase in the expenditure and states
are successfully meeting carry forward requirements and progressing in expending the current year’s
requirement.
v. Section 511 progress
Eleven of the 32 intensive TA agreements cover Section 511 (34%), the same figure as last
year. The logic model and menu of activities, outputs, and outcomes for this topic area was not
refined in Year Three as all ITA activities are completed. The Section 511 team activities included:
Providing technical assistance to agency staff to increase their knowledge of WIOA and
Section 511 requirements;
64
Strategic planning for implementation of Section 511 requirements. Strategic plans,
developed over several interactions through email and phone calls, lay out responsibilities
for different team members and timelines;
Developing and/or reviewing CC&I&R and self-advocacy resources and materials; and
Based on agency demands, assistance in the development of policies and procedures that
will assure compliance with all of the requirements of Section 511 and evaluation of 511
compliance.
The TA Specialist Team provided an update at the Annual September All-Team Meeting in
San Diego detailing the completion of all Section 511 activities and outputs and the achievement of
outcomes. Demonstrating impacts at a systems change level and a client outcome level, the Team
has conducted data summaries for every state examining changes in 14 (c) certificate holders and
subminimum wage participants as well. This data demonstrates 809 fewer 14 (c) certificate holders
(1,761 remaining) between January 2016 and April 2018 and between 80, 345 and 123, 212 fewer
subminimum wage participants (243, 684 originally)(final number depends on correction of data
available publicly). In the future, the Team notes that continued reductions in youth entering
subminimum wage could be examined, though there are data challenges to overcome. In connection
to other WINTAC areas, an increase in competitive, integrated employment (CIE) could be
examined further, as could be access to customized employment to achieve CIE outcomes.
Subminimum Wage
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
81% 19%
Progress of Subminimum Wage Activities
Completed In Progress Not started
Figure 30. Progress of Subminimum Wage Activities
65
81% 6% 13%
Progress SUBMINIMUM WAGE Activities
Completed 90% 75% 50% 25% Not started
Figure 31. Progress of Subminimum Wage Activities
vi. Competitive Integrated Employment
Twenty-four of the intensive TA agreements cover CIE (75%), up from 14 last year. The
logic model or menu of activities, outputs, and outcomes for CIE overall was not altered this year.
Instead, because of the expansive increase of Customized Employment pilots, the logic model for
Customized Employment was refined and a comprehensive, customized framework for its evaluation
has been developed and begun implementation. Those are reported further below. Here, a brief
review of CIE is provided. The CIE team assists SVRAs through the following types of activities:
Review and/or development of policies and procedures for CE, developing long-range plans to
develop the capacity to implement CE;
Provide training on supported employment (SE) to VR staff and CRPs to assess when SE should
be provided and to develop capacity to provide SE;
Provide information to SVRAs on customized employment models (CE) and rates as a
foundation for the development of their model and rate structure for the provision of CE;
Conduct a survey of CRP and in-house staff to determine CE experience and capacity; and
Expansion of integrated Business Engagement to include Customized and Supported
Employment (CE & SE) strategies and inclusion of LMI training for counselors. Development of
plans to expand and sustain LMI capacity development and development of agency policy and
procedures regarding LMI training.
66
CIE Intensive TA Agreement Status #Number of Intensive intervention sites 24Number of activities articulated 63Activity progress on average 51%Number of outputs completed 21 (33%)Number of short-term outcomes articulated 52Number of short-term outcomes completed 4 (8%)
Table 13: CIE Intensive TA Agreement Status
Competitive, Integrated Employment
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
33% 35% 32%
Progress of Competitive, Integrated Employment Activities
Completed In Progress Not started
Figure 32. Progress of Competitive, Integrated Employment Activities
33% 5% 10% 3% 17% 32%
Progress COMPETIVE, INTERGATED EM-PLOYMENT Activities
Completed 90% 75% 50% 25% Not started
Figure 33. Progress of Competitive, Integrated Employment Activities
vii. Workforce Integration
Fifteen of the 32 ITA agreements cover Workforce Integration (47%), up from 9 last year.
The logic model for Workforce Integration, as refined in the 4th quarter of Year Three, is presented
below. While not fixed or static, the current menu delineates 12 key activities, concomitant outputs
and 5 resulting short-term outcomes.
Activity Output
Assist agency to analyze readiness and capacity to increase integration with the workforce development system
A completed readiness and capacity assessment documenting implementation and role challenges and opportunities
Assist agency in completing WIOA “Are You Ready for Completed Checklist
67
Activity OutputIntegration?” checklist
Assist agency in individual agency self-assessments regarding level of workforce system integration
Self-assessments conducted, baseline level of integration determined. Integration Continuum Report Provided
Assist agency and partner programs with a system-wide self-assessment regarding level of workforce system integration
Report on Assessment and Priorities. Initial Action Plan with collaborative process developed
Assist the agency in assessing its current utilization of career pathways in the IPE development process and the resources needed to increase the use of career pathways in future planning with consumers
A completed assessment with tools and resources provided to agency staff
Assist in the development of new or revised policies and procedures that will increase the use of career pathways in the vocational planning and service delivery process for VR consumers
Draft of new and/or revised policies and procedures related to career pathways
Assist agency to identify resources that will promote and increase the use of apprenticeships in the VR service delivery system
The information and resources provided to the agency
Assist in the development of new or revised policies and procedures to implement apprenticeship opportunities
Draft of new and/or revised policies and procedures related to apprenticeships
Assess the capacity of the VR agency and core partners to utilize Integrated Resource Teams to serve individuals with disabilities in the AJCs
A completed assessment identifying challenges to utilizing IRTs and recommendations on how to implement IRTs
Assist the VR program and core WDS partners to develop a pilot site that will utilize IRTs to serve individuals with disabilities in the AJC
The establishment of a pilot site for IRT implementation
Provide training to VR (and any other partners) on (name activity focus like career pathways, apprenticeships, IRTs, etc.)
The number of individuals that complete the training by agency and the content delivered
Assist VR and other core and required partners to develop MOUs, Cost-sharing agreements, data sharing agreements, IFAs and local MOUs with Boards as needed.
A draft agreement or MOU
Short-Term Outcomes There will be an increase of at least one level on the Integration Continuum Scale for VR
in relation to at least one of the core partners per year until the integration level is achieved, at which point it will remain in subsequent years.
An increase of at least (x number) of events or other demonstrations of collaborative partnerships between VR and the core partners per year.
There will be an increase of at least (x percent) in co-enrollment of VR consumers with at least one core partner program per year.
There will be an increase of (x percent or number) in the number of VR consumers that obtain employment through accessing the services of the AJC per year
68
Activity Output The number of consumers that utilize the career pathway model in pursuit of their
education and employment goals will increase by at least (x percent) per yearTable 14. Workforce Integration progress
Workforce Integration Intensive TA Agreement Status #Number of Intensive intervention sites 15Number of activities articulated 43Activity progress on average 23%Number of outputs completed 4 (9%)Number of short-term outcomes articulated 32Number of short-term outcomes completed 0 (0%)
Table 15. Workforce Integration Intensive TA Agreement Status
Integration of VR into WDS
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
12% 33% 56%
Progress of Integration of VR into WDS Activities
Completed In Progress Not started
Figure 34. Progress of Integration of VR into WDS Activities
12% 2% 7% 23% 56%
Progress INTEGRATION OF VR INTO WDS Ac -tivities
Completed 90% 75% 50% 25% Not started
Figure 35. Progress of Integration of VR into WDS Activities
The topic area of Integration is another one for which WINTAC has taken on the effort of
bringing together a diverse array of resources and approaches to create a common, model approach
to implementation. In Year One, WINTAC TA Specialists conducted a comprehensive literature and
practice review and developed a measurement tool to assess the nature (or lack) of integration across
entities. In Year Two, the WINTAC Evaluation Team worked closely with the TA Specialists to
69
create and begin implementation of a pilot testing framework of the tool to receive feedback and
identify implementation challenges. In Year Three, this process continued and completed. The tool is
now live and Year 4 will result in collection of data from VR agencies and other entities in the
workforce development system across the country that can be analyzed for patterns in performance
and established as a baseline against which agencies can measure future progress towards greater
integration. Further details are provided in the next section on WIPPS and Special Projects.
viii. Common Performance Measures
Twenty-three of the 32 intensive TA agreements cover Common Performance Measures
(CPMs) (72%), up from 14 last year. The logic model for Common Performance Measures, as
refined in the 4th quarter of Year Three, is presented below. While not fixed or static, the current
menu delineates 6 key activities, concomitant outputs and 3 resulting short-term outcomes.
New Activity Output
Analyze VR agency readiness and capacity to collect and report CPMs
Completed assessment with identified implementation strategies and timelines necessary to accurately track and report the RSA-911 data elements and the Common Performance Measures
Review and/or develop/revise as needed policies and procedures for the tracking and reporting of 911 data elements and CPMs (including internal controls)
Completed draft policies and procedures for tracking and report (including internal controls)
Review, develop and/or revise, as needed, internal controls necessary for WIOA, RSA-911, and CPMs
Completed draft internal controls written procedures
Review and/or develop/revise as needed work performance standards for staff evaluation reflecting CPMs
Completed draft policies and procedures for staff evaluations
Assist VR leadership in the development of a program improvement plan in response to the transition to the CPMs
Completed plan with identified future implementation needs, as they relate to Common Performance Measures
Provide training to VR (and any other partners) on (name activity focus like career pathways, apprenticeships, IRTs, etc.)
Provide training to agency staff, with regard to CPM and 911 definitions and connections between 911 quarterly reports, CPM
70
requirements, and impacts on the rehabilitation process and systems
Short-Term Outcomes SVRA accurately report the RSA 911 data elements necessary for the common
performance measures SVRA successfully collects and reports data required for baseline year information and
future target negotiations SVRA successfully meet or exceed Common Performance Measure targets
Table 16. Common Performance Measures progress
CPM Intensive TA Agreement Status #Number of Intensive intervention sites 23Number of activities articulated 100Activity progress on average 33%Number of outputs completed 17 (17%)Number of short-term outcomes articulated 54Number of short-term outcomes completed 2 (4%)
Table 17. Common Performance Measures Intensive TA Agreement Status
Common Performance Measures
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
16% 40% 44%
Progress of Common Performance Measures Activities
Completed In Progress Not started
Figure 36. Progress of Common Performance Measures Activities
16% 2%6% 12% 20% 44%
Progress COMMON PERFORMANCE MEASURES Activities
Completed 90% 75% 50% 25% Not started
Figure 37. Progress of Common Performance Measures Summary
WINTAC is implementing 21 intensive TA agreements signed with 23 SVRAs. Eighty-three
percent of the agreed-upon activities have been initiated, and close to a third are complete.
Evaluation of activities over the course of Year 2 shows steady incremental progress towards
71
achieving the short-term outcomes for each topic area. In most cases, SVRAs and WINTAC topic
area teams are still collecting data that will be used as measures to determine if intended outcomes
have been met. The preliminary data and information available shows that SVRAs are trying to take
targeted steps to fills areas where they are lagging such as developing targeted spending plans and
improving agency capacity and resource to achieve expected outcomes.
Progress metrics are summarized in aggregate in the figure below. Overall, one can see
strong progress in several areas ensuring that preliminary examinations of impact will be able to
occur in the final years of the funding cycle. As we see with most of the states, Pre-ETS and Section
511 are the most progressed. Given the timing of WIOA implementation and regulations coming
into force, it is to be anticipated that these topic areas would be priorities for the agencies to
progress.
Pre -E m pl oyment T r a nsi tion Ser vi c es
Submi ni mum Wa g e
C ompetitive, Integ r a ted E mpl oym ent
Integ r a tion of VR i nto WD S
C omm on Per f or m a nce M e a sur es
Al l
54%
81%
33%
12%
16%
38%
28%
19%
35%
33%
40%
32%
0.186440677966102
0.317460317460317
0.558139534883721
0.44
0.303258145363409
Intensive TA Activities Progress
Completed In Progress Not started
Figure 38. Intensive TA Activities Progress
72
P r e - E m pl oym e nt T r a ns i ti on S e r vi c e s
S ubm i n i m um W a g e
Com pe ti ti ve , I n t e g r a t e d E m pl oym e nt
I nt e g r a ti on of V R i n t o W DS
Com m on Pe r f or m a nc e Me a s ur e s
A l l
54%
81%
33%
12%
16%
38%
3%
6%
5%
2%
2%
3%
0.101694915254237
0.0952380952380952
0.06
0.0738916256157636
7%
13%
3%
7%
12%
8%
0.0734463276836158
0.174603174603175
0.232558139534884
0.2
0.133004926108374
0.186440677966102
0.317460317460317
0.558139534883721
0.44
0.298029556650246
Intensive TA Activities Progress Completed 90% 75% 50% 25% Not started
Figure 39. Intensive TA Activities Progress
73
VIII. Workforce Innovation Pilot Projects (WIPPs) and Special Projects
A. The Career Index Plus
WINTAC is offering all SVRAs and their workforce partners The Career Index Plus (TCI+)
labor market information (LMI) system to facilitate their integration into the workforce development
system. The Career Index Plus adds significant capabilities and functionality to the basic and free
Career Index site which provides easy, convenient and fast access to all of the best-of-breed labor
market and occupational, job openings and training provider information. TCI+ is currently an
element of ITAs with five states.
TCI+ has developed and conducted trainings with SVRAs to understand how to implement
TCI+. In Year 3, trainees gained access to online, accessible, self-paced trainings available which
will allow testing for CRC credits. These tests will also serve as evaluation data, to inform post-
training knowledge gain and which may serve as a predictor of effective TCI+ use and consequently
improved outcomes for clients. Evaluation of TCI+ continues to develop, but preliminary reviews of
usage by VRCs in SVRAs that have undergone training suggests that a small minority of users
engage heavily with the system, many engage periodically, and many engage once. It is critical to
emphasize that this pattern is an early one that may not be a reliable estimate. Rather, it is indicative
of an exploratory pattern of use of an available resource. At this time, there are over 11,000
registered users. Users include VRCs, but in some states, also include clients who are encouraged to
use TCI+ to engage in job exploration activities and procure labor market information that can guide
their planning and training choices. To date, the TA Team reports that TCI has conducted 52 in-
person trainings involving 887 staff and conducted 85 training webinars for pilot states. An
additional 17 in-person trainings involving 200+ staff, plus an additional 15 webinars are scheduled
before the end of 2018, for a total of 69 training sessions with almost 1,100 staff.
74
Future evaluation of ITAs and pilot efforts will specifically focus on SVRA adoption of
TCI+ and the interaction between TCI+ and other topic areas. These more systematic initiatives will
be evaluated to determine patterns and predictors of effective usage and the relation between
effective engagement with TCI+ and improved client outcomes.
As more SVRAs adopt TCI+, additional impact evaluation opportunities will arise, such as
comparing various outcomes between “TCI+ implementing” and “non-TCI+ implementing” sites.
These outcomes can be specifically related to other topic areas as noted above (e.g., level of
integration). A set of common outcome measures that will always be tracked include mid- and long-
term employment outcomes achieved by clients. Shorter-term outcomes will involve VR service
improvements such as increased efficiency of the job development process; increased client
engagement and responsiveness, satisfaction; and increased number of job matches and job
placements. RSA data can be examined to assess credential attainment and other outcomes of
interest (see Appendix F for examples of these data available from RSA). Currently, data can be
examined statewide. Even with a few, or just one, SVRA adopting TCI+ more systematically, an
evaluation plan that uses a within-groups design is possible to examine impact through the use of
pre-test, post-test measures. On certain client outcomes, more stable estimates of “pre-test” scores
can be obtained by using a 3-5 year average prior to intervention. Given the pattern of use of TCI+
described above though, being able to also obtain data by individual users (and for VRCs, to be able
to link to their client outcomes) will allow for a more effective impact analysis of TCI+. Currently,
to approximate or supplement this approach, discussions with the TA Team Specialists and the
Evaluation Team include consideration of brief surveys for users of the system at log-on and log-off
from the system.
75
B. SARA
The second WIPP is focused on facilitating SVRA transition to the common performance
measures and the need to share common data elements among Workforce Development partner
agencies in Section 116 of WIOA. SARA is a client engagement and communications system that
automatically gathers needed information at the right time from consumers and providers without
staff intervention. using artificial intelligence (AI) and natural language processing, it can engage in
intelligent, two-way communications with consumers and third parties using two-way SMS
(texting), email and IVR (interactive voice response). All interactions result in detailed case notes.
SARA’s AI engine can be easily taught to collect any kind of information a human could collect. It
undertakes structured interviews with clients and providers to determine progress, barriers and
milestones reached and can make basic decisions accordingly. As noted earlier, SARA is
demonstrating rates in client satisfaction surveys of over 60% and even higher for routine
engagement with clients.
In Year 1, WINTAC focused on building SVRA interest in SARA through demonstrations,
holding 14 demonstrations for Nevada Combined and Workforce partners, Arkansas General,
Hawaii Combined, North Carolina General, Washington General, Pennsylvania Combined,
Pennsylvania Combined, Michigan General, Hawaii Combined and Adult Ed, Alaska Combined,
and Washington Blind. Some agencies received multiple demonstrations. Currently, SARA is an
element of ITAs with three different states: Alaska, Kentucky, and Nevada.
Over the course of Year 2, discussions began regarding evaluation plans for SARA to
systematically document its impact, which are reviewed here. Similar to the discussion above for
TCI+, short-, mid-, and long-term outcomes would be documented spanning the continuum from
system usage to client outcomes: increased efficiency of job development process; increased client
76
engagement and responsiveness, satisfaction; increased number of job matches, job placements;
increased time spent on direct client interaction/follow-up and services, improved responsiveness to
client; and increased continuous quality improvement as managers work with underperforming staff,
cross-team strategy sharing. Some of these outcomes overlap with TCI+ and reflect what is expected
when VRCs have their time freed by SARA to focus on improved client services to improve
employment attainment and retention.
Also similar to the impact evaluation possibilities for TCI+, comparing various outcomes
between “SARA implementing” and “non-SARA implementing” sites will be possible and valuable.
In the states where ITAs are present and an explicit initiative to implement SARA exists, an
evaluation plan that uses a within-groups design is being developed to utilize pre-test, post-test
measures as possible. A further set of “SARA-implementing” and “non-SARA implementing”
comparisons may also be possible in states with ITAs, if SARA is implemented only at certain sites.
For example, in Kentucky SARA is being implemented in one of 10 workforce regions: TENCO.
Kentucky also exemplifies the type of evaluation conundrum that arises from only employing
a “within-groups” or “pre-test/post-test” study design: when other changes take place during the
intervention implementation phase (in our case, that is the SARA implementation phase) it is not
possible to determine how outcomes are affected by these “other” changes. In Kentucky, these
“other changes” reflect an over 40% reduction in the VRC workforce, and a simultaneous new
mandate from the Governor to conduct outreach and follow-up to thousands of additional clients.
Examining VRC client management outcomes and client employment outcomes between TENCO
and Kentucky’s other nine workforce areas will allow one to understand the impact of these “other”
changes (changes that are significantly relevant to the outcomes of interest for WINTAC) and
“partial” out their effects so we can look at the true impact of SARA.
77
A general plan of evaluation for SARA (and TCI+) can be outlined as follows in Figure 40.
First, examine the effective use of SARA by VRCs; if SARA is not being utilized (or utilized
sufficiently or effectively), it cannot lead to improved downstream outcomes with clients. Barriers to
its usage need to be understood to facilitate continuous quality improvement in services and to
effectuate the intent of SARA as an innovation. Next, understand the intermediate or “process”
outcomes derived from SARA usage that should lead to the ultimate outcomes of interest: how is
SARA changing VRC case management and client-VRC interactions. This understanding serves
partly a continuous quality improvement function as well as SARA can be adapted to emphasize
those functions and utilities which serve as levers to improved client outcomes and extraneous
aspects of the system need not be prioritized by SVRAs for VRCs using the system. Finally, of
course, there is the important last step of understanding the client outcomes that derive from SARA
as an intervention.
Figure 40. Conceptual Framework for SARA Impact Evaluation
The possibilities extend further; additional evaluation activities could examine SARA
implementation from a cost-benefit perspective, or could examine its impact in states with order of
selection where resources are constrained and an efficient client management tool could be very
78
beneficial. As with TCI+, SARA should also improve integration outcomes by freeing up VRC time
to focus on interacting with other agencies to benefit their clients.
Importantly, SARA can become an instrument of change management. SVRAs adopting
SARA are doing so to facilitate improved VRC client engagement and service provision. While
SARA can help manage some aspects of case management and client engagement, its value lies in
its ability to allow the VRC to focus on client interaction as a part of improving counseling and
guidance efforts that lead a client to employment outcomes. SVRAs adopting SARA are thus
adopting a shift in service provision in many ways. As seen from extant literature (and real-world
experience), change management is not always easy or effective. Even when all parties agree that a
goal and shift in priorities is important, effectuating the shift can become difficult. Thus, another
possible aspect of the evaluation in planning stages involves the use of an intervention strategy to
improve effective SARA implementation and use. SARA use (follow-up with clients) should
become a habit. Charles Duhigg wrote in “The Power of Habit” that a habit occurs when you have
three key components: a cue, a response, and a reward. How do we use this to improve SARA usage
and see the benefits of that we expect? We need this to happen with three sets of parties: (1) VRCs,
(2) Managers, and (3) Administrators.
When SARA was rolled out to Alaska, Nevada, and Kentucky, it became apparent that not all
VRCs were acting on “alerts” the system provides to let them know that it was time to re-engage
with a client. It was also unclear whether managers were reading reports they receive from SARA
regarding VRC system usage and alert rates. In order for SVRA administrators and managers to
effectively manage SARA implementation by VRCs, they need to review reports and check-in with
VRCs when the data demonstrates a lower than desired level of usage. After WINTAC Evaluation
Team and SARA Implementation Team discussions, SARA was modified to track whether managers
79
were reading reports. This was an immediate change resulting from a continuous quality
improvement process, but also facilitates additional data collection for a broader evaluation effort.
Now that usage by all parties can be tracked, the next step is to conduct an intervention aimed at
improving administrator, manager, and VRC use of the system.
The proposed intervention would involve each party (VRCs, Managers, Administrators)
getting reports and being asked to respond to those reports. All parties could see icons indicating
their report was sent and read by supervisors. In Duhigg’s schema, this would represent the “cue.”
The system would then “ask” for a recorded response to provide to each other: Supervisors check-in
with supervisees to offer strategies and support and supervisees explain performance (e.g., “I was
away at conference, so alerts built up.”). These documented explanations for performance status
represent the “response” and serve to draw attention to the levels and types of VRC performance
regularly and repeatedly. All parties will become accustomed to evaluating and planning for action.
The final step is to build in a “reward” – preferably an intrinsic one rather than an extrinsic one.
Reports through SARA would be constructed to situate performance benchmarked against
acceptable ranges for key measures for a given type of agency. Then, on a regular schedule (set by
the agency), a report would be sent to each frontline staff member in the system. For each measure,
it would graphically show them where they lie within (or outside) that range, where they lie in
comparison to the rest of the staff, and ask for a comment. The comment can be dynamic (i.e., if they
are outside the range of expected performance, we can ask why outside and what they plan to do to
get back inside. If performance is well inside recommended parameters, we could ask to what they
attribute their success). SARA can track non-response and send reminders every two days with an
alert to the manager after five consecutive non-responses. When the staff member hits "submit", the
report is sent to the manager and for each comment box from staff, they will have their own
80
comment box for a response back which SARA can also track. SARA can send reminders to
managers in the same way as with frontline staff. Managers would also get a report staggered by one
week. It would graphically show where their division falls in the acceptable ranges as well as
compared to the rest of the agency with comment boxes for each. In addition, it would list each of
their staff with a marker indicating whether a response has been received and an encouragement to
respond to each as well as a link that allows them to send a message to non-responding
staff. Similarly, SARA will also submit reports to administrators, staggered by one week from
managers. This will be the same report as managers get, except it will show information for the
entire agency with links to drill down all the way to frontline staff. The data will be saved to a
reporting table for analysis. As the evaluation evolves, response options will be standardized for as
many elements as possible to drop downs or radio buttons to ensure efficiency for SVRA staff.
In Year Three, discussions on the above plans continued, and memos were developed
outlining evaluation possibilities for Kentucky and Nevada officials to review and consider. Ongoing
changes with governance, shifting state priorities, and initial implementation efforts delayed
evaluation implementation. Further discussions with WINTAC Leadership elucidated a priority for
focusing on SARA’s utility in improving collection of Common Performance Measures and this will
be a focus of evaluation efforts for early in Year Four. Implementation of SARA is a significant
undertaking in any state and documentation of the systems changes achieved by the effort are
important to elucidate and document as they themselves reflect the sort of systems change WIOA
seeks to achieve. For example, MOUs and data sharing agreements have been established with the
assistance of the WINTAC SARA TA Team between VR and other agencies, ensuring more
effective CPM reporting and VR integration into the workforce development system. These impacts
are no small feats; Kentucky took seven months to finalize the data sharing agreements and they
81
were the fastest (!), Alaska just finalized their agreement last month and Nevada is underway but
still pending. The TA Team reports that it was originally anticipated that SARA would engage with
15,000 clients on behalf of approximately 150 users in the pilot areas. As of today, SARA has
engaged with approximately 30,000 clients and has 1,000+ staff users in the pilot areas.
C. Peer Mentoring
The third WIPP has been developed in response to topic areas 1-3 and includes the
development and use of peer mentoring networks for young people with disabilities to help them
transition from secondary education to postsecondary education and employment through the power
and influence of high expectations, self-determination and the development of self-advocacy skills.
Research has shown that mentors, especially peer mentors, can positively affect the movement of
individuals with disabilities towards self-sufficiency through the establishment of high expectations,
support and empowerment. The WINTAC has established pilot sites/ITAs involving peer mentoring
in four states currently.
The evaluation planning for peer mentoring is underway and involves both formative aspects
and summative aspects. From a formative perspective, the evaluation can examine the structure of
the peer networks, the number, nature, and functionality of the peer networks. In addition, it can
examine which evidence-based approaches to implementation are being used, such as the utilization
of mentor match (disability, work experience/interest), setting high expectations, and goal-setting
strategies.
The summative aspect can of course examine changes to self-confidence and value, general
and career self-efficacy, self-determination, agency, self-advocacy, self-sufficiency, an increase in
persistence towards goals, academic success for youth in transition, and achievement (progress
towards substantive goals set, e.g., employment).
82
It is possible to implement a randomized-controlled trial or experimental evaluation of peer
mentoring and its impacts as well. An example would involve identifying a large group of youth in
each SVRA who are interested in participating, then randomly assigning (matched pairs) some to
peer mentoring groups and some to a wait-list for a period of time before they are assigned in as
well. Conversely, “control” group participants could be assigned to a peer mentoring group, but
given no structure or peer mentoring training. This would allow for a “business-as-usual” versus
“model/intervention” comparison. Additional groups could be added if groups (networks, regions, or
states) wish to implement more than one model allowing for a test of not just one but multiple
models of peer mentoring. Outcomes assessed would include the summative outcomes outlined
above.
In Year Three, plans to evaluate peer mentoring followed the approach articulated above and
implementation sites were engaged for discussions. Initial steps have involved identifying the
models of peer mentoring being implemented across the sites, but outlining a common framework
for their evaluation as possible. These are described immediately below. Based on discussions with
WINTAC Leadership, a key priority is identifying systems changes by SVRAs to adopt and
implement Peer Mentoring services, in addition to the client outcomes that are the ultimate long-
term outcomes for all WINTAC services.
In Year Three, a literature review was conducted to identify the types of models of peer
mentoring identified in the past and the constructs of importance to consider in determining
successful models and interim changes that further lead to successful individual outcomes. Table 18
below results from the literature review and suggests the data collection constructs that can be
employed in the evaluation. A documentation of existing models across the three key pilots was also
conducted and yielded the following classification. (1) A Statewide Certification Model: Florida’s
83
model includes training and certifying peer mentors across the state. (2) University Models – Two
Types: A. Recruitment (AK): helping students to be successful on campus when they start originally.
B. Recruitment AND Retention (MS): assisting students who are referred by VR when they start on
campus, but also referring students to VR and the peer mentor program when students on campus
(even in advanced years of matriculation) are identified as struggling (for example: on leave or in
danger of going on leave and needing to get back on track). Mississippi is tracking clients and
interested in linking participation to outcomes.
84
Table 18. Summary Literature Review - Models of Peer Mentoring & Successful Individual Outcomes
Pre-TrainingBaseline Info from State / Setting
Post-Training Evaluation
Baseline Mentee Data
Goals/Plans Data
Goals Progress
Post-intervention Mentor Data
Post-intervention Mentee Data
Major Outcomes
Prior Training/Initiatives
Post-training Quizzes, Satisfaction
General & Career Self-Efficacy
Quant and Qual Analysis of Goals Set
Quant and Qual Analysis of Goals Progress
Mentor Expectations
General & Career Self-Efficacy
Community Engagement (FL) – CPI?
Characteristics of Program (3 models currently)
Mentor Expectations
Confidence, Expectations
e.g., type/nature
Mentor Confidence
Confidence, Expectations
Employment
Characteristics of Mentor Population
Mentor Confidence
Self-Determination, Agency
e.g., Education goals aligned with career pathway preferences?Enrollment?Persistence?
e.g., Education goals aligned with career pathway preferences being met?Enrollment/ persistence.
Mentor evaluation of mentees
Self-Determination, Agency
Job Retention
Characteristics of Mentee Population
Self-Advocacy e.g., Work-based learning included?
e.g., Work-based learning experiences had?
Self-Advocacy Earnings
Mentor Expectations
Self-Sufficiency
e.g., Career relevant skills included?
e.g., Career relevant skills improving?
Self-Sufficiency
Mentor Confidence
Mentor-Mentee Match (also final evaluation of goals progress column)
85
D. Integration Continuum
In Year 2, a specialized evaluation plan for Integration was developed to facilitate the
validation of a measurement tool to examine the nature and level of integration of VR into the
workforce development system (WDS). The WINTAC Evaluation Team worked with WINTAC
TA Specialists in this area to implement a three-phase rollout and testing of two tools: (1) a self-
assessment for VR to use to determine its integration with the rest of the workforce development
system and (2) an assessment for agencies across a state to use independently and as part of a
facilitation to determine overall integration.
PROJECT PHASE
STARTING
ENDING
PHASE 1: ALPHA TEST
Tests: 3/17/17Revise: 5/29/17
Tests: 5/28/17Revise: 6/11/17
PHASE 2: BETA TEST
Tests: 6/12/17Revise: 7/10/17
Tests: 7/9/17Revise: 7/30/17
PHASE 3: PILOT TEST
Tests: 7/31/17Revise: 9/18/17
Tests: 9/17/17Revise: 9/30/17
PHASE 4: YR 1 Tests: Tests:
PROJECT PHASE DETAILSREVIEW STRUCTURE, FORMAT, APPEARANCE, WORDING
REVIEW CONTENT, USABILITY
REVIEW LEARNING IMPACT
USE, PROVIDE INTENSIVE TA, RE-EVALUATE AT END-OF-YEAR
Figure 41. Timeline for Conducting Piloting of Integration Continuum Tool
The first phase was an “alpha” test where one small group of individuals was recruited
directly by WINTAC TA Specialists to do individual reviews of the tools (structured as online
surveys), followed by them coming together in a teleconference to review and share their
feedback. Based on their feedback, adjustments were made to the tools and next a “beta” test was
conducted where two small groups were recruited by CSAVR to participate via webinar in a
86
discussion of the tools after they had examined them individually to provide further feedback. At
the same time, the beta test also involved two SVRAs participating in onsite facilitated trainings
on the use of the tools and provide feedback to the facilitator who works with the WINTAC TA
Specialist Team. Once again, based on the feedback received during this “beta” test, the tool was
refined.
Next, we conducted our “pilot” testing phase which began in the summer of 2017. During
this period, several (4+) SVRAs received facilitated training on the use of the tools onsite. The
process of having SVRAs conduct their own independent self-assessments and then review the
answers during a facilitated session, provided some of the richest feedback regarding the tools
and also allowed for the WINTAC TA Specialists to gain a deeper understanding of how agency
staff understand and utilize the tools. Follow-up discussions of the feedback and changes to be
made will take place during the first quarter of Year 3 (this represents a slight delay in the
planned timeline as represented in Figure 27 above due to scheduling the facilitated sessions with
SVRAs), with the final version of integration continuum tools ready for use during the second
quarter. These tools can be used during the process of WINTAC TA regarding integration at
“baseline” or initiation of TA and again during regular intervals (depending on agency goals and
priorities the time period can vary, but minimally it is suggested the assessment be done
annually) to measure progress in integration.
“Expert” assessments of SVRA integration (by WINTAC TA Specialists engaging with
the SVRA to understand changes and developments that have been scored as progress on the
tool) will validate tools. Additional, external criterion that are useful for both validation purposes
and to demonstrate the connection between effective integration and client outcomes include
examining the relation between levels of integration and co-enrollment statistics; co-
87
development of a unified/combined state plan; holding an annual meeting with core partners to
review/update plans; using a common case management system with core partners; conducting a
process flow map/chart with core partners; effectiveness of the implementation of common
performance measures reporting; employment metrics; earning metrics; credentials metrics;
measurable skills gains; and levels of business engagement.
Though it took a bit longer than anticipated, the initial alpha, beta, and pilot testing of the
tool took place and the tool was revised based on feedback in Year Three. Plans to roll out the
tool also evolved over Year Three to ensure that data collected would be subject to further
validation. The tool is now online and available nationwide and as data rolls in during Year Four,
it will be automatically summarized and posted online. Individual agencies that take the
assessment will receive email reports of their agency’s scores. They can go to the online page
summarizing all data to benchmark themselves against national data. Agencies undertaking
greater efforts to integrate can continue to take the assessment and, year upon year, progress
towards greater integration can be tracked at individual and national levels. This can lead to the
ascertainment of a “state of the states” with respect to VR/WDS integration. We can also
compare states receiving intensive TA from WINTAC to those generally making progress on
their own and could examine the relationship of integration to VR’s location in their state agency
structure. Finally, as noted above, level of integration should also be impacted by, and impact,
progress in other WINTAC topic areas such as TCI+, SARA, and common performance
measures.
E. Customized Employment
Customized Employment is one of the WINTAC topic areas in which evaluation
planning discussions took place over Year 2 to go beyond tracking attainment of outcomes
88
outlined in ITAs. While that process will still take place, a “substantive” process and impact
evaluation is also possible to examine training and TA of customized employment in a deeper
way.
Broadly speaking, an evaluation of customized and integrated employment (CIE) can
examine an increase in implementing the essential elements of evidence-based practices/models
such as customized and supported employment, determine (and create instruments as needed to
measure) fidelity, and compare outcomes for clients prior to implementation of these practices
and models (as with other topical area evaluations, such an approach should aim for a more
stable estimate of data by examining three to five year averages).
In a review of customized employment (CE) for example, Risen, Morgan, and Griffin,
2015 noted the existence of 15 non-data studies and 10 studies with descriptive data only. This
identified a need for further evidence and evaluation.
Importantly, because of the WINTAC’s much needed aggregation of CE approaches by
training providers, and the development of “Essential Elements” to implementation, a clear
opportunity exists for examining integrity to CE implementation. In Year Two, specific
outcomes of CE identified included: increased exploration of jobs with the individual; increased
work with employers to facilitate placement, including job customization; increased
development of job duties, schedule, job arrangement, supervision and location; increased
representation of client or professional chosen by client in working with employer; and increased
provision of supports and services at job placement. Training and TA of SVRA staff and
community rehabilitation professionals should lead to an increase in: trained agency and provider
staff and ultimately the number of CE providers; increased involvement and resources of
community, agency and workforce partners; the number of job placements as noted above and
89
satisfaction of individuals with the employment outcome (which should support the next
outcome – retention); and job retention.
In Year Three, a more comprehensive evaluation planning framework was established
and implementation begun. Below, we first present a review of WINTAC’s CE Pilot Initiative
and its overall status, approach, activities and outputs, and the impact on the adoption of CE
nationally. Next, we review the specific pilots; experience to date including their activities and
outputs, pilot outcomes, anecdotal and unforeseen impacts, and challenges and solutions.
i. WINTAC’s CE Pilot Initiative
Overall Status. At the end of the third program year, WINTAC is actively working with
8 state agencies that are in various stages of implementing CE. In each of the pilot states,
agencies and providers are serving consumers. Agencies are experiencing unprecedented levels
of integration with partner agencies, and braiding of resources. In these states, VR is taking the
lead in introducing or expanding CE. Those agencies are:
o California-C
o Idaho-G
o Colorado-C
o Louisiana-C
o Montana-C
o Minnesota-G
o Nevada-C
o South Carolina-B
Currently WINTAC is engaged in discussion with the following state agencies that have
expressed a desire to implement CE:
90
o Michigan (G and B)
o Minnesota-B
o Missouri-(G and B)
o Rhode Island-C
o District of Columbia
o Arizona-C
o North Carolina-G
o Oklahoma-C
o Arkansas-B
Approach. WINTAC assists these sites by structuring pilots that offer a platform for (1)
understanding CE implementation options in the VR context, (2) promoting sustainability
measures and evaluation to justify expansion or full adoption, and (3) shaping next steps.
WINTAC’s emphasis on evaluation and sustainability is based on the following assumptions:
Evaluation: An evaluation plan will help demonstrate impact and justify future investment,
expansion or full adoption after the temporary supports go away. Pilots or other new
initiatives in general are known to fail because an evaluation plan was not in place from the
outset. For example, Montana observed: “We’re not very good at closing the loop. We never
know if it was any good because we don’t have any data.”
Sustainability: Pilots are often able to achieve great results in an “incubator” setting but not
sustain them once the initial infusion of resources and supports has expired. For example,
staff turnover is a known challenge in the VR industry, and impacts the continuity and
quality of services in general. Without a plan and resources to compensate for this known
91
risk, quality services cannot be sustained, especially where continuity is so important, as with
the CE process.
Other features of WINTAC’s pilot approach include:
Facilitating pilot planning via logic models and/or roadmaps based on feasibility, readiness
and risk assessment, impact measurement, creative design and resource allocation, project
management and accountability.
Supporting consensus-building around standards for what CE is and is not, building capacity
to deliver CE with fidelity and assess quality and competence. This not only creates a pool of
qualified practitioners, but also enables VR to say to them, “Does your model look like this?”
and “Show me how it is up to these standards.”
Engaging other WINTAC teams and other TACs as needed. For example, the CE TA team
collaborates with the PreETS and TCI+ teams in sites where there is overlap in needs and
strategies. Also, WINTAC is moving toward formalizing a partnership with the Y-TAC and
NTACT to provide TA to agencies seeking to provide CE to in and out of school youth.
TA Activities and Outputs. WINTAC’s TA model for CE pilots consists of a sequence of
activities (guided by the CE Pilot Roadmap tool) for the state agency and partners to:
Learn about CE
Understand and commit to the levels of investment and partnership required for
implementation, capacity-building and sustainability
Design, execute and evaluate a CE implementation project that meets the state’s needs
o Supporting these activities is a series of tools, developed by WINTAC, that have
been well-received by sites considering a CE pilot. For example, at South
Dakota’s CE planning meeting stakeholders saw what a commitment it was going
92
to be to undertake this pilot, but when they pulled up the Roadmap they saw
WINTAC had done a lot of the fundamental work for them so they could move
forward in a calculated way and didn’t have to reinvent the wheel. Among the
tools developed to date are:
CE Roadmap for Implementation of CE.
CE Overview/Orientation for VR agencies and partners (day-long session)
Discovery/Pre-ETS Cross Walk (revised draft under review by RSA)
CE Milestone Fee Structure
CE Policy Template to guide the development of policies and practices
CE template for Request of Proposals for training
Impact on the Adoption of CE Nationally. As a result of WINTAC’s actions and
leadership relating to CE, the following observations can be made regarding national impact:
o Where there had been a patchwork of interpretations as to what constitutes CE, there is
now national consensus and there are measurable parameters, competency standards and
certification protocols for its application and implementation. “The Essential Elements of
Customized Employment” continues to be a valuable and widely promulgated document
that serves as an anchor for consensus on what CE is. It is used to develop training,
evaluation and certification systems. Most recently, WINTAC has worked with the
Association of Community Rehabilitation Educators (ACRE) to develop basic and
professional level CE Certification. The Essential Elements of Customized Employment
served as the foundation of both certificates.
o The CE pilots have demonstrated the power of building consensus and translating it into
action. One pilot site leader observed, “The pilot has already jump started a consensus
93
among stakeholders on what CE is. We are now grounded. WINTAC gave us a
foundation and we are well on the road to creating a system of training and support that is
sustainable and continuous.”
o Where there had been a handful of state agencies attempting to implement – and
struggling to sustain – CE prior to WIOA, and more agencies skeptical about CE even
after WIOA, WINTAC has put in place a framework to support and advance CE adoption
through a process of establishing its feasibility, testing it, incrementally expanding it and
planning to invest in taking it to scale. WINTAC is now providing intensive technical
assistance to 8 state agencies in various stages of CE implementation, and anticipates
supporting as many as 20 agencies implementing CE by the end of the grant in 2020.
VR is increasingly coming to the table with all stakeholders, saying “we have
employment options for people with the most significant disabilities,” and in some instances
taking the lead in system-wide implementation of CE.
ii. CE Pilots: Experience to Date
An overall evaluation framework, including sample outcomes, have been drafted for each
site to use in the further design and development of their pilot and in tracking and applying the
results. This framework was used to begin assessing the first six pilots and will support technical
assistance to and evaluation of all CE Pilots going forward. The framework and sample
outcomes documents may be found in Appendices G &H.
The first six sites to work with WINTAC on CE pilots serve as a strong “test sample” to
demonstrate the effectiveness of WINTAC’s technical assistance as well as the impact of pilots
to date. The original sites are led by California-Combined, Idaho-General, Minnesota-General,
Montana-Combined, Nevada-Combined and South Carolina-Blind. So far in these sites, a
94
foundation has been laid for the VR community to begin offering employment services and
options for individuals with the most significant disabilities. For the most part, these are states
where little to no foundation had existed previously and very little in the way of effective
services had ever been considered, much less offered to achieve competitive integrated
employment for this population. Through the pilots, WINTAC, its partners, and the SVRAs and
their partners invested technical assistance, funds, training and supports to build this foundation.
As a result, the pilots are seeing preliminary, measurable results that will help inform future
actions and demonstrate progress toward achieving outcomes. They are also producing
unforeseen impacts with broader implications for the VR system and services.
Activities and Outputs. Laying a foundation to introduce an innovative and paradigm-
shifting practice is no small feat. For each of these sites it has involved investments in actions
such as making the case for CE, building consensus, developing a framework (vision, plans,
resources, partnerships), creating or adapting service delivery infrastructure to pilot and
potentially adopt CE, and providing multiple rounds of intensive training and certification
opportunities.
Individual pilot outputs for planning, infrastructure, capacity-building and
implementation are reported in the CE Evaluation Worksheets in Appendix I. For pilot planning,
examples include: logic models (including expected outcomes), presentations and strategy
sessions. Examples of infrastructure outputs include: policies, procedures, fee codes and
structures, tools, guidance, job descriptions and performance expectations. Two of the most
impressive infrastructure outputs came from MN-G’s pilot. The first is a CRP CE competency
review protocol and process developed and launched in September 2018. On its first cycle, 24 of
37 (65%) practitioners were found to meet competency standards, which meant that 13 of the 20
95
(65%) CRPs requesting authorization are now contracting to provide CE services to MN-G
consumers. A second infrastructure output is MN-G’s intensive CE training curriculum and
protocol, developed, tested and ACRE-certified in the course of the pilot and with the support of
WINTAC and Y-TAC. It is currently in its soft-launch, with the hard launch scheduled for
January 2019.
The pilots’ capacity-building outputs include twelve in-depth, nationally certified CE
training sessions conducted by certified, national trainers and completed by 405 professionals
holding various positions with VR, CRPs, DD agencies, education and other partners. Trainings
typically included a 5-to-6 day classroom portion followed by a 6-to-12 month series of
performance-based consultations and mentoring where practitioners receive guidance in
completing the entire CE process with a consumer “learning partner” and submit documents and
case notes for mentor feedback and approval. Some sites have also incorporated an online
learning component.
In terms of implementation outputs, each pilot site defined or selected implementation
regions and teams – for example, CA-DOR and IDVR each designated multiple regional sites,
whereas the SCCB and MN-G pilots have more of a statewide focus. There are a total of 13
implementation sites across the six pilots, each with teams and cohorts of consumers or “learning
partners” referred according to pre-determined selection criteria and receiving CE services.
CE Pilot Outcomes. Individual pilot outcomes are reported in the CE Evaluation
Worksheets in Appendix I. Overall it can be said that, given the need for CE to be sustainable
and merit future investment, the fact that these pilots have laid a foundation for CE
implementation (where there previously was none – i.e., a baseline of zero), they each already
have achieved a substantial and measurable outcome. The fact that they have all encountered
96
challenges (see “Challenges and Solutions”) confirms that there is more work to be done in
stabilizing these foundations, with sustainability and evaluation continuing to be key pilot
imperatives.
Other measurable outcomes for the CE Pilots relate to service delivery, consumers and
capacity. Ultimately the types of short- and long-term impacts to look for will parallel those
identified in “Sample Outcomes for Customized Employment Pilots” document in Appendix I.
At this point in the pilot initiative while capacity-building is still in progress, it is unrealistic to
expect to see big numbers in any of those areas. But to paraphrase most pilot leads, “going from
zero to a handful is huge.” For example, seeing what it takes for a practitioner to meet
competency standards, not to mention develop a track record of helping individuals achieve
customized employment outcomes, the fact that a site has any certified practitioners where there
used to be none – or even several who are trained and poised for certification – is a success.
As a result of the CE Pilots, there are now a total of 131 professionals certified or
qualified to perform functions related to CE or Discovery services in 32 organizations in the six
sites. This includes 63 active, certified practitioners, 29 of whom are VR staff providing direct
services. An additional 68 VR staff or supervisors are qualified to make CE referrals, monitor or
assess the quality of the CE process and products, or supervise practitioners. Once these
practitioners are functioning at full capacity, it can be projected that these pilot sites will have the
ability to provide CE services to a significant number individuals per year.
In the meantime, it is instructive to look at other service delivery outcomes. The two that
are tracked by all six sites relate to the number of individuals referred for CE or Discovery and
the number who started the Discovery process. Over the implementation of all six pilots, 118
individuals were referred and 83 started the process. Numbers starting the process ranged from
97
as few as one client in one site, to as many as 53 in another. This is significant in light of a
baseline of virtually zero. The lack of additional aggregate case data points to one evaluation and
sustainability challenge, i.e., data collection and tracking systems not being adapted for CE at the
beginning of the pilot.
In terms of consumer outcomes, site leads again testify that seeing the progression of the
CE process result in competitive integrated employment for any individual who in the past may
have lingered on SE caseloads, or who heretofore would have been considered unemployable is a
profound success. While aggregate data are not available at this time, case studies prepared by
MN-G yielded helpful information on a sample of twelve individuals. Five of those individuals
are now working in customized jobs, two of whom have returned for job search services to
increase their hours. One other is in job search and another is starting an internship. The five
remaining are not pursuing employment at this time, many due to parent or guardian decisions on
their behalf.
While these results and the stories behind them cannot be projected onto the full cohort of
individuals served in the pilots, it is likely that a considerable portion of the individuals served
will achieve competitive, integrated employment.
Anecdotal and Unforeseen Impacts. The pilot initiative has provided a laboratory to
observe other impacts – some of them not yet deliberately measured, others unforeseen. Those
not deliberately quantified at this point in time require more data and rigorous study, but have
been previewed anecdotally in the initial pilots. For example, return on investment, which figures
heavily in decisions to move from pilot to expansion or full adoption, was not a focus of the
evaluation in Year 3, yet anecdotal evidence is promising. One pilot team member put it best,
referencing her experience with SE when compared with CE from a strong CE provider: "SE has
98
been a revolving door and that's after spending $10,000 on a case for job coaching where the
individual typically only lasts a year on the job. If you don't have outcomes after recycling
people in SE, the investment isn't worth it. When people say CE is expensive they should be
presented this information. We pay very little in ongoing supports from one CE provider because
they got paid more up front to make a better job match."
On a related note, implementing CE in many of the sites has showcased the imperative of
but also the natural environment for strategic alliances and shared resources to successfully
implement CE. The following examples demonstrate rich potential for studying system-wide
impact on resources and service delivery.
• The CA project emerged from the state's "Competitive Integrated Employment Blueprint"
initiative and, from the outset, has been led and financed by a coalition of VR, DD and
education partners.
• NV collaborates with the DD regional center, where CE started as an Employment First
initiative. The majority of CE referrals come from the regional center and arrangements
for shared funding are in progress.
• IDVR consulted extensively with the DD agency where the state's exposure to CE had
begun with the Employment First initiative. As work on the pilot began, the DD agency
was instrumental in helping break down barriers with parents and families. Idaho’s
education sector was also heavily represented in planning and site selection, and
participated in the original trainings to facilitate referral and transition. Schools continue
to be interested in the CE model and are considered a significant referral partner for the
pilot.
99
• Through South Carolina’s Employment First Initiative, SCCB and partners are
collaborating to build capacity to use Customized Employment to assist individuals
statewide in achieving competitive integrated employment outcomes.
• MN has partnered with MSOCS to develop and implement the state’s CE training
curriculum and with the blind agency to underwrite a portion of the new CE trainings.
• The CE Pilot evaluation strategy includes participant surveys to assess knowledge gains
and feedback on the classroom portions of the intensive CE training. The results will be
presented as soon as returns are in. The strategy does not yet include measures or
methods to assess the “performance-based” or mentoring portions of the training models.
Preliminary observations, however, suggest that such feedback would be valuable for
future training and technical assistance purposes. For example: “The pilot has provided a
great learning experience. The workers who are doing the best are the ones who are the
most tenacious - they "stuck like glue" to the mentor, made him break everything down
and took full advantage of the in-person mentoring opportunities.” It has also been noted
that closer pilot team monitoring and support of trainee participation in the performance-
based activities might yield more timely and productive results.
• Finally, the enormity of systems and cultural changes that have already occurred must not
be overlooked. Already in these sites, the following observations can be made:
• “We’ve done a lot to enlighten people and let them know there’s a better practice.
Even if they never use it, it helps them appreciate that people they used to think
are unemployable are in fact employable.”
• "There has been such a paradigm shift going from finding people employment to
discovering someone's personal genius. It's a different mindset."
100
• “Especially with the newer Section 511 population, MN is embracing the
realization that these individuals with the most significant disabilities who want to
pursue integrated employment may come to VR multiple times throughout their
working life. It's changing the paradigm from ‘here we go again’ to ‘they’re ready
for the next step’ – someone referred to it as ‘failing forward.’ The agency is
looking at how to learn from this – getting staff to recognize that ‘this is who
we’re serving, this is who we’re here for.’
• “We need the believers – the ones who say, ‘let’s give it a try.’ They’ll be the
early adopters – then everyone else comes along eventually.”
• "CE resources and trainings available through WINTAC have helped build
capacity to provide disability services differently in the state of Minnesota.
Thanks to WINTAC, YTAC and MG&A we were able to put foundational pieces
in place and transform, and help [staff] look at and identify a new way to serve
individuals referred under Section 511.”
• “SCCB believes that the CE pilot has significantly changed the culture, practices,
resources, policy and capacity of SCCB’s VR program. Prior to the pilot, SCCB
did not have a supporting infrastructure, staff knowledge, skills or abilities,
experience or history, nor partnerships, necessary to effectively provide
customized employment. During the training pilot, SCCB invested considerable
time and money to build the core infrastructure and program alignment to support
customized employment. Due to this pilot, SCCB has the foundation in place to
continue toward building out customized employment services.”
101
Beyond the internal impacts, the pilot has propelled CE onto the statewide stage. A prime
example is South Carolina’s Employment First Initiative, with CE at its core, which represents a
monumental systems change for this state. Moreover, it is groundbreaking that a VR-Blind
agency would be leading this coalition. SCCB attributes these achievements to its partnership
with WINTAC and Y-TAC in implementing the Customized Employment pilot.
Implementation Challenges and Solutions. The following challenges arose in the
course of planning and implementation for these pilot sites:
•Ongoing CE training and support: Opportunities for ongoing CE certification, refresher trainings
or mentoring are limited.
•Provider competencies: Pilot practitioner credentials, experience and competency vary from
provider to provider, with some being slow to understand or accept the distinction between SE
and CE - or lacking in experience and confidence to trust the CE process.
•Staff hiring and retention: Most sites report serious attrition issues throughout the VR industry as
a major challenge. While one site lost a trained worker due to an inaccurate position
description and interview questions, there is wide consensus that hiring and retaining qualified
workers is a challenge and CRPs in particular are churning through staff. This is especially the
case when they train the best and the brightest for what is viewed as a higher order of work,
and once trained, lose them to better-paying positions. 30-35% of trained CRP workers leave
the industry entirely. "Nationwide, the numbers tell the story: If you pay people $15/hr for this
level of work you will keep losing them."
•Consensus on CE definition and standards: With increased demand for CE, some members of
the vendor community try to take shortcuts or pass off partial or substandard versions of the
102
practice as CE. Further, without proper preparation, training participants may feel they already
using this approach and refuse to listen or differentiate between CE and business as usual.
•Provider buy-in and accountability: Some CRPs struggle with their investment in and support of
CE, due to a variety of factors such as lack of understanding of the model, lack of worker
confidence or competence, implementation delays and staff turnover. This can result in failure
to meet training and reporting commitments and ultimately impede pilot implementation.
•Consumer selection: Delays in selecting or engaging consumers due to lack of advance
preparation, unanticipated consumer barriers or unavoidable circumstances have held up pilot
implementation.
•Setting expectations: The implementation team did not fully anticipate what the process might
look like, so was not able to accurately set clients' expectations about how it might look or
how long it could take, or set CRPs' expectations about timelines and accountability.
•Data collection and tracking: Several sites have no process for tracking the status of cases that
have started or completed the CE process. This data is critical in telling whether progress is
being made in meeting project goals, whether changes are needed in the pilot design and
whether expansion or full adoption of CE is justified.
•Training – methods, process and oversight: Some sites struggled with excessive time between
classroom and performance-based training, some of the training methods and tools, and the
need to hold trainees accountable for commitments they make to the training process. A few
sites experienced significant delays in completing the mentoring portion of the training,
making it difficult to sustain momentum, enthusiasm and confidence in the process.
•Geography: One site found that the implementation model as initially designed would not reach
consumers in remote areas.
103
•Institutional or leadership issues: One VR management team reported limited experience and
unrealistic expectations about the CE process and timelines. Initiatives of any kind can fall
victim to organizational changes, shifting priorities, diminished resources, etc. While risk
management measures are always advisable there it is possible that such factors will seriously
impede or even terminate a project. One pilot site experienced major institutional upheaval
halfway through implementation and is only now able to regroup and determine how to
proceed.
The following solutions were found by sites over the course of pilot implementation:
• CE definition, standards and messaging: Some sites have positioned themselves to send a
clear messages about CE, ranging from the focus on practitioner certification and fidelity
standards to what to expect from the CE process and how to participate.
• CE evaluation protocol: Several teams have developed tools and protocols for VR to judge
the quality of CE case results and deliverables.
• In-house training and certification: To address practitioner turnover and ongoing support
needs, several sites are considering accessible and affordable alternatives to the national
trainers. One site's solution has been to develop and certify an in-house curriculum delivered
by certified trainers four times per year. Others are coupling online certification training with
expert mentoring. Many have leveraged partner resources to help pay for the training.
• Accommodating nontraditional work practices: One pilot provided an opportunity to work
through state HR barriers to VR staff providing direct CE services (e.g., nontraditional hours,
use of state vehicle, releases for services).
• Continuity of CE implementation: One pilot that was interrupted by major organizational
upheaval has demonstrated - in retrospect - the value of following through on, and evaluating
104
the results of, the investments already made in CE training and certification by the agency
and certified providers.
• Sustainability: One site is successfully leveraging the pilot experience to broaden the scope
of CE implementation across the entire workforce and Employment First system, with the
expectation that a shared commitment and investment among all partners will ensure that it
continues to be supported and sustained irrespective of the limitations of any one partner.
These challenges and solutions will be taken into account by the pilots in further shaping
the design and implementation of their projects, and by the WINTAC team in enhancing the
technical assistance and evaluation approaches going forward.
F. Other Special Project Possibilities
The other topic areas covered by the WINTAC could also have evaluation activities
associated with them that take a look beyond the outcomes specifically articulated in ITAs, to
examine broader trends and SVRA status on progress nationally for comparison purposes as a
part of WINTAC evaluation of TA and training, but also to determine relations between areas of
TA. For example, WINTAC TA Specialists have developed a checklist for Common
Performance Measures to determine SVRA activity and efforts that would align with effective
implementation. Results from this tool could again be related to what should be expected
progress on additional criterion such as: levels of integration and co-enrollment statistics; co-
development of a unified/combined state plan; holding an annual meeting with core partners to
review/update plans; using a common case management system with core partners; conducting a
process flow map/chart with core partners; employment metrics; earning metrics; credentials
metrics; measurable skills gains; and levels of business engagement.
105
In the area of Pre-ETS, a broader evaluation could again be to determine the state of
practice nationally to serve as a benchmark for WINTAC ITA states and provide examples of
peer practices. The assessment could document and analyze how SVRAs are spending their 15%
reserve and which strategies correlate with more effective implementation and improved client
outcomes.
G. Summary
WIPPs and Special projects offer WINTAC an opportunity to engage in focused activity
in innovative areas, using innovative techniques and technology, and to elevate the evaluations
of impact associated with WINTAC’s primary topic areas. Year Two saw the development of
robust plans of evaluation for several of these areas, notably SARA and the Integration
Continuum Tools and Year Three saw these evolve further and begin implementation. Efforts
and discussions in Years and Two also reshaped some activities and features of tools being
provided, a reflection of WINTAC’s commitment to continuous quality improvement. Year
Three saw Customized Employment in particular ramp up significantly in terms of the number of
pilot projects and the progress in their implementation. Evaluation plans have been developed
and implemented and allow for comparison across common structures, while being responsive to
the needs of SVRAs. Several tools to facilitate CE effective implementation by agencies have
been developed and the Evaluation Team has constructed data collection and evaluation tools as
well. Findings will allow for evaluation of effective training approaches, examination of integrity
in implementation, comparison of implementation approaches in terms of agency structures (e.g.,
fee schedules, turnover and training approaches, etc.), and the connection between these factors
and client outcomes. Unforeseen impacts reported demonstrate significant and valuable systems
changes that have already taken place.
106
IX. Conclusions and Recommendations
The WINTAC has completed its first three years of implementation with incredible
success, exceeding targets in several cases. One sign of credibility and trust from the VR
community, is the linkages between universal, targeted and intensive TA requests. Several
universal TA requests subsequently developed into targeted or intensive TA relationships;
similarly, one-on-one and joint targeted TA sessions have resulted in the development of
intensive TA agreements. The response from SVRAs to the opportunity of receiving intensive
TA underlines the need for the WINTAC. While the work plan called for developing intensive
TA agreements with 23 SVRAs in five years, the WINTAC has already established 32
agreements with 34 SVRAs.
The WINTAC has also worked to build partnerships with other TACs and organizations
(including Y-TAC, NTACT, JDVRTAC, and PEQA and DEI centers), and leverage resources to
deliver collaborative trainings, develop resources jointly, and partner in providing intensive TA
to SVRAs.
Universal TA metrics and evaluation demonstrate consistently high usage of the website
and value for the resources, as well as utilization of information obtained. Evaluation of Targeted
TA metrics also demonstrates remarkably high levels of activity and service provision by
WINTAC and follow-up evaluations indicate participants of trainings have begun to put the
knowledge they obtained into action at their agencies. Suggestions in Year Two for the future
from some participants indicated a desire for more information about peer practices and in Year
Three WINTAC delivered on this, much to the appreciation of Year Three evaluation surveys.
Communities of practice demonstrate high levels of engagement rates; Summits and Forums will
be evaluated in the same way as CoPs and one such evaluation of a Summit for agencies serving
107
individuals with blindness or low vision showed very positive evaluations immediately after and
three months later. Intensive TA as noted above reflects WINTAC’s high performance in one
simple metric: in one-and-a-half years of operation, WINTAC has already implemented 32
intensive TA agreements in 34 states thus completing the minimum required 23 ITAs required of
it. This reflects WINTAC’s unmitigated commitment to providing the highest level of service to
any SVRA that needs it. Meeting the minimum requirements is the least of any TA Specialist’s
concern. WINTAC staff are driven to ensure WIOA implementation is effective, and most
importantly, impactful. WINTAC’s WIPPs and Special Projects also highlight this level of
commitment. When the tools to implement or measure WIOA implementation are not available
or are insufficient, WINTAC staff develop, validate, improve, and provide them. Resources
developed will not only have value for states currently receiving intensive TA, but the field of
vocational rehabilitation, workforce development, and disability employment at large.
108
X. References
Bamberger, M., Rugh, J., & Mabry, L. (2006). RealWorld Evaluation: Working Under Budget,
Time, Data, and Political Constraints. Thousand Oaks: Saga Publications, Inc.
Bolman, L. and Deal, T. (2013). Reframing Organizations: Artistry, Choice and Leadership(5th
Edition). San Francisco: Josey-Bass.Bryan et al (2009).
Bryson, J. (2011). Strategic Planning for Public and Nonprofit Organizations: A Guide to
Strengthening and Sustaining Organizational Achievement. San Francisco: Josey-Bass.
Cogburn, D. L. and Levinson, N. S. (2003), U.S.–Africa Virtual Collaboration in Globalization
Studies: Success Factors for Complex, Cross-National Learning Teams. International
Studies Perspectives, 4: 34–51. doi:10.1111/1528-3577.04103
Cousins, J.B., & Earl, L. (1992). The Case for Participatory Evaluation. Educational Evaluation
and Policy Analysis, 14(4), 397-418.
De Silva MJ, Breuer E, Lee L, Asher L, Chowdhary N, Lund C, et al. Theory of Change: a
theory-driven approach to enhance the Medical Research Council’s framework for
complex interventions. Trials. 2014;15:267. doi:10.1186/1745-6215-15-267
Duhigg, Charles. (2012) The power of habit :why we do what we do in life and business New
York : Random House.
Graham, I., Logan, J., Harrison, M., Straus, S., Tetroe, J., Caswell, W. and Robinson, N. (2006).
Lost in knowledge translation: Time for a map? The Journal of Continuing Education in
the Health Professions, 26, pp 13-24.
Kania, J., & Kramer, M., (2011). Collective impact. Stanford Social Innovation Review, 9, (1),
(Winter 2011): 36–41.
Kania, J., & Kramer, M. (2013). Embracing emergence: How collective impact addresses
109
complexity. Stanford Social Innovation Review, 11, (1), (Winter 2013): 1-8.
Knowles, M., Holton, E and Swanson, R. (2005). The adult learner: The definitive classic in
adult education and human resource development. Elsevier: Boston, MA.
Odom, S., Cox, A. and Brock, M. (2013). Implementation Science, professional development
and autism spectrum disorders. Exceptional Children, 79(2), 233-251.
Patton, M. Q. (2008). Utilization-Focused Evaluation. Thousand Oaks: Sage Publications, Inc.
Prinsen G, Nijhof S. Between logframes and theory of change: reviewing debates and a practical
experience. Dev Pract. 2015;25(2):234–46. doi:10.1080/09614524.2015.1003532
Riesen, T., Morgan, R. L., & Griffin, C. (In Press). Customized employment: A review of the
literature. Journal of Vocational Rehabilitation, vol. 43, no. 3, pp. 183-193.
Rossi, P., Lipsey, M., & Freeman, H. (2004). Evaluation: A Systematic Approach 7th Edition.
Thousand Oaks: Sage Publications, Inc.
Stetler, C.B., Richie, J., Rycroft-Malone, J., Schultz, A., & Charns, M. (2007). Improving quality
of care through routine, successful implementation of evidence-based practice at the
bedside: An organizational case study protocol using the Pettigrew and Whipp model of
strategic change. Implementation Science, 2, (3).
Van Achterberg, T, Schoonhoven, L. and Grol, R. (2008). Nursing implementation science: How
evidence-based nursing requires evidence-based implementation. Journal of Nursing
Scholarship, 40(4), 302-312.
Wenger, E., Trayner, B., and de Laat, M. (2011). Promoting and assessing value creation in
communities and networks: a conceptual framework. Rapport 18, Ruud de Moor
Centrum, Open University of the Netherlands. Retrieved from http://wenger-
trayner.com/documents/Wenger_Trayner_DeLaat_Value_creation.pdf.
110
111